📡 Breaking news
Analyzing latest trends...

X Imposes Strict AI Disclosure Rules for Military Content Violators Face 90-Day Monetization Ban.

 

X Imposes Strict AI Disclosure Rules for Military Content Violators Face 90-Day Monetization Ban.
X Cracks Down on AI-Generated Military Content: New Penalties for Creators Amid Global Conflicts

Nikita Bier, Head of Product at X, has announced a significant update to the platform’s Creator Monetization standards. In light of ongoing global conflicts, X is prioritizing information accuracy and moving to prevent the spread of misleading AI-generated war imagery and videos.

Mandatory Disclosure for AI Warfare Content

Under the new regulations, creators who produce AI-generated videos related to military operations, combat, or high-tension geopolitical events must explicitly disclose the use of AI.

  • The Penalty: If a creator fails to label AI content and is caught by the platform’s detection systems, their monetization privileges will be suspended for 90 days.

  • Zero Tolerance for Recurrence: Any repeat violations will result in a permanent ban from the X Creator Ad Revenue Sharing program.

Multi-Layered Detection System

X is employing a robust verification process to enforce these rules. Detection is not limited to automated software; the platform is heavily integrating Community Notes X crowdsourced fact-checking feature to identify and flag unlabelled synthetic media in real-time.

Over the past year, many creators have used AI to create overly realistic war scenes to boost engagement and earn ad revenue. This policy aims to break the cycle of monetization from "fake panic," which impacts real-world stability.

X's use of Community Notes as a decision-making tool is a double-edged sword, reflecting the fact that AI detecting other AI may not be as accurate as human eyes examining the context of images and videos. This creates a decentralized fact-checking system that emphasizes user transparency.

It's anticipated that X may begin using the C2PA (Content Provenance and Authenticity) standard, a hidden code embedded in image/video files from various AI providers (e.g., OpenAI, Adobe, Google) to automatically verify content origins. If AI code is detected without identification, the system will immediately flag it.

The 90-day monetization suspension is a very severe penalty for "news influencers" or "accounts monitoring world events," potentially leading to platform migration or increased caution in using AI tools. 

 

Google and Back Market Team Up to Launch Official ChromeOS Flex USB Drives for Eco-Friendly PC Revivals. 

 

Source: TechCrunch 

💬 AI Content Assistant

Ask me anything about this article. No data is stored for your question.

Comments

Popular posts from this blog

Amazon Hits $181B in Q1 AWS and Advertising Fuel Record-Breaking Growth.

GitHub Copilot Shifts to Usage-Based AI Credits What Developers Need to Know.

Beijing Blocks Meta $2 Billion Manus AI Deal in Major Tech Intervention.

Ghostty Migration Why Legend Mitchell Hashimoto is Leaving GitHub.

Amazon Quick Hits the Desktop A New Era of AI-Driven Enterprise Productivity.

GitHub CTO Apologizes for Outages Blames Exponential Growth of AI Coding.

[Rumor] Apple Bringing Generative AI Editing to the Photos App