📡 Breaking news
Analyzing latest trends...

X Imposes Strict AI Disclosure Rules for Military Content Violators Face 90-Day Monetization Ban.

 

X Imposes Strict AI Disclosure Rules for Military Content Violators Face 90-Day Monetization Ban.
X Cracks Down on AI-Generated Military Content: New Penalties for Creators Amid Global Conflicts

Nikita Bier, Head of Product at X, has announced a significant update to the platform’s Creator Monetization standards. In light of ongoing global conflicts, X is prioritizing information accuracy and moving to prevent the spread of misleading AI-generated war imagery and videos.

Mandatory Disclosure for AI Warfare Content

Under the new regulations, creators who produce AI-generated videos related to military operations, combat, or high-tension geopolitical events must explicitly disclose the use of AI.

  • The Penalty: If a creator fails to label AI content and is caught by the platform’s detection systems, their monetization privileges will be suspended for 90 days.

  • Zero Tolerance for Recurrence: Any repeat violations will result in a permanent ban from the X Creator Ad Revenue Sharing program.

Multi-Layered Detection System

X is employing a robust verification process to enforce these rules. Detection is not limited to automated software; the platform is heavily integrating Community Notes X crowdsourced fact-checking feature to identify and flag unlabelled synthetic media in real-time.

Over the past year, many creators have used AI to create overly realistic war scenes to boost engagement and earn ad revenue. This policy aims to break the cycle of monetization from "fake panic," which impacts real-world stability.

X's use of Community Notes as a decision-making tool is a double-edged sword, reflecting the fact that AI detecting other AI may not be as accurate as human eyes examining the context of images and videos. This creates a decentralized fact-checking system that emphasizes user transparency.

It's anticipated that X may begin using the C2PA (Content Provenance and Authenticity) standard, a hidden code embedded in image/video files from various AI providers (e.g., OpenAI, Adobe, Google) to automatically verify content origins. If AI code is detected without identification, the system will immediately flag it.

The 90-day monetization suspension is a very severe penalty for "news influencers" or "accounts monitoring world events," potentially leading to platform migration or increased caution in using AI tools. 

 

Google and Back Market Team Up to Launch Official ChromeOS Flex USB Drives for Eco-Friendly PC Revivals. 

 

Source: TechCrunch 

Comments

Popular posts from this blog

[Rumor] NVIDIA is preparing to launch a new AI Inference system chip, with OpenAI as its first customer.

From Creative Tool to War Machine? Why Millions are Quitting ChatGPT This Week.

Apple Enforces New Age Verification API Across Australia, Brazil, Singapore, and the US.

[Rumor] Apple touch-screen MacBook Pro is about to feature Dynamic Island.

Your smart TV could become a spy for the AI ​​industry in 2026.

From YouTube Editors to Politicians Kalshi AI Sniffs Out Market Manipulators.

Claude Code Goes Mobile Anthropic Introduces Remote Control for Terminal Sessions.