Illustration: Nick Barclay / The Verge
Meta announced Wednesday that it would require advertisers to disclose when potentially misleading AI-generated or altered content is featured in political, electoral, or social issue ads.
The new rule applies to advertisements on Facebook and Instagram that contain “realistic” images, videos, or audio falsely showing someone doing something they never did or imagining a real event playing out differently than it did in reality. Content depicting realistic-looking fake people or events would also need to be disclosed. The policy is expected to go into effect next year.
“In the New Year, advertisers who run ads about social issues, elections & politics with Meta will have to disclose if image or sound has been created or altered…