Meta asks advertisers to disclose use of AI in political, social ads

66

As side-effects of generative AI begin to haunt people, Meta has asked advertisers to disclose whenever a social issue, electoral, or political ad contains an image or video that have been digitally created or altered.

Advertisers will now have to disclose whenever political ads contain a photorealistic image or video, or realistic sounding audio, that was digitally created or altered to depict a real person as saying or doing something they did not say or do.

The policy will apply to ads that depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.

“Advertisers running these ads do not need to disclose when content is digitally created or altered in ways that are inconsequential or immaterial to the claim, assertion, or issue raised in the ad,” Meta said in a blog post late on Wednesday.

This may include image size adjusting, cropping an image, colour correction, or image sharpening, unless such changes are consequential or material to the claim, assertion, or issue raised in the ad.

The new policy will go into effect in the new year and will be required globally. Meta will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered. This information will also appear in the Ad Library.

“If we determine that an advertiser doesn’t disclose as required, we will reject the ad and repeated failure to disclose may result in penalties against the advertiser. We will share additional details about the specific process advertisers will go through during the ad creation process,” said the company.

20231109159257

LEAVE A REPLY

Please enter your comment!
Please enter your name here