“We’re announcing a new policy to help people understand when a social issue, election, or political advertisement on Facebook or Instagram has been digitally created or altered, including through the use of AI. This policy will go into effect in the new year and will be required globally,” the blog said.
Advertisers will have to disclose about the digitally-modified photorealistic image or video if it was done to depict a real person as saying or doing something they did not say or do.
The disclosure will have to be made if the altered image or video depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened.
The rule will be applicable on digitally altered images or videos that depict realistic events that allegedly occurred, but that is not a true image, video, or audio recording of the event.
The development comes a day after Ministry of Electronics and IT issued an advisory to social media platforms after a deep fake video of actress Rashmika Mandanna was found circulating on social media platforms.
Discover the stories of your interest
Meta, which owns Facebook, Instagram and WhatsApp, said it will add information on the ad when an advertiser discloses in the advertising flow that the content is digitally created or altered.
“This information will also appear in the Ad Library. If we determine that an advertiser doesn’t disclose as required, we will reject the ad and repeated failure to disclose may result in penalties against the advertiser,” the social media firm said.