Platforms with AI bias won’t get sanctuary under safe harbour provision: MoS IT

89

Union Minister of State for Electronics and IT Rajeev Chandrasekhar said on Thursday that safe harbour provision will not apply for social media platforms if they promote search, algorithmic bias or AI bias.

Responding to a user on ‘X’, the minister said that search bias, algorithmic bias and AI models with bias are real violations of the safety and trust obligations placed on platforms under Rule 3(1)(b) of IT rules under regulatory framework in India.

“Those who are aggrieved by this can file FIRs against such platforms and safe harbor/immunity under Sec 79 will not apply to these cases,” he told the X user.

The safe harbour provision gives social media platforms legal immunity against content shared by users on their respective platforms.

The government intends to remove such provisions in the Digital India Bill.

The X user posted screenshots of a conversation with Google Bard where he asked the platform to summarise an article by a news website but Bard responded, saying it cannot summarise the article as the platform allegedly spreads false information and is biased.

Last month, the Ministry of Electronics and IT issued notices to social media intermediaries X (formerly Twitter), YouTube, and Telegram, warning them to remove any kind of Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet or face action.

“The government is determined to build a safe and trusted internet under the IT rules. The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. If they do not act swiftly, their safe harbour under section 79 of the IT Act would be withdrawn and consequences under the Indian law will follow,” the minister had warned.

Google-owned YouTube had said it did not detect any kind of CSAM on its platform. Encrypted messaging platform Telegram said it is “always committed” to upholding legal and ethical standards on its platform.

20231116162939

LEAVE A REPLY

Please enter your comment!
Please enter your name here