The government needs to come out with a codified data protection regime at the earliest to ensure data protection, enforcement mechanisms and also address privacy risks associated with facial recognition technology (FRT), says a recent NITI Aayog discussion paper.

COMMERCIAL BREAK
SCROLL TO CONTINUE READING

The Aayog, in the paper titled 'Responsible AI for All', recommended that rigorous standards for data processing, as well as the storage and retention of sensitive biometric data should be adequately addressed in any proposed data protection regime.

"It is pertinent to mention that FRT like other intelligent algorithms, is fundamentally a data intensive technology. In order to ensure propriety and legality in the manner in which data processing happens to train and develop FRT systems, it is imperative to have a codified data protection regime in the country at the earliest," it said.

In 2019, the government introduced the Personal Data Protection (PDP) Bill in parliament, which was subsequently withdrawn this year.

The government has clarified that this withdrawal is temporary, and a new data protection bill will be tabled in parliament.

Also Read: Government clears 23rd tranche of electoral bonds; sale from November 9 - check details here!

"The new data protection bill must retain the framework to ensure data protection, including obligations, enforcement mechanisms, a regulatory agency, penalties, and remedies from the PDP Bill, 2019," the Aayog added.

FRT refers to an artificial intelligence (AI) system that allows identification or verification of a person based on certain images or video data interfacing with the underlying algorithm. FRT primarily seeks to accomplish three functions -- facial detection, feature extraction and facial recognition.
FRT has garnered domestic and international debate around its potential benefits of efficient and timely execution of existing processes in different sectors; yet also the risks it poses to basic human and fundamental rights like individual privacy, equality, free speech and freedom of movement, to name a few.

According to the Aayog's discussion paper, such a regime must not be limited to regulating data processing by private entities but must adequately codify protections for fundamental right to privacy against state agencies (including law enforcement).

"Sensitive personal data should be protected under the new data protection law, including biometric data such as facial images and scans," it emphasised.

The Supreme Court has adequately set out a three-pronged test of legality, reasonability, and proportionality in the Puttaswamy judgement, it added.

The NITI Aayog also suggested that the organisations deploying an AI system can constitute an ethical committee to assess the ethical implications and oversee mitigation measures.

Click Here For Latest Updates On Stock Market | Zee Business Live

In India, as part of its efforts to improve travel experience, the Ministry of Civil Aviation has initiated the Digi Yatra programme under which FRT and facial verification technology (FVT) will be used at different process points.

FVT will be used at different airports for the purpose of identity verification of travellers, ticket validation, and any other checks as needed from time to time, based on operational needs of the airport processes.

The Aayog has sought views on the paper till November 30.