Worldcoin, a project initiated by Sam Altman, the father of ChatGPT, has faced numerous obstacles in overcoming privacy regulations. Recently, Worldcoin was ruled to have violated the Privacy Ordinance in Hong Kong and has been forced to stop collecting user iris and facial images.
Worldcoin utilizes iris scanning technology for human identity verification, and according to the official website, over 5 million people have already undergone verification in more than 160 countries.
Due to the privacy concerns associated with collecting iris data, the Hong Kong Office of the Privacy Commissioner for Personal Data stated that between December 2023 and January 2024, 10 enforcement actions were conducted at Worldcoin’s six operating locations in Hong Kong, including Yau Ma Tei, Kwun Tong, Wan Chai, Cyberport, Central, and Causeway Bay. These actions involved law enforcement officers or investigators posing as ordinary citizens to secretly investigate and gather evidence. Following a court order on January 31, 2024, the operating locations were investigated, and after two rounds of inquiries, all relevant investigations have been completed.
Based on the investigation results, Privacy Commissioner for Personal Data, Elizabeth Wong, ruled that Worldcoin’s operations in Hong Kong violated the Privacy Ordinance.
Further reading:
Foreign media: Worldcoin may collaborate with OpenAI and PayPal! If true, will it raise more regulatory concerns?
Worldcoin’s collection of facial and iris images deemed unnecessary
The Office of the Privacy Commissioner for Personal Data believes that the collection of facial and iris images by the Worldcoin project is unnecessary. The verification of whether a user is human does not require any iris scanning and can be assessed by the staff at the operating locations.
Furthermore, biometric data is sensitive personal information, and if improperly used or disclosed, it can lead to serious consequences. The Office believes that the use of facial and iris images is not necessary when there are less invasive methods of verification available.
Lack of important information in Chinese and failure to proactively inform the public of risks
In addition, the Privacy Statement and Biometric Data Consent Form of Worldcoin lack Chinese versions. The staff at the operating locations also do not explain or confirm to participants that they “understand” the contents of the documents. They also do not inform the public of the risks of providing biometric data to the project or answer questions from the public.
Excessive retention of personal data
The investigation found that Worldcoin retains personal data for up to 10 years for training AI models used in identity verification programs. The Office of the Privacy Commissioner for Personal Data considers this duration to be excessive retention of personal data.
The Hong Kong Privacy Commissioner has issued an enforcement notice to Worldcoin, requiring the project to cease the collection of iris and facial images of Hong Kong residents using iris scanning devices. The Office of the Privacy Commissioner for Personal Data in Hong Kong stated that if anyone discovers the continued presence of iris scanning devices at Worldcoin project locations, they can directly report it, enabling the Office to take enforcement action. Therefore, Worldcoin is currently in a “banned” state in Hong Kong.
Sources:
CoinDesk, PCPD, Reuters