[ad_1]
Recently, TikTok made a change to its U.S. privacy policy, allowing the company to “automatically” collect new types of biometric data, including what it describes as “faceprints” and “voiceprints.” TikTok’s unclear intent, the permanence of the biometric data and potential future uses for it have caused concern among experts who say users’ security and privacy could be at risk.
On June 2, TikTok updated the “Information we collect automatically” portion of its privacy policy to include a new section called “Image and Audio Information,” giving itself permission to gather certain physical and behavioral characteristics from its users’ content. The increasingly popular video sharing app may now collect biometric information such as “faceprints and voiceprints,” but the update doesn’t define these terms or what the company plans to do with the data.
[time-brightcove not-tgx=”true”]
“Generally speaking, these policy changes are very concerning,” Douglas Cuthbertson, a partner in Lieff Cabraser’s Privacy & Cybersecurity practice group, tells TIME. “The changes are vague in a lot of ways. TikTok does not explain what it will do with this biometric information, how and when it will seek consent before taking it, and what it means by ‘faceprints and voiceprints,’ which aren’t defined.”
To put TikTok’s popularity—and the amount of information it has access to—in perspective, it has 689 million global active users and ranks as the seventh most used social network in the world as of January 2021. In the U.S. alone, over 100 million Americans use TikTok every month while 50 million are on the app every day, according to figures shared by the company in August 2020. TikTok did not immediately respond to TIME’s request for comment.
Alessandro Acquisti, a professor of information technology and public policy at Carnegie Mellon University, notes that biometrics, and especially facial biometrics, are unique and permanent identifiers. He says that TikTok’s “faceprints” could potentially be used to re-identify an individual across a variety of scenarios. Since the information isn’t critical to the functioning of the app and the phrasing of the update is vague, Acquisti says it’s difficult to determine TikTok’s precise intent.
“Biometrics’ range of potential uses is vast: from benign, such as secure access to the app—think about how [Apple’s] iOS uses facial recognition for authentication—to chilling, such as mass re-identification and surveillance,” he says.
The provisions for how TikTok can use the data collected under the privacy policy’s “Image and Audio Information” section are broad.
“We may collect information about the images and audio that are a part of your User Content, such as identifying the objects and scenery that appear, the existence and location within an image of face and body features and attributes, the nature of the audio, and the text of the words spoken in your User Content,” the new section reads. “We may collect this information to enable special video effects, for content moderation, for demographic classification, for content and ad recommendations, and for other non-personally-identifying operations.”
It’s the last use on this list, “other non-personally-identifying operations,” that Cuthbertson says he takes particular issue with.
“It’s disingenuous to say these are ‘non personally-identifying’ operations,” he says, pointing out that a person’s unique ‘faceprint’ or ‘voiceprint’ could inherently be used to identify someone. “That’s not the way the mobile data ecosystem works anymore. You don’t need someone’s social security number to figure out who they are and how to monetize them.”
Users should also take note of the open-ended nature of the uses listed in this section, says Derek Riley, the director of the Milwaukee School of Engineering’s computer science program. “If you want to have funny face filters that engage users, gathering this kind of information is necessary. But there are a lot of other potentially alarming things that can be done with it too,” he tells TIME. “Capturing that information means TikTok could use it within their application, or they could turn and share it with another actor, government or company.”
While TikTok’s privacy policy states that it “does not sell personal information to third parties,” it also says it may share the information it collects for “business purposes.”
“It’s one thing if TikTok can discreetly say, we’re taking this narrow band of information, here’s our description of the information so that you, as a user, really understand what we mean and here’s this very narrow way we’re going to use it,” Cuthbertson says. “Instead we have vague definitions of what the data even is and TikTok itself is vague about how and why they need to use it.”
The fact that TikTok is owned by the Chinese company Bytedance may also play a role in how people view this policy update, Riley says. While President Joe Biden signed an executive order on June 9 revoking former President Donald Trump’s attempts to ban TikTok in the U.S., some still view the app as a potential national security threat. TikTok has said it doesn’t share data with the Chinese government and wouldn’t do so if asked.
TikTok has also previously faced legal action over privacy-related issues. In February, the company agreed to pay $92 million to settle a class-action lawsuit alleging that it violated Illinois’ Biometric Information Privacy Act, the federal Video Privacy Protection Act, and other consumer and privacy protection laws by collecting users’ personal data, including data harvested by facial recognition technology, without consent and sharing the data with third-parties, some of which were based in China.
Now, the updated policy states that TikTok will seek user permission for this type of data collection “where required by law,” but doesn’t specify whether it’s referring to state law, federal law or both.
While there’s no federal U.S. law regulating the collection and use of biometric data, some states began passing their own laws more than a decade ago. Illinois led the way in 2008, with Texas, Washington, California, New York and Virginia all enacting their own biometric privacy protections in the years since. But it’s this legal gray area that demonstrates the need for more stringent standards, Cuthbertson says.
“Is it state law? Is it federal law? Even if it’s every applicable law, it’s still highly problematic,” he says. “That they will do what’s required by law as defined under the vague term ‘U.S. laws’ really highlights the need for more robust privacy laws and regulations that govern the collection of biometric information.”
Ultimately, maintaining awareness of what you’re consenting to by using the app is crucial, Riley says, especially when it comes to the app’s younger users. “It’s really important for individuals like teachers and parents to be able to inform younger individuals who see TikTok as a fun way to engage with their friends of the implications of this type of data collection,” he says. “It has a web of tangential outcomes that could turn out to be really problematic.”
[ad_2]
Source link