The Irish Data Privacy Commission announced that TikTok is facing a new European Union privacy investigation into user data sent to China.
TikTok is facing a fresh European Union privacy investigation into user data sent to China, regulators said Thursday.
The Data Protection Commission opened the inquiry as a follow up to a previous investigation that ended earlier this year with a 530 million euro ($620 million) fine after it found the video sharing app put users at risk of spying by allowing remote access their data from China.
The Irish national watchdog serves as TikTok’s lead data privacy regulator in the 27-nation EU because the company’s European headquarters is based in Dublin.
During an earlier investigation, TikTok initially told the regulator it didn’t store European user data in China, and that data was only accessed remotely by staff in China. However, it later backtracked and said that some data had in fact been stored on Chinese servers. The watchdog responded at the time by saying it would consider further regulatory action.
“As a result of that consideration, the DPC has now decided to open this new inquiry into TikTok,” the watchdog said.
“The purpose of the inquiry is to determine whether TikTok has complied with its relevant obligations under the GDPR in the context of the transfers now at issue, including the lawfulness of the transfers,” the regulator said, referring to the European Union’s strict privacy rules, known as the General Data Protection Regulation.
TikTok, which is owned by China’s ByteDance, has been under scrutiny in Europe over how it handles personal user information amid concerns from Western officials that it poses a security risk.
TikTok noted that it was one that notified the Data Protection Commission, after it embarked on a data localization project called Project Clover that involved building three data centers in Europe to ease security concerns.
“Our teams proactively discovered this issue through the comprehensive monitoring TikTok implemented under Project Clover,” the company said in a statement. “We promptly deleted this minimal amount of data from the servers and informed the DPC. Our proactive report to the DPC underscores our commitment to transparency and data security.”
Under GDPR, European user data can only be transferred outside of the bloc if there are safeguards in place to ensure the same level of protection. Only 15 countries or territories are deemed to have the same data privacy standard as the EU, but China is not one of them.
For many years, data brokers have existed in the shadows, exploiting gaps in privacy laws to harvest our information—all for their own profit. They sell our precise movements without our knowledge or meaningful consent to a variety of private and state actors, including law enforcement agencies. And they show no sign of stopping.
This incentivizes other bad actors. If companies collect any kind of personal data and want to make a quick buck, there’s a data broker willing to buy it and sell it to the highest bidder–often law enforcement and intelligence agencies.
One recent investigation by 404 Media revealed that the Airlines Reporting Corporation (ARC), a data broker owned and operated by at least eight major U.S. airlines, including United Airlines and American Airlines, collected travelers’ domestic flight records and secretly sold access to U.S. Customs and Border Protection (CBP). Despite selling passengers’ names, full flight itineraries, and financial details, the data broker prevented U.S. border forces from revealing it as the origin of the information. So, not only is the government doing an end run around the Fourth Amendment to get information where they would otherwise need a warrant—they’ve also been trying to hide how they know these things about us.
ARC’s Travel Intelligence Program (TIP) aggregates passenger data and contains more than one billion records spanning 39 months of past and future travel by both U.S. and non-U.S. citizens. CBP, which sits within the U.S. Department of Homeland Security (DHS), claims it needs this data to support local and state police keeping track of people of interest. But at a time of growing concerns about increased immigration enforcement at U.S. ports of entry, including unjustified searches, law enforcement officials will use this additional surveillance tool to expand the web of suspicion to even larger numbers of innocent travelers.
More than 200 airlines settle tickets through ARC, with information on more than 54% of flights taken globally. ARC’s board of directors includes representatives from U.S. airlines like JetBlue and Delta, as well as international airlines like Lufthansa, Air France, and Air Canada.
In selling law enforcement agencies bulk access to such sensitive information, these airlines—through their data broker—are putting their own profits over travelers' privacy. U.S. Immigration and Customs Enforcement (ICE) recently detailed its own purchase of personal data from ARC. In the current climate, this can have a detrimental impact on people’s lives.
Company will no longer provide its highest security offering in Britain in the wake of a government order to let security officials see protected data.
Our first network security analysis of the popular Chinese social media platform, RedNote, revealed numerous issues with the Android and iOS versions of the app. Most notably, we found that both the Android and iOS versions of RedNote fetch viewed images and videos without any encryption, which enables network eavesdroppers to learn exactly what content users are browsing. We also found a vulnerability in the Android version that enables network attackers to learn the contents of files on users’ devices. We disclosed the vulnerability issues to RedNote, and its vendors NEXTDATA, and MobTech, but did not receive a response from any party. This report underscores the importance of using well-supported encryption implementations, such as transport layer security (TLS). We recommend that users who are highly concerned about network surveillance from any party refrain from using RedNote until these security issues are resolved.
The iPhone Mirroring feature in macOS Sequoia and iOS 18 may expose employees’ private applications to corporate IT environments.
To attract users across the Global Majority, many technology companies have introduced “lite” versions of their products: Applications that are designed for lower-bandwidth contexts. TikTok is no exception, with TikTok Lite estimated to have more than 1 billion users.
Mozilla and AI Forensics research reveals that TikTok Lite doesn’t just reduce required bandwidth, however. In our opinion, it also reduces trust and safety. In comparing TikTok Lite with the classic TikTok app, we found several discrepancies between trust and safety features that could have potentially dangerous consequences in the context of elections and public health.
Our research revealed TikTok Lite lacks basic protections that are afforded to other TikTok users, including content labels for graphic, AI-generated, misinformation, and dangerous acts videos. TikTok Lite users also encounter arbitrarily shortened video descriptions that can easily eliminate crucial context.
Further, TikTok Lite users have fewer proactive controls at their disposal. Unlike traditional TikTok users, they cannot filter offensive keywords or implement screen management practices.
Our findings are concerning, and reinforce patterns of double-standard. Technology platforms have a history of neglecting users outside of the US and EU, where there is markedly less potential for constraining regulation and enforcement. As part of our research, we discuss the implications of this pattern and also offer concrete recommendations for TikTok Lite to improve.
Secure and private AI processing in the cloud poses a formidable new challenge. To support advanced features of Apple Intelligence with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. Built with custom Apple silicon and a hardened operating system, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale.
Apple and the satellite-based broadband service Starlink each recently took steps to address new research into the potential security and privacy implications of how their services geo-locate devices. Researchers from the University of Maryland say they relied on publicly available…
Academics have suggested that Apple's Wi-Fi Positioning System (WPS) can be abused to create a global privacy nightmare.
In a paper titled, "Surveilling the Masses with Wi-Fi-Based Positioning Systems," Erik Rye, a PhD student at the University of Maryland (UMD) in the US, and Dave Levin, associate professor at UMD, describe how the design of Apple's WPS facilitates mass surveillance, even of those not using Apple devices.