Share to Facebook
Share to Twitter
Share to Linkedin
Chrome has been under fire for its “creepy” new tracking technology for some time, but now we have a more detailed technical analysis of the very serious risks to users and an update from Apple that highlights exactly why Chrome is such a privacy nightmare. Apple users should certainly not be using Chrome on any of their devices.
“Chrome is the only major browser that doesn’t offer meaningful protection from tracking,” rival Mozilla told me this week, as the Firefox developer released its stark technical analysis of Chrome’s new user tracking. Days earlier, Apple had launched a thinly veiled attack on Google, with Safari’s “privacy by design” update exposing its stark differences with Chrome when it comes to the dark art of user fingerprinting.
“The opportunities for fingerprinting have been removed,” Apple said of its Safari update, just as Mozilla warned that Chrome’s latest update poses “significant [fingerprinting] risks.” In March, Google heralded its “privacy first web.” But since then, Chrome has been under attack for data harvesting, tracking and control.
This has been a pivotal year for user privacy. And while Apple’s battle with Facebook over App Tracking Transparency and Privacy Labels has taken center stage, the stark differences between Apple and Google are arguably much more significant.
MORE FOR YOU
Forget shiny gadgets and glitzy functionality. You have a deeper decision to make when you select the apps you use, the platforms you subscribe to, the operating systems you adopt. And if there’s one thing to keep in mind, to distill everything down to an easy-to-understand concept, then it’s this: Follow the money.
It’s this principle that should guide you when you download apps onto your phone. If you’re not paying for the app, then it’s generating revenue some other way—put more simply, if you’re not buying the product, then you are the product. Someone is buying you—or your data, to be more precise. It’s this business model which has rocketed Google (and Facebook) into the stratosphere and spawned the vast app industry.
The web is a vast network of interconnected trackers and data brokers, algorithmically measuring and manipulating your behavior. And right at the heart of this is tracking—identifying you as an individual by your digital fingerprint, and then targeting you with ads or manipulative messages based on an AI assessment of how you will respond.
Clearly, if you use a browser provided by an advertising company (Google), then what happens next is unsurprising. More broadly, unless you adjust your privacy settings and select apps and platforms that put privacy first, you can assume that you are being followed (digitally) everywhere you go, that everything you do is being watched.
MORE FROM FORBESWhy It’s Time To Change Facebook Messenger Over Encryption ConcernsBy null
This concept of “fingerprinting” is the public internet at its worst. It draws data from multiple sources, tracking, correlating, cross-referencing behind the scenes. I know what someone with your characteristics is likely to buy, and so I push that to you, and measure when you transact. All of which inflates the value of my data to an advertiser.
And there are no lines. That’s why when you google something in one room, your partner sees an ad for that same product in their Facebook feed shortly afterward in another room. It’s a great way to manipulate you, to turn you into a buyer. You share an IP address. You’re matched on a social graph. Clearly, you’re fair game.
Let’s be very clear here. This is not okay. When Apple talks about privacy as a fundamental human right, it’s this kind of data grazing that it has in mind.
Apple has been cracking down on cross-site tracking for years with its Intelligent Tracking Prevention. And it has added app tracking into the mix with iOS 14.5. Now it’s going a significant step further with Private Relay, essentially breaking the identity chain so no-one can combine your IP address and other identifiers with your browsing history, not even Apple. This is a move to stop data brokers fingerprinting its users.
Apple’s new Private Relay prevents ISPs and WiFi operators gathering DNS queries, which, it says, “can be used to fingerprint a user and build a history of their activity over time.” It also stops web servers “determining user location… fingerprinting user identity and recognizing users across different websites.” Such tracking still takes place “even when tools like Intelligent Tracking Prevention in Safari” are enabled.
iCloud+ Private Relay
Private Relay has been described as a VPN, but that’s not accurate. A VPN creates a secure tunnel, masking your IP address from the access point or service provider and the web servers you visit. You are essentially surfing on a private network, but in doing so, you explicitly trust the VPN provider to safeguard your security and privacy. While you mask your data and activity from the public internet, you provide it all to the VPN provider and trust their policies and technology.
If you do use a VPN, your traffic will bypass Private Relay. And you will still need a VPN for geographic masking, and there are certain countries where Private Relay will not operate in any case—most notably and unsurprisingly, China. And, as I have warned multiple times before, make sure you use a trusted, premium VPN. You must not install free VPNs from shadowy developers—they’re actually dangerous.
VPNs are a prerequisite when you’re accessing the internet from public WiFi hotpots, hotels, restaurants and other public places. They mask your identity and encrypt your traffic. Because you’re tunnelling your web activity, you can also misrepresent your location to the web servers you access, which is why they have real value in certain geographies or when you want to stream domestic broadcasts while overseas.
Private Relay has a different purpose—to break the identity chain in Safari, so that no-one, not even Apple as the quasi-VPN provider, can see everything. “It is critical to note, that no one in this chain—not even Apple—can see both the client IP address and what the user is accessing. The opportunities for fingerprinting have been removed.”
MORE FROM FORBESWhy You Should Never Borrow iPhone CablesBy null
And so, back to Chrome. The contrast between Apple’s and Google’s approach to your privacy is showcased by the ways their browsers behave. Google has promised an end to dreaded tracking cookies as it moves to what it calls “a more privacy-first web.” Google describes advertising as the “economic foundation” of the internet but warns that users are now so fearful of being tracked that “if digital advertising doesn’t evolve to address the growing concerns… we risk the future of the free and open web.”
This is why, Google says, it is banishing third-party cookies from Chrome and why “once third-party cookies are phased out, we will not build alternate identifiers to track individuals as they browse across the web, nor will we use them in our products.” But Google’s business model relies on targeted ads. It needs a compromise, “innovations that protect anonymity while still delivering results for advertisers and publishers.”
You’ll likely have read about the federated learning of cohorts already—FLoC—Google’s proposed compromise. This is a pitch for the best of both worlds, to categorize people enough to target them with ads, but not to actually identify them.
Last month, we reported on new research from Tommy Mysk and Talal Haj Bakry that highlighted tracking weaknesses in Chrome. Now they’ve turned their attention to FLoC, with the researchers who exposed Apple’s clipboard vulnerability and Facebook’s link harvesting, warning that FLoC IDs “still don’t eliminate tracking.”
FLoC, which is now undergoing a secretive trial on tens of millions of Chrome browsers, has been lambasted by the privacy lobby since the start. A user is assigned to a cohort inside their Chrome browser, based on their prior week’s browsing, and Google says it doesn’t link those cohorts to other identifiers it might have.
But the advertising and tracking industry has made no such promises. While it might work in isolated, laboratory conditions. The real world doesn’t work that way. Clearly, once a user’s cohort is presented to a website and its underlying trackers, other factors can be added into the mix. IP addresses, browser specifics, known site identities, all raise the specter of fingerprinting. The risk with FLoC, the privacy lobby says, is that it might actually make the situation for users significantly worse.
It’s not just Mozilla warning of “significant [FLoC] risks.” Vivaldi warns “Google’s new data harvesting venture is nasty… a dangerous step that harms user privacy,” while Brave dismisses FLoC’s privacy claims as “beyond the point of being taken seriously.”
“The FLoC algorithm assumes that a user’s browsing behavior changes every seven days,” Mysk and Haj Bakry explain. “But this implies that users having a consistent browsing behavior will keep getting the same FLoC ID every time… Clearing the entire browsing history will cause Chrome to recompute the FLoC ID. So even doing so, a user with a consistent browsing behavior will eventually get the same ID. This is to be contrasted with clearing cookies where new identifiers will be created.”
Mozilla highlighted Chrome’s fingerprinting risk in a new technical analysis published this week, warning that “FLoC still allows for significant linkability of user behavior… it may be possible to identify individual users by using both FLoC IDs and relatively weak fingerprinting vectors… FLoC has the potential to significantly increase the power of cross-site tracking.”
I asked Mozilla about fingerprinting with FLoC. “Imagine you have a fingerprinting technique which divides people up into about 8,000 groups,” I was told, “this isn’t enough to identify people individually, but if it’s combined with FLoC using cohort sizes of about 10,000, then the number of people in each fingerprinting group/FLoC cohort pair is going to be very small, potentially as small as one.”
For its part, Google says that “advertisers don’t need to track individual consumers across the web to get the performance benefits of digital advertising, assuring it can “hide individuals within large crowds of people with common interests.” That may be true. But advertisers will continue to fingerprint users, and they will also risk tripping over sensitive information, despite Google’s best efforts to prevent this. “A particular cohort, EFF warns, “may over-represent users who are young, female, and Black; another cohort, middle-aged Republican voters; a third, LGBTQ+ youth.”
Google assures that it doesn’t cheat the FLoC system by linking cohort IDs with other identifiers it stores and that nothing is sent to Google’s servers as part of the cohort assignment process. But Google isn’t the only issue here, FLoC can and will be abused by the avaricious data brokers and trackers sitting behind the public internet.
And if you’re not sure whether this is a genuine risk—the advertising industry has essentially admitted as much, that FLoC IDs become just another “signal” they can use to resolve identities and firm up profiles on target users.
As Digiday confirms, “advertising companies are already strategically gathering FLoC IDs and linking them to identifiable data or analyzing them in an attempt to uncover information about people that may not have been known before, mimicking how they have parsed what third-party cookies told them about people’s behaviors.”
“FLoC’s approach,” Brave told me last month, “increases the risk of users being identified across sites, profiles and browsing sessions, by adding significant fingerprinting surface.” And there’s an additional risk, that users will offer up more and not less information about themselves. “FLoC exposes additional, new information about you (identified as you specifically) to sites who already know who you are.”
MORE FROM FORBESFacebook Tracks Your iPhone Location-This Is How To Stop ItBy null
The researchers say that they have found “more SQL statements for storing data used during computing the FLoC ID” in the latest iOS version of Chrome, such as “SELECT visit_id, floc_protected_score, categories, page_topics_model_version,annotation_flags FROM content_annotations WHERE visit_id=?”
The case against FLoC was already compelling and gaining serious traction. Now Apple has thrown a seriously heavyweight fox into an already vulnerable chicken coop. Apple has shown how fingerprinting can be prevented. And, in doing so, has also shown how far off the mark Google’s plans for Chrome are in their current guise.
Just as Apple has made its move to eradicate fingerprinting, Mozilla told me that “we want to see all browsers protect their users from cross-site tracking—our plan in the Firefox browser is to continue to ratchet up the privacy protection, with the goal of eliminating cross-site tracking from the browser entirely.”
If Google really wants a privacy-first web, then it needs to acknowledge these issues for itself, that its balance is far from right. There needs to be a rethink that skews on the side of privacy and not data-driven advertising. Meanwhile, all users—not just Apple’s—need to keep that “follow the money” mantra in mind. You should stop using Chrome now, especially while the secretive FLoC trial is taking place. And you should only go back if/when the cookie replacement technology runs less of a tracking privacy risk.