Google Messages Warning For All Users As Radical Update Launches

Google’s warning is very clear—make sure you read it and check your settings before using this new update…

Gemini has now started to hit Google Messages. And so we return to a warning first floated some months ago, but which now impacts millions of users. There is a security disconnect between AI chatbots and end-to-end encrypted messaging that users need to understand and take seriously. Because those blurred lines are dangerous and little understood.

“Am I alone in my concern and bewilderment that Gemini in Google Messages is not end-to-end encrypted?” One user on Reddit posted. I can’t bring myself to enable it because of that.”

“Google provides both the communication service and Gemini, so by definition it literally cannot be end to end encrypted,” came one reply. “End to end encryption means the communication provider (or anyone other than users) can’t read user messages.” While another poster asked “why would you ever expect this to be encrypted? Google clearly wants to use the data for training future models.”

Both these points start to get to the issue. You can’t end-to-end encrypt queries into a cloud-based AI chatbot, because by definition the service provider needs access to the content. And although the question as to whether that content is used to further train models is a matter for policy and user controls, it’s clearly a risk that will be important for users.

Google has been clear on this issue—its support site highlights both points as Gemini in Messages goes live.

Important,” Google says, even bolding the text. “Chats with Gemini aren’t end-to-end encrypted.” And then further down the page is a link to the Gemini Apps Privacy Hub, which says “Google collects your Gemini Apps conversations, related product usage information, info about your location, and your feedback. Google uses this data, consistent with our Privacy Policy, to provide, improve, and develop Google products and services and machine learning technologies.”

This warning applies to all cloud-based AI from Google and everyone else. The privacy-centric alternative is on-device AI, which Google provides via Nano. But that is substantially lower-powered than cloud-based alternatives—at least for now. And not to state the obvious, but the AI models need the training data from somewhere—and, put simply, you’re it.

While those warnings apply to all other apps and services onboarding AI, there’s a specific disconnect with secure messaging. Because in recent years we have been conditioned to expect end-to-end encryption—with obvious exceptions, and so we chat privately as if no-one is listening. But here, unless you dig into the settings, someone might be listening.

“To help with quality and improve our products (such as generative machine-learning models that power Gemini Apps),” Google says, “human reviewers read, annotate, and process your Gemini Apps conversations.”

And while Google assures that “we take steps to protect your privacy as part of this process. This includes disconnecting your conversations with Gemini Apps from your Google Account before reviewers see or annotate them,” it also warns—again in bold: “Please don’t enter confidential information in your conversations or any data you wouldn’t want a reviewer to see or Google to use to improve our products, services, and machine-learning technologies.”

The real warning here is much broader than Google Messages or even Gemini more widely. We treat our AI chatbots like trusted friends, not multi-billion-dollar data centers with human reviews and long-term storage. We want to chat about business plans, relationships, health and finances, but we really shouldn’t. And in that encrypted messaging world, this means managing private and non-private chats side-by-side.

Gemini in Messages is available just for adult beta testers right now, and only in English—except for Canada, where French is also available. But you can expect it to deploy more widely across the platform in the coming months.

When it does, take a moment to check your privacy settings and ensure you know what you’re doing.

Source Link

LEAVE A REPLY

Please enter your comment!
Please enter your name here