Home

Google offers bargain: Sell your soul to Gemini, and it'll give you smarter answers

Google on Wednesday began inviting Gemini users to let its chatbot read their Gmail, Photos, Search history, and YouTube data in exchange for possibly more personalized responses.

Josh Woodward, VP of Google Labs, Gemini and AI Studio, announced the beta availability of Personal Intelligence in the US. Access will roll out over the next week to US-based Google AI Pro and AI Ultra subscribers.

The use of the term "Intelligence" is more aspirational than accurate. Machine learning models are not intelligent; they predict tokens based on training data and runtime resources. 

Perhaps "Personalized Predictions" would be insufficiently appealing and "Personalized Artificial Intelligence" would draw too much attention to the mechanized nature of chatbots that now attracts active opposition. Whatever the case, access to personal data comes with the potential for more personally relevant AI assistance.

Woodward explains that Personal Intelligence can refer information from Google apps like Gmail, Photos, Search, and YouTube to the company's Gemini model. This may help the model respond to queries using personal or app-specific data within those applications.

"Personal Intelligence has two core strengths: reasoning across complex sources and retrieving specific details from, say, an email or photo to answer your question," said Woodward. "It often combines these, working across text, photos and video to provide uniquely tailored answers."

As an example, Woodward recounted how he was shopping for tires recently. While he was standing in line for service, he didn't know the tire size and needed his license plate number. So he asked Gemini and the model fetched that information by scanning his photo library, finding an image of his car, and converting the imaged license plate to text. 

Whether that scenario is better than recalling one's plate number from memory, searching for it on phone-accessible messages, or glancing at the actual plate in the parking lot depends on whether one sees the mind as a use-it-or-lose-it resource. Every automation is an abdication of autonomy.

To Google's credit, Personal Intelligence is off by default and must be enabled per app. If Personal Intelligence is anything like AI Overviews or Gemini in Google Workspace apps, expect notifications, popups, hints, nudges, and recommendations during app interactions as a way to encourage adoption.

Woodward argues that what differentiates Google's approach from rival AI agents is that user data "already lives at Google securely." There's no privacy intrusion when the call is coming from inside the house.

Gemini, he said, will attempt to cite the source of output based on personalization, so recommendations can be verified or corrected. And there are "guardrails" in place that try to avoid bringing sensitive information (e.g. health data) into Gemini conversations, like "I've cancelled your appointments next year based on your prognosis in Gmail."

It's ancient history now but in 2012, when Google changed its privacy policy to share data across its different services, that was controversial. The current trend is to encourage customer complicity in data sharing.

Woodward insists Google's aim is to provide a better Gemini experience while keeping personal data secure and under the user's control.

"Built with privacy in mind, Gemini doesn't train directly on your Gmail inbox or Google Photos library," he said. "We train on limited info, like specific prompts in Gemini and the model's responses, to improve functionality over time."

Pointing to his anecdote about his vehicle, he said that Google would not use the photos of the relevant road trip, the license plate in those photos, or his Gmail messages for model training. But the prompts and responses, filtered to remove personal information, would get fed back to the model as training data.

"In short, we don't train our systems to learn your license plate number; we train them to understand that when you ask for one, we can locate it," he said.

Google's Gemini Apps Privacy Hub page offers a more comprehensive view of how Google uses the information made available to its AI model.

The company says that human reviewers (including trained reviewers from partner service providers) review some of the data that it collects for purposes like improving and maintaining services, customization, measurement, and safety. "Please don't enter confidential information that you wouldn't want a reviewer to see or Google to use to improve our services, including machine-learning technologies," it warns.

The personalization with Connected Apps page offers a similar caution.

Google's support boilerplate also states that Gemini models may provide inaccurate or offensive responses that do not reflect Google's views.

"Don't rely on responses from Gemini Apps as medical, legal, financial, or other professional advice," Google's documentation says.

But for anything less consequential, maybe Personal Intelligence will help. ®

Source: The register

Previous

Next