Someone is under a kitchen sink. Their phone is propped on the cabinet door, camera on. They are not typing. They are talking. Google Search sees the shutoff valve, identifies the fitting, and walks them through the repair in real time.
This is not a demo. It is available right now, to anyone with the Google app, in more than 200 countries.
This week, Google expanded Search Live globally. That means real-time voice and live camera search, powered by Gemini 3.1 Flash Live, available across every language and location where AI Mode runs. You point your phone at something broken, something unfamiliar, something you don't have words for, and Search responds to what it actually sees. Not a photo. Not a description. A continuous live feed of whatever is in front of you.
That is a meaningful technical distinction. A photo captures a moment. A live feed captures a situation. For the first time, search can watch something happen and respond to it as it does.
The instinct will be to file this under "better voice search" and move on. That instinct is wrong.
Search Live is not an improvement to the existing internet. It is a preview of a different one — an internet that does not require a screen, does not wait to be opened, and does not need a human to translate their situation into a query. An internet that can see.
The phone under the sink is the version visible today. It still has a screen. A human is still holding it. But the logic of what it's doing (an always-available, visually-aware agent that perceives your environment and responds to it in real time) does not stay in that form. It is already migrating into every connected surface around us.
The next generation of home appliances will not wait for their owners to notice a problem. A refrigerator compressor drawing unusual current. An HVAC unit cycling inefficiently. A water heater running early failure patterns. These systems will see themselves the way Search Live sees the pipe continuously, in real time, and act accordingly.
Not by turning on a light and waiting. By initiating: "Your compressor is showing early failure patterns. I found three certified technicians available this week. Should I book the earliest slot?"
The owner says yes. The appointment is made. No search query. No browser. No call to a 1-800 number. The agent did the searching, the comparing, the selecting.
This is not the distant future. The sensors exist. The connectivity exists. The agent layer that coordinates perception, decision, and action is what's being built right now. Search Live is part of that build. And that's why we've suggested AI might become your best salesperson, if you teach it right. If you don't, it'll learn from someone else and push customers there.
Modern vehicles have been diagnosing themselves for years. GM's OnStar, Tesla's remote diagnostics, FordPass, these telematics systems already pull real-time data from the vehicle and transmit it. The car knows what's wrong. It has known for a while.
Right now, that data goes to the manufacturer.
The question is not whether a car can sense a failing brake pad, flag an oil pressure anomaly, or detect that a transmission is running outside normal parameters. It already can. The question is why, in 2025, that information travels to a corporate server in Detroit instead of to the shop two miles away that could actually fix it.
The infrastructure exists. The diagnostic capability exists. The connectivity exists. What doesn't exist yet is the agent layer that takes the vehicle's self-knowledge and routes it to a service decision: one that checks the owner's calendar, identifies the most reputable shop nearby, confirms availability, and asks a single question: "Should I book it?"
That layer is what's being built right now. Search Live is part of the same build. An AI that can see a leaky pipe in real time and talk a homeowner through the repair is the same AI, pointed at a different problem — one where the camera is already inside the machine, the data is already flowing, and the only missing piece is an agent with the autonomy to act on it.
The car driving itself to the shop is not science fiction. It is telematics plus agency. Both halves already exist. They just haven't been connected yet.
Every service business built in the last thirty years was built on the same assumption: the customer initiates. They have a problem, they search, they call, they walk in. The business waits to be found.
That assumption breaks when the problem finds the business itself.
When the appliance schedules its own technician. When the car drives itself to the shop. When Search Live, having walked someone as far as it can under that sink, selects a plumber to recommend and the homeowner just says yes — there is no search results page. No comparison shopping. No moment where a blue link gets clicked.
There is only the business the agent selected, and the businesses that were not in the conversation.
Getting into that conversation requires something different from what most businesses have built. Not a website that describes services. A documented, verifiable, machine-readable record of actual work, actual customers, actual results the kind of signal that AI systems can find, read, and trust when they are making decisions on someone else's behalf.
We built the internet on screens because screens were how we interacted with it. But the internet was never really about the screen. It was always about information, connection, and access. Those things do not require glass and pixels to exist.
Search Live points a camera at a pipe and talks back. That is the smallest version of what this becomes. The version where the camera is in the appliance, in the vehicle, in the walls of the building — where the agent is not waiting to be opened but is simply on, watching, ready. That version is not a prediction. It is an extrapolation from what already exists.
The rules of traditional SEO were written for a human doing the searching. The rules being written now assume the opposite: an agent searching on a human's behalf, in a world where the query was never typed, the browser was never opened, and the decision was made before the owner looked up from what they were doing.
Google Search Live gave AI search real-time eyes and ears. The rest of the body is being built.