• AI Boost
  • Posts
  • 🚀Google Launches Search Live—Voice AI Chats in Real Time

🚀Google Launches Search Live—Voice AI Chats in Real Time

PLUS: Gemini 2.5 Pro & Flash‐Lite Go Stable, Meta Previews Oakley & Prada AI Glasses... More!

Sponsored by

Looking for unbiased, fact-based news? Join 1440 today.

Join over 4 million Americans who start their day with 1440 – your daily digest for unbiased, fact-centric news. From politics to sports, we cover it all by analyzing over 100 sources. Our concise, 5-minute read lands in your inbox each morning at no cost. Experience news without the noise; let 1440 help you make up your own mind. Sign up now and invite your friends and family to be part of the informed.

Good morning! Today is Tuesday, June 18, 2025.

We have some exciting AI news today: Google is rolling out Search Live—voice-based, conversational AI in its Search, and Google deepens Gemini 2.5 rollout with new stable and lightweight variants.

1. Google Launches Search Live: Talk to AI Like a Friend While Browsing the Web

Google just rolled out “Search Live” for Android and iOS users in its AI Mode experiment, letting you talk to Search in a natural voice conversation. You can ask follow-up questions, get audio responses, and explore related web links all at once — perfect for multitasking or searching on the go. Powered by a custom Gemini model, Search Live brings AI voice interaction straight into your pocket, making it easier than ever to ask, listen, and learn in real time.

2. Ultrahuman Launches AI Blood Test That Predicts Cancer and Fatigue Risks for $800 a Year

Health tech startup Ultrahuman has unveiled Blood Vision, an advanced AI-powered blood test that can analyze over 120 biomarkers to assess risks for cancer, fatigue, glucose imbalances, and more. Users provide 8 to 10 vials of blood at a lab, and results are interpreted through the Ultrahuman app by an AI-backed clinician. Priced at $800 annually, the test offers a detailed snapshot of longevity and wellness, though it also raises privacy concerns around storing sensitive health data on a startup platform. Launch begins July 15 in the US.

3. Hexagon Unveils AEON, a Humanoid Robot Built with NVIDIA to Tackle Labor Gaps and Build Digital Twins

Hexagon has revealed AEON, a humanoid robot developed in partnership with NVIDIA to help fill critical labor shortages across industries like manufacturing, logistics, and aerospace. AEON can perform advanced tasks such as scanning environments for digital twin creation, inspecting parts, and manipulating objects in complex settings. Trained using NVIDIA’s simulation-first AI platforms and powered by onboard Jetson Orin computers, AEON learns skills quickly and operates autonomously in real-world environments. The robot is set to roll out in factories and warehouses, potentially transforming how companies build and interact with digital replicas of the physical world.

4. Google Launches Stable Gemini 2.5 Pro and Cheaper Flash-Lite for Developers

Google has officially released the stable version of its Gemini 2.5 Pro AI model, giving developers the green light to build on it after months of testing. Alongside it, Google introduced Gemini 2.5 Flash-Lite, a new ultra-efficient model designed for high-volume workloads at a fraction of the cost. Flash-Lite is now in preview and offers major savings for developers working with text, images, and video. While these updates won’t change much for everyday users, they show Google’s push to compete with OpenAI by offering more flexible and affordable AI options for different use cases.

5. Meta and Oakley Team Up to Launch Sporty Smart Glasses This Week

Meta is set to launch a new pair of smart glasses in collaboration with Oakley on June 20, expanding its wearable tech lineup beyond the Ray-Ban Meta line. Teased in a cryptic video featuring both brands’ logos, the new glasses are expected to be a sportier, performance-focused version aimed at athletes and outdoor enthusiasts. Since both Oakley and Ray-Ban are owned by Luxottica, Meta isn’t switching partners but rather diversifying its smart eyewear strategy. Get ready for AI-powered lenses designed for action.

6. MIT Study Finds AI Can Learn to Think More Like Humans

MIT researchers have discovered that AI agents can make better decisions when they learn from human reasoning. In a study, most humans were willing to pay a penny over budget to buy flour for a friend’s birthday cake, but AI models rigidly refused. The research suggests that exposing AI to human decision-making can help it make smarter, more flexible choices in real-world situations. While fully autonomous AI is still far off, this work could pave the way for AI that better understands context and makes decisions more like we do.

Zephyr HQ: Save Time & Stack Cash with AI

Zephyr HQ puts all the AI guides, prompts, and automations you need in one place. Sign up, pick a plan, and instantly unlock workflows that run for you—cutting hours off your to-do list and boosting your income. New blueprints land weekly, so you stay ahead and profit faster.

How would you rate today's newsletter?

Vote below to help us improve the newsletter for you.

Login or Subscribe to participate in polls.

Stay tuned for more updates, and have a fantastic day!

Cheers,
Zephyr