Showing posts with label Google Search. Show all posts
Showing posts with label Google Search. Show all posts

Google’s Circle to Search Gets a Major Upgrade with 'Scroll and Translate'.

Scroll and Translate Feature in Gemini

Google is expanding the capabilities of its popular Circle to Search feature on Android with a significant update to its translation function. This new enhancement, dubbed "Scroll and Translate," brings a seamless, continuous real-time translation experience that eliminates the need to restart the process every time you scroll.

What is Circle to Search?

Circle to Search is an AI-powered feature that allows you to quickly search for anything on your phone's screen without switching apps. You can circle, highlight, scribble, or tap on text or images to get instant search results by simply long-pressing the home button or navigation bar. The feature integrates with Google Lens and provides a powerful, intuitive way to get information on the go.

The New "Scroll and Translate" Feature.

Previously, using Circle to Search to translate text was a one-off action. If you were scrolling through a social media feed, a long document, or a webpage and wanted to translate content, you would have to trigger the translation for each new section of the screen that appeared. This could be frustrating, especially when reading through long articles or feeds.

The "Scroll and Translate" update changes this entirely. The new feature creates a continuous translation experience. Once activated, the translation will keep working as you scroll down the page or even switch to a different app. This is a game-changer for users who frequently consume content in different languages, as it provides a fluid and uninterrupted reading experience.

How to Use Google Circle Search.

Using the new feature is straightforward:

  1. Activate Circle to Search: Long-press the home button or navigation bar on your compatible Android device.
  2. Tap the Translate Icon: Look for the "Translate" icon that appears on the Circle to Search overlay.
  3. Enable Continuous Translation: Tap the "Scroll and translate" option.

Once enabled, the feature will automatically translate the text on your screen as you continue to scroll, making it easier than ever to read foreign-language content.

Availability.

The "Scroll and Translate" feature is a significant improvement for Google's ecosystem and is a great example of how AI can be used to make everyday tasks easier. The update is beginning to roll out this week on select Samsung Galaxy devices and will likely expand to other Android devices, including the Pixel lineup, in the coming weeks.

This update not only improves an existing tool but also showcases Google’s commitment to using AI to break down language barriers and enhance the user experience. You can use this information to write a comprehensive article, perhaps with a hands-on guide, for your readers.

Also Read: Google Translate Introduces AI-Powered Live Translation and Language Learning.

Reddit Aims to Become a Full-Fledged Search Engine as Q2 Profits Soar.

Reddit Logo
Key Takeaway.
  • Reddit plans to evolve into a full-fledged search engine powered by AI and real user discussions.
  • The platform reported its most profitable quarter ever, with a 78% year-over-year revenue increase.

Reddit is making a bold move toward becoming more than just a social platform instead, it now wants to be your go-to search engine. In its Q2 2025 shareholder letter, Reddit CEO Steve Huffman announced that the company is “concentrating its resources” to turn Reddit into a serious search competitor. With more than 70 million weekly active users engaging with its built-in search tools, Reddit is confident in the unique value it offers: real human answers from real community conversations.

One of the biggest drivers of this vision is Reddit Answers, an AI-powered search feature that delivers responses based on Reddit’s vast archive of posts and comments. Launched late last year, Reddit Answers has quickly grown from 1 million weekly users in Q1 to over 6 million in Q2. Huffman shared plans to integrate Reddit Answers more deeply into the platform's core search and roll it out globally. Unlike traditional search engines that offer AI summaries, Reddit’s model keeps the human element front and center, using trusted community content to generate results.

This ambitious shift comes on the heels of Reddit’s most profitable quarter yet. In Q2 2025, the company reported $500 million in revenue—a 78% increase from the previous year—and $89 million in net income, giving it a net margin of 18%. Its adjusted EBITDA also reached $167 million, with a 33% margin. These strong financials give Reddit the freedom to invest heavily in search innovation and expand its global footprint.

Huffman emphasized that Reddit’s unique advantage lies in its ability to deliver nuanced, people-driven insights. As tech giants like Google pivot to AI-driven answers, Reddit sees an opportunity to win over users who value a human perspective. The goal is clear: keep users on the platform by offering them deeper, more authentic answers powered by community knowledge.

With over 416 million weekly active users and a rising interest in alternative search experiences, Reddit’s transformation into a search engine could signal a major shift in how people look for information online.

Google Brings Live Camera Input Into Search AI Mode.

Google has officially rolled out Search Live, a major enhancement to its AI Mode that lets users interact with Google Search using live camera input. This update allows users to point their Android device camera at objects and speak their questions, while the AI responds in real-time fusion of visual and voice interaction, designed to enrich search experiences.

What is Search Live, and how does it work?

Search Live builds on Project Astra’s live capabilities and integrates into Google’s AI Mode interface within the official Google app. Once enabled in Search Labs, users will see a new Live button in AI Mode at the top or bottom right. Tapping it opens a live camera viewfinder. In this mode, users can ask questions about what the camera sees, such as food ingredients, plants, and street signs, and receive detailed, contextual responses alongside relevant links, videos, and summaries.

The interface also adapts visually when active. Google’s signature colored arc dips down during AI responses, and integrated options let users mute the microphone or view transcripts without interrupting the conversation.

Search Live echoes the capabilities of Gemini Live, which previously supported voice and screen sharing. The new feature takes that experience directly into Search, weaving together Lens and generative AI to create a seamless multimodal tool.

Live AI Mode Search

Search Live Feature is Useful.

Search Live represents a new level of interactivity in everyday search behavior. Instead of typing or tapping into apps, users can now ask questions about their environment and receive AI responses based on what they see. This opens possibilities for real-time assistance—such as meal prep help, plant care tips, translation of signage, or even product lookups in stores.

Because the feature works within Search’s AI Mode, it benefits from Google’s query fan‑out system. That means it can cross-reference multiple data sources and generate concise answers with links to sources—all while keeping the interaction in a conversational format. 

Availablity of Search Live Feature.

Search Live is currently rolling out to users enrolled in Search Labs in the U.S. Users on recent Google app versions specifically 16.28 (stable) or 16.29 (beta) on Android have already reported seeing the Live icon and viewfinder during AI Mode sessions. The search bar or AI Mode interface adapts on the fly to include the Live camera option.

Google may expand the feature globally over time. Because it is managed server-side, users may need to wait a few days or restart the app to see the option, even if they meet the version requirements.

Google Adds AI Mode Shortcut to Android Search Widget.

Google AI Mode Search
Key Takeaway.
  • Android users can now launch AI Mode from the Search widget with a dedicated shortcut, boosting access to Gemini-powered search.
  • The customizable widget shortcut is rolling out with Google app 16.28 and enhances usability without needing Search Labs enrollment.

Google is now rolling out a convenient shortcut to AI Mode directly on the Android Search widget, giving users one-tap access to its AI-powered search interface. The AI Mode icon appears in its own circle next to the voice and Lens shortcuts, making it quick to launch full-screen Gemini‑powered search responses.

What’s New in Google AI Mode and How to Use It.

Starting with app version 16.28, both beta and stable users can now customize the Google Search widget to include the AI Mode button. Long-pressing the widget brings up a Customize menu where you can enable AI Mode under Shortcuts. It will then appear alongside existing icons for voice search and Lens.

Here is a step-by-step process of how to enable Google AI Mode:

Step 1: Open the Google Search App on your Android phone.

Step 2: Click on your profile in the top-right corner and go to Settings.

Step 3: Select Customise Search widget and then select Shortcuts.

Google Search Settings

Step 4: Inside Shortcuts, you will get an option to add AI Mode to your Google Search.

AI Mode in Google Search

When you tap the AI Mode shortcut, it launches a full-screen interface where you can enter any prompt and receive AI-generated responses. It functions like a conversational search tool, using Gemini’s query fan-out technique to break down your question into subtopics and provide comprehensive information.

Users not enrolled in Search Labs may see the older Locate AI interface, where AI Mode is available in a pill-style button within the Discover feed instead of the widget area. Google encourages users to join Search Labs for a cleaner and more integrated experience.

Also Read: Google Launches AI-Powered Web Guide to Organize Search Results.

How Google AI Mode is Useful for the User.

The widget shortcut makes AI Mode more accessible and intuitive. It removes the need to open the Google app first and streamlines access for users who want next-generation search directly from their home screen.

This update reflects Google’s broader push to integrate AI deeply across its products. While new AI tools like Deep Search and Gemini 2.5 Pro are reserved for subscribers, the widget shortcut brings AI Mode to more casual users in a familiar format.

Google Launches AI-Powered Web Guide to Organize Search Results.

Google Web Guide

Key Takeaway.
  • Web Guide uses Gemini AI to organize search results into useful subtopics and related questions within the Web tab.
  • The experiment combines AI summaries with traditional links for faster and more intuitive browsing.
Google has started testing a new search feature called Web Guide, which uses AI to group search results into helpful categories. The Verge reports that this experimental tool is currently available to users who opt into Search Labs, bringing a smarter, more structured browsing experience.

What Is Web Guide and How Does It Work?

Web Guide is a Search Labs experiment powered by a customized version of Google’s Gemini AI. It analyzes open-ended or complex search queries and presents results in organized sections, such as subtopics or focused questions. Gemini performs multiple related searches simultaneously—an approach known as “query fan‑out”—to better understand the query and present more relevant groupings.

This tool appears within the Web tab of Google Search. Users can easily toggle between the traditional “10 blue links” and the AI-enhanced Web Guide format. Early examples include searches like "how to care for a mango tree," which generated sections like “Mango Tree Care in Specific Climates” and “Troubleshooting Mango Tree Issues.” Results came from educational sites, forums, and even Reddit discussions, thanks to past content partnerships.

Google Search Labs

Why Web Guide Matters.

Web Guide bridges the gap between conventional and AI-enhanced search. While Google’s AI Overviews rely on direct summaries, Web Guide reintroduces link-based exploration but in a more helpful format. It allows users to scan categorized results quickly and dive deeper into the topics that matter most, reducing the time spent scrolling through endless links.

By presenting both AI-generated context and traditional link structures, Web Guide enhances discoverability. Users can explore unfamiliar subtopics with confidence, guided by intuitive sections rather than disparate search results. This aligns with Google’s vision of making AI innovations like Gemini more useful and integrated into everyday search.

How To Access the Web Guide?

To access Web Guide, users need to opt into Search Labs. Once enabled, the Web tab will display categorized AI-assisted results alongside regular search listings. Google plans to expand the tool into the All tab over time as usage insights and feedback roll in.

Search Labs offers a controlled environment where Google can measure performance and tweak features based on user behavior. As Web Guide evolves, it may include deeper nested categories, richer summaries, and broader availability across search tabs.

Perplexity CEO Dares Google to Choose Between Ads and AI Innovation

Google Vs Perplexity
Key Takeaway.
  • Perplexity CEO Aravind Srinivas urges Google to choose between protecting ad revenue or embracing AI-driven browsing innovation.
  • As Perplexity’s Comet browser pushes AI-first features, a new browser war looms, challenging Google’s traditional business model.

In a candid Reddit AMA, Perplexity AI CEO Aravind Srinivas criticized Google's reluctance to fully embrace AI agents in web browsing. He believes Google faces a critical choice: either commit to supporting autonomous AI features that reduce ad clicks or maintain its ad-driven model and suffer short-term losses to stay competitive.

Srinivas argues that Google’s deeply entrenched advertising structure and bureaucratic layers are impeding innovation, especially as Comet, a new browser from Perplexity, pushes AI agents that summarize content, automate workflows, and offer improved privacy. He described Google as a “giant bureaucratic organisation” constrained by its need to protect ad revenue.

Comet, currently in beta, integrates AI tools directly within a Chromium-based browser, allowing real-time browsing, summarization, and task automation via its “sidecar” assistant. Srinivas warned that large tech firms will likely imitate Comet’s features, but cautioned that Google must choose between innovation and preservation of its existing monetization model.

Industry experts are watching closely as a new "AI browser war" unfolds. While Google may eventually incorporate ideas from Comet, such as Project Mariner, Srinivas remains confident that Perplexity's nimble approach and user-first subscription model give it a competitive edge.

Google’s AI Can Now Make Phone Calls on Your Behalf

Google Advance AI Search

Key Takeaway
  • Google's Gemini AI can now call local businesses for users directly through Search to gather information or book services.
  • The feature uses Duplex technology and is available in the U.S., with opt-out options for businesses and premium access for AI Pro users.

Google has taken a major step forward in AI-powered assistance by rolling out a new feature in the U.S. that allows its Gemini AI to make phone calls to local businesses directly through Google Search. This tool, first tested earlier this year, lets users request information like pricing, hours of operation, and service availability without ever picking up the phone.

When someone searches for services such as pet grooming, auto repair, or dry cleaning, they may now see an option labeled “Ask for Me.” If selected, Gemini will use Google’s Duplex voice technology to place a call to the business. The AI introduces itself as calling on the user’s behalf, asks relevant questions, and then returns the response to the user via text or email.

This move transforms the search experience into a more active and intelligent assistant. Users can now delegate simple but time-consuming tasks like making inquiries or scheduling appointments. It’s part of Google’s broader strategy to make AI more agent-like, capable of taking real-world actions on behalf of users.

Making call to Local Business in Google search
Credit: Google

Businesses that don’t want to participate in this feature can opt out using their Google Business Profile settings. For users, the functionality is available across the U.S., but those subscribed to Google’s AI Pro and AI Ultra plans benefit from more usage credits and access to advanced Gemini models like Gemini 2.5 Pro. These premium tiers also include features like Deep Search, which can generate in-depth research reports on complex topics using AI reasoning.

As AI integration deepens in everyday apps, this feature showcases a new phase of interaction, where digital tools not only inform but also act on our behalf. Google’s move reflects the future of AI as not just a search engine assistant, but a personal concierge for real-world tasks.

Google’s Discover Feed Gets AI Summaries, Alarming Publishers.

Google Discover Summary Feature

Google has quietly introduced AI-generated summaries into its Discover feed on the Google Search app, a move that’s already sparking alarm across the publishing industry. With this update, users in the U.S. using Android and iOS devices will now see short, three-line summaries instead of traditional article headlines and source names when browsing trending topics in categories like entertainment and sports.

How does it work?

The summaries are produced by Google’s in-house artificial intelligence models and combine information from multiple sources. These summaries appear at the top of the Discover cards, with a small overlay of icons that indicate how many sources were used to generate the content. Tapping on the icons opens a list of original sources, which users can then click on to read the full articles. A subtle “See more” link expands the summary, and each summary is accompanied by a disclaimer that states: “Generated with AI, which can make mistakes.”

Google Discover Now Uses AI Summaries

While this change might improve user convenience by offering quick insights without needing to click through, it is generating strong backlash from digital publishers. Industry experts warn that AI summaries could accelerate the existing trend of "zero-click" behavior, where users find their answers directly on Google's platform and no longer visit the actual news sites. Publishers argue that this could deal yet another blow to traffic volumes that are already declining due to previous algorithm changes and the growing prominence of AI-powered search features.

Why Publishers Are Concerned.

Recent data supports these concerns. According to analytics firm Similarweb, traffic to news sites from Google Search has sharply decreased, falling from over 2.3 billion visits in August 2024 to under 1.7 billion by May 2025. During that same period, the number of searches ending without a single click rose from 56% to 69%. Several digital media outlets, including BuzzFeed News, Laptop Mag, and Giant Freakin Robot, have either shut down or significantly downsized in recent months, loss of referral traffic cited as a contributing factor.

Independent publishers in Europe have gone a step further by filing an antitrust complaint with the European Commission, alleging that Google’s AI-driven tools undermine competition and fail to offer publishers meaningful options to opt out of AI training or display. Similar complaints are also under consideration by the UK’s Competition and Markets Authority, putting regulatory pressure on Google to reconsider how it integrates AI into search and content feeds.

Despite mounting criticism, Google maintains that AI summaries help users explore a broader range of content and that its platforms still drive billions of clicks to publishers every day. The company also introduced monetization tools like Offerwall, which allows publishers to generate revenue through subscriptions, micropayments, and newsletter signups. However, many industry voices argue that such measures are insufficient to counteract the loss of direct web traffic.

As the rollout continues, media companies are grappling with how to adapt. Some are testing their own AI tools to produce summary-style content in a format optimized for visibility within Google's evolving ecosystem. Outlets like Yahoo, Bloomberg, and The Wall Street Journal have started experimenting with article highlights and bullet-point takeaways to compete within the AI-influenced landscape. Still, concerns remain that even these efforts may not be enough to recover lost visibility and revenue.

In the coming months, Google is expected to expand the AI summary feature to cover additional content categories and possibly introduce it in international markets. Meanwhile, publishers and regulators alike are closely watching how this move will affect the future of digital journalism, news distribution, and the broader internet economy. The tension between technological advancement and fair access to digital audiences continues to intensify, with the stakes higher than ever.

DON'T MISS

AI
© all rights reserved
made with by WorkWithG