Google has just announced a beta rollout of a new feature that integrates Google Lens directly into YouTube Shorts. This update, detailed on Google’s support page, allows users to pause a Short, tap “Lens” from the top menu, and visually search anything in the video without ever leaving the app.
The process is seamless. When watching a Short on the YouTube mobile app:
- Tap to pause the video.
- Select the “Lens” option from the top of the screen.
- Use your finger to tap, draw around, or highlight the object you want to search.
Google Lens will then analyze the video frame and show search results right on top of the paused Short. Once you are done exploring, you can easily jump back into the video without disruption.
During the beta phase, ads will not be shown in the Lens search results. Additionally, Lens is not available on Shorts that include YouTube Shopping affiliate links or paid product promotions.
You might wonder why Google is introducing Lens in Shorts when it already offers the “Circle to Search” feature on Android devices. The answer lies in accessibility and user behavior. Circle to Search” is limited to select Android phones (like the Pixel and some Samsung models) and works across any screen. However, it is not available on iOS or on all Android devices. Integrating Lens directly into the YouTube app ensures that more users, regardless of their device, can access visual search.
By embedding Lens into the Shorts player, Google is trying to enhance the experience in a way that is hyper-contextual. If a viewer is watching a travel Short and spots an intriguing landmark, they no longer need to guess the location or switch apps. This instant access to information turns passive viewing into active engagement.
YouTube has been transitioning from just a video platform to a broader discovery engine. With the rise of Shorts, Google wants to capitalize on spontaneous curiosity. This feature encourages deeper interaction with content and keeps users within the app ecosystem longer.
Shorts contain vast, diverse video content from all over the world. Lens integration could potentially feed anonymized visual data back to improve Google’s AI and object recognition systems. It is a clever way to train and refine their algorithms with live user input.
This move is another sign of Google’s intent to turn video into an interactive layer of the internet. With AI and visual search becoming core pillars of the company’s strategy, we can expect even more immersive, real-time tools in video content—possibly extending beyond Shorts into full-length YouTube videos, Stories, and even live streams.