The Google Pixel 2 is arguably the rave of the moment and is in close competition with the iPhone X.
Since Google doesn’t want to lose out in the competition of high-end smartphones, it has added a powerful new feature to the Pixel phones .
Google Lens, the company’s intelligent camera software that can analyze the world around you, is now rolling out to Google Assistant on Pixel phones. Announced earlier this year, Lens is one of the company’s most important new products as it offers an early look at what the future of search will look like for Google.
Though Google Lens has been available within Google Photos since the Pixel 2 launched, this update, which is rolling out “over the coming weeks,” marks the first time the feature has been available outside of Google Photos.
This means Pixel owners will be able to use Google Lens with their smartphone camera in real-time, rather than simply using the feature to analyze photos they’ve previously taken.
Tech Correspondent Ray Wong noted in his Pixel 2 review, it’s still very early days for Google Lens, so the new feature will be limited to a handful of use cases, including text recognition, barcode scanning, and the ability to identify books, movies, art, and landmarks.
Even so, the addition of Lens stands to make Assistant much more useful. Instead of manually searching for that type of information, you can simply point your camera at it to get the answer you need. And while other camera apps can recognize text or barcodes, Lens’ landmark-identifying feature offers a particularly intriguing look at where the feature is heading.
With the feature, you can simply point your camera at a landmark and Assistant will be able to identify what it is and tell you more about it.
Eventually, those computer vision abilities could extend to a number of other types of places and objects as well. Again, it’s still early days for the technology, but it’s clear that Google is envisioning Lens as an important part of the future of search.