Responsive Web Design,Google Lens: The Lens of the Future!
Responsive Web Design,Google Lens: The Lens of the Future!
Google Lens, responsive web design, the image recognition technology by Google uses visual analysis based on a neural network to do things, like point your smart phone at something, for instance, a kind of flower and it not just detects the type of flower, but understands what it detects and offers actions based on the object, like nearby florists, in case of the flower. When Google Lens was released in 2017, it was just an exclusive feature of Google pixel devices and was integrated into Google Assistant and Google photos in Google Pixel phones. But today it is available on most Android devices, including LG,Motorola, Xiomi, Sony, OnePlus, etc. as a standalone app or as a feature within Google Assistant or Photos.
How does it work?
Google lens has combined search, computer vision and is powered by AI and AR and can do many things, like scan business cards or bar codes, indentify restaurants, monuments and objects or even find book reviews. It can even help you point your camera and take a picture of the SSID sticker on the back of a Wi-Fi router, and your phone will automatically connect to the Wi-Fi network without you needing to read out the password while typing it on your phone. With Google Lens, you can literally point and shoot.
Although, you can do a manual search for anything and type a text query to find image results, but this new technology of taking a picture of something and getting all details and information related to it has only made life easier.
The ability of Google Lens to recognise restaurants, cafes and bars is more than impressive. Point your lens at a cafe or a bar and Google lens will present you with a pop-up window showing reviews, address details and opening times of the cafe.
What else can the Google Lens do?
Apart from the above described scenarios, Google lens has the following capabilities:
Smart Text Selection:
This is one of the most useful and intriguing features of Google Lens. You can point the Google lens at text, for instance, from a book or a menu and it can highlight that text within Google Lens, and copy it to use on your phone. For instance, you can point your lens at a menu and tap on it to see explanations of dishes you are not familiar with.
Smart Text Search:
The smart text search comes in handy when you need to look up a definition of a word. The highlighted text can be searched with Google Assistant.
Search for clothing or decor:
This feature allows users to identify pieces of clothing or household decor, easily filter through offers, review offers from multiple retailers to serve up the best and most relevant reviews and shopping options.
Search for objects around you:
With google lens, you can point your camera around you and can get details and take actions on your photos. Looking at a building and not sure what it is? Interested to know the species of a rare flower? Want to know the reviews and synopsis of a book at a book store? Google lens is the perfect sidekick to tell you what is around you.
How does Google Lens work?
Google has a standalone app for Google Lens which is available for some Android devices. You can access Google Lens within other apps like from Google Assistant, Google Photos or in some Android phones, it is directly added to the device’s Camera Apps. The experience is similar and the functionality is more or less the same, whichever approach you take. Google lens is integrated into Google Assistant and Google Photos across a wide range of Android devices. You can easily tap the Lens icon, placed at the bottom right corner in Google Assistant and point your phone at anything, for instance, show timings outside a theatre and it takes you through to the same view you get directly in the Lens app. You will then be served up a number of suggestions in the viewfinder and you can use Google lens to get all the information you need.
Google Lens icon, an Instagram-esque dot with three quarters of a square around it is placed at the bottom of the window in Google photos. When you tap on this icon, scanning dots will appear on the picture you want to scan and Google will immediately serve up suggestions based on the scanned image. Depending on your photo, you will be able to check the details of the image, take an action or find similar products.
Smart Lens in Google Assistant
With Google Photos, you first have to take a picture with the camera app, then go to Google photos and load it up Google lens. However, it is easier, smoother and much quicker to identify something with Lens in Google Assistant. Lens in Google Assistant is smarter than the tool built in Google photos as it is able to recognize even people, be it politicians or even actors.
Google Lens’ forte is identifying just about any common object, from a cappuccino to a MAcBook Pro or from a lily to a pint of beer. It can correctly identify buildings, monuments, landmarks and you will be able to see opening and closing times, along with a brief description. Google Lens can also help you identify popular pieces of art, the name of the artist and even the date it was painted. Google lens comes in handy while looking up extra information or running a related search for any kind of media like CD, DVD, book covers, etc. Google Lens can even pick up text like phone numbers, addresses. You can snap a picture of an e-mail address or a map address and then see where it is in Google Maps. However, sometimes Google Lens might not give you the results you want. For instance, you might want to know a match for a particular brand. Photograph the logo of the brand and you might see matching products from the brand.
The future of Google Lens:
Google announced at the I/O conference that it would be bringing 3D and augmented reality to search results. With the inclusion of this, users will now be able to translate text in real time and would be able to view and place 3D objects from search directly into your own space, giving you a detail of scale and detail. The evolving features of Google Lens will be able to provide more visual answers to visual questions.
Additionally Google lens will have the feature of real time language detection and translation. This means that when a person will point the camera at the text, Google Lens will automatically detect the language and translate it. This feature will make your travelling abroad less stress inducing. All you need to do is scan any document and Google Lens will be able to translate it for you in your chosen language. Another feature, which is the first to be launched in Google is that the Lens will be able to read the text out loud. Google Lens will also have a smarter Google assistant, responsive web design,that will be able to process speech on device at zero latency and will be able to locally run on the phone, without being connected to the web.