Google Lens, a new computer vision technology, can turn the phone’s camera into a search engine. In the future, you will no longer need a search engine as long as you have a phone camera.
Google Lens has shown that through the use of some inexpensive cameras, microphones and other sensors, and some AI applications, better perception can be achieved with the use of minimum sensors. The camera seems to be replacing the keyboard. Snapchat seems to have known this since the beginning, that is why it positioned itself as a camera company instead of a social company in the initial IPO. And in recent months, Google and Facebook almost all confirmed this: Although the progress is slow, but there is no doubt that the keyboard is being replaced by the camera. Look at Snapchat, the company put its business in the fact that people prefer to exchange pictures, do not want to use the word string. This idea proved to be so forward-looking that Facebook and Instagram have developed their own without any disguise. For Facebook and Snapchat, users can add some exaggerated enhancements at the top of the photo – not something that can be done with text. At the same time, Google has taken a more practical way to turn the camera into a similar input device to the keyboard itself. Put your camera on a tree, it will tell you what kind of a tree it is! Last week at Google’s annual developer conference, the company showcased Google Lens, a new computer vision technology that turned the phone’s camera into a search engine. This simplicity is important. Google CEO said we now interact with our machines in a more natural and emotional way, so the keyboard is becoming redundant. Take a picture of a new restaurant in your neighborhood, and Google Lens will tell you the menu and a few recommended dishes, and even help you book your room. Google’s image recognition will scan this information, pass it on your phone, and automatically take you to the website. Google is not the only company to identify images as a future trend. Amazon’s Fire Phone phone in 2014 supports image-based searches, which means that you can align your camera with a book or a box of cereals and get it delivered via Amazon Prime. Baidu’s search also supports image recognition. But the meaning of image recognition is not only that. Recently, Computer Lyle columnist Mike Elgan wrote that the real meaning of Google Lens is to show us the future of general sensors. The article says thanks to machine learning and AI, the cloud can now use only one real sensor (camera) to create one million different sensors in the software and become super sensors. This super sensor and all the perception (and the perceived action) is a software solution where data will be used for software-based virtual sensors. No need to install new equipment, no need to replace the battery or any other additional sensor. At the same time, CMU researchers this month also announced their latest super sensor technology. These signs indicate that the “trillions of sensors” of the old model of networking will be killed by the AI, the super sensor is about to rise. In the future, through some inexpensive cameras, microphones and other sensors, you can quickly create any sensor in the software at low cost, these sensors will work through AI and cloud services. You no longer need to have the common equipment and physical sensors. The most profound change in the super sensor is that AI applications can achieve better results with fewer physical sensors. AI has killed the need for trillion of sensors. Google has also discussed a partnership with Particle, which is an Internet Of Things (IoT) platform developer. Google is exploring a way to make IoT devices aware of geolocation without installing GPS modules.
Also Read: All You Need To Know About Google Fuchsia
Google always seems to be ahead of the game. When it was all about web pages and text, Google created the best search engine. Now that the world is moving to images, Google has upped its game accordingly.
What do you think about Google Lens? Share with us in the comments below!