Searching with Google is without a doubt one of the easiest ways to quickly find answers, and it can also raise new questions. Sometimes these questions start as a result of having just finished reporting, but the company simply announces a search update that will help you find answers to questions you haven’t asked.
At the company Search On Wednesday, Google announced a number of exciting new features, including a Google Lens tool that gives you the ability to search with keywords and images as well as a new “Info” box that answers queries related to your search.
The new products are powered by a MUM, or Multitask Unified Model, which can extract information from files beyond text, including images, audio, and video. MUM can also transfer this information to 75 languages and count them trained.
With the help of learning tools, Google can now integrate content beyond search terms, which can be useful if you stick to it what search by word or phrase to use. In the coming months, Google’s search results will begin to rank more relevant to you than you ever thought possible.
One example that the company provided is researching acrylic paint: You can type the word in the search bar and then go to the section labeled Things to know. Google will explain acrylic paint and where you can buy it, as well as links to how to use acrylic paint, styles related to such painting, and craft tips and how to get started with household items. In this case, Google AI expands your search and takes you to the bottom of a rabbit hole with acrylic paint, or the like. Google AI only gets all of this from search results, and it is possible that “Things to know” may make you stop clicking links to get more information if search is done in a good box. But Google will also link you to where you found the information, which may not save you (although manufacturers may have reason to be apprehensive).
Google Lens uses machine learning to ask you questions on real-time search engines. For example, if you see a picture of a sweater, you can use a lens to ask for other types of clothing that have the same color.
However, perhaps the most effective way to use this tool is when you are looking for help to identify something you do not know how to explain in a search. Google has shown this with a photo of a bike with a broken Derailleur, which I did not realize was part of the bike so far. You can point the Google Lens connected camera to the broken side of the bike, and then ask Google how to fix it without realizing Google’s location. This Google Lens app will not be available until early next year, and it is not known if it will be cooked for the independent Google Lens app or within the Android Phone Camera app.
Google also uses machine learning to recognize time in the video and to identify topics related to the event. For example, if you are watching a video of an animal that is not mentioned in the video title or description, Google may restart the colors and return some related links. Google has announced this will be announced in the coming weeks, with “significant improvements” in the coming months.
Finding relevant images will soon be easier with the newly created photos page. This should make it easier to search for project ideas and other search engines, and you can try Google search on the Internet.
Two other upcoming changes, which are not compatible with learning from Google machines, are optional Highlight this search and Promise this search, options when searching online. This will help you to continue your original question, or reduce it. Google has announced this in the coming months.