For the past two years, Facebook AI Research (FAIR) has been working with 13 universities around the world to develop a comprehensive video game program — specifically to teach color recognition. System-trained AIs should be good at maneuvering human-connected robots, or interpreting images from smart glasses. “Machines will help us in our daily lives only if they understand the world through our eyes,” says Kristen Grauman of FAIR, who is leading the project.
Such technology can be used to help people who need help at home, or to direct people to the tasks they are learning to complete. “The video in this archive is very close to how people view the world,” said Michael Ryoo, a computer analyst at Google Brain and Stony Brook University in New York, who does not participate in Ego4D.
But the risks are understandable and worrying. The study is sponsored by Facebook, a media giant that was recently criticized in the Senate by Putting value on public health, ideas related to MIT Technology Review‘s their research.
The Facebook business model, and other Big Tech companies, need to rank as much as they can from what people are doing online and selling to advertisers. The AI described in the project can reach people on a daily basis, revealing things around a person’s home, activities they enjoy, long-term roommates, and even where they are watching – information that has never happened before.
“There is a secret job that needs to be done when you get rid of this in search engine optimization and make something that is made,” says Grauman. “This work can be encouraged by this work.”
Ego4D is a revolutionary change. The old-fashioned first-person movies feature 100 hours of people in the kitchen. The Ego4D database contains 3025 hours video recorded by 855 people in 73 different locations in nine countries (US, UK, India, Japan, Italy, Singapore, Saudi Arabia, Colombia and Rwanda).
The participants were of different ages and ages; others were hired for their brilliant jobs, such as bakers, machinery, carpenters, and architects.
Previous pages contain social media accounts that last a few seconds. For Ego4D, participants wore head-to-head cameras for up to 10 hours at a time and took video of the first people not doing daily activities, including street walking, reading, laundry, shopping, playing with pets, playing games, and chatting with some people. Some of these illustrations also include the subject matter, many of the students’ observations, and various perspectives on the same site. That’s the first barn of its kind, says Ryoo.