The AI program was way less cute than a real baby. But like a baby, it learned its first words by seeing objects and hearing words.
The new model keeps things simple, and small — a departure from many of the large language models, or LLMs, that underlie today’s chatbots. Those models learned to talk from enormous pools of data. “These AI systems we have now work remarkably well, but require astronomical amounts of data, sometimes trillions of words to train on,” says computational cognitive scientist Wai Keen Vong, of New York University.
To narrow the inputs down from the entirety of the internet, Vong and his colleagues trained an AI program with the actual experiences of a real child, an Australian baby named Sam. A head-mounted video camera recorded what Sam saw, along with the words he heard, as he grew and learned English from 6 months of age to just over 2 years.
As this process happened iteratively, the model was able to pick up some key words. Vong and his team tested their model similar to a lab test used to find out which words babies know. The researchers gave the model a word—, for instance. Then the model was asked to find the picture that contained a crib from a group of four pictures. The model landed on the right answer about 62 percent of the time. Random guessing would have yielded correct answers 25 percent of the time.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: ForbesTech - 🏆 318. / 59 Read more »
Source: WIREDBusiness - 🏆 68. / 68 Read more »