Speaking with AI Psychologists
How do you actually build a brain out of computer code? The Augmented City spent two days with a Los Angeles-based company called AGI3, which is building a cognitive architecture for natural language understanding of English text.
A big part of that work involves understanding how humans process and comprehend language, and then trying to explain that phenomenon in a manner that software engineers can use to create a functioning system. Hence, there is a job title called "AI Psychologist"
The folks at AGI3 were kind enough to let me interview two AI psychologists and a software engineer to try to figure out just what a computer needs to understand language, not just process symbols.
Originally from Estonia, Dasha is an AI psychologist and a second language speaker of English. So she often thinks, "well,what would a computer need to know to communicate effectively in English?" as if it were an English student like she taught back in Estonia.
Another AI Psychologist, Maxwell, believes the best way to describe the job is that it involves working with computers to help them solve problems they're historically bad at doing. Among these problems are understanding the surrounding context based on the data that's there, and more difficult, helping computers reason about right context when the data is not there.
Brad is a software engineer who works with the AI Psychologists to create a common development language between abstract human analysis and abstract computing code. When asked about where he expects to see over the next decade, Brad says that in 2027, people will be teaching intelligent machines more than operating them. He believes that we need to make teaching computers as cool as programming them.
Our conversation showed that advanced language understanding and AI software development are crossing new disciplinary boundaries between social science and engineering. The old right brain left brain dichotomy is in for a major overhaul.