Definitely, LLM's have gotten so much press, that many people arguing about 'AI', are thinking about LLM.
And, LLM's are not all that a human can do. Language is not everything about a human.
But there is an argument that there is part of the brain that produces language, and it has some LLM characteristics. It's just that the brain is bigger and does more than an LLM. So the brain is not an LLM.
The brain has many components. What happens when you take the problem solving of something like AlphaGo/AlphaStar, with the Vision processing in Cars or DaLLe, and the language processing in LLM. Add in hearing, touch.
It's not that the brain is bigger than an LLM, it's that the way we learn written language (and spoken language too, tbh) is different from how LLMs learn language, and that the way we think about the world isn't derivative of language.
We don't learn to read or write by doing token prediction (if we did, subjects like spelling would be much easier). In fact, there was a movement in schools to teach reading by asking students to predict what words might be based on the context of the sentence, and it was a disaster and led to increased illiteracy rates and schools have started shifting back to phonics. Not only do we not learn that way, when we try to learn that way it leads to worse education outcomes.
The reason why brains are not like an LLM is not because we also have eyes and an LLM doesn't, it's because just isolating out our language "models", we are trained differently and interact with the rest of our brains differently.
If our language centers of our brain worked like an LLM, we would expect language skills to develop faster than reasoning capabilities within our writing/speaking. A primitive LLM like GPT-2 has very limited processing ability but is still able to imitate a wide range of styles and is still able to "speak" in a grammatically correct way. Humans are the opposite: we start out communicating complex ideas poorly and we start out using language poorly. We master language as a processing tool before we become competent at using language in general.
And, LLM's are not all that a human can do. Language is not everything about a human.
But there is an argument that there is part of the brain that produces language, and it has some LLM characteristics. It's just that the brain is bigger and does more than an LLM. So the brain is not an LLM.
The brain has many components. What happens when you take the problem solving of something like AlphaGo/AlphaStar, with the Vision processing in Cars or DaLLe, and the language processing in LLM. Add in hearing, touch.
It starts to look like the components of a brain.