> On the contrary, we have one working example of general intelligence (humans)
I think some animals probably have what most people would informally call general intelligence, but maybe there’s some technical definition that makes me wrong.
I do not know how "general intelligence" is defined, but there are a set of features we humans have that other animals mostly don't, as per the philosopher Roger Scruton[1], that I am reproducing from memory (errors mine):
1. Animals have desires, but do not make choices
We can choose to do what we do not desire, and choose not to do what we desire. For animals, one does not need to make this distinction to explain their behavior (Occam's razor)--they simply do what they desire.
2. Animals "live in a world of perception" (Schopenhauer)
They only engage with things as they are. They do not reminisce about the past, plan for the future, or fantasize about the impossible. They do not ask "what if?" or "why?". They lack imagination.
3. Animals do not have the higher emotions that require a conceptual repertoire
such as regret, gratitude, shame, pride, guilt, etc.
4. Animals do not form complex relationships with others
Because it requires the higher emotions like gratitude and resentment, and concepts such as rights and responsibilities.
5. Animals do not get art or music
We can pay disinterested attention to a work of art (or nature) for its own sake, taking pleasure from the exercise of our rational faculties thereof.
6. Animals do not laugh
I do not know if the science/philosophy of laughter is settled, but it appears to me to be some kind of phenomenon that depends on civil society.
7. Animals lack language
in the full sense of being able to engage in reason-giving dialogue with others, justifying your actions and explaining your intentions.
Scruton believed that all of the above arise together.
I know this is perhaps a little OT, but I seldom if ever see these issues mentioned in discussions about AGI. Maybe less applicable to super-intelligence, but certainly applicable to the "artificial human" part of the equation.
[1] Philosophy: Principles and Problems. Roger Scruton
> Sure, it won't be the size of an ant, but we definitely have models running on computers that have much more complexity than the life of an ant.
Do we? Where is the model that can run an ant and navigate a 3d environment, parse visuals and different senses to orient itself, figure out where it can climb to get to where it needs to go. Then put that in an average forest and navigate trees and other insects and try to cooperate with other ants and find its way back. Or build an anthill, an ant can build an anthill, full of tunnels everywhere that doesn't collapse without using a plan.
Do we have such a model? I don't think we have anything that can do that yet. Waymo is trying to solve a much simpler problem and they still struggle, so I am pretty sure we still can't run anything even remotely as complex as an ant. Maybe a simple worm, but not an ant.
Having aptitude in mathematics was once considered the highest form of human intelligence, yet a simple pocket calculator can beat the pants off most humans at arithmetic tasks.
Conversely, something we regard as simple, such as selecting a key from a keychain and using to unlock a door not previously encountered is beyond the current abilities of any machine.
I suspect you might be underestimating the real complexity of what bees and ants do. Self-driving cars as well seemed like a simpler problem before concerted efforts were made to build one.
> Having aptitude in mathematics was once considered the highest form of human intelligence, yet a simple pocket calculator can beat the pants off most humans at arithmetic tasks.
Mathematics has been a lot more than arithmetic for... a very long time.