Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is quasi-deterministic (sans a heat parameter) and it only ever responds to a query. It is not at all autonomous. If you let it do chain-of-thought for too long or any sort of continuous feedback loop it always goes off the rails. It is an inference engine. Inference by itself is not intelligence. Chollet has very good reasoning that intelligence requires both inference and search/program design. If you haven't read his papers about the ARC-AGI benchmark, you should check it out.


> It is quasi-deterministic (sans a heat parameter)

Human brains are quasi deterministic. It’s just chaos from ultimately determinist phenomena which can be modeled as a “heat parameter”.

> it only ever responds to a query. It is not at all autonomous.

We can give it feedback loops like COT and you can even have it talk to itself. Then if you think of the feedback loop as the entire system it is autonomous. Humans are actually doing the same thing, our internal thought process is by definition a feedback loop.

> If you let it do chain-of-thought for too long or any sort of continuous feedback loop it always goes off the rails.

But this isn’t scripted. This is more the AI goes crazy. Scripting isn’t a characteristic that accurately describes anything that’s going on.

AI hallucinates and goes off the rails isn’t characteristic of scripting its characteristic of lack of control. We can’t control AI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: