Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is it possible to run Cursor entirely with local models? My Mac can comfortably run relatively massive models. I would experiment so much more with AI in my codebases knowing that I won't slam into a brick wall due to quotas, connection issues, etc.


You can power it with local models but you can’t use it without internet, or without sending your data to cursor, since they do a bunch of preprocessing and orchestration on the backend before handing everything off to the model




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: