Hacker Newsnew | past | comments | ask | show | jobs | submit | HenryNdubuaku's commentslogin

Valid question. Our perspective is that there can be multiple players, there are 7B devices to power, everyone will get a slice.


You don’t have to bundle the weights as an asset, you can do over-the-air updates, new weights are simply downloaded.


Neat, but not really addressing my point. My point is that you still need to roll out changes and LLM ApIs just work.


Apologies for this, but you have nothing to worry about, no one is suing you. We are experimenting with the license and monetisation for corporations not indie developers. Please keep using Cactus the way you want, take this response as explicit permission while we go away and chew on your feedback.


Yes, you can fine-tune a model for any task, what do you have in mind?


You can add a web_search tool, checkout what these guys did with Cactus: https://anythingllm.com/mobile


Thanks I will take a look.


Ok, looking forward to it!


We host on HuggingFace, were you able to get it to work eventually?


We are focused on phones and we did add some benchmarks and will add more. However, anyone can see performance for themselves with the repo directly.


Real-time video and audio inference.


This was one of the issues we set out to solve, so not as much as you’d expect.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: