Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That’s actually an awesome idea and totally helps to reduce wasting context size - move repeatable instructions to a SKILL.md, and once they’re repeatable and no longer have variability to input, turn it into a tool! Rinse repeat.

Oh nice, you could even eventually turn the whole process including inference into an app so that you’ve cut out the LLM from the whole process saving you execution time



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: