Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Congrats on the launch. I think maybe you should get the LLM to pre-compute the paths and store the results. Getting it to calculate every branch of the story as you go presents a few problems: - When it is busy (like it is now), everyone is just waiting for the next section of text. - You are going to burn tokens, even when multiple people are following the same path

That said, the way you have done it means it is roughly unique each time.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: