hey HN — roni here, full-stack dev out of india (ex-gsoc @ internet archive).
spent last weekend hacking ZON-TS because json was torching half my openai/claude budget on dumb redundant keys — hit that wall hard while prototyping agent chains.
result: tiny TS lib (<2kb, 100% tests) that zips payloads ~50% smaller (692 tokens vs 1300 on gpt-5-nano benches) — fully human-readable, lossless, no parse tax.
drop-in for openai sdk, langchain, claude, llama.cpp, zod validation, streaming... just added a full langchain chain example to the readme (encode prompt → llm call → decode+validate, saves real $$ on subagent loops).
quick try:
```ts
npm i zon-format
import { encode, decode } from 'zon-format';
const zon = encode({foo: 'bar'});
console.log(decode(zon));
```
github → https://github.com/ZON-Format/ZON-TS
benches + site → https://zonformat.org
YC’s fall rfs nailed it — writing effective agent prompts is brutal when every token adds up. if you’re in a batch grinding observability (helicone/lemma vibes) or hitting gemini limits like nessie did, what’s your biggest prompt bloat headache right now? paste a sample below and i’ll zon it live.
feedback (harsh ok) very welcome
cheaper tokens ftw