Hacker Newsnew | past | comments | ask | show | jobs | submit | tpae's commentslogin

Osaurus | Full-Time | Remote | Protocol Engineer

Osaurus is a native macOS AI platform. The core idea: inference is a commodity — everything else (context, memory, tools) should be owned by you, locally. Think of it as the runtime layer for AI on Mac that works with any provider (OpenAI, Anthropic, xAI, local models via MLX).

3.7K+ GitHub stars and 54K+ downloads with zero marketing spend. Open source, local-first, provider-agnostic.

Looking for a protocol engineer who wants to build agent-to-agent networking and communication infrastructure. You'd be the first hire.

What we work with: Swift, macOS native APIs, Apple Silicon optimization, provider-agnostic API layer, plugin architecture. You don't need Swift experience specifically — strong systems fundamentals and the ability to pick things up fast matter more.

If this sounds interesting, try the app (https://github.com/osaurus-ai/osaurus), share your GitHub, and tell me what you'd change.

[email protected]


The framing here assumes you need to ship native to all 3 platforms to justify leaving Electron. You don't.

I've been building a native macOS AI client in Swift — it's 15MB, provider-agnostic, and open source: https://github.com/dinoki-ai/osaurus

Committing to one platform well beats a mediocre Electron wrapper on all three.


As somebody who frequently switches between windows and Linux, I will pretty much never install an app that's only on one platform. Cross platform options mean I don't need to pay for or learn separate apps for the same thing on each OS

I don't care if it's QT or a WebView or something else, I just want to lower friction for myself


Link gives 404


just fixed it


Yeah, like you don't need to write three different clients. You can write a native MacOS client and ship your electron client for the irrelevant platforms.


You can check out my project here: https://github.com/dinoki-ai/osaurus

I'm focused on building it for the macOS ecosystem


I'm working on https://github.com/dinoki-ai/osaurus, Native macOS LLM server with MCP support. Looking for more feedback!


This looks cool. I actually created Tweaks - https://tweaks.io/


Great name! I'm sure we'll be sending some mistaken traffic to each other


Your work is super awesome, love the osaurus tool!


I've been building with local AI, on Apple Silicon. It's only 8mb, but runs 30% faster than Ollama.

https://github.com/dinoki-ai/osaurus


did you really solo develop this entire application? including dinoki-ai which appears to be SAAS?


Not yet, but can be supported in the future


I just finished https://tweaks.io, it's open-source and MIT licensed. Let me know your thoughts!


I'm working on Osaurus - https://github.com/dinoki-ai/osaurus, native, Apple Silicon–only local LLM server. Open Source + MIT-License


Check out Osaurus - MIT Licensed, native, Apple Silicon–only local LLM server - https://github.com/dinoki-ai/osaurus


thank you


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: