When you turn on exhaustive, exhaustruct and wrapcheck linters in golangci-lint. You get such a massive safety boost and it makes you fly through writing Go.
I initially worked on a code generator for OpenAPI -> MCP in January but very quickly we found issues relating to poor quality operation (tool) names and descriptions. Not to mention that each API endpoint does not cleanly map to an MCP tool and yet it all gets dumped into the context window - sometimes exhausting it if you have hundreds of schemas and endpoints.
Gram is our attempt to make better use of API by adding a curation layer:
- You upload your OpenAPI document, or any number of other OpenAPI documents.
- You then subset them into "toolsets" by selecting only the ones relevant to a domain (e.g. reading stripe charges) or business process (e.g. understanding customer health by reading info from your CRM, data warehouse and so on).
- Optionally, you create custom tools (these are prompt templates under the hood) that describe how to make a series of tool calls to solve a problem.
- Finally, every toolset is automatically exposed as a hosted/managed MCP server. No waiting for build or deploy steps.
- You can edit the names and descriptions of all imported tools and they are instantly reflected in the MCP server.
The net result is you have a rapid iteration loop to create effective tools from your API.
I hope you have a chance to try it out. Will be around to answer any questions in the mean time :)
I just love how this language marches forward. I have so many colleagues that hate many aspects of it but I sit here combining Go, Goa and SQLc writing mountains of code and having a fairly good compiler behind me. I understand what I’m missing out on by not using stricter languages and so often it’s a totally fine trade off.
Go is the only language where I've come back to a nontrivial source code after 10 years of letting it sit and have had zero problems building and running. That alone, for me, more than makes up for its idiosyncrasies.
As a more sysadmin/ops focused guy it really is the killer feature.
Static binaries and a more Java-esque resource profile than say Python are the cherries on top.
Okay, C++ is believable, but can you really build a Java / .NET project that was not touched for 20+ years with no changes to the code or the build process (while also using the latest version of the SDKs)?
I imagine you can _make_ a project compile with some amount of effort (thinking maybe a week at most) but they wouldn't be exactly "unzip the old archive and execute ./build.bat".
Yes, because Ant exists since 2000, Maven exists since 2004, and MSBuild since 2003.
Before it was a common procedure to have central package management, we used to store libraries (jars and dlls), on source control directly in some libs folder.
Afterwards, even with central package management, enterprise software when done right, is not calling the Internet in every build, rahter there are internal repositories that are curated by legal and IT, and only those packages are allowed to be used in projects.
So the tooling is naturally around after 20+ years, no one is doing YOLO project management when playing with customer's money.
As for the "...latest version of the SDKs..", that is moving the goal posts, there is no mention of it on,
> Go is the only language where I've come back to a nontrivial source code after 10 years of letting it sit and have had zero problems building and running. That alone, for me, more than makes up for its idiosyncrasies.
Ant and Maven have existed for a long time, but for me they didn't prevent Java (and other JVM language) projects from suffering significant bitrot in the build process.
For example, I worked on a project that just stopped being able to be built with Maven one day, with no changes to the JVM version, any of the dependencies, or the Maven version itself. After a while I gave up trying to figure it out, because the same project was able to be built with Gradle!
Older Scala projects were a pain in the ass to build because the Typesafe repositories stopped accepting plain HTTP connections, requiring obscure configuration changes to sbt. I've never had to deal with things like that in the world of Go.
> As for the "...latest version of the SDKs..", that is moving the goal posts, there is no mention of it on [...]
I thought it was implied since tooling & library breakages over the years happen and sometimes you can't just get the old SDK to run on the latest Windows / macOS. If the languages and Ant/Maven are backwards compatible to that extent, that's actually pretty good!
I had to deal with moving a .NET Framework 4.7 project to .NET Standard 2.0 and it wasn't effortless (although upgrading to each new .NET release after that has been pretty simple so far). We took a couple of weeks even though we had minimal dependencies since we're careful about that stuff.
This. Maintainability and refactorability are some of the major Go superpowers for me which enables getting into any code base and updating it. These are supported by features like static typing, fast compile times, etc.
Of note, I've found this to be very important with AI generated code, where it's easy to grok and refactor AI code.
In all fairness 10 years ago the deps would have been vendored in. Which side steps a whole set of problems if security, remote api version compat and features are not a major need
Yes, but with all the v2 in stdlib popping up we will get a lot of outdated code and a lot of "I need to know v1 and v2, because I will come across both".
Also, most of this can be automated with `go install golang.org/x/tools/gopls/internal/analysis/modernize/cmd/modernize@latest && modernize -fix ./...`
The difference is that going back to Go code you've written a few years ago, isn't nearly as bad as going back to Perl code you've written a few years ago!
And having written a lot of Common Lisp, Go code is extraordinarily straight forward in a sense where every developer writes in almost the exact same style.
This is not true for Common Lisp (even though it's not as bad as people make it out to be).
I feel the exact same way with C versus C++, even if I was the person to write the C++.
I've gotten used to golang, though it's still not my favourite language to program in by any stretch. One issue I've been having, though, is the documentation.
Documentation for third-party modules in Python is fantastic, almost universally so. In nearly every case of using a third-party library, large or small, there's sufficient documentation to get up and running.
Golang libraries, however, seem to be the opposite. In most cases there's either no documentation whatsoever on how to use things, or, more commonly, there is example code in the readme which is out of date and does not work at all.
The IDE integration with golang is great, and it makes some of this a bit easier, but I also still get a ton of situations where my editor will offer some field or function that looks like what I want (and is what I'm typing to see if it will autocomplete) but once I select it it complains that there's no such field or function. Still haven't figured that out.
So yeah, I dunno. The language is 'great'; it certainly has some extreme strengths and conveniences, like the fact that 'run this function with these arguments in a separate thread' is a language keyword and not some deep dive into subprocess or threading or concurrent.futures; the fact that synchronization functionality is trivially easy to access; Sync.Once feels so extremely obvious for a language where concurrency is king, and so on.
Still, the ecosystem is... a bit of a mess, at the best of times. Good modules are great, all other modules are awful.
Generally gophers just use the standard library as much as possible. There isn't the usual set of "must-have" dependencies, and generally speaking when a gopher tries to solve a problem, the first step isn't to search for a 3rd party library that solves it for them.
Obviously this is a broad generalisation and there are plenty of gophers who swear by using one or more libraries, and there are plenty of gophers who do rely on third-paarty dependencies. But this is still noticeably less prevalent than in many other languages, especially the more popular ones in web dev.
As others have said, it also helps that Go code is easy to read and emphasises simplicity. The code is often more readable than the documentation, for sure. Whether you consider this bad documentation is up to you ;)
I quite frankly will just read the code. Go generally discourages abstractions so any code you jump into is fairly straightforward (compared to a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring etc etc).
Regarding your IDE issues- I’ve found the new wave of copilot/cursor behavior to be the culprit. Sometimes I just disable it and use the agent if I want it to do something. But it’ll completely fail to suggest an auto complete for a method that absolutely exists.
> Go generally discourages abstractions so any code you jump into is fairly straightforward
This is a really anti-intellectual take. All of software engineering is about building abstractions. Not having abstractions makes the structure less easy to understand because they're made implicit, and forces developers to repeat themselves and use brittle hacks. It's not a way to build robust or maintainable software.
I think the more charitable interpretation is "Go generally discourages metaprogramming." Which I would agree with, and I think positively distinguishes it from most popular languages.
Go mostly only have abstractions that the language designers put into the language. It is (mostly) hostile to users defining their own new abstractions.
A case in point is that arrays and maps (and the 'make' function etc) were always generic, but as a user until fairly recently you couldn't define your own generic data structures and algorithms.
Go discouraging abstracts is sorta just... wrong anyways. Go doesn't discourage building abstractions, it discourages building deep / layered abstractions.
That is a key point in my opinion. A typical stack trace of a Spring (Java) application can easily be 1000 to 2000 lines long. That is not so common in Go, as far as I know (I'm not a Go expert ...).
Not really, it's more like it encourages "wide" abstraction (lots of shallow abstractions) that get pieced together vs heavily nested abstractions that encapsulate other abstractions. It's a very imperative language.
Did you cherry pick that part of the sentence and ignored "(compared to a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring etc etc)." on purpose or?
Of course, you'll probably retreat and say "Go is better for small projects", but every large project started as a small one, and it's really hard to justify rewriting a project in a new language in a business context.
You don't need a hierarchy of abstract classes, dependency injected implementations, nested pattern matching with destructuring, etc for any project. If one decides to implement these techniques in an ad-hoc basis in Go to solve problems, that's more to do with trying to apply principles and techniques from other languages in Go.
Really, there is nothing in the language that prevents you from creating crazy AbstractFactoryFactories or doing DI. What really prevents this is the community. In enterprise C# / Java, insanity is essentially mandated.
I enjoy the Go ecosystem quite a bit and haven't found many issues with documentation. I love how open source modules are documented on pkg.go.dev, including those from major providers, like AWS, Google, etc. Every library has the same references. When examples are useful, such as with charting modules, I've found that the projects do provide them. On the occasion where the README.md code is out of date, it's been easy for me to check pkg.go.dev and update it myself.
> my editor will offer some field or function that looks like what I want (and is what I'm typing to see if it will autocomplete) but once I select it it complains that there's no such field or function
Generally I found updated example in one of the test files. Or I could understand how to use library by reading test files in the repo. For me it's the opposite problem, python documentation is too long in some cases and it's not intuitive to find what I want if it's not trivial, and had to use websearch or llm.
Python package documentation is abysmal. It tends to read like a novel and yet still only covers surface layer details with simplistic examples. It's next to impossible to just "get an overview" of what's available: just show me the modules, classes, functions, etc. Don't make me spend 30 minutes trying to find an explanation for that one function which just takes a kwargs, which ends up only being covered in thr footnote of some random page in the documentation on something otherwise completely unrelated.
I wrote a lot of Java in a past life, and the documentation situation is night and day, for sure. I think it's partly a syntax/tooling issue, and partly a cultural thing. Luckily Go's standard library (+ `/x/` modules) lets me avoid third-party dependencies in many cases. The documentation from the Go team is very good in my opinion.
This is so true and unfortunate because golang has an inbuilt example function that closely follows the test functions. It means that all that really needs to change is how godoc promotes or badges libraries with examples.
I did not like it at first but it has grown on me. I still have my gripes, which are mostly things that come from its overall architecture and will never be resolved, but it is pretty enjoyable to use for the limited domain I use it in at work.
I know this won’t apply to everyone but I’ve been able to, relatively successfully, stick to the constraint that most of my dev tools are static binaries: zoxide, fzf, eza, btop and so on. There is one glaring exception which is docker but I let it pass.
By sticking to this I’m able to avoid Homebrew’s mess and Nix’s complexity. I use mise (https://mise.jdx.dev/) to manage the binaries and languages I have on my machine. It handles installing multiple versions and is directory-aware.
The introduction of WWW-Authenticate challenge is so welcome. Now it's much clearer that the MCP server can punt the client to resource provider's OAuth flow and sit back waiting for an `Authorization: Bearer ...`.
It's somewhat disappointing to see a bunch of "well, duh" comments here. We're often asking for research and citations and this seems like a useful entry in the corpus of "effects of AI usage on cognition".
On the topic itself, I am very cautious about my use of LLMs. It breaks down into three categories for me: 1. replacing Google, 2. get a first review of my work and 3. taking away mundane tasks around code editing.
Point 3. is where I can become most complacent and increasingly miscategorize tasks as mundane. I often reflect after a day working with an LLM on coding tasks because I want to understand how my behavior is changing in its presence. However, I do not have a proper framework to work out "did i get better because of it or not".
I still believe we need to get better as professionals and it worries me that even this virtue is called into question nowadays. Research like this will be helpful to me personally.
If you limit yourself to common programming languages and single-binary programs (e.g. ones written in Rust/Go) then I cannot recommend Mise enough to you. Absolute bliss compared to homebrew, asdf and other projects especially if you want multiple versions of tools in different projects.
Very similar in appearance to Redpanda Connect (Benthos) which isn’t a bad thing at all. Would be good to elaborate on how error handling is done and what message delivery guarantees it comes with.