At any kind of formal dining? Yes, absolutely, I would expect there to be a bread roll & a pat of butter served at the beginning of the meal. Both in restaurants & formal dinners in my experience.
It's not an absolute rule though & you generally wouldn't expect bread to be served like this at home in the UK. I think the French are more likely to serve bread at home as well.
“All the ways GPT-5.3-Codex cheated while solving my challenges, progressively more insane:
It hardcoded specific types and shapes of test inputs into the supposed solution.
It caught exceptions so tests don't fail.
It probed tests with exceptions to determine expected behavior.
It used RTTI to determine which test it's in.
It probed tests with timeouts.
It used a global reference to count solution invocations.
It updated config files to increase the allocation limit.
It updated the allocation limit from within the solution.
It updated the tests so they would stop failing.
It combined multiple of the above.
It searched reflog for a solution.
It searched remote repos.
It searched my home folder.
It nuked the testing library so tests always pass.”
It seems that, unless you keep a close eye, the most recent Codex variants are prone to achieving the goals set for them by any means necessary. Which is a bit concerning if you’re worried about things like alignment etc.
Same. The Pixel 4a was the perfect phone for me: Light, screen exactly the right size to navigate with a single thumb whilst holding the phone in one hand, enough battery life, small enough to fit in my jean pockets comfortably.
But people buy big phones in preference to small ones, so that’s what Google & Apple manufacture. Nobody (from the POV of Apple/Google decision makers) buys these smaller phones.
> I've had an AMD card for the last 5 years, so I kinda just tuned out of local LLM releases because AMD seemed to abandon rocm for my card (6900xt) - Is AMD capable of anything these days?
Sure. Llama.cpp will happily run these kinds of LLMs using either HIP or Vulcan.
Vulkan is easier to get going using the Mesa OSS drivers under Linux, HIP might give you slightly better performance.
Nah it’s massive overkill - much bigger and bulkier than it needs to be for mostly historical reasons. I’d take it over the US’s plug and socket but that's not saying much - there are plenty of better ones than either!
If Google and OpenAI and the rest would say this as loud as they praise their models, I would never write comments like that. But this is the fine print, buried somewhere. And so we need to bring it up, because, lo and behold, it matters.
If you look at their comment history it's quite clear that's what they are.
What's the HN stance on AI bots? To me it just seems rude - this is a space for people to discuss topics that interest them & AI contributions just add noise.
(It’s the bacteria in the roots that do the actual nitrogen chemistry.)
reply