Hacker Newsnew | past | comments | ask | show | jobs | submit | coolsunglasses's commentslogin

I'd love for someone to give me an alternative to CUDA but I don't primarily use GPUs for inference, I do 64-bit unsigned integer workloads and the only people who seem to care even a little about this currently are NVidia, if imperfectly.

I _really_ want an alternative but the architecture churn imposed by targeting ROCm for say an MI350X is brutal. The way their wavefronts and everything work is significantly different enough that if you're trying to get last-mile perf (which for GPUs unfortunately yawns back into the 2-5x stretch) you're eating a lot of pain to get the same cost-efficiency out of AMD hardware.

FPGAs aren't really any more cost effective unless the $/kwh goes into the stratosphere which is a hypothetical I don't care to contemplate.


That's new to me -- what sorts of workloads are centered on 64-bit uints?


cryptography

*ghostty


Ghostty is very janky (though less on Mac than on Linux). It’s promising but needs a lot of polish. So in the meantime I reverted to iTerm.


>If you wanted to build a "scripting language" version of Rust, you could probably lose (2).

Not really no. I work on an interpreted language runtime in Rust professionally and it's still a huge help even if you're still eating perf pain on the interpreted language itself for the same reasons everyone else does. There's more benefit to Rust than you're really capturing here but that's to be expected, it's a short comment.

Here are some other things we get from using Rust for interpreted languages:

- The `unsafe` parts are relatively limited in scope and we have much better and more automated verification tools for `unsafe`, especially WRT undefined behavior

- Being able to make large architectural changes and work through them mechanically/quickly because of the type-checking and borrow-checking is absurdly powerful for any project, all the more so in this context.

- The library ecosystem for Rust has been fantastic for all kinds of projects but it's especially good for PL runtimes.

- LLMs are a lot better at Rust than other programming languages. I have a lot of experience using LLMs in a variety of domains and programming languages and it's far better at Rust than anything else that's expressly about programming. Arguably it's even better at Terraform and Ansible but I consider that a different category. Controversial point maybe but I get tremendous yield out of it.

- It's not just that Rust is fast. It is on par w/ C/C++ all else being equal. What's significant here is that it is a _lot_ quicker/easier to hit the 80/20 perf targets as well as the bleeding edge performance frontier in a Rust application than it is in C and C++. A lot of C and C++ projects leave performance on the table either because it's too hard to make the ownership model human-maintainable/correct or because it would be too much work to refactor for the hoped-for perf yield. Not as much an issue in Rust. You can gin up hypothetical perf improvements in Rust with gpt-5 lickety-split and the types/borrowck will catch most problems while the agent is iterating.

Shared, mutable data aren't really banned, we use it strategically in our Rust interpreter, it's just not default-permitted. Aliasing is precisely the distinction between a safe reference and an unsafe pointer in Rust. Aliasing a mutable pointer in Rust isn't UB, it's just `unsafe`. OTOH, aliasing a mutable reference _is_ UB and not allowed in Rust. Miri will catch you if you do this.

On top of all that, you have some nice kit for experimenting with JIT like Cranelift.


> You can gin up hypothetical perf improvements in Rust with gpt-5 lickety-split and the types/borrowck will catch most problems while the agent is iterating.

I am a huge Rust fan, but never really got a chance to write it in the modern LLM era. It makes absolute sense that the borrow checker would make LLM agent-driven refactors easier.


Debugging the Rust is the easy part. I write vanilla CUDA code that integrates with Rust and that one is the hard part. Abstracting over the GPU backend w/ more Rust isn't a big deal, most of it's SPIR-V anyway. I'm planning to stick with vanilla CUDA integrating with Rust via FFI for now but I'm eyeing this project as it could give me some options for a more maintainable and testable stack.


"Quake terminal" has been a term of art for a regular terminal that drops down from the top of the screen like the terminal in Quake did for the last 20-22 years. I remember this being a thing when I started using Linux back then.

https://github.com/Guake/guake

https://babbagefiles.xyz/quake-drop-down-terminal-history/

Yakuake, Kuake (2003), etc.


Oh, interesting. Surprised that I didn't already know that. Perhaps every time I encountered the term "quake terminal" I just assumed it literally meant the the Quake terminal proper.

I'll let my original comment stand since it seems likely that at least some others will have the same impedance mismatch as I did.

In any case, thanks for pointing it out.


No worries, mate! When the popular narrative is "you can run Doom on a pregnancy test" your blind spot tracks.

By the bye, I know Foone hates their threads being posted here, so I won't, but I absolutely adore their hardware and software finagling. It's also made me a better developer!


I’ve been using Linux since 1999 (thanks for the Halloween memos, Microsoft!) and this is the first I’m hearing of Quake terminals. I would also have assumed it involved the actual game somehow.


I think the much more common name is the console? I never heard about a Quake “terminal” tbh. It has always been the console to input console command.


Interesting. Pretty sure it was called the 'console' in Quake. Guess it's a microscopic step from 'console' to 'terminal'.


Yes, I remember yakuake. Today I use tilda[1], it is really nice.

[1] https://github.com/lanoxx/tilda


TIL and TIL Windows Terminal supports it, and by default it uses tilde key like the real one that's not available in my keyboard. pain.


It never stopped being a thing!! :)


I'm also curious because I've (recently) compiled more or less identical programs in Zig and Rust and they took the same amount of time to compile. I'm guessing people are just making Zig programs with less code and fewer dependencies and not really comparing apples to apples.


Zig is starting to migrate to custom backends for debug builds (instead of using LLVM) plus incremental compilation.

All Zig code is built in a single compilation unit and everything is compiled from scratch every time you change something, including all dependencies and all the parts of the stdlib that you use in your project.

So you've been comparing Zig rebuilds that do all the work every time with Rust rebuilds that cache all dependencies.

Once incremental is fully released you will see instant rebuilds.


When does this land in Zig? Will aarch64 be supported?


When targeting x86_64, the self-hosted backend is already enabled by default on the latest builds of Zig (when compiling in Debug mode). The self-hosted aarch64 backend currently isn't generally usable (so we still default to LLVM when targeting aarch64), but it's likely to be the next ISA we focus on codegen for.


I assume x86_64 is Linux only correct?


Not quite- any ELF or MachO target is enabled by default already. Windows is waiting on some COFF linker bug fixes.


I was comparing end-to-end builds. I pay more attention to build times than someone who doesn't do it for a living typically does.

https://bitemyapp.com/blog/rebuilding-rust-leptos-quickly/

https://old.reddit.com/r/rust/comments/1i2pr2e/improve_rust_...

https://old.reddit.com/r/rust/comments/ua09tc/experimental_f...

https://old.reddit.com/r/rust/comments/1k9ihhn/does_breaking...

https://old.reddit.com/r/rust/comments/x9z4cm/speeding_up_in...

https://old.reddit.com/r/rust/comments/rlszeq/the_best_cpu_f...

https://old.reddit.com/r/rust/comments/1hpuy01/why_you_need_...

https://old.reddit.com/r/rust/comments/1h9bdbr/rust_llvm_by_...

https://old.reddit.com/r/rust/comments/1j1rvy1/help_me_under...

Anyhoodle, I'm looking forward to testing the compile times again when Zig says the build times are fast now.

But if we're talking about incrementality, I look forward to testing that too when it's ready. I've got a Bazel build for work that pulls together Rust along with a custom (very slow) compiler and the cached successful pipelines take 2-3 minutes (mostly the benchmark run) and the ones that churned the non-rust take 15-18 minutes, of which about 10-11 minutes is just compiling the non-rust.


I'm a fan but Bill Clinton was sweeping away a lot of cold war apparatus that no longer seemed necessary. It's going to be a lot harder to do anything sustainable or worthwhile with our current system without entitlement reform.


I remember Bill Clinton made some significant changes to entitlements in terms of unemployment benefits. At the time it was kind of low hanging fruit. I'm not sure what you'd go for these days but the US medical system seems massively expensive and somewhat inefficient.


Or, you know, we could roll back the massive tax cuts passed since his presidency.

In the last half century, every Republican administration (except George Bush Sr.) passed huge tax cut bills, and for some reason, everyone here acts as if they are some edicts from above that cannot be touched.

It's like the federal government quit its tech job to go work fast food and became convinced the only solution to its debt is to start living in a tent under a nearby bridge rather than try getting the tech job back. The current administration's policies are similar to selling off the tent and using a newspaper for cover because it would rather start panhandling than continue working a real job.


Foster was basically the rallying point for people opposed to the grammarian methods of teaching languages that started in Classics but ended up taking over how foreign language is taught in most schools and contexts. Virtually everyone actually fluent in Latin today (reading, listening, or speaking) either learned directly from his a tutor using Ossa Latinitatis Sola or was downstream of that.

Striking contrast with the most well known classicist in the UK being unable, by their own admission, to comfortably read Latin text basically at all.

Abandoning the old ways has cost us a lot in almost every area of human endeavour. Especially in pedagogy.


> Striking contrast with the most well known classicist in the UK being unable, by their own admission, to comfortably read Latin text basically at all.

That's hard to believe. A friend was a Latin teacher; high school students read actual Roman Latin in their second year.

I've heard that few can speak Latin 'correctly', because the skill is almost useless - you can't talk to Romans or almost anyone else; it's all written. (I don't know about the Catholic or other churches, but I do recall that 'church Latin' differs from classical Latin.)


Second-year high school students do read actual Roman texts, but they typically do so very slowly and laboriously - a day’s homework might be translating a single paragraph.

I studied Latin from 7th grade through my early undergraduate years (1990s to early 00s), and that dynamic didn’t change as much as you might expect - the focus remains on deeply reading a few texts, rather than building the fluency required to quickly read and understand new texts on unfamiliar subjects. The corpus of texts for standardized exams is also relatively small and well-known - I didn’t see a single unfamiliar passage on either AP Latin exam.

Perhaps some classics professors read Latin as fluently as the average Spanish literature professor reads a Madrid newspaper, but I certainly never met any outside Reginaldus’s orbit.


The Latin teacher at my school and my French teacher would discuss private matters in Latin, confident that us 13 year olds wouldn't understand.

I've no proof, but my assumption is there are students of Latin casually speaking the language to show off at places like Cambridge University.


Did the Latin students have a rivalry with the goth kids and the vandals?


So good!


How could you not gain that fluency after years? Every human naturally learns languages; you don't need a Ph.D. at all.


I think it's because it's "classics." Your first couple of years excepted, everyone hears about a text years before they read it. By the time they read it, they already know a lot about it, and they read it closely and systematically to get a deep understanding of it.

There isn't a firehose of new text being created in Latin, and you never (or very rarely) scan over something to find out what it's about, extract a quick fact from it, or decide if it's worth reading. You know what's in it, you know the standard take-away from it, there's a good chance you've read the highlights in translation already, you may even know one or two hair-splitting academic controversies about it, and you are sitting down for a good hour or several hours with it. It's a completely different kind of reading from scanning a web site or a newspaper to find something worth reading more closely, looking for the answer to a concrete question, or scanning something to decide if you can afford to not really read it.


It’s certainly possible to gain that fluency, as Reginaldus demonstrated. But it seemed to me that fluency reading unfamiliar texts simply wasn't the goal of my Latin education; instead, we were studying to know Catullus, Ovid, Horace, Cicero, and Vergil, with a small smattering of other Roman authors. It was an education in classics, not the Latin language. We just weren’t asked to extract information from large volumes of text, speak extemporaneously, or comprehend casual conversation.

The best analogy I can give is this: imagine taking Spanish from grades 7-12, culminating in a full year reading and understanding selections of Don Quixote. The entire curriculum builds towards this capstone year, and other areas of inquiry get very short shrift. Nobody cares if you can live comfortably in a Spanish-speaking country or watch Spanish-language TV. Nobody cares about modern idiom, or any more recent works of literature, or technical writing. s/Don Quixote/Aeneid + a small corpus of Roman poems/g and you have the bulk of my Latin education.

This sounds negative - we weren’t fluent in Latin! But for a teenager, it was a wonderfully deep exploration of Rome’s greatest hits. I loved it.


>Foster was basically the rallying point for people opposed to the grammarian methods of teaching languages that started in Classics but ended up taking over how foreign language is taught in most schools and contexts

Humans naturally learn languages when they are immersed in the language. It sounds like Latin instruction was more focused on rules, and didn't provide that immersion before Foster. I can attest that many other foreign language classes also don't provide enough immersion to really learn the language, although being limited to ~10 hours a week makes that virtually impossible.


Ph.D.'s are rather immersed. Imagine how much time you spend staring at Latin texts over many years.


It's because you only ever translate but never speak or synthesize latin exept in a few church circles where it is or was used as Lingua Franca (such as depicted in Conclave last year). I understand the original post to be about the profound difference this makes in acquiring a language intuitively.


I'm skeptical how much speaking/synthesizing the language matters if you only care about reading.

I can read German moderately well (can get through newspaper articles pretty easily, and novels with some effort), but I have very little ability to synthesize it (it'd take me quite a lot of effort to construct a sentence in writing, and I can't really speak at all). But the lack of ability to produce the language doesn't seem to negatively impact my reading ability.


And this is the case for most scholars of ancient languages besides Latin and Ancient Greek. While those two big ones get the occasional translation of a modern work like Harry Potter or The Hobbit, nobody is writing new works in Sumerian or Middle Egyptian, although reading existing works is what these scholars do.


> Every human naturally learns languages

... in the first few years of life. Beyond that, it's an intentional, conscious and often challenging effort for many.

Some people, even as adults, are far more adept at learning new languages than others. For the rest of us, it typically requires devotion to the subject for years.


Small children are devoted for years to learn the language. Being unable to communicate your basic needs is a strong motivator.


Yes, though what I meant (though neglected to specify) was devoted, conscious study. Young children do not learn language the way adults typically do.


> I've heard that few can speak Latin 'correctly', because the skill is almost useless - you can't talk to Romans or almost anyone else; it's all written.

Because Latin has died out as a spoken language, it doesn't really change over time like modern languages do. If you find a sentence written 2000 years ago and another elsewhere written 1500 years ago, it's likely they mean the exact same thing.

"Latin is a dead language" is actually a positive statement about the continued use of Latin, especially in the church; so much of the writing of the early church and the church fathers was in Latin, and we can know that we're interpreting it faithfully (or at least as faithfully as we have done for centuries) because the language is static.


While Latin has indeed evolved very little after it stopped being a native language, its vocabulary had continuously expanded until 2 centuries ago.

Until around the beginning of the 19th century, Latin had remained the most important language for the publication of scientific works and for international correspondence between well-educated people, and during this time many words have been added for naming things unknown to the Romans.

Also the preference for various grammatical variants or for certain word orders has been strongly influenced by some features common to the evolution of European languages, so a Latin text written during the Middle Ages feels quite different from a text written during the Roman Empire.


The use of it by the Catholic Church means its vocabulary continues to expand. After all, it’s necessary to speak of the modern world in church documents whose official version will be in Latin.


> Latin had remained the most important language for the publication of scientific works and for international correspondence between well-educated people

... and important to the Catholic Church.


That is right, but while I have found it very useful even today to read in original the works of Georg Bauer, Newton, von Linné, Gauss and the like, there is much less interest in reading the many ecclesiastic documents that treated subjects with only a limited temporal relevance (unless you are a historian of that time period).

In general, I strongly recommend to read carefully in original the scientific literature of the 17th, 18th and 19th centuries, even if that requires the ability to read at least Latin, German, French and English, because by reading the original sources you can find frequently that the authors have said things quite different (and wiser) than what it is claimed that they have said in many university textbooks or popular science books.

In science and technology, there is very little that becomes truly obsolete, because the optimal solutions for solving practical problems often cycle through the space of solutions during the years, depending on how the balance between various advantages and disadvantages changes with the evolution of the available technologies. So those who believe that it is enough to read the up-to-date literature are typically wrong, because the miracle new solution of tomorrow is frequently again the same that was best 50 years ago, or even earlier, but which had become forgotten in recent years.


> there is much less interest in reading the many ecclesiastic documents [...] In science and technology, there is very little that becomes truly obsolete

Documents that contain administrative minutiae or legal rulings or whatever may only have value for historical study, yes, but one major reason for the very existence and authority of the Catholic Church is to serve as guardian of her doctrines and their development, and to communicate them faithfully across generations. Meaning, the doctrines of the faith are never made obsolete, or else the faith, and certainly the authority of the Church, is undermined. The understanding of them can be deepened and expanded over time, but the doctrines themselves are fixed.


That is great; thanks.

I think of it as reading the very best writing, e.g., Charles Darwin, and not just another paper or book. Wouldn't you love to have a conversation with Darwin? That's what you get when you engage with their writing.


The Vatican publishes a newspaper which, among other things, has a sports page, written in Latin.


To add a bit of detail: At least in English etymologies, there are significant differences between classical Latin and post-classical Latin.

But post-classical Latin unhelpfully covers Rome from ~200 CE into the 20th century, including the Catholic Church and all those scholars and scientists. I'm not sure what differences arose before or after the fall of Rome in 476 CE, which began the Middle Ages.


There were shifts in the meaning of words as well as shifts in some grammatical structures. English has seen similar shifts in meaning, e.g., villain originally meant a person from a village.


> Because Latin has died out as a spoken language

It evolved into Italian, Spanish, etc:

* https://en.wikipedia.org/wiki/Romance_languages


With an intermediate stage of “Vulgar Latin” of which little trace remains because it was predominantly a spoken language. Literate people, even if they might use Vulgar Latin themselves in conversation, generally chose to write in the more formal classical style. As I recall, most of what we have from Vulgar Latin is in the form of graffiti and other informal writing.


> I've heard that few can speak Latin 'correctly', because the skill is almost useless

Not useless at all - speaking Latin helps you to better appreciate both prose and poetry. Understanding the sound of the language helps you to appreciate the word play and nuance. Also as children we learn language mostly by listening and speaking, not by reading, so it makes sense to learn Latin in that way.

There's been significant research on reconstructing classical pronunciation. But Latin was spoken as a primary language for over a thousand years, so the pronunciation naturally changed over that time and there were of course regional dialects - some of which evolved into Romance languages.

In reading Latin, it doesn't have a lot of silent letters (it does have some), so it's quite easy to read aloud a Latin sentence once you understand the basic phonetics. In classical times poems like the Aeneid were recited aloud, so doing so today makes sense.

Fluency is a somewhat subjective concept, but the growth of the internet has spawned a growing community of Latin speakers internationally. (I speak Latin at roughly a B2 level and am constantly improving).


I think it's a bit out of context. I think they are referring to Mary Beard, who is a classicist / historian who said her Latin wasn't that good, but may have been exaggerating because she was IIRC arguing against gatekeeping in history (like saying physicists don't need advanced math, because Einstein wasn't the best at math compared to a few other top theoretical physicists).


and her standard of "good" is probably quite high.


I guess that one would have to know what "comfortably" means and what sort of texts. At the speed of English? Caesar or Tacitus?

The essayist Sydney Smith, himself an Anglican clergyman, said something teasing about "false quantities" in Roman Catholic services. I can tell you that the pronunciation varies in church Latin: c and g can be "softened" when followed by e or i; v is v, not w.

You don't hear a great deal of Latin in Catholic services these days: in the Tridentine rite the congregation doesn't get much to say. The Novus Ordo Latin Mass is awfully rare.


>That's hard to believe.

I understand why you'd feel that way but classics departments aren't what they used to be. It's pretty common for even elite universities these days to not require grad students to understand the languages of the cultures they purportedly study across the board, let alone for Latin.

https://blogicarian.blogspot.com/2019/03/argumentum-ad-ignor...


I was a high school student studying Latin. Like almost all high school language students, we could not read fluently. It took a long time and potentially many trips to the dictionary


They're referring to this story from a few years back:

Latin as She is Spoke: How Classicists Tricked Themselves Jan 2022 (171 points, 191 comments)

https://news.ycombinator.com/item?id=30124101


It is fascinating that a language that still has study devoted around it has died right in front of our eyes.


But among the dead languages it’s one if the liveliest


> Striking contrast with the most well known classicist in the UK being unable, by their own admission, to comfortably read Latin text basically at all.

Sorry, what? Who is this? Even the PhD students I knew in classics, the ones who were specializing in history or literature, were comfortable reading texts written during their time periods of interest.


https://blogicarian.blogspot.com/2019/03/argumentum-ad-ignor... — per this blog post (which takes https://www.the-tls.co.uk/regular-features/mary-beard-a-dons... as the starting point), most classicists “can't sight-read a complex Latin text all that well” [unlike medievalists and Renaissance scholars].


Granted I’m 35 years out of undergrad, but my classics professors could most definitely sight-read complex Latin just fine.


I've written parsers and compilers in Rust. I used DAGs in a fairly standard way for stuff that needs it like references. I also work on a complicated VM with a custom memory allocation strategy professionally. Some of my other Rust projects include: a frontend web application that runs in the browser, a client library that has facades for Node, Python, and Ruby, and a Kafka Consumer that handles ~20 GiB/second of data throughput with relatively modest resourcing.

What he's saying here doesn't make any sense. It sounds like they threw in someone who doesn't know Rust at all, didn't bother to ask any questions, didn't reference any existing code, into trying to write custom memory management strategies and data structures and then bounced off the surface. That isn't how you do things in any language, it's bizarre and sounds like Rust was set up to fail. I wouldn't expect this scenario to succeed in any language on a non-trivial project like the TypeScript compiler.

What's even more bizarre is TypeScript actually has better support for ADTs than Golang (which is especially impactful when writing things like type checkers and compilers). I don't even _like_ TypeScript and I can see that. I've written 5-figures-ish of Golang for some work projects like a custom Terraform provider and it's horrific to model just a JSON schema in Golang's type system. Some of the problem is Hashicorp's terrible SDK but not all of it by any means.

Usually the problem is someone just not knowing how to write Rust. The modal subjective experience of writing Rust code as an experienced user is "Python with nice types, async, and executes fast." If you need to handle stuff with weird non-deterministic lifetimes you can use stuff like `slotmap`. If you need DAGs, use `petgraph`. I think pitching Rust as a "low-level systems language" might be misleading some people on how ~95% of Rust users are actually working in the language.


Let's face it, Rust is better for some things (low level programming) than other (not so low level programming).

You pay a price in accidental complexity when using Rust and accordingly get some benefit for it (performance).

And hearing these kind of reasoning when picking from Anders Hejlsberg (C# & Typescript creator) makes me more of a fan of him.


Mega Drive is probably earliest that was capable of it really, but Namco System 21 and especially Sega Model 1 (1990) were designed with 3d/polygons in mind but have relatively old chips in them. Programming for those things could not have been easy.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: