Hacker Newsnew | past | comments | ask | show | jobs | submit | _jyty's commentslogin

As someone who isn't in academia, I've heard of this being a problem before, but in the context of research related to computer science, it seems like private research at companies like Microsoft might be better for such research. A lot of interesting research comes from Microsoft, and I don't think they have a problem of over-incentivizing the speed of research publication. That said, I'm not in academia (and have never done research) nor employed by Microsoft; I'm an undergraduate in computer science. Just speculating. Do you think this could be plausible, or is it way off?


This article about async in Python helped me understand it pretty well, since it explains them in terms of coroutines, which are very intuitive for me: https://mleue.com/posts/yield-to-async-await/

Another thing that helps me get it is comparing it to the continuation passing style, where you never return from a function, you just take an argument that's basically a function pointer bound to an environment, and at the end of the function, instead of returning, you call your input function, giving it another function and environment as input, repeating the cycle. It's very similar to the transformation of callbacks within callbacks within callbacks pattern in JavaScript to the async/await pattern.


The thing is, we tried coroutines in C (embedded) y-e-a-r-s ago. It was all the rave for a bit. There were a couple different macro/libraries you could use with duff's device and other trickery to get coroutine-ish things in C.

Maybe the implementation just wasn't up to where it needs to be with these newer/slicker/more integrated versions, but mine (and others') issues with them wasn't the weakness/caveats of the implementation, but rather with the mess of spaghetti it made as your coroutine use grew with any degree. In onesie twosies under nice demo cases (look ma, I get some data from the intertubes with this syncy thing), they're great, but my experience was that they're a mess when scaled.

I'm happy to be proven wrong. I get to use them a bunch in Kotlin, I'm trying not to be a victim of my experience. I'm still on the fence.


Those libraries were always somewhat of a hack. Async Rust is an official language-backed syntax.


OP did say:

> but mine (and others') issues with them wasn't the weakness/caveats of the implementation, but rather with the mess of spaghetti it made as your coroutine use grew with any degree.


Once you realize async/await is just sugar over the familiar callback hell, a lot of the mystery fades away and it's easier to groc.


Rust async is a bit different than in other languages. It's more like sugar over state machines instead of sugar over callbacks.

This is what makes it work nicely on embedded. The compiler-generated state machines are structs with fixed size so they can be statically allocated. Callbacks would have to be heap-allocated and garbage-collected/refcounted.

(disclaimer: Embassy maintainer here)


> It's more like sugar over state machines instead of sugar over callbacks

they are equivalent [1]. There are scheme compilers (a language with have first class continuations and often heap allocated stack frames) that compile everything down to a giant C switch statement.

[1] well, continuations are strictly more powerful of course, but the stackless subset needed for async/await is the same.


This misses the actual async part, which is more like polling a task queue. Callback hell sugar is only a thing in langues that already have event loops built in (i.e. JavaScript, which I assume is also what you're referring to)


The point is that it actually is insightful to think of it as callback sugar. That will give you better understanding of how the threading is handled when the calling method yields to the callee and conversely how it must be different when the caller isn't yielded.


It generally applies to eventloops, not just to lanugages with a builtin event loop. E.g. it certainly also applies to raw boost::asio which uses callbacks, libuv, libevent, QT, GObject, Netty, etc.


That’s true in JS but less so in a language like Rust where there are threading implications.


There are no threading implications to async in Rust. The executor you're using may add some requirements on your futures because it wants to run them on multiple threads[1], but that's not related to async itself, and you can always use a single threaded executor if you don't want these limitations (and they doesn't apply to embeded anyway).

[1] namely, your futures will need to be Send + 'static.


I'm not as familiar with Rusts's implementation but even in C# it's mostly true. Threads are only hit when a callback is not directly awaited. There's more to it but it starts you down the right path, I think.


Or realize CPS, channel and actor model are basically equivalent.


If you have a lot of different symbols, it makes sense that ligatures would help to distinguish them more easily. So, ligatures for Haskell symbols also probably help, since there are so many infix operators in Haskell. There are also a bunch of old papers that showcase Haskell code and use the ligature representation of its operators.

However, most programming languages really just have a few symbols: arithmetic operators (+, -, /, *), comparison operators (<, >, <=, >=), and logical operators (&&, ||, !). C also has bitwise operators (~, &, |). It's already easy enough to distinguish these without ligatures, so the benefit is negligible here, IMO. I prefer to go without ligatures, because I have no trouble remembering or distinguishing them, and, like someone else mentioned, I find ligatures in a monospace font kind of weird (how they morph into a two column character when you write them, and turn back to a one column character when you delete half of it).


However, not all IDEs, text editors, and terminal emulators support disabling the use of ligatures, so it's often easiest to just switch to a font that doesn't include them if you don't want them.


huh, I'm surprised from searching that there isn't a quick tool to just disable ligatures given a font file. It should just be a matter of removing the ligatures in the GSUB table from the font.


Just switch the font variant to NL - no need for editor support.


Doesn't that depend on the font author providing a NL version?

e.g if you want the FiraCode improvements[1] over FiraMono without the ligatures. Without editor support, you would have to build it yourself since they don't distribute a NL version.

[1] https://github.com/tonsky/FiraCode#whats-in-the-box


That's what I mean; it's often easier to use a font that doesn't have ligatures (like the NL variant) than to use a font with ligatures and hope your editor lets you disable ligatures.


>"Rust is so hard to learn that the AI doesn’t get it" is a bad place to be for rust.

I don't think this is really a big point against Rust; in my experience, ChatGPT doesn't write good code in general.

To be honest, I think Rust's borrow checker won't seem very hard if you've used C++ smart pointers before, which are like Rust's references/Box<T>s, but there's no borrow checker. The borrow checker just formalizes and enforces existing good practice for using pointers in C and C++, and makes sure they're followed 100% of the time unless specifically opted out of.

What actually seems more interesting to me than Rust is the ATS language. Rather than a borrow checker, you have proof objects that you pass around with pointers (and these proof objects exist only at compile time), which prove to the compiler that the pointer has a valid address. What's cool is that you can do things like pointer arithmetic in a type safe way. The downside is that it's more verbose than either C or Rust. I'm still trying to learn it.


I'm using a Dell laptop, but I'm running vanilla GNOME on EndeavourOS (took out the distro-specific customizations in the installer), and its trackpad support seems perfect on mine. Everything is as smooth as I've seen on Windows or MacOS. The settings menu for it is nice. I especially like how it comes with a three-finger swipe left or right to switch workspaces/desktops, or a three-finger swipe up to get an overview of your current desktop. It all feels natural, and it gives me a sense of control; if I swipe slowly, it switches slowly, rather than just make the transition deterministically after I've swiped in some direction (something I've experienced elsewhere on Linux).

I also like that you can use two-fingers to zoom in on a webpage with Firefox, just as on Windows.

GNOME is the first DE I've tried where all of this touchpad support was implemented so well, although I haven't tried that many, to be fair.

Sorry if I sound like an advertisement; I just wanted to be specific about what I like about GNOME's trackpad support as opposed to just saying that it "works fine for me."


I'm running stock Fedora (which uses GNOME) on a Framework laptop and it's the same for me. It's the first time I've had Mac level trackpad support and UI responsiveness out of any non-Mac.. windows or Linux.


Compared side-by-side, Aptos looks rounder and less square, and it has more natural looking curves.

The problem is that differences in fonts are kind of subtle when their goals are very similar. Both Roboto and Aptos seem like they're trying to be high x-height grotesque sans serif fonts. They're going to look similar to some other font, because there are so many in that category already.

(However, the capitals of Aptos remind me more of Gill Sans than of Helvetica, so I guess it's not strictly grotesque.)


When they refer to Swisss typography, they're likely referring to the International Typographic Style, which is also known as the Swiss Style (it's on Wikipedia). It's a style of graphic design. It typically uses a Grotesque sans serif font, like Helvetica or Univers (as opposed to a humanist sans serif).


Closing the loop a bit, Helvetica is the second word of the Latin name of Switzerland (Confoederatio Helvetica). It's why the ISO 3166 code for the country is .ch.


That would be a great name for a font site, but it’s Helvetia, from the same root in the Helvetii peoples’ name.


No, it's really "Confoederatio Helvetica", as you can see e.g. on this coin: https://worldcoinsinfo.com/world_coins/swiss-5-franken-1888-...

"Helvetia" exists too, as the female national personification of Switzerland.

Both are indeed derived from the Helvetii.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: