Hacker Newsnew | past | comments | ask | show | jobs | submit | zahlman's commentslogin

It's also unusual given how much English you'll hear in Germany nowadays (at least in major, tourist-attracting cities) in just about any other context.

Too much English. I noticed this indoctrination way back when they released Ice Age over there for kids. The title wasn't even translated into German.

"Das Boot"

> didn't support chip-and-PIN or contactless payments.

As opposed to... swiping the card?

Are there really cards out there that exclusively support that?


Chip and signature, which often means just the chip without further authentication.

EMV has multiple options. Many countries (including the US) chose the signature option for credit cards for convenience and use PINs only with debit cards. Before contactless payment apps became common, that was a major source of friction when using American credit cards in Europe.


I'd argue we picked it for legacy reasons - Americans are not used to the chip/pin concept, and adopted EMV very late because of a variety of legacy reasons (massive installed base of mag stripe equipment, and systems to deal with the inherent slightly higher fraud).

Very few Norwegian issued cards, if any, have a magnetic strip. It's too easily cloned.

Both my DNB and Nordea cards, as well as my personal and corporate Norwegian AMEX cards all have magnetic strip, and they’ve all been issued somewhat recently.

If this story was more than a few years ago it's plausible that the card didn't have a chip. I still have a VISA debit card without a chip, and it was issued only two years ago.

Also chip-and-pin is mostly not enabled with American credit cards or card payment terminals


Prepaid gift cards (please note: those are not store issued) dont have chips and it is sometimes a problem to use them. But I doubt someone would buy a plane ticket with them.

A gift card isn't a credit card, though... ?

> A gift card isn't a credit card, though... ?

I supposed it's a matter of semantics, is a prepaid credit card that is gifted not a "gift card"?


You are likely thinking of branded gift cards that are specific to one store (or one grouping of stores). An Apple or Target gift card, for example.

There are also gift cards that are credit cards. Or, really, debit cards. See “open-loop cards” at https://www.bitsaboutmoney.com/archive/more-than-you-want-to...


I am not sure if it counts as a debit or credit card, because from technology point of view the division is blurry.

They are perhaps prepaid debit cards. But you can change and set a pin on them just like on the old credit cards. Because that's what they are - old technology.

They dont have a chip, so you have to swipe them.

Some employers give those gift cards instead of cash and I think those cards use the older technology in order to be cheaper (the chip costs few cents to manufacture). After you clear your card balance you basically throw the card away (very ecological), but if they wanted they can also add more money to the same card again. They usually dont do it since people lose the cards, so they issue new ones. So you get a lot of plastic. Think of this as some pocket money every 3-6 months.

It says VISA on the card.

The companies could give cash, but due to some obscure law and psychological reasons they give cards. The card is still better than the paper sodexo gift cards they would give out years ago that were a pain to use since few shops accepted them. But still it is a pain, since you often end up with a small balance and you need to pay part with this gift card (to clean it up to zero) and part with cash.


> isn't semantic

It's certainly better than calling everything a div.

> breaks accessibility features

I don't know if I'd call it breakage to just... not use them where they should be used. Of course if a real tag exists that adequately matches the author's intent, that should be preferred over a made-up one.


> I don't know if I'd call it breakage to just... not use them where they should be used.

Accessibility only has 2 states: "Working" and "Broken", there's no third "I didn't bother".


> It's certainly better than calling everything a div.

It's not. For semantic purposes <my-element> is the same as <div class=my-element>. So on the surface they are equivalent.

But if you are in the habit of using custom elements then you will likely continue to use them even when a more useful element is available so <my-aside> rather than <aside class=my-aside> so in practice it is probably worse even if theoretically identical.

Basically divs with classes provide no semantic information but create a good pattern for using semantic elements when they fit. Using custom elements provides no semantic information and makes using semantic elements look different and unusual.


> But if you are in the habit of using custom elements then you will likely continue to use them even when a more useful element is available

This article is written for web developers. I’m not sure who you think you are addressing with this comment.

In any case - the argument is a weak one. To the extent people make the mistake you allege they can make it with classed div and span tags as well and I’ve seen this in practice.


> But if you are in the habit of using custom elements then you will likely continue to use them even when a more useful element is available

You could say the same about divs. I’ve seen pages where everything is a div. No paragraphs, no headings, no tables, no lists, just divs.


That is a strawman. I never said everyone who uses classes perfectly uses semantic elements.

My point is that if you are using <div class=my-element> you don't have to change your .my-element CSS selector or JS selection code to improve your code to <p class=my-element>. If you are using <my-element> it is much more work to change your selectors and now you have two ways of doing things depending on if you are using a native semantic element or a div (either a tag selector or class selector). You have made your styling code depend on your element choice which makes it harder to change.


I've seen <span class=italics>, and it made me want to break things.

> For semantic purposes

But semantic purposes are not all possible purposes.


> > If you find yourself writing layouts like this you're probably ignoring a bunch of useful stuff like <aside> <article> <menu> etc.

> It's certainly better than calling everything a div.

Yes but it's worse than <aside> <article> <menu> etc. as the comment you are replying to mentions.


I also saw this at https://lyra.horse/blog/2025/08/you-dont-need-js/#fn:3 :

> You are allowed to just make up elements as long as their names contain a hyphen. Apart from the 8 existing tags listed at the link, no HTML tags contain a hyphen and none ever will. The spec even has <math-α> and <emotion-> as examples of allowed names. You are allowed to make up attributes on an autonomous custom element, but for other elements (built-in or extended) you should only make up data-* attributes. I make heavy use of this on my blog to make writing HTML and CSS nicer and avoid meaningless div-soup. ↩

(HN filtered out a "face with heart eyes" emoji from the second example.)


> Note that some of the examples can run even faster with Shed Skin by providing --nowrap/--nobounds options, which disable wrap-around/bounds-checking respectively. I'm not sure if PyPy has any options to make it run faster, at the cost of certain features (in the distant past there was talk of RPython - does that still exist?).

My understanding: RPython was never meant for that sort of thing. Its purpose is to make the bootstrapping process possible; it isn't really a separate thing, it just represents the features of Python that they don't use in the core implementation, so that the core implementation can translate itself to C. Much like the limitations on what Shedskin can accept for compilation. Once the implementation is bootstrapped, it's unclear whether restricting end users to RPython could help squeeze out any more JIT performance; it doesn't appear that they're interested in trying.


I'm not familiar with the thing you're recreating (I gather it's something to do with getting better responses out of LLMs by manipulating the context or something like that?) but I appreciate that you haven't, like so many others, dropped ten paragraphs of Markdown-formatted press release (without bothering to check whether the formatting even works here) on us echoing a bunch of marketing-speak in a README.

Haha, i just wanted my repo to be out here. If someone finds it interesting they can always just check the repo. And you're close, its about getting faster responses from the model by manipulating the request queues and memory.

I would not be able to handle that due to video streaming, web clients for things like email, etc. And some sites I trust (including HN) provide useful functionality with JS (while degrading gracefully).

But I use NoScript and it is definitely a big help.


https://en.wiktionary.org/wiki/die#Noun

"dice" is the plural for the object used as a source of randomness, but "dies" is the plural for other noun uses of "die".


I'm reading this thread on an 11-year-old desktop with 8GB of RAM and not feeling any particular reason to upgrade, although I've priced it out a few times just to see.

Mint 22.x doesn't appear to be demanding any more of my machine than Mint 20.x. Neither is Firefox or most websites, although YouTube chat still leaks memory horrendously. (Of course, download sizes have increased.)


I've been enjoying running Mint on my terrible spec chromebook - it only has 3GB of RAM, but it rarely exceeds 2GB used with random additions and heavy firefox use. The battery life is obscenely good too, I easily break 20 hours on it as long as I'm not doing something obviously taxing.

Modern software is fine for the most part. People look at browsers using tens of gigabytes on systems with 32GB+ and complain about waste rather than being thrilled that it's doing a fantastic job caching stuff to run quickly.


Mint is probably around 0.05% of desktop/laptop users.

I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

This wasn’t mentioned, but it’s a new thing for everyone to experience, since the general trend of computer hardware is it gets cheaper and more powerful over time. Maybe not exponentially any more, but at least linearly cheaper and more powerful.


> I think root comment is looking at the overall picture of what all customers can get for their money, and see it getting worse.

A $999 MacBook Air today is vastly better than the same $999 MacBook Air 5 years ago (and even more so once you count inflation).


The OP is not looking at the static point (of the price of that item), but the trend - ala, the derivative of the price vs quality. It was on a steep upward incline, and now it's flattening.

This is a text based website though. It should be fast on everything. Most websites are a bloated mess, and a lot slower

Try Win11 on that old PC, and you'll really feel the need for more RAM and a better CPU.

I sometimes feel M$ is deliberately making its Windows OS clunkier, so it can turn into a SaaS offering with a pricey subscription, like it has already successfully done with its MS-Office suite (Office 365 is the norm in corporates these days, though individuals have to shell out $100 per year for MS Office 365 Personal edition). We can still buy MS Office 2024 as standalone editions, but they are not cheap, because Micro$oft knows the alternatives on the market aren't good enough to be a serious threat.


You seem to have dropped the intended [1] link?

Ah, indeed I did! It's a searchable term in general, but I'm particularly fond of this write-up: https://plato.stanford.edu/entries/frame-problem/

yet again I am begging, weeping, pleading for markdown support on HN, my lord


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: