Hacker Newsnew | past | comments | ask | show | jobs | submit | niobe's commentslogin

Also super unlikely, and only worth discussing except as a theoretical exercise. We just aren't undoing centuries of law. Same reason trusts are here to stay.


Less than 200 years unless I'm misreading. Doesn't seem that difficult to change.


On the other hand, SV development culture is often obsessed with early release and adding features, traits I would identify with short-termism, rather than improving usability and minimising bloat (trends not rules). So a little old school conservatism might go a long way when it comes to software.. although the problems of too much silence in the face of seniority are also well known from the airline industry.


I say, what a splendid tale.


paywalled



could simply be 41 instances of the same server in 41 regions, not necessarily a cause for concern. Starlink is a global service after all. I'd be more concerned if 41 instances were sharing one key.


> I'd be more concerned if 41 instances were sharing one key.

Dozens/hundreds/thousands of web servers servers can easily share one private key in a certificate, public keys offer even more options on sane designs. Directly authenticating 41 servers using ssh-keys is just poor, slap dash engineering.


You're asking for something that is supported in X.509 but OpenSSH wrote their own certificate exchange standard that does not have support for those features.

HTTPS uses X.509. OpenSSH has no interest in supporting X.509 or, AFAIK, for changing their version to support anything but "self-signed" keys.


> You're asking for something that is supported in X.509

There's more than 1 way to skin this cat, and no, I'm not asking for the a specific solution you suggested.

SpaceX can implement any internal auth-scheme they choose to connect to a handful (not 41) of SSH intermediate instances, which then connect to the terminals


I would argue reusing private keys worldwide is slapdash engineering. You generally want to minimize exposure in the event of a breach, not maximize it.


I wonder what the reasonable balance between reuse and over exposure is; I'd think you would want less keys per device, and have less key overlap (ie more keys overall.) But forty two sounds high, and isn't it now just 42x more at risk?


> I would argue reusing private keys worldwide is slapdash engineering

I wasn't suggesting it, and frankly can't see how that could be a solution in this instance. I was making a comparison against current practices on a harder problem to solve , i.e. safely scaling a single private key in an SSL certificate across many servers is solved today without a 1:1 server to certification ratio


Is it a better idea to share private keys? In case of server breach, you will have a much harder time, won't you?


A better idea would be the terminal trusting one or two core certificate authorities and then those authorities creating time limited certificates when needed.

So the terminal accepts "sshauthority1"

Then the 41 remote sites contact sshauthority1 to get a 1 hour (10 minutes, 10 days, whatever) long certificate for "site18"

If a remote site is compromised sshauthority1 no longer issues certificates, and within an hour (10 minutes, 10 days, etc) the remote site can no longer reach the terminals.

Revoking a key from that many terminals (many of which will be offline) if one of the 41 keys is exposed is not trivial.

Now if sshauthority1 is compromised then you've got the same issue with rotation (although can CRL it), but it's easier to secure one or two authorities than 41 keys.


> Is it a better idea to share private keys

It is not, amd I can't see how my earlier comment can be read as recommending that. This is a solved problem for private keys (using load balancers, for example) , so public keys are lower-hanging fruit than that.

Edit: upon rereading, I cam see how the word "share" would be ambiguous in the context of if a private key. I meant "jointly make use of", rather than "distribute copies throughout the fleet". I have exited my root comment to make my meaning clearer.


Is that normal? I would imagine that if I were managing such a large deployment, I would just use a CA for the keys and then issue CA signed private keys so that I don't need to add a bunch of random ones to authorized_keys


Notably, I believe 41 is the current count for how many points of presence starlink has around the globe.


calling it "last" is defeating their own premise - that tests need to keep pace developments in ability


The name is very intentional, this isn't "AI's Last Evaluation", it's "Humanity's Last Exam". There will absolutely be further tests for evaluating the power of AIs, but the intent of this benchmark is that any more difficult benchmark will either be

- Not an "exam" composed of single-correct-answer closed-form questions with objective answers

- Not consisting of questions that humans/humanity is capable of answering.

For example, a future evaluation for an LLM could consist of playing chess really well or solving the Riemann Hypothesis or curing some disease, but those aren't tasks you would ever put on an exam for a student.


Isn't FrontierMath a better "last exam"? Looking through a few of the questions, they seem less reasoning based and more factual based. There's no way that one could answer "How many paired tendons are supported by this sesamoid bone [bilaterally paired oval bone of hummingbirds]" without either having a physical model to dissect, or just regurgitating the info found somewhere authoritative. It seems like the only reason that a lot of the questions can't be solved yet is because the knowledge is specialized enough that it simply is not found on the web, you'd have to phone up the one guy who worked on it.


Wow, old google seemed to care about the quality of their data and the service they were providing to users, and then apply reasoning to achieve those aims.


The Google of old wanted users to get the best results using their software. The Google of new wants customers to get the best results using their ad network.


Go outside, everyone's real .. at least for the time being!


They are boring though


Only seems that way until you come down off the hyperstimulation.


Excel is in our DNA and will never die


Unfortunately our DNA is also in excel. Several genes had to be renamed because they kept being identified as dates.


Excellent


Funnily enough our DNA does not use a fixed-length offset mechanism. It uses null termination sequences (and start sequences too, for some reason.)

Which is closer to the storage mechanism of excel (XML), and not to it's visualization interface (tables).


Interesting. Well yeah null termination seems better if (a) you don’t have an integer encoding and (b) you have random ”bit” flips.


I don't think you need integer encoding to process fixed lengths. They do it just fine at the word level for codons. You would need a specific mechanic processors for each different schema length pattern though.

I think bit flips have no effect on the appropriateness of either fixed length or null termed. But omissions and comissions are probably why anything fixed length doesn't work.


I am a technical musician for 40 years and I couldn't understand the points he was trying to make... poorly explained


What is a "technical musician"?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: