Hacker Newsnew | past | comments | ask | show | jobs | submit | seeknotfind's commentslogin

Yeah, it sounds platonic the way it's written, but it seems more like a hyped model compression technique.

As the simplest theory, my default position is the universe is computable and that everything in the universe is computable. Note that they are not the same thing.

Some intuition:

1. If the universe contains an uncomputable thing, then you could utilize this to build a super turing complete computer. This would only make CS more interesting.

2. If the universe extends beyond the observable universe, and it's infinite, and on some level it exists, and there is some way that we know it all moves forward (not necessarily time, as it's uneven), but that's an infinite amount of information, which can never be stepped forward at once (so it's not computable). The paper itself touches on this, requiring time not to break down. Though it may be the case, the universe does not "step" infinitely much information.

One quick side, this paper uses a proof with model theory. I stumbled upon this subfield of mathematics a few weeks ago, and I deeply regret not learning about it during my time studying formal systems/type theory. If you're interested in CS or math, make sure you know the compactness theorem.

Paper direct:

https://jhap.du.ac.ir/article_488.html

I enjoyed some commentary here:

https://www.reddit.com/r/badmathematics/comments/1om3u47/pub...

See also:

https://en.wikipedia.org/wiki/Mathematical_universe_hypothes...


There are a lot of things in the universe that are not computable.

Do you mean like ghosts or like quantum randomness and Heisenberg's uncertainty principle?

We cannot compute exactly what happens because we don't know what it is, and there's randomness. Superdeterminism is a common cop out to this. However, when I am talking about whether something is computable, I mean whether that interaction produces a result that is more complicated than a turing complete computer can produce. If it's random, it can't be predicted. So perhaps a more precise statement would be, my default assumption is that "similar" enough realities or sequences of events can be computed, given access to randomness, where "similar" is defined by an ability to distinguish this similulation from reality by any means.


Please compute the last digit of Pi.

The last digit of pi doesn't exist since it's irrational. Chaitan's constant, later busy beaver numbers, or any number of functions may be uncomputable, but since they are uncomputable, I'd be assuming that their realizations don't exist. Sure, we can talk about the concept, and they have a meaning in the formal system, but that's precisely what I'm saying: they don't exist in this world. They only exist as an idea.

Say for instance that you could arrange quarks in some way, and out pops, from the fabric of the universe, a way to find the next busy beaver numbers. Well, we'd be really feeling sorry then, not least because "computable" would turn out to be a misnomer in the formalism, and we'd have to call this clever party trick "mega"-computable. We'd have discovered something that exists beyond turing machines, we'd have discovered, say, a "Turing Oracle". Then, we'd be able to "mega"-compute these constants. Another reason we'd really feel sorry is because it would break all our crypto.

However, that's different than the "idea of Chaitan's constant" existing. That is, the idea exists, but we can't compute the actual constant itself, we only have a metaphor for it.


And like any digits of Chatin's constant.

I'm looking forward to seeing everyone coming up together, creating better fonts for free, and wiping out any of those profits. Is Monotype trying to destroy their own industry, or do they really think this will work?

How many times does the rate need to be increased 10x before it's a problem?


If I remember correctly, 6.023x10^23 protons (with electrons) is one gram of hydrogen.


Yeah, people tend to add rather than improve. It's possible to add into lower levels without breaking things, but it's hard. Growing up as a programmer, I was taught UNUX philosophy as a golden rule, but there are sharp corners on this one:

To do a new job, build afresh rather than complicate old programs by adding new "features".


Was not it: "do one thing, do it well"?



"It’s not likely to be something you’ll ever have at home" Pessimistic much?


I think the more convincing argument is that most known applications of quantum computers (sidestepping any hardware practicalities), are for niche problems (in my wheelhouse, quantum simulation), the average person has no (practically advantageous) reason to own a quantum computer.


I suspect that once quantum computers actually scale up so that you can play with them, we'll find all sorts of interesting things to do with them.

However, even now, you can imagine that if quantum computers were small enough, it would be worth it to have it just for the asymptotically fast prime generation with Shor's algorithm. I don't think that's that far fetched. Of course, people wouldn't necessarily need to know they have a quantum computer, but they don't necessarily know the workings of their computers today anyway.


Thousand of hours on this. Love it.


It's nice that there is only a few weird behaviors produced. Often use-after-free leads to so many different random bugs, you might gorble a hubalu.


"Xia" would map to a single character (code point) in Chinese. For instance, in simplified Chinese, it could be 下 (xia, meaning down), 侠 (martial arts - like the xia in wuxia), or any number of other homophones. Since the characters are already combinatorial, I'm not sure a Chinese speaker would think of this as a portmanteau.


If you've seen Steve Martin's The Jerk, you'd know he becomes a real jerk in the modern sense during the movie.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: