Hacker Newsnew | past | comments | ask | show | jobs | submit | devin's favoriteslogin

So what the heck has happened with LK-99 really? (Disclaimer: I'm no physicist nor chemist, but I have co-written a report on three LK-99 papers [1] and am tracking the Twitter discussion as much as I can. I also got some help from knowledgable friends---much thanks for proof-reading.)

It turned out that LK folks were not talking about some stupid shit. Specifically they were one of the last believers of long-forgotten Russian theory of superconductivity, pioneered by Nikolay Bogolyubov. The accepted theory is entirely based on Cooper pairs, but this theory suggests that a sufficient constraint on electrons may allow superconductivity without actual Cooper pairs. This requires carefully positioned point defects in the crystalline structure, which contemporary scientists consider unlikely and such mode of SC was never formally categorized unlike type-I and type-II SC. Professor Tong-seek Chair (최동식) represented a regret about this status quo (in 90s, but still applies today) that this theory was largely forgotten without the proper assessment after the fall of USSR. It was also a very interesting twist that Iris Alexandria, "that Russian catgirl chemist", had an advisor who was a physicist-cum-biochemist studied this theory and as a result were so familiar with the theory that they were able to tell if replications follow the theoretical prediction.

Fast forward to today, students of the late Chair continued the research and produced a possible superconducting substance---LK-99---based on the Russian theory. A lot can be said about papers themselves, but it should be first noted that this substance is not a strict superconductor in the current theory. Prof. Chair once suggested that we need to trade off some (less desirable) properties of superconductors for room-temperature superconductivity, and that property seems to be isotropy. This particularly weakens the Meissner effect criterion due to the much reduced Eddy current, so there is a possibility that LK-99, even when it's real, might not be accepted as a superconductor in the traditional sense. LK folks on the other hand think they should be also considered a superconductor, but they are probably already aware of this possibility.

If we allow anisotropy in this discussion, we do have lots of such things already, most importantly carbon nanotubes. Scientists even thought about the possibility that they may function as typical superconductors [2], without any success though. So it might be appropriate to say that LK-99 is a substance that mimics them in one direction, but much more malleable. And that is an actually significant result (if true, of course) because for most uses a strict type-I superconductor is far more than sufficient, while implications of superconductivity are more achievable. We so far looked for strict superconductors only because we didn't know the effective way to trigger superconductivity otherwise; LK-99 might change that situation.

This whole discourse should make you more careful to conclude whether LK-99 is a superconductor or not, because we may well end up with a revised definition of SC as a result. If LK-99 makes superconductivity much easier to trigger it should be considered a superconductor in the macroscopic sense, authors would argue. Only the time will tell if they indeed made such a substance and it would be malleable enough to be substitutes for other superconductors, but they have a long history and arguably received unfair treatments. And they are about to fight back.

[1] https://hackmd.io/@sanxiyn/S1hejVXo3 (Semi-automatically translated: https://hackmd.io/DMjYGOJFRheZw5XZU8kqKg)

[2] For example, https://twitter.com/MichaelSFuhrer/status/168696072754495897...

----

This post is now also available as a standalone version: https://hackmd.io/@lifthrasiir/lk-99-prehistory & https://twitter.com/senokay/status/1687360854315151360


The issue is not single-pass vs multi-pass. It is instead, what constitutes a compilation unit, i.e., a pass over what?

Clojure, like many Lisps before it, does not have a strong notion of a compilation unit. Lisps were designed to receive a set of interactions/forms via a REPL, not to compile files/modules/programs etc. This means you can build up a Lisp program interactively in very small pieces, switching between namespaces as you go, etc. It is a very valuable part of the Lisp programming experience. It implies that you can stream fragments of Lisp programs as small as a single form over sockets, and have them be compiled and evaluated as they arrive. It implies that you can define a macro and immediately have the compiler incorporate it in the compilation of the next form, or evaluate some small section of an otherwise broken file. Etc, etc. That "joke from the 1980's" still has legs, and can enable things large-unit/multi-unit compilers cannot. FWIW, Clojure's compiler is two-pass, but the units are tiny (top-level forms).

What Yegge is really asking for is multi-unit (and larger unit) compilation for circular reference, whereby one unit can refer to another, and vice versa, and the compilation of both units will leave hanging some references that can only be resolved after consideration of the other, and tying things together in a subsequent 'pass'. What would constitute such a unit in Clojure? Should Clojure start requiring files and defining semantics for them? (it does not now)

Forward reference need not require multi-pass nor compilation units. Common Lisp allows references to undeclared and undefined things, and generates runtime errors should they not be defined by then. Clojure could have taken the same approach. The tradeoffs with that are as follows:

1) less help at compilation time 2) interning clashes

While #1 is arguably the fundamental dynamic language tradeoff, there is no doubt that this checking is convenient and useful. Clojure supports 'declare' so you are not forced to define your functions in any particular order.

#2 is the devil in the details. Clojure, like Common Lisp, is designed to be compiled, and does not in general look things up by name at runtime. (You can of course design fast languages that look things up, as do good Smalltalk implementations, but remember these languages focus on dealing with dictionary-carrying objects, Lisps do not). So, both Clojure and CL reify names into things whose addresses can be bound in the compiled code (symbols for CL, vars for Clojure). These reified things are 'interned', such that any reference to the same name refers to the same object, and thus compilation can proceed referring to things whose values are not yet defined.

But, what should happen here, when the compiler has never before seen bar?

    (defn foo [] (bar))
or in CL:

    (defun foo () (bar))
CL happily compiles it, and if bar is never defined, a runtime error will occur. Ok, but, what reified thing (symbol) did it use for bar during compilation? The symbol it interned when the form was read. So, what happens when you get the runtime error and realize that bar is defined in another package you forgot to import. You try to import other-package and, BAM!, another error - conflict, other-package:bar conflicts with read-in-package:bar. Then you go learn about uninterning.

In Clojure, the form doesn't compile, you get a message, and no var is interned for bar. You require other-namespace and continue.

I vastly prefer this experience, and so made these tradeoffs. Many other benefits came about from using a non-interning reader, and interning only on definition/declaration. I'm not inclined to give them up, nor the benefits mentioned earlier, in order to support circular reference.

Rich


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: