I hadn't tried Magit before now, but on the suggestion of this blog I'm giving it a look through. Looks good so far. I also enjoy the style the blog is using: anyone know what backend the author is using?
One thing I hated about magit was that I couldn't stage multiple files with a region or similar. I'm currently on my phone for the rest of the evening so I won't have chance to check - if you notice this has arrived in your testing, do let me know!
I usually use shell emulator + company for dir/file autocomplete. This is the fastest I've got to staging individuals files or dirs, especially with projects that have a dir depth bigger than 1.
In magit-status-mode I'm able to stage files based on the region.
I can also expand them and highlight multiple chunks to stage just those chunks.
The are only two major things I don't know how to do with the current stable version of Magit (1.4). One is starting an interactive rebase. It can take you through the commits and let you edit the buffer, but I haven't figured out if it's possible to start an interactive rebase without using the "!" command line.
The other thing is checking out files in order to revert them, I'm sure there's a way but using "!" and pasting the file names in still seems fastest.
I've also been wondering if it's possible to write a command to manage the `--skip-worktree` status of files, showing which are currently skipped in magit-status-mode. That would be a useful thing for me to have.
This feature has been available for some time, I believe. In magit-status, just select with region the files and then `s` to stage the highlighted files.
Somehow, even though Collapse is insane if you think about it hard enough, and even though Pilot-waves are semi-classical and seem to be adding an additional complication that may not be necessary, both of these seem to be more popular among physicists than Many-Worlds. Somehow, even though superposition is observed and observable, even though the scale on which we can observe it is climbing steadily larger, people keep assuming that it either dissipates sometime before our scale, or that it's an illusion produced by some sort of semi-classical, overcomplicated reinterpretation.
The original idea—that the wave function is the whole deal—works just fine if you accept that we can be in superposition too. If you simply accept that the entire classical concept might be an illusion, and work up from the wave function, there's no reason to shoehorn in an idea like Collapse or Pilot-Wave.
Yes, Copenhagen is wrong. Yes, Collapse is patently absurd. Yes, assuming that there is exactly one, stochastic, probabilistic reality makes no sense. But that doesn't mean we need to add something complicated like an underlying superfluid that supports all of spacetime. This smells like Aether.
Many Worlds is a far, far more popular interpretation than Bohm[1].
The strong objection to Many Worlds is not that macroscopic objects cannot be in superposition. There are many objections [2], but the principal one is the difficulty of deriving the Born Rule.
This is a deep objection. The Born rule predicts of the result of quantum measurements in QM, and it's not clear how to get those results out of MWI. The Born Rule in MWI is inserted ad-hoc afterwards, or arises via some weird "world-counting" formalism that doesn't naturally connect to probabilities. So MWI has more the flavor of a visualization, not a theory that aims at making predictions.
When you say "collapse is wrong," it depends on what is being collapsed. Sure, inserting some special "wavefunction collapse dynamics" separate from ordinary evolution is a pretty rough approach. But when the wavefunction is understood as encoding probabilities, then it's not something physical, and its collapse is no more mysterious than the probability of the Giants winning the World Series "collapsing" to 100% once the final game was played.
> But when the wavefunction is understood as encoding probabilities, then it's not something physical, and its collapse is no more mysterious than the probability of the Giants winning the World Series "collapsing" to 100% once the final game was played.
No, from my understanding this is not correct - what you say would be true if we lived in a classical world.
The problem with this approach - that is, interpreting the wave function as encoding probabilities of different states of the world that merely reflect our ignorance of the true state - is that it doesn't explain how we can get interference effects between those different potential states of the world.
> interpreting the wave function as encoding probabilities of different states of the world that merely reflect our ignorance of the true state
The wavefunction definitely encodes probabilities - that's the Born rule, and it's a key result of QM. But probabilities of what? Not of the probability of the system being in different states, for there is only one state, which is described by the wavefunction. Instead it encodes the probability of the results of measurements.
For example, in the double-slit experiment, the wavefunction tells us, if we were to deploy a measuring device at the left slit, or right slit, or at various points on the screen, what the probability of measuring an electron would be. It does not tell us the probability that the electron went through the left or right slit. That would prohibit quantum interference, as you say!
A key point (of non-Bohmian interpretations) is there is no underlying "true state," i.e. predetermined values of observables. The uncertainty principle drives that home.
> we can get interference effects between those different potential states of the world
If these possibilities were classical, that would be impossible, as you say. But they are quantum possibilities, and quantum possibilities can interfere. AFAIK this has to be made a postulate of the theory. But once you've done that, and specified the mapping from quantum probabilities to classical measurements (i.e. the Born rule), you can show how classical measurements reflect the interference.
The key idea is that an electron's wavefunction is "made of" probabilities or numbers, not electron-stuff or matter or anything physical. Then "wavefunction collapse" is just a change in subjective knowledge, not a physical process.
I agree with the comment regarding the Born Rule, which is indeed a deep objection.
A further objection is aimed ad the claim that "we can be in a superposition" and it is: "If we can be in a superposition, why aren't we aware of it?"
This is a general question that any interpretation of QM must answer, and none does, which comes down to: "Why is there a classical world at all?"
There is a sense in which every interpretation of QM is an attempt to answer this question, but none do.
Decoherence, for example, simply asserts that we cannot be aware of quantum effects except via interference phenomena. Why not? Why can't we be directly conscious of the various incoherent components of the wavefunction in the same sense of directly that I am directly conscious of my cat sitting beside me? [1] I don't have to do any fancy interferometry or statistical inference to be aware of the cat, so why do I have to mess about with statistics and interferometry to be aware of the wavefunction, given I myself am described by one?
The fact that my multiple incoherent states do not interfere with each other is irrelevant unless you have some reason to believe that it is only via coherent interactions (interferecne patterns) that the wavefunction manifests itself to consciousness, and why would that be?
When I measure a gamma decay why am I aware of an event at a moment in time rather than a continuous probability wave? Likewise, why can't I be conscious of the wholistic universe that Many Worlds implies?
I've focused on consciousness here because there is no doubt we are directly aware of the classical world but are only indirectly aware of the quantum world, but there is very little reason to believe there is anything particularly special about consciousness in this regard. More likely, the brain, body, planet, etc, all "partake in" classical physics, none of which makes any sense from a quantum perspective.
That is: if all you knew about was quantum mechanics, you would never come up with Born's Rule or anything like it because you would never have any reason to talk about the results of classical measurement. You would not be aware than anything like classical measurement could exist.
So if we believe that QM is somehow foundational or fundamental to the classical world (and who doesn't?) then the fact that it gives no indication that the classical world even exists is something of a problem.
[1] To belabour the point: I do not mean "direct" in any Cartesian sense, but simply that there is a perfectly ordinary causal relationship between my cat and my awareness of my cat, which is quite different from my awareness of wavefunctions, which can only be via indirect means. We might have any number of additional senses, but all of them would be direct in this sense: none of them would allow me the immediate, simultaneous perception of a photon travelling through both slits at once. As Feynman said: this is the fundamental mystery.
> A further objection is aimed ad the claim that "we can be in a superposition" and it is: "If we can be in a superposition, why aren't we aware of it?"
Sounds rather like the fish being unaware of water. What would not being in a superposition feel like?
> Decoherence, for example, simply asserts that we cannot be aware of quantum effects except via interference phenomena. Why not? Why can't we be directly conscious of the various incoherent components of the wavefunction in the same sense of directly that I am directly conscious of my cat sitting beside me? [1] I don't have to do any fancy interferometry or statistical inference to be aware of the cat, so why do I have to mess about with statistics and interferometry to be aware of the wavefunction, given I myself am described by one?
So "you" is a quantum computer or something behaving like one, right? For you to "be aware of" a wavefunction, you'd have to causally interact with it. And that's very hard because of e.g. the no-cloning theorem; all you can do is entangle a qbit in your head with the qbit you're trying to measure, but what does that actually get you? What does that subjectively feel like? What operation would you expect to be able to perform that you can't?
I've seen a couple new sites out here in the bay try paying "beta testers" $5 or so each to contribute a little content. I remember doing that with BlockAvenue (they advertised via Craigslist SFBay).
Color me impressed. I do wonder if there's any way to reliably influence the pattern of the mutations. On the surface, with the differences in thread-scheduling patterns, I'd say it seems like a meaningful improvement on the randomness of the system PRNG.
If you have a sufficiently well-developed model of thread-scheduling patterns, then the randomness reduces in quality to that of the system's PRNG. But thread-scheduling is a serious pain, so this seems promising.
I'm pretty sure you can reduce the entropy of the output by saturating the processor or memory. If the various accesses are being spread out more because other processes are executing then you're going to have substantially less entropy. Sounds like bait for a DoS attack... hit the server really hard and all of a sudden the RNG is producing bad random numbers.
I doubt it's more efficient than something like Mersenne Twister. Generating entropy this way is pretty damn expensive in terms of processor cycles.
Some of this (especially the new post indicators) can be provided by a Chrom{e,ium} extension I've been using called Hacker News Enhancement Suite [0].
I just installed this. How do I know when a comment is new? I just looked at a few threads i hadn't opened before and I was expecting all of them to have the "new" UI element but they all looked normal (other than the new UI that the plugin brings for all of HN)
edit: and as I post this, my own post is marked as new. For those interested, new posts seem to be denoted by an border on the left
The key is that it's just like normal web apps, except they talk to each other easily, despite possibly belonging to different suites, and don't put your data at as much risk by operating on an external server.
I don't know if basing things on source is necessarily a negative. Perhaps better binary distribution mechanisms would be worthwhile, but maintaining the source-basis is a good way to ensure we're less susceptible to falling into a pit of proprietariness.
They're right that people don't like creativity or dissent. There was a good post on Less Wrong about this [1], and about how the rejection of creativity, the rejection of what to you, is blindingly obvious, is key in the break between the creative person, the independent thinker, and their trust in society's expectations and sanity [2]. Seeing the things that the established process missed requires valuing finding a better idea more than how many people you might piss off along the way. It make take several ideas to get there, but you have to learn to reward your brain for producing ideas, and you can't expect society to provide that reward. You have to produce that reward all on your own, until you find the right idea and succeed with it. Then everyone shuts up for a moment, before talking about how they knew you'd figure it out all along, and they were starting to think in that direction just the other day.
This is where the Less Wrong post on Occam's Razor [1] comes in handy. It helps make clear how to respond to someone who argues that backward perspective.