Hacker Newsnew | past | comments | ask | show | jobs | submit | JMostert's commentslogin

Which is why no other browser does anything with it, which is why it works so nicely.

The only drawback is that it doesn't validate. You can either wrap it in a conditional comment or incur the wrath of the standardistas and leave it be.


"If anything, cancer resistance (which would probably mean less cell mutation ability) would be disadvantageous from an evolution standpoint because you would have less variation and consequently may have trouble adapting."

Evolution doesn't work that way. It doesn't favor mutation because mutation is "good" for evolution. You have to look at it from individual genes, and there the story is just the opposite.

Evolution happens in spite of, not because of genetic predisposition to mutation, because it's vastly more likely for a mutation to be pointless or harmful than beneficial. If you have a gene that makes mutation in the genepool overall less likely, that gene will be more successful at spreading itself, because the mutants have, on the whole, far less chance of success than the originals (which have already proven their worth in a long chain of ancestor individuals carrying them). Of course every so often a mutant gets lucky and is more successful (or at least not less successful) than the original -- this is what selection is all about -- but statistics are against them.

To elaborate your scenario: suppose the environment changed very rapidly, and adaptation in some direction were necessary to survive it. If by some stroke of luck a gene that increased the mutation rate happened to be there at that moment, defying the odds, it would find a favorable environment, but only for as long as the rapid change is required. Generalizing this: a successful genepool is one where the mutation rate is "just right" for adapting to the environment. Too much mutation and the genepool isn't stable and will eliminate itself, too little mutation and the genepool will be eliminated by the environment. In this balance, the odds are stacked in favor of avoiding mutation.

And all this is orthogonal to the discussion of how cancer (the uncontrolled reproduction of cells) is related to the capacity for mutations (infidelity in the copying process). Cancer resistance and mutation resistance are related, but by no means the same thing. To borrow a computer image: your computer has copy processes with very high fidelity (only the occasional alpha particle to spoil the fun) but this doesn't protect it against worms, which copy themselves unwantedly but with the same high fidelity. This analogy is badly flawed in obvious ways (regular programs are not designed to copy themselves, cells are), but I hope it gets the point across.


The stark irony of this is, of course, that very few games on the Commodore 64 used or required BASIC, and certainly none of the popular ones. BASIC games where the first generation, before people mastered assembly. Some games may make proficient use of internal BASIC routines, but most will bank the BASIC ROM out of sight and never look back. Without BASIC, you won't get that READY. prompt and you can't enter LOAD "* ", 8, 1, but that hardly seems an obstacle.

The simple fact of the matter is that any emulator, or for that matter any programming environment that can be used to create one will allow users to run "illegal copies of games" (though with slowdown that could be crippling, admitted). They should ban emulators altogether, and anything that even smells like custom code execution (subject to Apple's discretion, of course). I propose "applications may not supply a Turing-complete environment to users", though regrettably this may mean spreadsheets and anything using macros may have to get the axe as well...


Interestingly, neither the original image nor the manipulated one can be called racist. The trouble only comes from realizing we're looking at a before and after.

Microsoft generally goes to great lengths to ensure people are not offended. It has to, because of the sheer size and diversity of its markets. Raymond Chen has blogged about one apparently innocuous issue [http://blogs.msdn.com/oldnewthing/archive/2003/08/22/54679.a...], and there must have been many, many more across Microsoft products.

Marketing works better if you reflect your audience's tastes. Doubtlessly someone calculated that the other image better suited Polish tastes, for which a racially diverse cast only serves as a vivid reminder that the image was marketed for an American audience. This doesn't make either the Poles or the marketing racist. The only thing "racist" was the mistake to clumsily manipulate an existing image instead of using a brand new one. Penny wise, pound foolish.


Marketing works better if you reflect your audience's tastes. Doubtlessly someone calculated that the other image better suited Polish tastes, for which a racially diverse cast only serves as a vivid reminder that the image was marketed for an American audience.

Supplementary point of information: Poland's ethnic make-up diverges massively from that of the USA -- according to wikipedia the last census suggested there were a grand total of 4500-odd black people living in Poland (pop. 38 million). They have some Vietnamese, but they're still pretty rare; Poland's ethnic minorities are almost all eastern European. So the original ad would push "exotic foreign company" buttons, which is not what you want to do if you're trying to present yourself as a regular local business.

(On the other hand? "Mindlessly stupid" just about begins to sum up what they did with photoshop ...)


> Microsoft generally goes to great lengths to ensure people are not offended.

Except with Clippy. Ok, ok, I know... I couldn't help myself.


The perils of DRM were no different back then, except that the term hadn't been coined yet.

I remember disassembling and working around the tape loader for a game on my C64 just so I could copy it to disk. I had no intention of sharing it with anybody, I just wanted to get around the atrocious loading times.

Today I download cracks for the games I buy (no need to write them yourself anymore) so I can run them without needing to have the DVD inserted. The more things change, the more they stay the same.


Did you read Dijkstra's article? He's making the case for 0, though he's doing it a little abstractly. Simply put, if you start numbering at 1, you are setting yourself up for more boundary problems and off-by-one errors than if you start at 0 (and by extension, inclusive lower bounds and exclusive upper bounds). That's not to say that some algorithms are not in fact easier expressed by numbering things from 1, just that they're not the majority.

Oh, and obviously 0 is even. Why? Because 0 mod 2 = 0, 0 is evenly divided by 2, and that's what "even" means. If you need more intuition, though: 1 is indisputably odd, and even and odd numbers alternate, so 0 is even. The rest is philosophy -- you can probably find definitions for "odd" and "even" where 0 is a problem. That's fine, but those don't help. Ignore them. Mathematically there's no problem whatsoever.

The trickiness only comes if you insist in thinking in terms of "the first element", rather than "element number 0". Some people use "zeroth", but this seems to invite more confusion because it induces two meanings for all the other ordinal forms -- if there's a zeroth element, does that mean the "first" element is in fact the second element? Best to avoid ordinals altogether -- you usually don't need any more than "first" and "last" anyway.


> Did you read Dijkstra's article?

Yes, I did. Does it hurt to offer a different perspective? I feel it doesn't. Approaching a subject with an open mind or from a different angle tends to be good for discussion. Offering a contradicting opinion or observation helps everyone to understand an issue better.

> Oh, and obviously 0 is even.

I might counter this with a similar question you asked me: "did you read my post?" I didn't make a case that 0 was odd, I asked how I should look at it: as element 0, or as the first element. If you look at it as element 0, the first element of the vector is an even element. If, on the other hand, you look at it as the first element, you will think it is an odd element. This is the case I made, not that 0 is even.


> Yes, I did. Does it hurt to offer a different perspective?

No, I was questioning whether you had considered his arguments at all, given the "why do we start counting at 0" question which he tried to answer.

> I didn't make a case that 0 was odd, I asked how I should look at it: as element 0, or as the first element.

You can look at it either way, as both are correct.

> If you look at it as element 0, the first element of the vector is an even element. If, on the other hand, you look at it as the first element, you will think it is an odd element.

Ah, I see your point now. To me this wouldn't be a question because everyone knows programmers start counting at 0 -- a machine instruction would operate on indexes, not ordinals. "The first element" is the element with index 0. It is not "element number one", or if you do want to see it that way, "number one" is the ordinal you use when you start counting on your fingers, which is something we deliberately ignore.


> No, I was questioning whether you had considered his arguments at all

Ah, yes. No, I wasn't trying to disprove him. I just gave a different approach to the issue. My mind is not made up on this subject. I've been a programmer for too long: 0 is burned in my fingers and my mind.

> a machine instruction would operate on indexes, not ordinals

The interesting aspect isn't the difference between indexes and ordinals, but the difference between indexes and cardinals. The cardinals match the ordinals for normal people. For normal people, indexes have no meaning.

But the difference between indexes and cardinals aren't that easy for programmers either. In an example presented by Dijkstra a young programmer used 0 in everyday language as a cardinal replacing 1. Indexes have little meaning in real life, but cardinals do.

Good response, BTW. Thanks. It's interesting to link the concepts of ordinals and cardinals to the discussion.


> I didn't make a case that 0 was odd, I asked how I should look at it: as element 0, or as the first element.

element 0, aka the 0th element..


I often wonder if Dijkstra's arguments still hold sway, since algorithms written in modern languages differ greatly from those written when Dijkstra wrote this. Rather than writing hash functions or shortest route algorithms, a majority of my algorithms are centered around lists and or dictionary manipulations. There are still many situations where it makes more sense to start indexing at 0, but I wonder if it is still the majority of cases.


Well, I think that there is a very important human reason to keep sticking to 0 as the start: convention. It has always been this way. Changing it would cause a whole lot of weird bugs.

To be honest, I think one reason why this grew is probably the way C handles array indexing. a[0] = *(a + 0) and this symmetry is a nice property to have in many cases. I think I use that symmetry in my MSc project a lot more than I use vec_mule/vec_mulo.

(Yeah, I'm not of a fixed mind in this matter :-))


> you usually don't need any more than "first" and "last" anyway.

Apart from FP people - they need "the first" and "the rest" ;)


http://en.wikipedia.org/wiki/Evenness_of_zero

Learning that such article could exist (it DOES exist), and reading it was a shocking experience to me.


Relevant quote from the article:

'A second-year was "quite convinced" that zero was odd, on the basis that "it is the first number you count"'


"obviously 0 is even"

Sounds like a bit of an odd statement to me.


I feel like some qualification is in order. I can't make any excuses for SOAP (maybe someone else wants to take that up), but COM, at its core, is indeed a simple and dare I even say elegant technology. COM is alive and well on Windows, because it still serves a basic need (cross-language binary interop) and it does so well.

The problem is that COM just couldn't stay simple -- the basic idea of querying an object for reference-counted interfaces was quickly buried under incomprehensible layers of pUnkOuter and IMoniker and apartment states and OLE activation sites and god knows what other stuff that you have to know about but doesn't help you getting work done.

I've seen people who can get carried away adding layers of stuff to something simple and beautiful without ever realizing that it stopped being simple and beautiful quite a while ago, because they still understand everything. Boiling frog design, if you will. I don't know Don Box well enough to tell if he's one of those people, but COM and related technologies seem to have seen a lot of that crowd.


It appears he was of that crowd:

http://news.ycombinator.com/item?id=335734


Good advice from the irrepressible Dijkstra on this matter: "Before embarking on an ambitious project, try to kill it." (http://www.cs.utexas.edu/users/EWD/ewd10xx/EWD1055A.PDF)

If you try honestly and still fail to kill it completely, whatever's left that you can't kill because it seems doable right now is one way to go.


Funny that he has this under "Signs that you shouldn't be a programmer": "you think the presence of code in a program will affect its runtime behavior, even if it is never invoked".

Anyone who has written code in C can probably cite an anecdote where exactly this was the case, be it from a compiler bug or as fallout from a memory corruption bug.

Maybe we should add "does not understand the von Neumann architecture" to the list of signs that you're a mediocre programmer... On second thought, let's not -- the piece is bad enough to begin with. The "alternative careers" sections are just odious, and the whole thing breathes an air of inconstructiveness.


This happens in JVM languages as well. Most optimizing compilers will do things differently depending on what happens elsewhere in a function, even if it's "dead code" (and especially if it's dead code that the compiler is too stupid to realize is dead).

Granted this usually only affects runtime speed, but in some circumstances things can get worse (multithreaded programs are particularly susceptible to such optimizations, especially if they're not written correctly - yes, the real problem may be that you've written your code wrong, but there are very real situations where the presence of dead code can change the optimization path taken enough so that it either works or doesn't, which is a very real effect).

And that's even before we take into account compiler bugs, which just make things worse.


It's quite possible there is none, and they're independent "inventions". There aren't that many four-letter words composed of hex digits (by my reckoning, the only common ones are BABE, BEAD, BEEF, CAFE, DADA, DEAD, DEAF, DEED, FACE, FADE and FEED), and stringing two together to get a magic number must be a popular pastime. In fact, Wikipedia has a list of them at http://en.wikipedia.org/wiki/Magic_number_%28programming%29 under "magic debug values", which demonstrates that 1337 can get you some more.


In the late 90s, I cracked a shareware but didn't do it too well, it was a quick and dirty jump flip; the developer was testing for cracking attempts, and as soon as I left the debugger and stepped back into ring-3 I notice my Windows start button now read "0xDEADBEEF".


It takes some courage to admit that on a site populated by software entrepreneurs.

(I'm not condemning you -- but I do think that cracking shareware is the computer equivalent of meticulously planning how to steal coins from the collection plate at a slum church.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: