Just to elaborate on this, the CPU performance of the T420 should not be a concern. The T420 (and T430) use 35 watt processors, whereas modern ultrabooks use 15 watt processors. A T420 from 2012 can beat today's base model Macbook Air handily in CPU performance despite the technology upgrades in the past 5 years; it will just produce more heat doing it. And that's before the possibility of a quad core upgrade.
The good thing about going to the T430 is because it has the later Intel Ivy Bridge (as opposed to the Sandy Bridge) processors. The Ivy Bridge / newer chip sets brought USB 3.0 with them. Even the i5 version of the T430 has two USB 3.0 ports.
The NES's GPU generates sync pulses on the video line - one at the end of each line to make the electron gun's aim to return to the left side, and a long one at the end of each frame for the electron beam path to return to the top left. Circuitry in the TV uses these pulses to keep the picture stable.
This story does not take into account the recent supreme court ruling in Riley v. California that the search-incident-to-arrest exception does not apply to the information carried in cell phones. Even if your cell phone is on your person and unlocked when you are arrested, police need a search warrant to look at the data it contains.
I'd recommend that anyone with an interest in civil rights read Supreme Court decisions - they tend to be written in a much more understandable style than you'd expect.
The case you cite, "the SR server was located by myself and another member..." is not ambiguous at all. For "by myself" to take on the "on my own" meaning, the sentence would need "I" as its subject:
"I located the SR server by myself.", not
"The SR server was located by myself", which I don't think anyone, even those using this objectionable form, would produce.
I understand and sympathize with your distaste for the expression: it goes against previous rules of formal written English, and it signals the style adopted by people speaking for organizations attempting to convey power and control, which goes against the anti-authoritarian ethos of HN. But these sorts of judgments are usually motivated by deeply-held feelings and then rationalized as pleas for clarity and precision in language.
Not only are Finnish and Hungarian related, but they are (as far as historical linguistics can tell) completely unrelated to every other language of Europe. Going by family resemblance, you'd expect English speakers to have an easier time learning Persian or Sanskrit than Finnish.
The thing to note about this card is that it's a dual-GPU card - two chips with some sort of intra-card PCIE bus between them. You could get equivalent gaming performance by buying two of the GTX 780 Ti for about $1500. That level of performance is roughly what you want if you want to game at 4k resolutions at 60 FPS, so there are gaming applications for this card, even if it falls into "insanely high-end". The niche for this sort of card is in mini-ITX and similarly small computers, where there isn't enough room for a pair of cards and the builder wants "throwing money away" levels of performance.
That said, the selling point of the Titan cards is that their GPUs don't have the same restrictions put on their general-purpose compute performance as the standard gamer cards. NVIDIA locks this performance on their geforce cards in order to protect their lucrative GPGPU business, so this is really more of an entry-level card for scientific computing and other applications.
Cryptocurrency mining would be an obvious application, but a quirk of NVIDIA and AMD's differing architectures means that AMD cards are vastly more powerful at the specific functions needed to mine cryptocoins.
While a few years ago this card would have been great at mining cryptocoins, nowadays no gpu is ever going to be worth it as ASICs are 2-3 orders of magnitude faster and more efficient.
A distro with enough options to satisfy the creators of every fork would be an unholy mess. 1,000 separate contributors are very good at adding every option under the sun to a piece of software. But reducing choices to the point where you can just burn a CD and "install Debian" is anti-parallelizable.
I think 10 different distributions are fine. 1000 distributions are not. Fedora is good for Gnome, Ubuntu is good for Unity, OpenSUSE is good for KDE, Arch Linux is good for minimalism, etc. All these distributions are good at something radically different which is fine but I want to use a different theme, let's create a new distribution or some driver didn't work on my system, so I am going to create a new distribution is hardly a valid reason.
For me freedom means two things (1) contribute to up stream project (2) write free software.
The comparison is hardly disingenuous: the i5 may not be given Intel's highest branding designation, but it is an enthusiast processor and only a slight step down from the top-of-the-line i7-4770k, lacking only hyperthreading.
And this is completely irrelevant, since the i5-4670k ships with Intel's highest integrated graphics option for desktop chips, which is what is being compared to the A10-7850k.
At the moment AMD's processors can't compete with Intel at the high end. It makes no sense to berate a company for not doing what it can't.
Also, AMD has chosen to fab Kaveri in a bulk process that trades off clock frequency for density (-> more chips/wafer -> cheaper).
The HSA stuff could really be something exciting if they can get the software and OS support. Winning both latest gen consoles is bound to get some clever people spend cycles on it.
I'm actually of the thought that k-series CPUs are just regular CPUs with a non-working IOMMU, so instead of throwing it out they unlock the multiplier and sell it as an unlocked CPU.
About 10 hours of crawling through forums, VMWare documentation and a couple of conversations with people about ESXi. So all of it pretty unreliable ;)
Enthusiasts aren't generally using the onboard HD 4600 graphics -- the power profile of CPUs yield a reality that they remain seriously underpowered compared to standalone, 100s of W dedicated GPUs. But for those who do want to maximize the integrated graphics with Intel, the highest option for desktop chips is the Iris Pro 5200 (on the i7-4770R, for instance), which is some 60%+ faster than the 4600.
Nonetheless, these are usually targeted at business PCs and the like (hence the "9 out of 10". Businesses consume the overwhelming majority of PCs).
This article reads like a really bad press release. For instance-
"The new chips show that AMD is moving in a very different direction from Intel"
fair point. Here's a more interesting benchmark [1]
Thus we’ve discovered and confirmed Kaveri’s biggest advantage over Richland, performance per watt. At the high-end Kaveri doesn’t have a lot to offer non-gamers but once you bring TDPs down into standard small form factor or laptop ranges the performance profile of AMD’s newest chip is a lot more competitive. At the present time Kaveri’s performance appears to be a little behind, but still near what we’ve seen from Intel’s ~45 Watt Iris Pro or GT3e graphics solution.
or [2]
It is interesting to note that at the lower resolutions the Iris Pro wins on most benchmarks, but when the resolution and complexity is turned up, especially in Sleeping Dogs, the Kaveri APUs are in the lead.
Seems that Kaveri might not beat intel on the desktop, but might do so on the laptop.
The 45 Watt version of kaveri isn't even out yet and is nothing more than a paper launch. The 15 Watt laptop version hasn't even paper launched yet. I really don't think that this is going to provide any advantage in the laptop space.
One nitpick: it is not possible to really get Iris Pro on the desktop. The 4770R (all the R chips, actually) are FCBGA and not sold retail.
Maybe the big brand guys sell some desktops with Iris Pro, and I know Gigabyte has it in one of their NUC alternatives, but otherwise 'enthusiasts' can't get their hands on one
it is not possible to really get Iris Pro on the desktop
Current iMacs come with Iris Pro, and of course as you mentioned there are integrated products with it: While you can't buy it as a discrete chip at retail, you can certainly get Intel-equipped desktops with it, which was the point I was discussing.
Which is certainly by design by Intel, based upon an understanding their market: They put higher performance graphics in their mobile and FCBGA chips because those markets are where it is actually likely to be demanded -- from companies like Apple, or on a mobile where it is the primary graphics. When they sell a chip retail, it is overwhelmingly likely the buyer is going to be coupling it with a stand-along graphics card, so there really isn't much of a point.
Which is going to be the issue that AMD is going to come up against. They are selling something as an enthusiast chip while providing graphics capabilities that lie in that no-man's land of being overpowered for a standard business desktop, but underpowered for the market that is likely paying attention.
I'm sorry that I completely ignored Apple. Although I am not sure I would consider someone buying an iMac (or any brand name prebuilt desktop system) to be an 'enthusiast.'
* It does give pretty decent performance (notably, it outperforms the new Mac Pro on some workloads)
* It looks attractive
* A prebuilt OS-X system means less futzing around with drivers etc
I don't play games (beyond the occasional Minecraft session with my son) and I'm not particularly price sensitive. I've built (many!) of my own computers, going back to a 386DX40, and I'm happy to do it again if I see a good reason. But at the moment I don't.
Desktop computers that seem attractive to me at the moment:
This came off as much more judgmental than I intended, especially towards Apple which I respect as a company and whose products I admire from a design and integration perspective. I was also not trying to belittle Apple fans or customers of any of the other big name manufacturers.
I also really like OS X. As a FreeBSD user for many years, seeing OS X be successful is even a little gratifying because I know there's a lot of cross-pollination going on behind the scenes. I'm not a mobile/laptop kind of guy, but did use a MBP for a couple of years and it was without question the nicest laptop I've ever used. If I were to buy a laptop today it would probably be a Macbook Pro.
I've been building computers from parts for 30 years. I enjoy the research, part selection and construction aspect of the process. I like that I can go into the process with a specific set of criteria and come out with something that satisfies them exactly or, barring that, that I'm in control of the compromises. I like that if these criteria change or I find I made a mistake (more likely), I can just swap out a part and continue on. This is possible with most of the name-brand PC desktops, less so with the Apple products, but I like building it all myself the most.
Also, as a FOSS user, it's typical for hardware support to be an issue. Sometimes it feels like various industries either do not care about me as a user or actively want me to suffer; constructing a modern PC that doesn't have support issues is a challenge that brings a small amount of satisfaction when overcome. I understand if people think this is silly.
So, I consider myself an enthusiast. Given this explanation, hopefully my original comment makes more sense.
You're forgetting about portable (notebook, tablet, etc.), where most personal computers are sold.
One reason I even consider buying a desktop anymore is that discrete-GPU notebooks tend to be enormous and/or awful. If AMD can produce an APU with reasonable graphics performance, I'll gladly buy a notebook with that in it rather than a new desktop rig.
I just did this. Unless you're targeting something really high end with a lot of 4K and gaming, 4600 is more than enough to handle typical HTPC duties.
I'd certainly look at Kaveri today instead of Haswell for HTPC if I was doing it all over again though.
I've gone integrated on my last two HTPC builds.. my current one is going on 5 years old now.. and have been considering a jump... ironically it's by far the slowest computer in my apartment, and the one I use the most.
Something like this really appeals to me.. I'd been keeping an eye on the F2 line, and will probably go that way, or maybe NUC for my next HTPC, it just feels like the NUC options are just a little under powered.
The Gigabyte BRIX has an Iris Pro-equipped model if that's something you want.
In my case I wanted in-box optical and tv tuner, which eliminates the NUC stuff. There are a handful of right-sized 'htpc' mini-itx cases that enable this and a whole bunch of nice mini-itx boards now.
In general, Super Mario World is being played back on a Super Nintendo emulator using prerecorded inputs (a file exists that says which buttons should be held down on each frame). But these inputs aren't a recording of someone actually playing; these button presses were constructed frame-by-frame very carefully to produce these specific effects. Theoretically, if you could manipulate a Super Nintendo controller with perfect precision 60 times per second you could reproduce this.
Specifically, some objects in-game have pointers to code associated with them ("what to do if this block gets hit by a turtle shell", that sort of thing). The P-switch has one of these pointers assigned to a very special value by coincidence: its pointer points to the memory location where button presses are mapped. This pointer is never supposed to be followed, but by making a bunch of objects very carefully the authors can glitch the game into jumping to that memory address. Once execution is there, they can write a bootloader by making sure the button inputs on each frame correspond to the correct opcodes, letting them execute arbitrary code that they write in on the controller port.
I wasn't involved in the production of this TAS, so I'm not an expert, but that's my understanding of what's going on.
Imagine the host doesn't know. If you guess right, he opens a door with a goat behind it and you find yourself in the classic situation.
But if you guess wrong, there's a 50% chance that the host reveals a car by accident.
So the probablities look like:
33%: you guess right, the host reveals a goat
33%: you guess wrong, the host reveals a goat
33%: you guess wrong, the host reveals a car
So, of the situations where the host reveals a goat, the car will be behind your door 50% of the time.
If the host knows and avoids the goat, the probabilities look like this:
33% you were right; the host reveals a goat
33% you were wrong; the host reveals a goat behind door B
33% you were wrong; the host reveals a goat behind door C
So in the situations where the host knows about and avoids the goat, 66% of them have the car behind the other door.
In the case where the host doesn't know about the car, picking the car on your first choice makes it more likely for the host to reveal a goat. That's the difference.