Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm actually expecting Nvidia and Apple to catch-up to Intel on CPU performance before AMD does, and I think it will happen within a year or two, after they switch to 16nm FinFET. This can happen among other reasons because Intel has stopped focusing on increasing performance too much since Sandy Bridge. They mainly focus on power consumption and increasing GPU performance these days, which obviously leads to a compromise on CPU performance.


You need to appreciate the economics to realize why it won't happen: a new chip foundry for these types of processors costs something like $5 billion, and that price has only gone up over the years (since we keep wanting to put more and more mass-produced nanotechnology into these things).

Every single time you change a process in anyway, millions of dollars of equipment - minimum - is being ripped out, retooled and replaced. And that's fine, because this industry is all about economies-of-scale, but it means Intel has a huge advantage: they can build more chips. As in, they can convert several fabrication lines to build chips, and simply have more out the door and on the market then their competitors, which means they can afford a price drop which other people can't - because they need to pay for the upkeep, running and loans to build those fab plants in the first place.

Intel is focusing on power and GPU because that's where the gains are to be had and what the market needed, and because they have to - current gen high-end CPUs have more thermal output density then a stove hotplate. Power use had to drop to have any hope of running higher performance into the future, and anyone hoping to compete has the exact same problems to contend with. And since new battery technology isn't happening, mobile has to find power savings on the demand side.


TSMC which is where nvidia and others make their chips already has 16nm factory running.


You probably won't see anything on their 16nm node until some time in mid to late 2015 and it should be pointed out that it is not a real node shrink but instead a fake one where they use the same 20nm node but have introduced FinFET. While this gets them increased energy efficiency it does not give them the other benefit of a node shrink (increased transistor density) and so that means that means there will be less chips made per wafer driving up the cost.


If they don't shift that focus in 5-7 years they'll find themselves kicking ass at the top of the scale in a market no one can afford. Buying 20 of their competitors chips would be cheaper, just as fast and save a boatload of cash in power, cooling and equipment design compromise.

No one will beat Intel at their own game competitors need to keep them going sideways.


I really doubt this to be honest as everything I have seen indicates that Intel foundries are only pulling ahead even further from the competitors.


Impossible, their CPUs today are about 3-4 times slower than Haswell. It's too much to make during two or even three years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: