http://www.sweclockers.com/nyhet/164...slutet-av-aret
This is very exciting, once a few of the engineering samples get out we might get some rogue benchmarks.
http://www.sweclockers.com/nyhet/164...slutet-av-aret
This is very exciting, once a few of the engineering samples get out we might get some rogue benchmarks.
Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450
well, taking this with a grain of salt because intel's rumors are always better than the production
but if HD5000 integrated graphics are really double what HD4000 is capable, that would put both Nvidia and AMD in trouble for their low end cards, performance levels like that would be on par with a 7770 or 650
however, i think we will have more issues with haswel overclocking, Ivy has the problem of the thermal wall being so steep that it knocked watercooling out as a viable option, part of that is due to the smaller size increasing the concentration of heat, dropping again to 14NM will only amplify this, meaning we will have lower overall clocks on air and water, and will need exotic cooling like phase changing to reach the high overclocks we easily enjoy currently
also, part of one thing that annoys me is the lack of direction with die shrinks, they used to be about fitting more transistors on the same sized chip, ie, you could add more cores without needing a CPU the size of a dinner plate, but instead we have simply been making them smaller, with 14nm we could easily have 10, 12 or even 16 core chips in a LGA1155 socket, and double that in a 2011 socket, the actual silicon real-estate of the 50 core Knights Ferry co-processor is roughly the same as an 10 core E7 Xeon, instead they are just focusing on power efficiency
Well to be fair, we're starting to reach a point where MOAR POWER isn't really as big as a priority as fixing power efficiency. With a 3570k at 4.2 ghz and a 7970 I can run any game at max settings and still have a good 60-50% of my cpu power available. I can't imagine what would be used with your insane computer.
Now this might not be true when we start getting 4k resolution monitors in a few years but as of right now we're pretty much fine.
This obviously only applies to games and not hardcore video rendering, 3d modelling, code compiling, etc.
You're right about the cooling issue though.
I don't suppose it would be possible to spread the cores around the cpu would it? Surely the heat generation wouldn't be such an issue if it were to be spread out over a larger area?
http://eu.battle.net/wow/en/characte...rning/advanced
i5-3570k @ 4.4ghz - R9-280X @ 1150Mhz on stock voltage - 8GB of DDR3 Ram @ 1866Mhz
It's too bad Intel drivers are still terrible. That itself is puzzling considering the good people they have writing their compilers. They have the resources to do a better job but they don't.
No need to worry about 14nm until Broadwell.
No one needs 16 cores or 50 cores for their desktop. No one even has software to use that. (AND by no one I mean the majority of users) The focus on power efficiency is responding to the market, where the majority of systems being sold are laptops.
So how powerful would said integrated graphics be?
If they pushed out more cores on their CPUs the software market would sooner or later have to adapt. More cores doesn't mean less efficiency either, you can just put some cores to sleep when they aren't needed.
---------- Post added 2013-01-19 at 02:26 AM ----------
http://www.anandtech.com/show/6600/i...eforce-gt-650m
This is pretty much the only info available.
Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450
Quad core technology is still a long way off from being fully utilized, and we have had quad core processors on the market since 2005 or 2006? It's getting close to a decade and still too much technology is designed around single threading. Why bother increasing the core capacity until there is a demand for it? It was a wise move at the time, but these days there isn't a huge need for 6, 8, 12, 16 etc core processors unless you are running servers.
Some more info: http://www.digitimes.com/news/a20130118PD210.html
Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450
This is due to the console market though. Nothing is pushing the PC to evolve, so it doesn't have to. That and intel not having any competition, they can shrink and save power all they want. Right now intel are at the point where no one is upgrading except enthusiasts. When even a gen1 will do anything a casual user will need, why upgrade? People will keep their gen1/2s for YEARS before even thinking of upgrading.
We need a new Crysis to push the PC to new limits, but no one is willing to when consoles are the market. Cheaper, more subtle PCs are the market right now with a lot of the focus being on console type gaming and so reducing the consumption is all intel will really benefit from.
you read the link right?
in gaming perhaps, but imagine if you could have a 50 core cpu, you could run an entire active directory and VDI system off a single chip, you could simulate whole LANs on a single windows 8 desktop,
as for laptops, personally i would love more cores in my laptop, part of the reason i went with an optimus configuration is so i could use the GTX675 as a cuda co-processor
the thing is, parallel compute is the future until quantum computers become viable, it will take some time for the code to develop, especially with some of the best programmers making stupid crap for phones, we could have the worlds first compute based OS, instead we have angry birds
No but worrying about cooling 14nm chips when talking about Haswell is relevant how?
Parallel computing has been the 'future" for a decade. It's not any more relevant to the mass market today than it was in 2003. Those use cases you bring up are irrelevant to the overwhelming majority. Even fewer would need such power in a laptop and if you did stuff such a chip in there, the computer would likely need to be plugged into the wall anyhow. Ironic that you cry about concentrating on efficiency when it's that efficiency that would make cramming more cores into a laptop possible.in gaming perhaps, but imagine if you could have a 50 core cpu, you could run an entire active directory and VDI system off a single chip, you could simulate whole LANs on a single windows 8 desktop,
as for laptops, personally i would love more cores in my laptop, part of the reason i went with an optimus configuration is so i could use the GTX675 as a cuda co-processor
the thing is, parallel compute is the future until quantum computers become viable, it will take some time for the code to develop, especially with some of the best programmers making stupid crap for phones, we could have the worlds first compute based OS, instead we have angry birds
going by what the overwhelming majority thinks it needs it pointless, the overwhelming majority of computer users wouldn't know which end of a computer is up with it wasn't labeled, and even among gamers, the overwhelming majority needs to upgrade and quit using there crappy dual core cpus, the limitation on computer gaming is more their fault than it is of consoles, as much as developers don't want to spend time porting a console game, they also dont want to invest time into making a game that only 3% of computer gamers can run
I have to agree with this, I know people that are still sitting on Pentium 4 or Athlon XP and play WoW every day.
Take a look at this and you will see why we don't get more demanding games.
Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450