Page 1 of 4
1
2
3
... LastLast
  1. #1

    Intel has started with the Haswel producton.

    http://www.sweclockers.com/nyhet/164...slutet-av-aret

    This is very exciting, once a few of the engineering samples get out we might get some rogue benchmarks.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  2. #2
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    well, taking this with a grain of salt because intel's rumors are always better than the production

    but if HD5000 integrated graphics are really double what HD4000 is capable, that would put both Nvidia and AMD in trouble for their low end cards, performance levels like that would be on par with a 7770 or 650

    however, i think we will have more issues with haswel overclocking, Ivy has the problem of the thermal wall being so steep that it knocked watercooling out as a viable option, part of that is due to the smaller size increasing the concentration of heat, dropping again to 14NM will only amplify this, meaning we will have lower overall clocks on air and water, and will need exotic cooling like phase changing to reach the high overclocks we easily enjoy currently

    also, part of one thing that annoys me is the lack of direction with die shrinks, they used to be about fitting more transistors on the same sized chip, ie, you could add more cores without needing a CPU the size of a dinner plate, but instead we have simply been making them smaller, with 14nm we could easily have 10, 12 or even 16 core chips in a LGA1155 socket, and double that in a 2011 socket, the actual silicon real-estate of the 50 core Knights Ferry co-processor is roughly the same as an 10 core E7 Xeon, instead they are just focusing on power efficiency

  3. #3
    Banned This name sucks's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    A basement in Canada
    Posts
    2,724
    Quote Originally Posted by Cyanotical View Post
    well, taking this with a grain of salt because intel's rumors are always better than the production

    but if HD5000 integrated graphics are really double what HD4000 is capable, that would put both Nvidia and AMD in trouble for their low end cards, performance levels like that would be on par with a 7770 or 650

    however, i think we will have more issues with haswel overclocking, Ivy has the problem of the thermal wall being so steep that it knocked watercooling out as a viable option, part of that is due to the smaller size increasing the concentration of heat, dropping again to 14NM will only amplify this, meaning we will have lower overall clocks on air and water, and will need exotic cooling like phase changing to reach the high overclocks we easily enjoy currently

    also, part of one thing that annoys me is the lack of direction with die shrinks, they used to be about fitting more transistors on the same sized chip, ie, you could add more cores without needing a CPU the size of a dinner plate, but instead we have simply been making them smaller, with 14nm we could easily have 10, 12 or even 16 core chips in a LGA1155 socket, and double that in a 2011 socket, the actual silicon real-estate of the 50 core Knights Ferry co-processor is roughly the same as an 10 core E7 Xeon, instead they are just focusing on power efficiency
    Well to be fair, we're starting to reach a point where MOAR POWER isn't really as big as a priority as fixing power efficiency. With a 3570k at 4.2 ghz and a 7970 I can run any game at max settings and still have a good 60-50% of my cpu power available. I can't imagine what would be used with your insane computer.

    Now this might not be true when we start getting 4k resolution monitors in a few years but as of right now we're pretty much fine.

    This obviously only applies to games and not hardcore video rendering, 3d modelling, code compiling, etc.

    You're right about the cooling issue though.

  4. #4
    I don't suppose it would be possible to spread the cores around the cpu would it? Surely the heat generation wouldn't be such an issue if it were to be spread out over a larger area?

  5. #5
    Quote Originally Posted by Butler Log View Post
    I don't suppose it would be possible to spread the cores around the cpu would it? Surely the heat generation wouldn't be such an issue if it were to be spread out over a larger area?
    Nope, AFAIK each 2cores share cache so they need to be stuck together.
    Also IVY gets hot simply coz of the cheap TIM they use under the lid, check the folks who de-lidded their CPU and got 20+C in some cases making it better than sandy.
    http://eu.battle.net/wow/en/characte...rning/advanced
    i5-3570k @ 4.4ghz - R9-280X @ 1150Mhz on stock voltage - 8GB of DDR3 Ram @ 1866Mhz

  6. #6
    Quote Originally Posted by Cyanotical View Post
    well, taking this with a grain of salt because intel's rumors are always better than the production

    but if HD5000 integrated graphics are really double what HD4000 is capable, that would put both Nvidia and AMD in trouble for their low end cards, performance levels like that would be on par with a 7770 or 650
    It's too bad Intel drivers are still terrible. That itself is puzzling considering the good people they have writing their compilers. They have the resources to do a better job but they don't.

    Quote Originally Posted by Cyanotical View Post
    however, i think we will have more issues with haswel overclocking, Ivy has the problem of the thermal wall being so steep that it knocked watercooling out as a viable option, part of that is due to the smaller size increasing the concentration of heat, dropping again to 14NM will only amplify this, meaning we will have lower overall clocks on air and water, and will need exotic cooling like phase changing to reach the high overclocks we easily enjoy currently
    No need to worry about 14nm until Broadwell.

    Quote Originally Posted by Cyanotical View Post
    also, part of one thing that annoys me is the lack of direction with die shrinks, they used to be about fitting more transistors on the same sized chip, ie, you could add more cores without needing a CPU the size of a dinner plate, but instead we have simply been making them smaller, with 14nm we could easily have 10, 12 or even 16 core chips in a LGA1155 socket, and double that in a 2011 socket, the actual silicon real-estate of the 50 core Knights Ferry co-processor is roughly the same as an 10 core E7 Xeon, instead they are just focusing on power efficiency
    No one needs 16 cores or 50 cores for their desktop. No one even has software to use that. (AND by no one I mean the majority of users) The focus on power efficiency is responding to the market, where the majority of systems being sold are laptops.

  7. #7
    The Unstoppable Force Belize's Avatar
    10+ Year Old Account
    Join Date
    Mar 2010
    Location
    Gen-OT College of Shitposting
    Posts
    21,922
    So how powerful would said integrated graphics be?

  8. #8
    Quote Originally Posted by Cows For Life View Post
    No one needs 16 cores or 50 cores for their desktop. No one even has software to use that. (AND by no one I mean the majority of users) The focus on power efficiency is responding to the market, where the majority of systems being sold are laptops.
    If they pushed out more cores on their CPUs the software market would sooner or later have to adapt. More cores doesn't mean less efficiency either, you can just put some cores to sleep when they aren't needed.

    ---------- Post added 2013-01-19 at 02:26 AM ----------

    Quote Originally Posted by Belize View Post
    So how powerful would said integrated graphics be?
    http://www.anandtech.com/show/6600/i...eforce-gt-650m

    This is pretty much the only info available.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  9. #9
    Banned This name sucks's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    A basement in Canada
    Posts
    2,724
    Quote Originally Posted by Belize View Post
    So how powerful would said integrated graphics be?
    Approximately 7770 but its too early to tell for sure.

  10. #10
    Titan
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    14,128
    Quote Originally Posted by Cyanotical View Post
    well, taking this with a grain of salt because intel's rumors are always better than the production

    but if HD5000 integrated graphics are really double what HD4000 is capable, that would put both Nvidia and AMD in trouble for their low end cards, performance levels like that would be on par with a 7770 or 650

    however, i think we will have more issues with haswel overclocking, Ivy has the problem of the thermal wall being so steep that it knocked watercooling out as a viable option, part of that is due to the smaller size increasing the concentration of heat, dropping again to 14NM will only amplify this, meaning we will have lower overall clocks on air and water, and will need exotic cooling like phase changing to reach the high overclocks we easily enjoy currently

    also, part of one thing that annoys me is the lack of direction with die shrinks, they used to be about fitting more transistors on the same sized chip, ie, you could add more cores without needing a CPU the size of a dinner plate, but instead we have simply been making them smaller, with 14nm we could easily have 10, 12 or even 16 core chips in a LGA1155 socket, and double that in a 2011 socket, the actual silicon real-estate of the 50 core Knights Ferry co-processor is roughly the same as an 10 core E7 Xeon, instead they are just focusing on power efficiency
    Quad core technology is still a long way off from being fully utilized, and we have had quad core processors on the market since 2005 or 2006? It's getting close to a decade and still too much technology is designed around single threading. Why bother increasing the core capacity until there is a demand for it? It was a wise move at the time, but these days there isn't a huge need for 6, 8, 12, 16 etc core processors unless you are running servers.

  11. #11
    The Unstoppable Force Belize's Avatar
    10+ Year Old Account
    Join Date
    Mar 2010
    Location
    Gen-OT College of Shitposting
    Posts
    21,922
    Quote Originally Posted by n0cturnal View Post

    http://www.anandtech.com/show/6600/i...eforce-gt-650m

    This is pretty much the only info available.
    More or less a GTX 650M? Eh, that's a lot better than today's integrated that's for sure. You can easily play games on lower graphic settings with that. Although the demo game they used (Dirt 3) isn't exactly the... prettiest or most demanding game lol.

  12. #12
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  13. #13
    Deleted
    Quote Originally Posted by Methanar View Post
    Well to be fair, we're starting to reach a point where MOAR POWER isn't really as big as a priority as fixing power efficiency. With a 3570k at 4.2 ghz and a 7970 I can run any game at max settings and still have a good 60-50% of my cpu power available. I can't imagine what would be used with your insane computer.
    This is due to the console market though. Nothing is pushing the PC to evolve, so it doesn't have to. That and intel not having any competition, they can shrink and save power all they want. Right now intel are at the point where no one is upgrading except enthusiasts. When even a gen1 will do anything a casual user will need, why upgrade? People will keep their gen1/2s for YEARS before even thinking of upgrading.

    We need a new Crysis to push the PC to new limits, but no one is willing to when consoles are the market. Cheaper, more subtle PCs are the market right now with a lot of the focus being on console type gaming and so reducing the consumption is all intel will really benefit from.

  14. #14
    Banned This name sucks's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    A basement in Canada
    Posts
    2,724
    Quote Originally Posted by clampy View Post
    This is due to the console market though. Nothing is pushing the PC to evolve, so it doesn't have to. That and intel not having any competition, they can shrink and save power all they want. Right now intel are at the point where no one is upgrading except enthusiasts. When even a gen1 will do anything a casual user will need, why upgrade? People will keep their gen1/2s for YEARS before even thinking of upgrading.

    We need a new Crysis to push the PC to new limits, but no one is willing to when consoles are the market. Cheaper, more subtle PCs are the market right now with a lot of the focus being on console type gaming and so reducing the consumption is all intel will really benefit from.
    This too. They've been suffocating the market since the ps2 when they suddenly became hyper popular.

  15. #15
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Cows For Life View Post
    No need to worry about 14nm until Broadwell.
    you read the link right?

    Quote Originally Posted by Cows For Life View Post

    No one needs 16 cores or 50 cores for their desktop. No one even has software to use that. (AND by no one I mean the majority of users) The focus on power efficiency is responding to the market, where the majority of systems being sold are laptops.
    Quote Originally Posted by Rennadrel View Post
    Quad core technology is still a long way off from being fully utilized
    in gaming perhaps, but imagine if you could have a 50 core cpu, you could run an entire active directory and VDI system off a single chip, you could simulate whole LANs on a single windows 8 desktop,

    as for laptops, personally i would love more cores in my laptop, part of the reason i went with an optimus configuration is so i could use the GTX675 as a cuda co-processor

    the thing is, parallel compute is the future until quantum computers become viable, it will take some time for the code to develop, especially with some of the best programmers making stupid crap for phones, we could have the worlds first compute based OS, instead we have angry birds

  16. #16
    Deleted
    Quote Originally Posted by Cyanotical View Post
    instead we have angry birds
    Quite amusing with your avatar sounds like you are the angry bird! (but I do agree)

  17. #17
    Quote Originally Posted by Cyanotical View Post
    you read the link right?
    No but worrying about cooling 14nm chips when talking about Haswell is relevant how?

    in gaming perhaps, but imagine if you could have a 50 core cpu, you could run an entire active directory and VDI system off a single chip, you could simulate whole LANs on a single windows 8 desktop,

    as for laptops, personally i would love more cores in my laptop, part of the reason i went with an optimus configuration is so i could use the GTX675 as a cuda co-processor

    the thing is, parallel compute is the future until quantum computers become viable, it will take some time for the code to develop, especially with some of the best programmers making stupid crap for phones, we could have the worlds first compute based OS, instead we have angry birds
    Parallel computing has been the 'future" for a decade. It's not any more relevant to the mass market today than it was in 2003. Those use cases you bring up are irrelevant to the overwhelming majority. Even fewer would need such power in a laptop and if you did stuff such a chip in there, the computer would likely need to be plugged into the wall anyhow. Ironic that you cry about concentrating on efficiency when it's that efficiency that would make cramming more cores into a laptop possible.

  18. #18
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Cyanotical View Post
    you read the link right?
    Quote Originally Posted by Cows For Life View Post
    No.

    Quote Originally Posted by Cows For Life View Post
    Parallel computing has been the 'future" for a decade. It's not any more relevant to the mass market today than it was in 2003. Those use cases you bring up are irrelevant to the overwhelming majority. Even fewer would need such power in a laptop and if you did stuff such a chip in there, the computer would likely need to be plugged into the wall anyhow. Ironic that you cry about concentrating on efficiency when it's that efficiency that would make cramming more cores into a laptop possible.
    going by what the overwhelming majority thinks it needs it pointless, the overwhelming majority of computer users wouldn't know which end of a computer is up with it wasn't labeled, and even among gamers, the overwhelming majority needs to upgrade and quit using there crappy dual core cpus, the limitation on computer gaming is more their fault than it is of consoles, as much as developers don't want to spend time porting a console game, they also dont want to invest time into making a game that only 3% of computer gamers can run

  19. #19
    Quote Originally Posted by Cyanotical View Post
    going by what the overwhelming majority thinks it needs it pointless, the overwhelming majority of computer users wouldn't know which end of a computer is up with it wasn't labeled, and even among gamers, the overwhelming majority needs to upgrade and quit using there crappy dual core cpus, the limitation on computer gaming is more their fault than it is of consoles, as much as developers don't want to spend time porting a console game, they also dont want to invest time into making a game that only 3% of computer gamers can run
    I have to agree with this, I know people that are still sitting on Pentium 4 or Athlon XP and play WoW every day.

    Take a look at this and you will see why we don't get more demanding games.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  20. #20
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,085
    i5-4xxx? i7-4xxx? NOOOOOOOOOOOOOOOOOO!!!

    I am annoyed. They need to figure something new out, holy crap.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •