Page 5 of 9 FirstFirst ...
3
4
5
6
7
... LastLast
  1. #81
    Wikipedia says the first iGPUs on Intel processors were on the Clarkdale CPUs (Celeron G1xxx, Pentium G6xxx, i3 5xx, i5 6xx product lines). These were released in 2010. Before this point, the integrated graphics were on the motherboard (Northbridge) chipset (Intel Extreme Graphics, Intel Graphics Media Accelerator).

  2. #82
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Butler Log View Post
    Wikipedia says the first iGPUs on Intel processors were on the Clarkdale CPUs (Celeron G1xxx, Pentium G6xxx, i3 5xx, i5 6xx product lines). These were released in 2010. Before this point, the integrated graphics were on the motherboard (Northbridge) chipset (Intel Extreme Graphics, Intel Graphics Media Accelerator).
    yay, my memory served me correct

  3. #83
    Quote Originally Posted by inux94 View Post
    You can't compare clock speeds between 2 different architectures.

    4.5 GHz on a 3570k is a very respetable overclock, just like with the 5GHz overclock on the 8350.
    Over at OCN, the argument was that a 'respectable' OC with a top end AIO like the H100, would be 4.8-5.0 for the 8350 and 4.6-4.7 for the 3570K. They argue that Teksyndicate's rationale on going + 1 Ghz on both CPUs is unfair for the Intel CPU since it has some OC headroom remaining where as the 8350 does not.

    There are more problems with their review besides overclocked clock speeds. Most notably, they're comparing CPU performance in games that aren't really CPU limited. They should have definitely done SC2 and maybe WoW or GW2 as well.

    Moreover, they use an unoptimized game and state it should be fair to both AMD and Intel due to its unoptimized nature which doesn't make sense at all for several reasons: the game could have been developed on an AMD CPU; optimization may not have taken advantage of certain features of Intel CPUs; if Intel spends more money developing 'optimization' tools for their CPU, thats a legitimate benefit for Intel (assuming there's no anti-competitive behavior).
    Last edited by yurano; 2013-01-26 at 11:20 PM.

  4. #84
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,102
    I'd like to see what would happen if both CPUs were just OC'd to their limits, if that means 5GHz each, then hey. Sounds pretty fair to me.

    Frankly though, I think he should have also done the tests without the overclocks, as if I am to understand correctly, he didn't do that.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  5. #85
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by DeltrusDisc View Post
    I'd like to see what would happen if both CPUs were just OC'd to their limits, if that means 5GHz each, then hey. Sounds pretty fair to me.

    Frankly though, I think he should have also done the tests without the overclocks, as if I am to understand correctly, he didn't do that.
    He did, they compared non-OC to OC. But even then the 500mhz difference remains. Added to that the fact it has more "cores". Makes this a very apples-to-oranges review. What yurano said, and what i tested myself. This cannot be decided on these simple merits. Because if you compare these they each have their perks. Intel has the higher per-core performance, and AMD has the extra threads. They just stressed the GPU beyond retarded, and chose games that were multithreaded to give the AMD an edge. I can make an equal review where the Intel comes on top every time.
    Therefor, either stupid or biased.
    Last edited by Majesticii; 2013-01-26 at 11:32 PM.

  6. #86
    They're testing the 8350's performance vs the intel chips while STREAMING. It's being said so many times in the video. The 8350 is better for streaming, but the i5's will pwn it when you're not streaming.

  7. #87
    Regardless of overclocking headroom or the overclock over stock ratios, we all should still take one thing from this:

    AMD isn't the absolute trash that fanboys make them out to be, and their current lineup is fully capable of maxing anything on the market.

    That's all that matters IMHO.
    i7-4770k - GTX 780 Ti - 16GB DDR3 Ripjaws - (2) HyperX 120s / Vertex 3 120
    ASRock Extreme3 - Sennheiser Momentums - Xonar DG - EVGA Supernova 650G - Corsair H80i

    build pics

  8. #88
    Quote Originally Posted by Ron Burgundy View Post
    the i5-3750k is better all around.
    Pretty much. Performance per core (IPC) is better atm than having more cores with lesser performance (FX series). However the FX series is pretty good for the price (CPU+MB prices) and will run everything at very acceptable performance, specially in multithreading.
    Last edited by Warrax; 2013-01-26 at 11:37 PM.
    Warrax, Fury Warrior
    Silika, BM Hunter

  9. #89
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by protput View Post
    They're testing the 8350's performance vs the intel chips while STREAMING. It's being said so many times in the video. The 8350 is better for streaming, but the i5's will pwn it when you're not streaming.
    Because of the threads. This just adds more bias, because it's unfair. It would be the same if you'd compare an i7 to a FX4###.
    Most games tested easily sature 4 threads, leaving no room for the streaming.
    Besides, though everyone talks about it, who actually streams. This is just ruining the benchmark for silly reasons.

    Quote Originally Posted by glo View Post
    AMD isn't the absolute trash that fanboys make them out to be, and their current lineup is fully capable of maxing anything on the market.
    Honestly, i haven't said anything about AMD or Intel in particular, other than to emphasize how wrong the video is.
    Last edited by Majesticii; 2013-01-26 at 11:46 PM.

  10. #90
    Quote Originally Posted by RicardoZ View Post
    I'd be willing to pay the extra money up front if I can avoid having to go through a hassle and spend more money later. I want to take my computer in the mail, take it out of the box, turn it on, and play games. No fuss no muss.
    Who gets a computer from the mall?

  11. #91
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,102
    Quote Originally Posted by Lilfrier View Post
    Who gets a computer from the mall?
    He said he wants to take it to the mall.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  12. #92
    Quote Originally Posted by Biernot View Post
    To the people complaining about the overclock of i5-3570k vs FX-8350:

    The reason, why the i5-3570k was only overclocked to 4.5GHz is simple: Ivy Bridge can't be overclocked as high as Sandy Bridge, because it "hits a thermal wall" around the 4.5 / 4.6 GHz mark. Pushing it past this point requires very high voltages and good cooling.
    The FX-8350 on the other hand can be oc'ed to 5GHz very easily. A friend of mine has it running at 4.8GHz with stock voltages and stock cooler.
    The reason why Ivy hits a thermal wall at a relatively low overclock (it runs hotter than its predecessor in general) is because Intel used a thick layer of cheap thermal paste between the CPU die and heat spreader cover. In Sandy fluxless solder was used instead which resulted in lower temperatures due to the much higher heat transfer rate.

    One can, in fact, remove the heat spreader cover, clean off the crappy thermal paste, replace it with a high quality thermal paste and get much, much lower temperatures. NOT RECOMMENDED unless you know what you are doing, this will void your warranty and can start a fire if you do it wrong. The curious can look up "Ivy Bridge delidding" for more information.

  13. #93
    Quote Originally Posted by Lilfrier View Post
    Who gets a computer from the mall?
    Quote Originally Posted by DeltrusDisc View Post
    He said he wants to take it to the mall.
    Really guys? Not mall but mail.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  14. #94
    Quote Originally Posted by Majesticii View Post
    Honestly, i haven't said anything about AMD or Intel in particular, other than to emphasize how wrong the video is.
    What's wrong with the video? Trying to argue that fraps is hindering results isn't accurate whatsoever. Fraps only utilizes a single thread by default.

    The reason people can't accept that AMD is competitive is their own doing. Results like these aren't shocking anyone that's remotely had an unbiased look at the CPU market since the Bulldozer revisions took place (43xx/63xx/etc).
    Last edited by glo; 2013-01-26 at 11:57 PM.
    i7-4770k - GTX 780 Ti - 16GB DDR3 Ripjaws - (2) HyperX 120s / Vertex 3 120
    ASRock Extreme3 - Sennheiser Momentums - Xonar DG - EVGA Supernova 650G - Corsair H80i

    build pics

  15. #95
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by glo View Post
    What's wrong with the video? Trying to argue that fraps is hindering results isn't accurate whatsoever. Fraps only utilizes a single thread by default.

    The reason people can't accept that AMD is competitive is their own doing. Results like these aren't shocking anyone that's remotely had an unbiased look at the CPU market since the Bulldozer revisions took place (43xx/63xx/etc).
    I didn't say fraps was hindering performance. Fraps used in both systems, and therefor a constant. I think the video is wrong for a number of reasons, reasons i've already elaborated enough in previous posts.
    And neither am i biased. I'm the exact opposite. However, AMD should be judged vs. Intel on the same merits. Comparing a 4ghz 8core to a 3.5ghz quadcore is just beyond rediculous. No matter what brandnames are stamped on it.
    The fact AMD is selling them bottom-dollar, is not a reason to disregard these facts. And for a self-proclaimed 'unbiased' review, they sure have set the groundrules up to heavily favor AMD. Because it either ties or completely annihilates it, whereas on every other review they are relatively tied or the Intel edges a bit. Why? Well i've explained that in a previous ebalorate test.
    Last edited by Majesticii; 2013-01-27 at 12:12 AM.

  16. #96
    Quote Originally Posted by glo View Post
    What's wrong with the video? Trying to argue that fraps is hindering results isn't accurate whatsoever. Fraps only utilizes a single thread by default.
    The fact that their results in Far Cry 3 differs a huge amount from what every other review.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  17. #97
    Quote Originally Posted by n0cturnal View Post
    The fact that their results in Far Cry 3 differs a huge amount from what every other review.
    No, it doesn't.

    http://www.techspot.com/review/615-f...nce/page6.html

    See how much better AMD scales with overclocking? The link i just showed only put it to 4.5Ghz, the youtube video had it at 5Ghz which would logically make it walk all over the 3570k.

    I don't know, I'm sure people will claim that there's now an entire ring of sites in on the conspiracy.
    i7-4770k - GTX 780 Ti - 16GB DDR3 Ripjaws - (2) HyperX 120s / Vertex 3 120
    ASRock Extreme3 - Sennheiser Momentums - Xonar DG - EVGA Supernova 650G - Corsair H80i

    build pics

  18. #98
    Quote Originally Posted by glo View Post
    No, it doesn't.

    http://www.techspot.com/review/615-f...nce/page6.html

    See how much better AMD scales with overclocking? The link i just showed only put it to 4.5Ghz, the youtube video had it at 5Ghz which would logically make it walk all over the 3570k.

    I don't know, I'm sure people will claim that there's now an entire ring of sites in on the conspiracy.
    That link shows i5 3470 beating the 8350 at stock speeds, you really think the 8350 will double the performance of an overclocked 3570k when the 8350 is also overclocked? I don't think so.

    It could possibly beat the 3570k but it would only be with a few FPS not 30FPs more or what it was in the video.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  19. #99
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by glo View Post
    No, it doesn't.
    Yes it does, you're just reading these benchmarks incorrectly. The fact that the AMD scales better is because the Intel saturates the game at a lower frequency. If you look at relative performance, the i5-3470 is faster than at 3.2ghz, vs the 4.0ghz on the FX8350.
    Now, compare those results to the 25fps (i5-3570K) vs 57fps (FX8350) they got on the TekSyndicate video, and you understand why we don't believe it.
    Last edited by Majesticii; 2013-01-27 at 12:21 AM.

  20. #100
    Quote Originally Posted by glo View Post
    http://www.techspot.com/review/615-f...nce/page6.html

    See how much better AMD scales with overclocking? The link i just showed only put it to 4.5Ghz, the youtube video had it at 5Ghz which would logically make it walk all over the 3570k.
    Fallacy of extrapolation.

    You can't extrapolate unless you know the governing equation. Even then, extrapolation is not a good idea.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •