Page 3 of 9 FirstFirst
1
2
3
4
5
... LastLast
  1. #41
    The Lightbringer inux94's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Nuuk, Greenland
    Posts
    3,352
    Quote Originally Posted by n0cturnal View Post
    Isn't that highly irrelevant since you have a different CPU and GPU?
    The FX-6300 is supposedly at the same level of performance on games as the 8350.

    I use the same GPU as them.

    Quote Originally Posted by inux94 View Post
    I've got a FX 6300, which is supposedly just as powerful for games. (With a GTX 670)

    I'll test it once I get my case and start overclocking.
    i7-6700k 4.2GHz | Gigabyte GTX 980 | 16GB Kingston HyperX | Intel 750 Series SSD 400GB | Corsair H100i | Noctua IndustialPPC
    ASUS PB298Q 4K | 2x QNIX QH2710 | CM Storm Rapid w/ Reds | Zowie AM | Schiit Stack w/ Sennheiser HD8/Antlion Modmic

    Armory

  2. #42
    Quote Originally Posted by inux94 View Post
    The FX-6300 is supposedly at the same level of performance on games as the 8350.

    I use the same GPU as them.
    Oh I read your signature, but still are you running the same clock on your GTX670? And isn't the 8350 higher clocked?
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  3. #43
    The Lightbringer inux94's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Nuuk, Greenland
    Posts
    3,352
    Quote Originally Posted by n0cturnal View Post
    Oh I read your signature, but still are you running the same clock on your GTX670? And isn't the 8350 higher clocked?
    Yeah, my signature is oudated :P

    I'll check, one mo.

    FX-6300 base clock: 3.5 GHz
    FX-6350 base clock: 4.0 GHz

    I'm bumping my CPU to 5 GHz and redo the benchmarks sometime next week, when my case arrives and I move the stuff around

    GTX 670 is at same clock speed
    i7-6700k 4.2GHz | Gigabyte GTX 980 | 16GB Kingston HyperX | Intel 750 Series SSD 400GB | Corsair H100i | Noctua IndustialPPC
    ASUS PB298Q 4K | 2x QNIX QH2710 | CM Storm Rapid w/ Reds | Zowie AM | Schiit Stack w/ Sennheiser HD8/Antlion Modmic

    Armory

  4. #44
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    If we juggle arround the variables this easily, i see no reason for this thread to continue because it loses all relevance/coherency. Either cooperate (1080P, benchmarkable games) or /endthread.

    It's not the same. The 6300 is a 3.5ghz 6core, and the 8350 is a 4.0ghz 8core. How is that even remotely the same.

  5. #45
    The Lightbringer inux94's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Nuuk, Greenland
    Posts
    3,352
    I already told you I would benchmark the games they played once I get my overclocking done.

    Why are you so angry? There's no point in 1080p when there's obviously more juice to be squeezed out when playing at 2560x1440.

    6300 and 8350 should perform the same, provided they're at the same clock speeds.
    i7-6700k 4.2GHz | Gigabyte GTX 980 | 16GB Kingston HyperX | Intel 750 Series SSD 400GB | Corsair H100i | Noctua IndustialPPC
    ASUS PB298Q 4K | 2x QNIX QH2710 | CM Storm Rapid w/ Reds | Zowie AM | Schiit Stack w/ Sennheiser HD8/Antlion Modmic

    Armory

  6. #46
    Quote Originally Posted by Majesticii View Post
    If we juggle arround the variables this easily, i see no reason for this thread to continue because it loses all relevance/coherency. Either cooperate (1080P, benchmarkable games) or /endthread.

    It's not the same. The 6300 is a 3.5ghz 6core, and the 8350 is a 4.0ghz 8core. How is that even remotely the same.
    I see nothing wrong with Inux's testing method. Logan and the guys at Tek Syndicate tested at 1440p as well, so it is entirely possible to compare Inux's benchmarks of the 6300 with their's of the 8350 and 3570K...

  7. #47
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by inux94 View Post
    I already told you I would benchmark the games they played once I get my overclocking done.

    Why are you so angry? There's no point in 1080p when there's obviously more juice to be squeezed out when playing at 2560x1440.

    6300 and 8350 should perform the same, provided they're at the same clock speeds.
    I'm angry because of your total lack of scientific responsibility. The 6300 and 8350 do not perform the same at the same clockspeeds. That is only relevant in games where IPC matters. If you tested a multithreaded game, and you assumed the processor were the same, you'd get inconsistent results.
    Playing at 1440P only removes the issue further away from the CPU since you're only going to burden the GPU heavier. That's why you run CPU tests at 800x600. Granted those are synthetic benchmarks, and you should go for something more realistic like 1080p.
    But what's the relevance of your benchmarks when you're comparing them to the video, with a different processor that's not the same in any regard. At a resolution which can barely be compared to the rest of this forum due to the fact practicly noone owns a 1440p screen.

    Quote Originally Posted by noteworthynerd View Post
    I see nothing wrong with Inux's testing method. Logan and the guys at Tek Syndicate tested at 1440p as well, so it is entirely possible to compare Inux's benchmarks of the 6300 with their's of the 8350 and 3570K...
    fml, science is gone awol these days.
    Last edited by Majesticii; 2013-01-26 at 03:33 PM.

  8. #48
    Quote Originally Posted by Majesticii View Post
    fml, science is gone awol these days. Forget it i'm out.
    He's not exactly trying to be scientific, is he? The entire point of the benchmarks that Tek Syndicate used were to show how the CPUs perform in real-world situations (like playing a game).

    If he were trying to do "true benchmarks", he would set it to 800x600, but he doesn't play games at that resolution, so it would hardly be real-world, would it?

  9. #49
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by noteworthynerd View Post
    He's not exactly trying to be scientific, is he? The entire point of the benchmarks that Tek Syndicate used were to show how the CPUs perform in real-world situations (like playing a game).

    If he were trying to do "true benchmarks", he would set it to 800x600, but he doesn't play games at that resolution, so it would hardly be real-world, would it?
    True, but you still need to set some groundrules. Otherwise it has no point. If you say "6300 is sort of the same as a 8350", nothing that comes after that is trustworthy. I'm sorry.

    Also, he's free to test it at 1440p, i never said he wasn't. But not supplying the 1080p results just makes it rather pointless since barely anyone can compare. Except with the video, in which i have reason to believe is biased. Or at least at some point been poorly excecuted. There is just no way a i5-3570K does 25fps and the 8350 does 57fps.
    Last edited by Majesticii; 2013-01-26 at 03:39 PM.

  10. #50
    The Lightbringer inux94's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Nuuk, Greenland
    Posts
    3,352
    Quote Originally Posted by Majesticii View Post
    I'm angry because of your total lack of scientific responsibility. The 6300 and 8350 do not perform the same at the same clockspeeds. That is only relevant in games where IPC matters. If you tested a multithreaded game, and you assumed the processor were the same, you'd get inconsistent results.
    Playing at 1440P only removes the issue further away from the CPU since you're only going to burden the GPU heavier. That's why you run CPU tests at 800x600. Granted those are synthetic benchmarks, and you should go for something more realistic like 1080p.
    But what's the relevance of your benchmarks when you're comparing them to the video, with a different processor that's not the same in any regard. At a resolution which can barely be compared to the rest of this forum due to the fact practicly noone owns a 1440p screen.



    fml, science is gone awol these days.
    The games tested aren't multithreaded (that much), only ARMA II.
    i7-6700k 4.2GHz | Gigabyte GTX 980 | 16GB Kingston HyperX | Intel 750 Series SSD 400GB | Corsair H100i | Noctua IndustialPPC
    ASUS PB298Q 4K | 2x QNIX QH2710 | CM Storm Rapid w/ Reds | Zowie AM | Schiit Stack w/ Sennheiser HD8/Antlion Modmic

    Armory

  11. #51
    If you want to see it from a scientific standpoint you can look at it like he is trying to compare his CPU to theirs and not trying to replicate/validate their results.
    If he is running the same GPU at same speeds and the games at the same settings you could argue that there is some value to his results from a CPU performance standpoint.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  12. #52
    Quote Originally Posted by n0cturnal View Post
    If you want to see it from a scientific standpoint you can look at it like he is trying to compare his CPU to theirs and not trying to replicate/validate their results.
    Exactly. This is how I was looking at it (a comparison of the 6300 to the 8350 and the 3570K).

  13. #53
    Legendary!
    10+ Year Old Account
    Join Date
    Sep 2009
    Location
    Not in Europe Anymore Yay
    Posts
    6,931
    Quote Originally Posted by Majesticii View Post
    I dont understand, how can the numbers on Far Cry 3 vary that much when compared to EVERY other benchmark review.
    I call BS on this video. Also, 25fps with a GTX670 on 1080p maxed FC3.... what. I have a 670 icm with a i5-760 (yes old) and i get 45-50fps maxed 1080p...
    This is pretty much what I wanted to post after reading that. Same specs as you, same results, vastly different from the results they got.

  14. #54
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by noteworthynerd View Post
    Exactly. This is how I was looking at it (a comparison of the 6300 to the 8350 and the 3570K).
    I understand this also, but then why make the benchmarks into a format only comparable to an already questionable video. If the purpose of this thread is to find out wether the results are valid, you would need real-world results outside of this video. Like, among the majority of people with 1080p screens. Ill do some tests in a minute at 1080p.

    Quote Originally Posted by RoKPaNda View Post
    This is pretty much what I wanted to post after reading that. Same specs as you, same results, vastly different from the results they got.
    Judging from the numbers though, they cranked up the MSAA to a retarded level. Maybe the AMD somehow has a gain then, i don't know. Still really strange.

  15. #55
    Quote Originally Posted by Majesticii View Post
    Still really strange.
    I agree.

    The fact that their "findings" go against every other benchmark I've seen make me extremely skeptical.

    I watch Tek Syndicate for the Tek, but I think their testing methods are questionable at best (even as "real world" testing), even while watching the video I was raising an eyebrow, especially when they were going through the hardware they were using (ITX board for benchmarking?)...
    Last edited by noteworthynerd; 2013-01-26 at 04:08 PM.

  16. #56
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Also the lack of explanation makes me wary. Posting numbers at this stage is just not going to cut it. As you can judge from the reactions here people know better. Their target audience though are newbies who want to construct a gaming rig. And this is apparent from the obvlious reactions on youtube.

    Also their perspective on "real-world" is warped. Who is satisfied with 30fps avg (i can understand minimum, but not avg), and who would crank up the MSAA to 8x/16x to play at 15fps. No FXAA/MLAA/MSAA comparisson or elaboration what they used. Maybe thus explaining the difference. They overclocking is iffy, as posted before, 4.5ghz is hardly a high OC. And then compare it to a higher #core AMD at higher clockfrequency. I mean... wat. The sad truth probably is that they don't know it themselves, and the lack of explanation and/or justification for the results is due to their own ignorance on the matter.

    They simply crank up the settings, run the game on both platforms and dump the results. This is hardly the way to do it.

    ---------- Post added 2013-01-26 at 05:45 PM ----------

    Running Benchmarks now.
    Max Settings @ 1080p, 4xMSAA
    Core i5-760 at 2.8ghz stock and 4.0ghz.
    ASUS GTX670 (non TOP).

    I will build this post up, cba dumping it into a file and then post it once, since the OC requires me to restart.

    Crysis Benchmark Tool, gpu_test (guru3d has this tool, V1.05). CPU @ 2.8ghz.

    26-1-2013 15:07:14 - Vista 64
    Beginning Run #1 on Map-island, Demo-benchmark_gpu
    DX10 1920x1080, AA=4x, Vsync=Disabled, 32 bit test, FullScreen
    Demo Loops=3, Time Of Day= 5
    Global Game Quality: VeryHigh
    ==============================================================
    TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
    !TimeDemo Run 0 Finished.
    Play Time: 47.22s, Average FPS: 42.35
    Min FPS: 20.27 at frame 142, Max FPS: 60.54 at frame 1612
    !TimeDemo Run 1 Finished.
    Play Time: 41.45s, Average FPS: 48.25
    Min FPS: 20.27 at frame 142, Max FPS: 64.34 at frame 1651
    !TimeDemo Run 2 Finished.
    Play Time: 41.60s, Average FPS: 48.08
    Min FPS: 20.27 at frame 142, Max FPS: 67.81 at frame 110

    I guess the first run there are some buffer issues on the harddrive/memory. Hence you loop these.

    METRO 2033 BENCHMARK RESULTS @ 2.8ghz. (tool is in the root folder)
    26-1-2013 18:03:32
    frontline
    Options: Resolution: 1920 x 1080; DirectX: DirectX 11; Quality: Very High; Antialiasing: MSAA 4X; Texture filtering: AF 16X; Advanced PhysX: Enabled; Tesselation: Enabled; DOF: Disabled
    Run 0 (Frontline)
    Total Frames: 2510, Total Time: 59,55149 sec
    Average Framerate: 42.25
    Max. Framerate: 107.17 (Frame: 2221)
    Min. Framerate: 8.85 (Frame: 926)

    View larger version
    Run 1 (Frontline)
    Total Frames: 2577, Total Time: 59,85393 sec
    Average Framerate: 43.16
    Max. Framerate: 190.88 (Frame: 2536)
    Min. Framerate: 11.81 (Frame: 1235)

    View larger version
    Run 2 (Frontline)
    Total Frames: 2553, Total Time: 59,83719 sec
    Average Framerate: 42.77
    Max. Framerate: 157.16 (Frame: 2294)
    Min. Framerate: 11.96 (Frame: 1232)

    Far Cry 2 benchmark, also in root folder (Small ranch, no fixed-step 30fps run) cpu @ 2.8ghz.

    Run 1
    Settings: Demo(Ranch Small), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(4x), VSync(No), Overall Quality(Ultra High), Vegetation(Very High), Shading(Ultra High), Terrain(Ultra High), Geometry(Ultra High), Post FX(High), Texture(Ultra High), Shadow(Ultra High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)
    Loop 1
    Total Frames: 6512, Total Time: 51,00s
    Average Framerate: 127,68
    Max. Framerate: 215,80 (Frame:961, 5,98s)
    Min. Framerate: 83,93 (Frame:3392, 25,17s)

    Loop 2
    Total Frames: 6645, Total Time: 51,00s
    Average Framerate: 130,30
    Max. Framerate: 198,06 (Frame:979, 6,21s)
    Min. Framerate: 88,16 (Frame:4501, 34,25s)

    Loop 3
    Total Frames: 6629, Total Time: 51,00s
    Average Framerate: 129,97
    Max. Framerate: 199,87 (Frame:1057, 6,60s)
    Min. Framerate: 89,31 (Frame:4452, 34,04s)

    Average Results
    Average Framerate: 129,31
    Max. Framerate: 201,18
    Min. Framerate: 89,24

    Think you already get the gist if you want to do this properly, it turns into a real numbercruncher.
    ---------------------------------------------------------------------------------------------

    Crysis Benchmark Tool, gpu_test (guru3d has this tool, V1.05). CPU @ 4.0ghz.

    26-1-2013 18:36:13 - Vista 64
    Beginning Run #1 on Map-island, Demo-benchmark_gpu
    DX10 1920x1080, AA=4x, Vsync=Disabled, 32 bit test, FullScreen
    Demo Loops=3, Time Of Day= 5
    Global Game Quality: VeryHigh
    ==============================================================
    TimeDemo Play Started , (Total Frames: 2000, Recorded Time: 111.86s)
    !TimeDemo Run 0 Finished.
    Play Time: 36.82s, Average FPS: 54.32
    Min FPS: 34.76 at frame 172, Max FPS: 73.62 at frame 959
    !TimeDemo Run 1 Finished.
    Play Time: 32.12s, Average FPS: 62.28
    Min FPS: 34.76 at frame 172, Max FPS: 75.79 at frame 116
    !TimeDemo Run 2 Finished.
    Play Time: 32.01s, Average FPS: 62.47
    Min FPS: 34.76 at frame 172, Max FPS: 75.79 at frame 116
    TimeDemo Play Ended, (3 Runs Performed)
    ==============================================================

    Minimum framerate gone up from unplayable 20 to 34, avg also increased by 14fps. Only variable that changed is the CPU OC.

    METRO 2033 BENCHMARK RESULTS @ 4.0ghz. (tool is in the root folder)
    METRO 2033 BENCHMARK RESULTS
    26-1-2013 18:46:54
    frontline
    Options: Resolution: 1920 x 1080; DirectX: DirectX 11; Quality: Very High; Antialiasing: MSAA 4X; Texture filtering: AF 16X; Advanced PhysX: Enabled; Tesselation: Enabled; DOF: Disabled
    Run 0 (Frontline)
    Total Frames: 2626, Total Time: 59,81219 sec
    Average Framerate: 44.00
    Max. Framerate: 122.62 (Frame: 2511)
    Min. Framerate: 8.79 (Frame: 731)

    Run 1 (Frontline)
    Total Frames: 2654, Total Time: 59,83586 sec
    Average Framerate: 44.45
    Max. Framerate: 136.57 (Frame: 2234)
    Min. Framerate: 12.14 (Frame: 1291)

    Run 2 (Frontline)
    Total Frames: 2640, Total Time: 59,88383 sec
    Average Framerate: 44.19
    Max. Framerate: 109.66 (Frame: 2021)
    Min. Framerate: 11.89 (Frame: 1285)

    Illustrating this is not a good game to test CPU performance on. A 2fps gain i would call 'jitter', and not call it a performance gain.

    Far Cry 2 benchmark, also in root folder (Small ranch, no fixed-step 30fps run) cpu @ 4.0ghz.
    Run 1
    Settings: Demo(Ranch Small), 1920x1080 (60Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(4x), VSync(No), Overall Quality(Ultra High), Vegetation(Very High), Shading(Ultra High), Terrain(Ultra High), Geometry(Ultra High), Post FX(High), Texture(Ultra High), Shadow(Ultra High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)
    Loop 1
    Total Frames: 7713, Total Time: 51,00s
    Average Framerate: 151,23
    Max. Framerate: 230,35 (Frame:1140, 6,26s)
    Min. Framerate: 94,32 (Frame:0, 0,01s)

    Loop 2
    Total Frames: 7833, Total Time: 51,00s
    Average Framerate: 153,58
    Max. Framerate: 231,39 (Frame:1154, 6,11s)
    Min. Framerate: 110,14 (Frame:3364, 20,64s)

    Loop 3
    Total Frames: 7694, Total Time: 51,00s
    Average Framerate: 150,86
    Max. Framerate: 232,81 (Frame:879, 6,08s)
    Min. Framerate: 0,62 (Frame:110, 2,31s)

    Average Results
    Average Framerate: 151,89
    Max. Framerate: 229,34
    Min. Framerate: 113,62

    avg/min/max all gone up ~25-30fps. Old game, not really reaching 99% load, totally reliant on the CPU performance IPC. Very noticeable CPU gain. So this differs highly per game, and achieved framerates. Upping the settings, and thus dropping the framerate to a hold, is NO viable way of testing CPU performance (metro2033 results). Yes, this particular game it doesn't matter, but in other games it matters 30fps. You can't just base your theory on a few games in which you get insane GPU bottlenecks and say the performance is the same and/or better.
    And i still call BS on their Far Cry 3 results.

    Even this super-elaborate test still has some inconsistancies, like f.e. you can't write down smoothness in fps, perhaps i can incorporate frametimes for that but cba. Go figure how much trouble i have with that video :P Look at this wall of text, and the only thing i achieved is prove how much importance IPC has in these 3 titles. Not even talking about how much they benefit from extra cores etc.
    Last edited by Majesticii; 2013-01-26 at 06:25 PM.

  17. #57
    Their results are highly suspicious. Especially since they took them in game. Why, well first they would have to record down the results at the same time under the same workload conditions. And also it's easy to manipulate in game results. Don't believe me, pick a game put it to max and then go stay facing a corner. Your FPS will go through the roof.

  18. #58
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by gmike View Post
    Their results are highly suspicious. Especially since they took them in game. Why, well first they would have to record down the results at the same time under the same workload conditions. And also it's easy to manipulate in game results. Don't believe me, pick a game put it to max and then go stay facing a corner. Your FPS will go through the roof.
    To be fair, they mentioned doing the same thing with all runs. I.E play through the same scene again and again. Doesn't make them any more reliable though.

  19. #59
    Immortal Schattenlied's Avatar
    10+ Year Old Account
    Join Date
    Aug 2010
    Location
    Washington State
    Posts
    7,475
    I want to know why they are comparing a quad core to an 8 core, 8 core processors can't be utilized by almost every game out there. Using the AMD FX-4170 would have been a much better comparison.
    A gun is like a parachute. If you need one, and don’t have one, you’ll probably never need one again.

  20. #60
    Brewmaster Majesticii's Avatar
    10+ Year Old Account
    Join Date
    Feb 2010
    Location
    Netherlands
    Posts
    1,414
    Quote Originally Posted by Schattenlied View Post
    I want to know why they are comparing a quad core to an 8 core, 8 core processors can't be utilized by almost every game out there. Using the AMD FX-4170 would have been a much better comparison.
    If it matters not, then why should it influence the results?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •