Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by azza125 View Post
    Thank you, i appreciate it!

    Thank you Alc aswell for your information.
    Just to add to what I said before, was in a bit of a hurry and didn't properly proof-read.

    if you don't OC super hard, just get from a decent brand, if you do plan on OC super hard, do some research into which particular board does it well. It mostly comes down to power delivery to the CPU, but with Z390 being fairly new, it's anyone's game there

  2. #22
    Deleted
    Quote Originally Posted by Temp name View Post
    Just to add to what I said before, was in a bit of a hurry and didn't properly proof-read.

    if you don't OC super hard, just get from a decent brand, if you do plan on OC super hard, do some research into which particular board does it well. It mostly comes down to power delivery to the CPU, but with Z390 being fairly new, it's anyone's game there
    Sounds good bud, thank you!

    wont be overclocking super hard as i don't have the know how

  3. #23
    Quote Originally Posted by stevenho View Post
    We finally start seeing games using all cores and bottleneck the GPU hard on 4-core CPUs.
    Go ahead and list those up.

    Dont worry, ill wait.

  4. #24
    The Lightbringer Shakadam's Avatar
    10+ Year Old Account
    Join Date
    Oct 2009
    Location
    Finland
    Posts
    3,300
    Quote Originally Posted by Kagthul View Post
    Go ahead and list those up.

    Dont worry, ill wait.
    4c/8t is still decent but maybe not very future proof, minimum FPS can suffer in certain scenarios and especially if you do anything else on your computer while gaming (something in the background or watching a video on another monitor etc) you're definitely going to notice the performance loss.

    4c/4t is not something that should be considered anymore. Average FPS is noticeably lower and there's lots more stuttering and dips to very low fps.





    https://www.guru3d.com/articles-page...chmarks,4.html
    Last edited by Shakadam; 2018-11-21 at 09:54 PM.

  5. #25
    Quote Originally Posted by Kagthul View Post
    Go ahead and list those up.

    Dont worry, ill wait.
    1st I played was AC: Origins. Although it dropped below 60 in places like Alexandria and other cities, I endured. Then came AC: Oddyssey and it was even worse. I never thought a game will force me to buy an i7, but it happened. It was glorious after that.

    The OP would have a very bad experience trying to stream one of those games on 4 cores.

  6. #26
    Quote Originally Posted by Shakadam View Post
    4c/8t is still decent but maybe not very future proof, minimum FPS can suffer in certain scenarios and especially if you do anything else on your computer while gaming (something in the background or watching a video on another monitor etc) you're definitely going to notice the performance loss.

    4c/4t is not something that should be considered anymore. Average FPS is noticeably lower and there's lots more stuttering and dips to very low fps.





    https://www.guru3d.com/articles-page...chmarks,4.html

    ............................................________
    ....................................,.-'"...................``~.,
    .............................,.-"..................................."-.,
    .........................,/...............................................":,
    .....................,?......................................................,
    .................../...........................................................,}
    ................./......................................................,:`^`..}
    .............../...................................................,:"........./
    ..............?.....__.........................................:`.........../
    ............./__.(....."~-,_..............................,:`........../
    .........../(_...."~,_........"~,_....................,:`........_/
    ..........{.._$;_......"=,_......."-,_.......,.-~-,},.~";/....}
    ...........((.....*~_......."=-._......";,,./`..../"............../
    ...,,,___.`~,......"~.,....................`.....}............../
    ............(....`=-,,.......`........................(......;_,,-"
    ............/.`~,......`-...................................../
    .............`~.*-,.....................................|,./.....,__
    ,,_..........}.>-._...................................|..............`=~-,
    .....`=~-,__......`,.................................
    ...................`=~-,,.,...............................
    ................................`:,,...........................`..............__
    .....................................`=-,...................,%`>--==``
    ........................................_..........._,-%.......`
    ...................................,


    Those charts are all but meaningless. They dont list clock speeds, in some of them not even what types of CPUs are being used. The one with actual CPUs listed, doesn't list clock speeds.

    And... i know this is going to be a shocker for you guys since you cant ever seem to separate anecdotal belief from the argument - but most people (something like 90%+ of the market) are fine with 1080p/60.

    Which everything on that list provides all the way to 4k.

    So if you're trying to make some kind of "Man, 4c/4th just cant cut it anymore" argument with these charts, you're blowing your own argument into tiny little pieces. Yeah, i mean, how DARE someone have to settle with 80fps at ultra settings. Just fuckin unlivable, that is.

    So thanks for making my point for me.

    - - - Updated - - -

    Quote Originally Posted by stevenho View Post
    1st I played was AC: Origins. Although it dropped below 60 in places like Alexandria and other cities, I endured. Then came AC: Oddyssey and it was even worse. I never thought a game will force me to buy an i7, but it happened. It was glorious after that.
    Funny, my wife's got a 6600K @ 4.4ghz and a GTX 1080... and had zero issues with EITHER of those games at 1080p/60 using FastSync. Its almost like anecdotal experience is useless.

    The OP would have a very bad experience trying to stream one of those games on 4 cores.
    ... or Quicksync.

    For fucks sake, you dont have to stream using CPU encoding, particularly if you're just a hobbyist at it.

  7. #27
    Quote Originally Posted by Kagthul View Post
    Those charts are all but meaningless. They dont list clock speeds, in some of them not even what types of CPUs are being used. The one with actual CPUs listed, doesn't list clock speeds.

    And... i know this is going to be a shocker for you guys since you cant ever seem to separate anecdotal belief from the argument - but most people (something like 90%+ of the market) are fine with 1080p/60.

    Which everything on that list provides all the way to 4k.

    So if you're trying to make some kind of "Man, 4c/4th just cant cut it anymore" argument with these charts, you're blowing your own argument into tiny little pieces. Yeah, i mean, how DARE someone have to settle with 80fps at ultra settings. Just fuckin unlivable, that is.

    So thanks for making my point for me.

    Funny, my wife's got a 6600K @ 4.4ghz and a GTX 1080... and had zero issues with EITHER of those games at 1080p/60 using FastSync. Its almost like anecdotal experience is useless.
    The charts aren't meaningless, you're just pissed off that technology is progressing. Also FYI the R5 2600x is beating the 6600k despite being clocked similarly.

    Who gives a fuck? Most people on consoles are fine with 900p 30 fps, you gonna start campaigning for that too?

    You're missing the point, 80 FPS at Ultra is terrible when your video card is capable of 100+. And that bottleneck is only going to get worse as games are only getting more demanding not less so.

    The fact is that quad cores are ancient technology and games are becoming too demanding for them even at high clock speeds. When even Intel, one of the greediest corporations on the planet, is moving past quad cores it should tell you something.
    Last edited by Courierrawr; 2018-11-22 at 12:51 AM.

  8. #28
    Quote Originally Posted by Kagthul View Post
    ............................................________
    ....................................,.-'"...................``~.,
    .............................,.-"..................................."-.,
    .........................,/...............................................":,
    .....................,?......................................................,
    .................../...........................................................,}
    ................./......................................................,:`^`..}
    .............../...................................................,:"........./
    ..............?.....__.........................................:`.........../
    ............./__.(....."~-,_..............................,:`........../
    .........../(_...."~,_........"~,_....................,:`........_/
    ..........{.._$;_......"=,_......."-,_.......,.-~-,},.~";/....}
    ...........((.....*~_......."=-._......";,,./`..../"............../
    ...,,,___.`~,......"~.,....................`.....}............../
    ............(....`=-,,.......`........................(......;_,,-"
    ............/.`~,......`-...................................../
    .............`~.*-,.....................................|,./.....,__
    ,,_..........}.>-._...................................|..............`=~-,
    .....`=~-,__......`,.................................
    ...................`=~-,,.,...............................
    ................................`:,,...........................`..............__
    .....................................`=-,...................,%`>--==``
    ........................................_..........._,-%.......`
    ...................................,


    Those charts are all but meaningless. They dont list clock speeds, in some of them not even what types of CPUs are being used. The one with actual CPUs listed, doesn't list clock speeds.

    And... i know this is going to be a shocker for you guys since you cant ever seem to separate anecdotal belief from the argument - but most people (something like 90%+ of the market) are fine with 1080p/60.

    Which everything on that list provides all the way to 4k.

    So if you're trying to make some kind of "Man, 4c/4th just cant cut it anymore" argument with these charts, you're blowing your own argument into tiny little pieces. Yeah, i mean, how DARE someone have to settle with 80fps at ultra settings. Just fuckin unlivable, that is.

    So thanks for making my point for me.

    Funny, my wife's got a 6600K @ 4.4ghz and a GTX 1080... and had zero issues with EITHER of those games at 1080p/60 using FastSync. Its almost like anecdotal experience is useless.

    For fucks sake, you dont have to stream using CPU encoding, particularly if you're just a hobbyist at it.
    I gotta say you're a pretty funny guy advocating in a pretty funny manner for preserving a pretty funny 10 year old 4 core technology.
    Your fuming is meaningless though, there are plenty of videos clearly showing 4 core processors bottlenecking GPUs in the last two AC games and there is not much to discuss here, you can LITERALLY WATCH IT

    You're hilarious

    Last edited by stevenho; 2018-11-22 at 02:52 AM.

  9. #29
    I'm enjoying some popcorn and just thinking about how the Ryzen 7 1700X was available today for $129, talk about a performance bargain.

  10. #30
    The Unstoppable Force Puupi's Avatar
    15+ Year Old Account
    Join Date
    Jan 2009
    Location
    Finland
    Posts
    23,402
    How do extra threads/cores affect the performance if you are running multiple games at the same time? Like eg. Black Desert or EVE clients in the background while playing something else.
    Quote Originally Posted by derpkitteh View Post
    i've said i'd like to have one of those bad dragon dildos shaped like a horse, because the shape is nicer than human.
    Quote Originally Posted by derpkitteh View Post
    i was talking about horse cock again, told him to look at your sig.

  11. #31
    Quote Originally Posted by azza125 View Post
    Yeah, i upgraded my GPU from a 770GTX to a GTX1070 last year, but im still running a i5 4690k and 8GB RAM from atleast 4 years ago, so id like to start looking at potential upgrades etc.

    i see the difference between a I7 8700k and 9700k is about £25 for me.. would i be better off going for the 9700k just purely based on future proofing as it has more cores?

    What RAM and MOBO would i choose with these?
    I disagree with @Alcsaar because in real scenarios the difference is next to negligible (you're not going to get 15% less fps in games nor streaming at lower quality, hell, it could even be better since parallel work is the strong point of Ryzen) but he's correct on his data. If you want a recommendation in motherboard, choose a good B450 one (or x470) and while RAM speed makes a difference going above 3200 imho is a waste.

    Point is, Ryzen are an hell bang for a buck systems right now. You can look around for benchmarks and basically everyone agrees that gaming wise Intel and Ryzen trade blows as some game favor parallel workload while others prefer single core performance. They're practically equivalent in a realistic scenario.

    Anyway, if you plan to go with the blue team, go with 9700k. Rocket speed and real 8 cores (no HT bullshit) - way better than 8700k imho. Soldered IHS should provide better thermals for OC but i've seen mixed opinions actually on this. Again for mobo go with z370/z390 and some nice 3200mhz ram.

    Last thing i'm reading a lot: "future-proofing" is sort-of a bet. I mean, it's worth to spend more on a GPU/CPU to have better perfomances and it's definitely true that better parts will last you more time. However it all depends on the technology jumps. Example, i had a gtx470 that lasted me perfectly for 9 years for gaming at 1080p. I swapped with a 960 and i was fine until 10 serie came out and monitor started to increase base resolution and games made a hefty jump in quality - to the point i was trying 1440p and while doable i was at limit already. So i snatched a used 1080ti; i know this will last me way more, but i replaced my 960 just becasue 10 serie was so much better, and right now given 20 serie is lackluster i think i made a good deal.

    CPU wise, you should be fine with any option you pick now, it's mostly a matter of budget.
    Non ti fidar di me se il cuor ti manca.

  12. #32
    Quote Originally Posted by Coldkil View Post
    Point is, Ryzen are an hell bang for a buck systems right now. You can look around for benchmarks and basically everyone agrees that gaming wise Intel and Ryzen trade blows as some game favor parallel workload while others prefer single core performance. They're practically equivalent in a realistic scenario.
    The tests I watched about a month ago when considering an upgrade path showed Ryzens to be lagging behind Intel by about 10% in pure gaming, sometimes more, sometimes less. Those were CPUs with comparable number of cores/threads. But they were also MUCH cheaper than Intels.
    If I was budget conscious I would chose AMD right now, especially due to the fact that gaming is not the only thing I do on the PC.

    As for future proofing - in my experience it is both cheaper and better to replace gear more often with stuff that's not top shelf, than wait a long time and pay premium for the top offerings. This especially true for graphics cards.

  13. #33
    Quote Originally Posted by Kagthul View Post
    Go ahead and list those up.

    Dont worry, ill wait.
    4-5 games a year have good multi-core support and people scream every year that all new games support it, every year for 10 years now...........

  14. #34
    Quote Originally Posted by stevenho View Post
    I gotta say you're a pretty funny guy advocating in a pretty funny manner for preserving a pretty funny 10 year old 4 core technology.
    Your fuming is meaningless though, there are plenty of videos clearly showing 4 core processors bottlenecking GPUs in the last two AC games and there is not much to discuss here, you can LITERALLY WATCH IT

    You're hilarious

    You need to learn to read. Im not advocating for anything. Im pointing out that people who are like “OH EM GEEZ, FOUR COARS NOT ENUFFZ” are completely full of shit to the point it is streaming out of their ears.

    Single-digit percentages of the market are playing at 1440p or higher. Like sub 5%.

    1440p results are meaningless. Why the fuck would you build a budget machine and excpect to get the enthusiast levels of performance?

    Are you that much of a fucking idiot?

    So, wow, news at 11, a budget CPU doesn’t provide amazeballs performance at an enthusiast level of settings.

    Fucking shocker.

    Meanwhile, a quad core CPU will be plenty for gaming on for the vast majority of people who are gaming for the foreseeable future. You know, the people that actually make money for these companies. Which are not enthusiasts.

    - - - Updated - - -

    Quote Originally Posted by Miyagie View Post
    4-5 games a year have good multi-core support and people scream every year that all new games support it, every year for 10 years now...........

    Pretty much exactly my point. And NONE of those games that have good Multicore support NEED a lot of cores to perform well. You can still get well over 60fps in BF5 at 1080p with an i3.

  15. #35
    Quote Originally Posted by Kagthul View Post
    --fuming continues, sense not yet made--
    Why are you even speaking? You just got rekt by a video lol.
    Advocating 4-core technology to a poster who wants to have a future-proof setup and ALSO streams is next level troll patrol.
    Are you confusing this forum with 4chan and posting 4-core meme material?

  16. #36
    Quote Originally Posted by stevenho View Post
    Why are you even speaking? You just got rekt by a video lol.
    Advocating 4-core technology to a poster who wants to have a future-proof setup and ALSO streams is next level troll patrol.
    Are you confusing this forum with 4chan and posting 4-core meme material?
    The OP from the 400$ Build Thread never said anything about a futureproof PC he wants a PC that you can upgrade in the future.
    A 2400G is perfect for that.

  17. #37
    Quote Originally Posted by Miyagie View Post
    The OP from the 400$ Build Thread never said anything about a futureproof PC he wants a PC that you can upgrade in the future.
    A 2400G is perfect for that.
    I wonder what led you to post a comment regarding a guy building a $400 PC in a thread made by another guy who wants to have a future-proof PC with streaming capability.

  18. #38
    Quote Originally Posted by stevenho View Post
    I wonder what led you to post a comment regarding a guy building a $400 PC in a thread made by another guy who wants to have a future-proof PC with streaming capability.
    Since Kagthul never advocated a Quad Core CPU to the OP in this Thread you must talk about the 400$ PC Build Thread where Kaghtul advocated a Quad Core CPU.
    Kagthul just pointed out that you dont need 64 Cores for gaming and a Quad Core is still fine.

  19. #39
    Quote Originally Posted by Miyagie View Post
    Since Kagthul never advocated a Quad Core CPU to the OP in this Thread you must talk about the 400$ PC Build Thread where Kaghtul advocated a Quad Core CPU.
    Not sure if you're serious now, but the guy made a fuming post about how 4/4 is absolutely perfect for 90+ percent of everything and his actual wife plays AC: OD absolutely fine on it.
    And well...in the other thread he wrote that her game "rarely drops into the 40s"...
    So yeah, I don't know what you're trying to say exactly, but the OP of this thread wants a future proof gaming PC with streaming capability, I wrote that 4 cores will not cut it.
    Do you disagree or something?

  20. #40
    Quote Originally Posted by Alcsaar View Post
    AMDs are a lower price specifically because they're for a lower budget, and you get what you pay for. If you want longevity, an Intel is probably better just because it will be slightly more future proofed. The question is, will you really notice the difference? Hard to say.

    I build my PCS with the plan that I don't upgrade the parts for at least 2 years, so I always go Intel. I only just replaced my 970 GTX last week with an RTX 2070. That was a big upgrade, much more impactful than spending $500 a couple years ago or whatever for a 1070 or 1080 GTX that was only slightly better.
    An Intel processor is hardly future proofing. All of the "Intel is better for gaming" is based on games that are single core. Games are starting to appear that are designed to be multi core which invalidates Intel's biggest selling point for games.

    The real advice. Don't bother trying to future proofing. You will only get burned. Buy an Intel today for today's games and upgrade in three years. You never know when the next breakthrough will happen. 64 bit processors. dual core. Intel core series. Ryzen. These are all things that sorta just turned up and disrupted the industry and that's just processors.
    Quote Originally Posted by Nizah View Post
    why so mad bro

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •