Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    The ROG Maximus offers plenty of M.2 storage x 5 slots -- which of course is welcome by video editors, content creators, and gamers with ample libraries of games. Opens the door for future expansion if needed.

    In particular, the ROG Maximus also offers one M.2 PICe 5.0 slot.

    The only Strix that has M.2 5.0 is the ROG Strix Z790-E. The other three models only have the 4.0 version. If I was to go with a Strix, I'd go with the Z790-E. But I do like the Maximus I/O and storage flexibility, even if it's overkill.

  2. #22
    Quote Originally Posted by Agall View Post
    I actually agree with Kagthul on this. The low-mid range boards are generally worth the value for most people. I'm likely to spend $1k on my next motherboard but that's because I have specific features on some boards that I've grown accustom to, though that's a decision for when my platform needs to be upgraded on whether those features exist.
    And, at least in case of their Z690 lineup, the best Strix board actually beat most of the Maximus boards in areas that may actually matter to an average consumer, like connectivity. Particularly SSD connectivity, which is where the Maximus boards (with the exception being Maximus Apex, if I recall correctly) cut some corners to achieve their super duper cooling and power stages that either don't matter to most people or don't even tell them anything. While being much, much cheaper (and the last time I checked it, the price gap only got even worse over time, further in favor of the Strix board). Some of the Maximus boards also made some weird choices in the rear I/O. The biggest benefit, connectivity-wise that z690 Maximus boards had over the Strix was a second internal USB 3 header for cases with 4 USB 3s in front I/O, but that doesn't really justify the significant price increase.
    Quote Originally Posted by Kangodo View Post
    Does the CIA pay you for your bullshit or are you just bootlicking in your free time?
    Quote Originally Posted by Mirishka View Post
    I'm quite tired of people who dislike something/disagree with something while attacking/insulting anyone that disagrees. Its as if at some point, people forgot how opinions work.

  3. #23
    Immortal Ealyssa's Avatar
    10+ Year Old Account
    Join Date
    Jun 2009
    Location
    Switzerland, Geneva
    Posts
    7,002
    Is it the 90s ? Are we hyped for motherboard again ?
    Quote Originally Posted by primalmatter View Post
    nazi is not the abbreviation of national socialism....
    When googling 4 letters is asking too much fact-checking.

  4. #24
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    PCIe 2 x PCIe 5.0 x16 (@x16 or x8/x8)
    1 x PCIe 4.0 x4

    not enough PCI lanes is the main drawback of intel atm, that really needs to be x16/x16/x8 in 2022

  5. #25
    Quote Originally Posted by Medievaldragon View Post
    The ROG Maximus offers plenty of M.2 storage x 5 slots -- which of course is welcome by video editors, content creators, and gamers with ample libraries of games. Opens the door for future expansion if needed.
    Nothing you are doing will benefit from PCIe 5.0 NVMe drives. Literally nothing. (Not to mention you have to spend WAY more on expensive PCIe 5.0 drives). Gen 4 is already so fast that unless you're editing RAW 8K footage at native res, you wont see a difference. While manufacturers love to push the newest and the bestest, it isnt needed for 99% of people. Your games will not load faster. The bottleneck is already elsewhere.

    There really isnt even much perceptible gain from PCIe 3.0 to PCIe 4.0 for loading times. (like.. less than a second). It gets *slightly* better when you take into account Resizeable BAR/SAM, but even then its just a second or three.

    When people say "content creators" and "video editors" they mean people shooting in 8K uncompressed footage. That aint you. (No offense) They mean people like Jayz2Cents, BitWit, Paul, GN, etc. And even LTT doesn't really care about PCIe 5.0 because they are editing straight off their servers via 10GbE.

    Notice that sometimes the PCIe 3 drives load faster anyway: https://www.techspot.com/review/1893...vs-pcie-3-ssd/

    Even if you're recording 4k footage and editing it for YouTube (which will be compressed), you're not going to max out the 4GB/s read/write on PCIe 4.0, much less 5.0. Nor are you going to do so in the future in the reasonable lifetime of this machine. And even if you did - you can get an external Thunderbolt enclosure that will be just as fast.

    I'll re-mention the extra cost of PCIe 5.0 over even 4.0, as well.

    In particular, the ROG Maximus also offers one M.2 PICe 5.0 slot.
    So does the Strix E, as far as i can tell.

    The only Strix that has M.2 5.0 is the ROG Strix Z790-E. The other three models only have the 4.0 version. If I was to go with a Strix, I'd go with the Z790-E. But I do like the Maximus I/O and storage flexibility, even if it's overkill.
    As far as I/O is concerned, they all have Thunderbolt 4. You can get a single TB4 Dock that will have any possible I/O you need.

    Now, im not saying "definitely do not under any circumstances get the Maximus". Its your money, and its a perfectly good board. Its just, IMO, way more than you need to achieve your goals, and you did say in your other thread that the budget wasn't infinite.

    I'd point out that the extra outlay of getting the Maximus is close to 2/3 of the way to simply building a 2nd rig to stream and edit on (since itll be sitting there only doing the streaming/recording and compiling, you dont need to go ham on the components, and can re-use some stuff from your existing rig), which you initially said was far outside your budget.

    But if you want the Maximus and that sweet looking RGB I/O shield, you do you man.

  6. #26
    Quote Originally Posted by Kagthul View Post
    Now, im not saying "definitely do not under any circumstances get the Maximus". Its your money, and its a perfectly good board. Its just, IMO, way more than you need to achieve your goals, and [you did say in your other thread that the budget wasn't infinite].
    Just a correction: You are confusing me with someone else who might have said their budget wasn't infinite.

    I asked in a different thread about the AMD 7950X ($699) and the Intel 9-13900K ($629). Now discussing ASUS ROG Maximus.

    Other than that, I welcome the rest of your feedback.

  7. #27
    Quote Originally Posted by Medievaldragon View Post
    Just a correction: You are confusing me with someone else who might have said their budget wasn't infinite.

    I asked in a different thread about the AMD 7950X ($699) and the Intel 9-13900K ($629). Now discussing ASUS ROG Maximus.

    Other than that, I welcome the rest of your feedback.
    Could have sworn that was you; i at one point recommended that a better solution might be to build a separate streaming/editing rig from your main rig. It would cost a little more but would provide a higher-quality stream, and you could do renders without locking yourself out of your only rig. Maybe it was another thread but i could have sworn it was you.

    If you can do that, i'd do that, honestly. My wife does that (runs her PC video output through a 144hz capable splitter and runs the stream from her M1 Mac Mini.) The streaming rig itself doesn't even have to be massively powerful. Yeah, the renders might take longer but you can run them while you're still playing or overnight.
    Last edited by Kagthul; 2022-09-27 at 11:07 PM.

  8. #28
    Quote Originally Posted by Kagthul View Post
    Could have sworn that was you; i at one point recommended that a better solution might be to build a separate streaming/editing rig from your main rig. It would cost a little more but would provide a higher-quality stream, and you could do renders without locking yourself out of your only rig. Maybe it was another thread but i could have sworn it was you.

    If you can do that, i'd do that, honestly. My wife does that (runs her PC video output through a 144hz capable splitter and runs the stream from her M1 Mac Mini.) The streaming rig itself doesn't even have to be massively powerful. Yeah, the renders might take longer but you can run them while you're still playing or overnight.
    Does she use a capture card (i.e. Elgato)?

  9. #29
    She has two HDMI 2.0 to USB 3.0 input devices, similar to this one:

    https://www.amazon.com/Capture-Strea...29093730&psc=1

    Not that exact one, but a similar generic Chinesium brand. One handles the video from the PC and the other takes video from her Canon mirrorless that she uses instead of a webcam. Im sure the ElGato solutions are as good or better, but shes never had an issue with the generics. She uses the Mac version of OBS.
    Last edited by Kagthul; 2022-09-27 at 11:35 PM.

  10. #30
    Quote Originally Posted by Kagthul View Post
    She has two HDMI 2.0 to USB 3.0 input devices, similar to this one:

    https://www.amazon.com/Capture-Strea...29093730&psc=1

    Not that exact one, but a similar generic Chinesium brand. One handles the video from the PC and the other takes video from her Canon mirrorless that she uses instead of a webcam. Im sure the ElGato solutions are as good or better, but shes never had an issue with the generics. She uses the Mac version of OBS.
    Thanks for sharing. I got an Elgato external device about a year ago to record D2R. I never got it to work properly with OBS + a second PC. Hopefully, this one helps.

    I'm curious, though. You mentioned 144Hz, but the USB device outputs to 60Hz. Is there a higher Hz version of this device?

  11. #31
    Quote Originally Posted by Medievaldragon View Post
    Thanks for sharing. I got an Elgato external device about a year ago to record D2R. I never got it to work properly with OBS + a second PC. Hopefully, this one helps.

    I'm curious, though. You mentioned 144Hz, but the USB device outputs to 60Hz. Is there a higher Hz version of this device?
    No idea. Only reason she needs the splitter to be 144hz is because her gaming monitor is 144hz. You're not uploading anything past 60fps anyway, so it doesnt matter what that the input into the target PC is as long as it is 60fps which is all Twitch or YouTube support anyway.

    And like i said, thats not the exact one she uses, i dont even remember what the name was (there are dozens of different chinese brands that use the exact same items with their logo on them). The ones she uses are USB 3.0, not 2.0. I was just giving an example.

    As to the settings required to use a 2nd PC, i have no idea. I dont stream at all, that's my wife's thing. But you could throw together a small mITX or mATX rig with a 12400 and a used nVidia GPU that has the beef to use nVENC (a 1650 would be ideal) for encoding for super cheap, particularly if you have old parts you can re-use (RAM from your current rig, case, etc).

    I know its what a lot of professional Twitch guys use though (a two-system setup) because it removes any possibility of the streaming setup impacting game performance. Its not required by any means though. It just offers some benefits that i think are worthwhile - you can edit and render on the streaming rig without basically taking down your gaming rig, etc.

    YMMV.

  12. #32
    Quote Originally Posted by Kagthul View Post
    No idea. Only reason she needs the splitter to be 144hz is because her gaming monitor is 144hz. You're not uploading anything past 60fps anyway, so it doesnt matter what that the input into the target PC is as long as it is 60fps which is all Twitch or YouTube support anyway.

    And like i said, thats not the exact one she uses, i dont even remember what the name was (there are dozens of different chinese brands that use the exact same items with their logo on them). The ones she uses are USB 3.0, not 2.0. I was just giving an example.

    As to the settings required to use a 2nd PC, i have no idea. I dont stream at all, that's my wife's thing. But you could throw together a small mITX or mATX rig with a 12400 and a used nVidia GPU that has the beef to use nVENC (a 1650 would be ideal) for encoding for super cheap, particularly if you have old parts you can re-use (RAM from your current rig, case, etc).

    I know its what a lot of professional Twitch guys use though (a two-system setup) because it removes any possibility of the streaming setup impacting game performance. Its not required by any means though. It just offers some benefits that i think are worthwhile - you can edit and render on the streaming rig without basically taking down your gaming rig, etc.

    YMMV.
    I see. That makes sense. I'll look into the 3.0 version. I gave that second PC to my parent, so err ... I guess I'm not having that back. lol

    If I build a i7-13700K pc, it will be from scratch. I can use my current ASUS Z170-Pro (i5-16600K) PC as the streaming one.

    From your feedback, I'm probably going with the ASUS ROG Strix Z790-E. I have to look up what the supported RAM is, and probably going with an EVGA PSU. That's what I have purchased in the past 20 years. I have to look up for a water cooling and a case based on the mobo size.


    WISH LIST

    i7-13700K (overclocked above 6.0GHz -- need to see reviews)
    ASUS ROG Strix Z790-E
    Corsair 5000D case ($110)
    EVGA GQ850 850W power supply ($124)
    RAM (TBD)
    ASUS ROG Strix 3080 OC.

    • I checked the mobo support list. It only supports 3 EVGA PSUs. That one is the highest.

    • The case's front I/O has two USB 3.0 and one 3.1 USB-C, and for air flow: bottom, top, and side (potential radiators). I could add watercooling on the top for the CPU, and for the Graphic Card on the side (if I go that extra mile). Flexible on the case, though.

    • No RAM Compatibility List yet, but supports 4 x DIMM, Max. 128GB, DDR5 7200(OC)/7000(OC)/6800(OC)/6600(OC)/6400(OC)/ 6200(OC)/ 6000(OC)/ 5800(OC)/ 5600/ 5400/ 5200/ 5000/ 4800 Non-ECC, Un-buffered Memory*

    • I'm kinda partial on the video card. I need to see what the Intel Arc brings to the table, but I am not sure I am willing to lose the Nvidia codec for video. TBD. Pricing of 3070 Ti OC and 3080 Ti OC are kinda close for about $100 difference -- so not sure there what to pick. Still kinda expensive. The 4090/4080 are out of the question.
    Last edited by Medievaldragon; 2022-09-28 at 10:47 AM.

  13. #33
    Are you working for ASUS or something?
    R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B

  14. #34
    Quote Originally Posted by Thunderball View Post
    Are you working for ASUS or something?
    Of course, not. My current PC has a ASUS Z170-PRO Gaming motherboard.

    I upgrade every 8-years or so. Saw the ASUS Z790 ROG announcement was on Sept 27, and wanted to share with the community. There might be others interested in any of the new models.
    Last edited by Medievaldragon; 2022-09-28 at 12:26 PM.

  15. #35
    Quote Originally Posted by Medievaldragon View Post

    • I checked the mobo support list. It only supports 3 EVGA PSUs. That one is the highest.
    ....what? Does it have some kind of special connector? There's absolutey no reason for that to be true.

    • I'm kinda partial on the video card. I need to see what the Intel Arc brings to the table, but I am not sure I am willing to lose the Nvidia codec for video. TBD. Pricing of 3070 Ti OC and 3080 Ti OC are kinda close for about $100 difference -- so not sure there what to pick. Still kinda expensive. The 4090/4080 are out of the question.
    Get the best GPU you can afford at the moment. The 3080Ti is substantially better than the 3070Ti. For the 100$, id jump on it. If you look around, you can probably find a 3090 or 3090Ti for not a lot more, they are fire-saling those things right now.

    - - - Updated - - -

    Looks like the 3090 is another 100$. IMO, its not enough of a step up over the 3080Ti to bother. The 3080Ti at around 800$ is a great deal. Grab one before they go back up. Id suggest either an ASUS or Zotac card, simply due to their customer service. With EVGA out of the picture (though they will support their cards for the warranty duration) those are your two best bets for CS.
    Last edited by Kagthul; 2022-09-28 at 09:06 PM.

  16. #36
    Quote Originally Posted by Kagthul View Post
    <snip>
    The 7600X is $300, the 13600k is $310, and the 5800X3D is $430. But i think that a build based on price/performance you'd do better with the 5800X3D, especially if you make a frugal choice on parts: ddr4, lower wattage power supply, much lower cost motherboard choice.

    - - - Updated - - -

    Quote Originally Posted by Medievaldragon View Post

    i7-13700K (overclocked above 6.0GHz -- need to see reviews)
    ASUS ROG Strix Z790-E
    Corsair 5000D case ($110)
    EVGA GQ850 850W power supply ($124)
    RAM (TBD)
    ASUS ROG Strix 3080 OC.

    • I checked the mobo support list. It only supports 3 EVGA PSUs. That one is the highest.

    • The case's front I/O has two USB 3.0 and one 3.1 USB-C, and for air flow: bottom, top, and side (potential radiators). I could add watercooling on the top for the CPU, and for the Graphic Card on the side (if I go that extra mile). Flexible on the case, though.

    • No RAM Compatibility List yet, but supports 4 x DIMM, Max. 128GB, DDR5 7200(OC)/7000(OC)/6800(OC)/6600(OC)/6400(OC)/ 6200(OC)/ 6000(OC)/ 5800(OC)/ 5600/ 5400/ 5200/ 5000/ 4800 Non-ECC, Un-buffered Memory*

    • I'm kinda partial on the video card. I need to see what the Intel Arc brings to the table, but I am not sure I am willing to lose the Nvidia codec for video. TBD. Pricing of 3070 Ti OC and 3080 Ti OC are kinda close for about $100 difference -- so not sure there what to pick. Still kinda expensive. The 4090/4080 are out of the question.
    •The motherboard only requires the 24 pin and two 8-pin cpu connectors. As long as the power supply has those you're fine. That list, wherever you found it, is silly. If you're serious about overclocking any of your components 850w may not be enough.

    •And you have a front panel usb3 header.

    •Ram I'd suggest 2x8GB 5000, anything extra seems overkill. Maybe 4x8GB if you do something that needs the ram.

    •I'm hesitant to suggest an A770, it comes in at $330, probably will have stock issues on release day 10/12. If you want a decent midrange card the 6700 (non-xt) is damn good price/performance.

  17. #37
    Quote Originally Posted by Kagthul View Post
    ....what? Does it have some kind of special connector? There's absolutey no reason for that to be true.
    Step 1: If you go to the ROG Strix page, there are four tabs: Features, Specs, Gallery and Support. Click the Support tab.
    https://rog.asus.com/motherboards/ro...ng-wifi-model/

    Step 2: This opens a new page. Under CPU/Memory Support tab, there are two tabs: CPU Support and Other Devices. Click Other Devices.

    Step 3: In the Other Devices tab, there is a search box. Next to that search box, there is a dropdown menu: HDD Devices, Peripheral, and Power Supplies. Click Power Supplies.

    That lists all of the Power Supplies compatible with the ROG Strix Z790-E motherboard.
    https://rog.asus.com/motherboards/ro...sk_qvl_device/

    Only 3 EVGA power supplies listed there:


    EVGA 700 GQ 700W
    EVGA EVGA-SuperNOVA-750-GT-750W
    EVGA GQ850 850W

    A bit strange, but that's what they got listed there. I don't see any compatible RAM yet. I guess To-Be-Announced. Maybe more EVGA power supplies will be announced later.


    Quote Originally Posted by Linkedblade View Post
    •The motherboard only requires the 24 pin and two 8-pin cpu connectors. As long as the power supply has those you're fine. That list, wherever you found it, is silly. If you're serious about overclocking any of your components 850w may not be enough.

    Well, the power supply not by choice. It's only one out of three compatible EVGA supplies listed in their support page =/

    I see ANTEC has a 1000W PSU ($225), and ASUS offers a 1200W PSU (gasp: $295). I assume it is just that EVGA hasn't announced one yet for this mobo.
    Last edited by Medievaldragon; 2022-09-28 at 09:40 PM.

  18. #38
    Quote Originally Posted by Linkedblade View Post
    The 7600X is $300, the 13600k is $310, and the 5800X3D is $430. But i think that a build based on price/performance you'd do better with the 5800X3D, especially if you make a frugal choice on parts: ddr4, lower wattage power supply, much lower cost motherboard choice.
    The 5800X3D barely beats the 12600K (and often doesnt). The 13600K will beat it soundly (even at stock), and is cheaper. You can use a 600 series motherboard with it, and it still supports DDR4. There is no world in which the 5800X3D is the better choice over the 13600K, especially since they are now 14 cores/20 threads.

    •Ram I'd suggest 2x8GB 5000, anything extra seems overkill. Maybe 4x8GB if you do something that needs the ram.
    Because of how DDR5 has been packaged, 2x8 is a poor financial choice, since it isnt a lot cheaper than getting 2x16. Hes also going to potentially be doing video editing and renders on this rig. Given the lack of substantial savings by only going with 16GB (2x8) there is no point to not getting 32GB.

    •I'm hesitant to suggest an A770, it comes in at $330, probably will have stock issues on release day 10/12. If you want a decent midrange card the 6700 (non-xt) is damn good price/performance.
    Since he straight up said hes getting a 3070Ti or better, im not even sure why you brought this up.

    - - - Updated - - -

    Quote Originally Posted by Medievaldragon View Post
    Step 1: If you go to the ROG Strix page, there are four tabs: Features, Specs, Gallery and Support. Click the Support tab.
    https://rog.asus.com/motherboards/ro...ng-wifi-model/

    Step 2: This opens a new page. Under CPU/Memory Support tab, there are two tabs: CPU Support and Other Devices. Click Other Devices.

    Step 3: In the Other Devices tab, there is a search box. Next to that search box, there is a dropdown menu: HDD Devices, Peripheral, and Power Supplies. Click Power Supplies.

    That lists all of the Power Supplies compatible with the ROG Strix Z790-E motherboard.
    https://rog.asus.com/motherboards/ro...sk_qvl_device/

    Only 3 EVGA power supplies listed there:


    EVGA 700 GQ 700W
    EVGA EVGA-SuperNOVA-750-GT-750W
    EVGA GQ850 850W

    A bit strange, but that's what they got listed there. I don't see any compatible RAM yet. I guess To-Be-Announced. Maybe more EVGA power supplies will be announced later.
    You can safely totally ignore this (the PSU thing).

    As long as the PSU has enough power on the correct rails, and the right plugs, it will work just fine. This isnt a 12vO board. Also, i'd get a Seasonic over an EVGA, not that EVGA's PSUs are terrible or anything. But Seasonic has a much better warranty and their customer service is even better than EVGAs.

    As for RAM compatability, all that is is a list of stuff tha ASUS has tested with the board. Especially with Intel, this is usually a non-issue (AMD's memory controllers tend to be a bit more tempramental). Ive literally never had a set of RAM not work with an Intel chip at its rated XMP, regardless of wether or not it was on the QVL list. Just stick to a decent brand and youll be fine.

    - - - Updated - - -

    https://pcpartpicker.com/product/jWF...pply-ssr-850fx

    Is the Seasonic unit i'd recommend. You might want to kick it up to a 1000W for potential "future proofing" of a higher-end GPU, but honestly i think by the time you're seriously contemplating replacing the GPU you're going to have to replace it anyway due to them switching to new ATX 3.0 connectors.

  19. #39
    Quote Originally Posted by Kagthul View Post
    The 5800X3D barely beats the 12600K (and often doesnt). The 13600K will beat it soundly (even at stock), and is cheaper. You can use a 600 series motherboard with it, and it still supports DDR4. There is no world in which the 5800X3D is the better choice over the 13600K, especially since they are now 14 cores/20 threads.



    Because of how DDR5 has been packaged, 2x8 is a poor financial choice, since it isnt a lot cheaper than getting 2x16. Hes also going to potentially be doing video editing and renders on this rig. Given the lack of substantial savings by only going with 16GB (2x8) there is no point to not getting 32GB.



    Since he straight up said hes getting a 3070Ti or better, im not even sure why you brought this up.

    - - - Updated - - -



    You can safely totally ignore this (the PSU thing).

    As long as the PSU has enough power on the correct rails, and the right plugs, it will work just fine. This isnt a 12vO board. Also, i'd get a Seasonic over an EVGA, not that EVGA's PSUs are terrible or anything. But Seasonic has a much better warranty and their customer service is even better than EVGAs.

    As for RAM compatability, all that is is a list of stuff tha ASUS has tested with the board. Especially with Intel, this is usually a non-issue (AMD's memory controllers tend to be a bit more tempramental). Ive literally never had a set of RAM not work with an Intel chip at its rated XMP, regardless of wether or not it was on the QVL list. Just stick to a decent brand and youll be fine.

    - - - Updated - - -

    https://pcpartpicker.com/product/jWF...pply-ssr-850fx

    Is the Seasonic unit i'd recommend. You might want to kick it up to a 1000W for potential "future proofing" of a higher-end GPU, but honestly i think by the time you're seriously contemplating replacing the GPU you're going to have to replace it anyway due to them switching to new ATX 3.0 connectors.
    Thanks. Seasonic ATX 3.0 (PCIe 5.0) coming mid-Dec 2022.

  20. #40
    Quote Originally Posted by Kagthul View Post
    The 5800X3D barely beats the 12600K (and often doesnt). The 13600K will beat it soundly (even at stock), and is cheaper. You can use a 600 series motherboard with it, and it still supports DDR4. There is no world in which the 5800X3D is the better choice over the 13600K, especially since they are now 14 cores/20 threads.
    GN has showed it perfectly clear while reviewing the. 7000 serie. The charts are build in a way so they try to create the most cpu-bottlenecked configuration to see how the architecture scales up = 1080p, low settings, etc to see those 600+ fps in rainbow six siege.

    Once you start being more gpu bound, the scale flattens immediately. They tested SotTR at 1440p and high details and the first 7 or 8 cpus were all around the same fps. So, in the end all of those are perfectly viable and other factors should be the deciding ones.

    7000 is a terrible value proprosition due to the new platform being very costly right now (plus i don't like their cooling solution and totally unnecessary toasting). The 13600k isnsurely interesting due to the reasons you mentioned, but at that point i could go 12600k and don't even see the difference if my gpu can sustain the 144 fps my monitor can actually display.

    Gaming wise, any relatively high in the stack cpu is basically the same duebto generally high core count and greatly improved tech.

    EDIT: even Intel in their own charts have added the 5800x3d in a sketchy way because it was consistently beatingnor performing on par with the 13900k in gaming.
    Non ti fidar di me se il cuor ti manca.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •