Page 3 of 3 FirstFirst
1
2
3
  1. #41
    Deleted
    Quote Originally Posted by vesseblah View Post
    Hard to make fair comparison with the silicon lottery.
    well not really with ivy/sandy 10-15% additional performance is statisticly very likely (when talking about "k" chips ofc)

  2. #42
    Titan
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    14,142
    Quote Originally Posted by Cyanotical View Post
    something to keep in mind, Crysis3 is one of the few games that actually takes advantage of multicore CPUs and depends less on IPC(compared to say wow), there is nothing wrong with the FX8350 beating the 3570k in that situation, if more and more games start to take advantage of multicore CPUs, the days of quad cores being optimal for gaming are done, but everyone knew that already, it's just a matter of if people are willing to accept it
    I think though, that in the next couple of years, multithreading is going to be very important to gaming and the computer industry as a whole. Next generation consoles are coming and will usher in a new era of graphics technology for the TV gamer market, studios are going to start optimizing processors for efficiency instead of doing what they do now for consoles and that is maximizing the CPU to it's fullest potential just to render the graphics properly, which isn't efficient for the processor itself. Advanced lighting, shading and textures are impossible on current generation consoles (Wii U excluded), physics doesn't work well at all because the systems can't process it. With the next generation systems, and PC's alike, physics and advanced graphics features will be available to all platforms. I think you will see a better use of PC technology since consoles will be competitive with it for a few years.

    PC's aren't even optimized to run effectively since so many games are designed for consoles in mind and then ported to the PC, I think the opposite is going to happen with the next generation, since the PC will still be more powerful, but the technology in the consoles will all be pretty similar so they will design for the computer first and then port as needed. In terms of cost effectiveness, having all 3 consoles running on x86 next generation is going to help as well.

  3. #43
    Quote Originally Posted by Rennadrel View Post
    PC's aren't even optimized to run effectively since so many games are designed for consoles in mind and then ported to the PC, I think the opposite is going to happen with the next generation, since the PC will still be more powerful, but the technology in the consoles will all be pretty similar so they will design for the computer first and then port as needed. In terms of cost effectiveness, having all 3 consoles running on x86 next generation is going to help as well.
    I'm pretty sure they'll still be designed for consoles first since there's a lot more money to be made there vs. the PC market. But hopefully we'll see better ports since they're not stuck developing games for 7 year old hardware.
    Quote Originally Posted by Karragon View Post
    I'd like WoW to be a single player game

  4. #44
    Quote Originally Posted by blargh312 View Post
    I'm pretty sure they'll still be designed for consoles first since there's a lot more money to be made there vs. the PC market. But hopefully we'll see better ports since they're not stuck developing games for 7 year old hardware.
    When the new consoles come out those will be based on year old midrange PC hardware because of long development times and the requirement to push prices and cooling into reasonable level. My bet is those will be about on par with A10-5800 processor and Radeon 7670 series GPU, not much higher than that.
    Never going to log into this garbage forum again as long as calling obvious troll obvious troll is the easiest way to get banned.
    Trolling should be.

  5. #45
    Quote Originally Posted by vesseblah View Post
    When the new consoles come out those will be based on year old midrange PC hardware because of long development times and the requirement to push prices and cooling into reasonable level.
    1 year old midrange hardware is still a shitload better than 7 year old midrange hardware.
    Quote Originally Posted by Karragon View Post
    I'd like WoW to be a single player game

  6. #46
    Quote Originally Posted by blargh312 View Post
    1 year old midrange hardware is still a shitload better than 7 year old midrange hardware.
    And in 2-3 years consoles will be hopeless shit again compared to the DIY gaming PCs.
    Never going to log into this garbage forum again as long as calling obvious troll obvious troll is the easiest way to get banned.
    Trolling should be.

  7. #47
    Deleted
    Quote Originally Posted by vesseblah View Post
    And in 2-3 years consoles will be hopeless shit again compared to the DIY gaming PCs.
    optimist :P no way they gonna keep up considering how fast mobile chips are advancing now.
    i give it 2 years tops.

  8. #48
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    the current expectation is that due to both the new consoles being X86 based, games will be developed for PC first, then ported down, how ever, in the case of teh PS4, with it running linux, and having direct hardware access, that means a port script could be made to easily port PS4 games to OpenGL for linux based computers

  9. #49
    Quote Originally Posted by Cyanotical View Post
    in the case of teh PS4, with it running linux, and having direct hardware access, that means a port script could be made to easily port PS4 games to OpenGL for linux based computers
    Not gonna be that easy because linux doesn't really give direct hardware access to games any more than windows does.

    Quote Originally Posted by Cyanotical View Post
    the current expectation is that due to both the new consoles being X86 based, games will be developed for PC first, then ported down
    That would be really nice, but remains to be seen. Porting code nowadays is trivial anyway since people use cross-platform engines like Unity to make the games anyway, problem is the texture quality which is done low for the consoles' limited video RAM. Making high res textures and scaling it down is lot more expensive than making low res textures and just keep feeding PC players shit like they do now.
    Never going to log into this garbage forum again as long as calling obvious troll obvious troll is the easiest way to get banned.
    Trolling should be.

  10. #50
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by vesseblah View Post
    Not gonna be that easy because linux doesn't really give direct hardware access to games any more than windows does.
    the PS4 gets direct hardware access, which means you can just write a script to port from the hardware profile to opengl

    Quote Originally Posted by vesseblah View Post
    ...and just keep feeding PC players shit like they do now.
    cross some fingers and hope that doesn't happen, right now i would say it's 50/50

  11. #51
    Hmm I think this is going to be slightly offtopic but I like to share this video how to make your CPU running at 100% to gain more gpu loads in Crysis 3. Check nocturnals pics of the cpu loads where you can see the CPU isn't hitting it's limit yet.

    It's only going to be useful if you use cards like a gtx 690 in sli or a titan in sli. A single 680 this video isn't needed but still worth it to check it out. Seems like he was able to push 12-15% extra on 3 680's each.


  12. #52
    Quote Originally Posted by Faithh View Post
    Hmm I think this is going to be slightly offtopic but I like to share this video how to make your CPU running at 100% to gain more gpu loads in Crysis 3. Check nocturnals pics of the cpu loads where you can see the CPU isn't hitting it's limit yet.

    It's only going to be useful if you use cards like a gtx 690 in sli or a titan in sli. A single 680 this video isn't needed but still worth it to check it out. Seems like he was able to push 12-15% extra on 3 680's each.
    To be honest, his problem is running $1500 worth of GPU off of a $300 CPU. Not to mention any PCIe lane problems with Z77 instead of x79.

    A 3 to 1 price ratio (GPU to CPU) is already pushing it but 5 to 1 in a physics heavy game just isn't going to cut it.
    Last edited by yurano; 2013-03-04 at 02:02 AM.

  13. #53
    Quote Originally Posted by yurano View Post
    To be honest, his problem is running $1500 worth of GPU off of a $300 CPU. Not to mention any PCIe lane problems with Z77 instead of x79.

    A 3 to 1 price ratio (GPU to CPU) is already pushing it but 5 to 1 in a physics heavy game just isn't going to cut it.
    Why would the lanes be a problem?

  14. #54
    Quote Originally Posted by Faithh View Post
    Why would the lanes be a problem?
    Background: AFAIK, a 3770K has 23 PCIe 3.0 lanes. Some are used for USB3/SATA bus so for purposes of PCIe cards, there are only 20 lanes available. For SLI/CF the lanes are split as follows: 16x for one card, 8x/8x for two cards or 8x/8x/4x for three cards (depends on motherboard). In addition, PCIe 3.0 8x has the same bandwidth as PCIe 2.0 16x.

    http://i.imgur.com/4sYbB.jpg

    The above linked image is a test done by /r/gamingpc comparing PCIe 2.0 16x/16x (equivalent to PCIe 3.0 8x/8x) vs PCIe 3.0 16x/16x for SLI of two cards. In the bandwidth limited case (PCIe 2.0 16x/16x) there's some reduction in performance at 1920x1080, albeit marginal. The performance penalty is much worse at 5760x1080. The Youtube vid you linked seemed to be using 1440p, which is roughly halfway between 1920x1080 and 5760x1080 in terms of pixel count.

    We must further consider the fact that the Youtuber used SLI x3 (8x/8x/4x) instead of SLI x2. In the following link, the same bandwidth bottleneck is observed.

    http://www.evga.com/forums/tm.aspx?m=1537816&mpage=1

  15. #55
    Quote Originally Posted by yurano View Post
    Background: AFAIK, a 3770K has 23 PCIe 3.0 lanes. Some are used for USB3/SATA bus so for purposes of PCIe cards, there are only 20 lanes available. For SLI/CF the lanes are split as follows: 16x for one card, 8x/8x for two cards or 8x/8x/4x for three cards (depends on motherboard). In addition, PCIe 3.0 8x has the same bandwidth as PCIe 2.0 16x.

    http://i.imgur.com/4sYbB.jpg

    The above linked image is a test done by /r/gamingpc comparing PCIe 2.0 16x/16x (equivalent to PCIe 3.0 8x/8x) vs PCIe 3.0 16x/16x for SLI of two cards. In the bandwidth limited case (PCIe 2.0 16x/16x) there's some reduction in performance at 1920x1080, albeit marginal. The performance penalty is much worse at 5760x1080. The Youtube vid you linked seemed to be using 1440p, which is roughly halfway between 1920x1080 and 5760x1080 in terms of pixel count.

    We must further consider the fact that the Youtuber used SLI x3 (8x/8x/4x) instead of SLI x2. In the following link, the same bandwidth bottleneck is observed.

    http://www.evga.com/forums/tm.aspx?m=1537816&mpage=1
    Think you're making a big mistake. Lanes bottleneck means you never can get higher gpu loads, in his case he did reach higher gpu loads. The lanes aren't the bottleneck here but the CPU was, hence why it was running at 100% and gpu's were chilling at 60% load.

    He can use boards like the Maximus V extreme which supports Quad sli easily 8x/8x/8x/8x because of the plx chip. Plx chip adds lag or to be more precise it's just going to increase the frame render latency and this stands apart from the loads again.

    A gtx 690 in a single pci express slot, runs at 16x or 8x? Not sure about this. If that runs at x16 and you add another 680 so you have 8x (690)/8x(680) so I don't see the problem here?

  16. #56
    Can you even run tri-SLI 680 with a 690 and a 680?

  17. #57
    Quote Originally Posted by Faithh View Post
    Think you're making a big mistake. Lanes bottleneck means you never can get higher gpu loads, in his case he did reach higher gpu loads. The lanes aren't the bottleneck here but the CPU was, hence why it was running at 100% and gpu's were chilling at 60% load.

    He can use boards like the Maximus V extreme which supports Quad sli easily 8x/8x/8x/8x because of the plx chip. Plx chip adds lag or to be more precise it's just going to increase the frame render latency and this stands apart from the loads again.

    A gtx 690 in a single pci express slot, runs at 16x or 8x? Not sure about this. If that runs at x16 and you add another 680 so you have 8x (690)/8x(680) so I don't see the problem here?
    And he's clearly using a Maximus V with PLX? 16x + 8x is more than the 20 PCIe lanes available to a standard Z77 chipset.

    Again, I'd like to reiterate that his problem is trying to run $1500 worth of graphics cards off a $300 CPU.

  18. #58
    Quote Originally Posted by yurano View Post
    And he's clearly using a Maximus V with PLX? 16x + 8x is more than the 20 PCIe lanes available to a standard Z77 chipset.

    Again, I'd like to reiterate that his problem is trying to run $1500 worth of graphics cards off a $300 CPU.
    Yes he is using a Maximus V extreme. http://www.youtube.com/watch?v=YUKathtsFEw

    So your bandwidth limitation is invalid.

    ---------- Post added 2013-03-04 at 06:57 PM ----------

    Quote Originally Posted by Butler Log View Post
    Can you even run tri-SLI 680 with a 690 and a 680?
    Seems like it's only possible with a 6990 + 6970. So no I guess.

  19. #59
    I am Murloc! Cyanotical's Avatar
    10+ Year Old Account
    Join Date
    Feb 2011
    Location
    Colorado
    Posts
    5,553
    Quote Originally Posted by Butler Log View Post
    Can you even run tri-SLI 680 with a 690 and a 680?
    no, only with crossfire, SLI requires the same model, ie all 660Ti's or all Titans, no mix/matching

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •