1. #5101
    Quote Originally Posted by Adappy View Post
    Makes me wonder how much VRAM BF3 would use on full ultra with an eyefinity setup... (I'm sure you'll run out of GPU power before you hit the VRAM limit though)
    Isn't that what Crossfire and SLI were invented for?

  2. #5102
    The Unstoppable Force DeltrusDisc's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Illinois, USA
    Posts
    20,098
    Quote Originally Posted by Butler Log View Post
    Isn't that what Crossfire and SLI were invented for?
    Yes but we're wondering what just one of these cards can do.
    "A flower.
    Yes. Upon your return, I will gift you a beautiful flower."

    "Remember. Remember... that we once lived..."

    Quote Originally Posted by mmocd061d7bab8 View Post
    yeh but lava is just very hot water

  3. #5103
    Brewmaster Fierae's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Northampton, UK
    Posts
    1,331
    Quite a lot, world record is held now by the 7970 for single GPU in HWBot afaik.

    Some crazy hit 1700Mhz as well.
    Digital Rumination
    Plays: Sylvanas EU - Fierae (Druid) | HotS | EVE | PUBG
    Played: Rift | Guild Wars 2 | SW:TOR | BF4 | Smite | LoL | Skyrim
    Ryzen 1920X - 32GB - 980Ti SLI - PCIE NVMe 1GB SSD - Enthoo Primo - Full WC - 4K

  4. #5104
    Anybody care to explain exactly what this means though:
    Quote Originally Posted by JCPUser@overclock.net
    I don't want to be the one to point this out, but look at the top where the first "Da Orginal" is crossed out. It references a 1000MHz core clock and a number of shaders that is heavily crossed out. That shader number looks to be 2304 which would mean 36CU chip... in other words, just like the GTX 480 the 7970 may be a flagship card with part of the die disabled. Here look at this source as they pick up on the same thing.
    Does it mean that they simply disabled part of the graphics card?

  5. #5105
    Warchief sizzlinsauce's Avatar
    10+ Year Old Account
    Join Date
    May 2010
    Location
    Bellforest, Tower state
    Posts
    2,188
    Quote Originally Posted by DeltrusDisc View Post
    $200 for a 1080p 25"? Idk if I would think of that as a deal, tbqh. Just got a 21.5" Acer LED-back-lit 1080p monitor yesterday for $115. =X
    that price isnt bad i payed $319 each for my 27" ASUS monitors

  6. #5106
    Titan
    10+ Year Old Account
    Join Date
    Oct 2010
    Location
    America's Hat
    Posts
    14,143
    Quote Originally Posted by Butler Log View Post
    What about the liquid cooled 1335 MHz version with 3GB?
    That's probably the one I would get if I were going liquid cooling.

  7. #5107
    Quote Originally Posted by Mortalis71 View Post
    my camera is terrible but here is my rig right now:



    I will be upgrading to a 670(when they come out), a waterblock for the 670 and a 200mm rad up front. It looks so much better in person since my fan and tubing is UV reactive and I have some cathodes.
    Slightly photoshopped? Ended up looking like my 'shopped copy of the below pic.





    Quote Originally Posted by Rennadrel View Post
    I guess I shouldn't say anything, my case is flat black and the only other colour on it are the blue LED for the power button and the blue LED on my PSU.
    But, blue is cool!

    I also removed the white power/HDD LEDs from my 700D frontpanel and replaced them with blue LEDs.
    5800X | XFX 7900XTX | Prime X570 Pro | 32GB | 990Pro + SN850 2TB | Define 7

  8. #5108
    Quote Originally Posted by Butler Log View Post
    Anybody care to explain exactly what this means though:


    Does it mean that they simply disabled part of the graphics card?
    Either that, or it could mean that they are actually intentionally limiting it, so they can push out a newer GPU later, when nVidia pushes Kepler. Because nobody expects the Spanish Inquisition.

    It would be quite brilliant. Lulling nvidia into a false sense of security, and then bam. We'll see what happens later. If this in fact is true at all, of course!
    (Also. It's much less expensive to develop a single GPU and cutting and limit it rather than several different.)
     

  9. #5109
    Brewmaster Fierae's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Northampton, UK
    Posts
    1,331
    The thing is, without knowing the specs of the Kepler cards, it could end up going really well for AMD, or really badly!
    Digital Rumination
    Plays: Sylvanas EU - Fierae (Druid) | HotS | EVE | PUBG
    Played: Rift | Guild Wars 2 | SW:TOR | BF4 | Smite | LoL | Skyrim
    Ryzen 1920X - 32GB - 980Ti SLI - PCIE NVMe 1GB SSD - Enthoo Primo - Full WC - 4K

  10. #5110
    Why?
    I mean, it's not like AMD is as afraid of nVidia as the other way around.
    And there have been leaked speccs already.
     

  11. #5111
    Brewmaster Fierae's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Northampton, UK
    Posts
    1,331
    Specs of course, specs don't always mean performance.

    It could go badly if the nVidia cards wipe the floor, even with this unknown AMD card. Neither company are really afraid of each other I don't think as it's quite clear what their target markets are.
    Digital Rumination
    Plays: Sylvanas EU - Fierae (Druid) | HotS | EVE | PUBG
    Played: Rift | Guild Wars 2 | SW:TOR | BF4 | Smite | LoL | Skyrim
    Ryzen 1920X - 32GB - 980Ti SLI - PCIE NVMe 1GB SSD - Enthoo Primo - Full WC - 4K

  12. #5112
    The high end segments doesn't really matter for anything other than prestige. AMD could drop their interest in the higher end (HD x8xx and higher) and likely increase their profit.

    It would only matter for the gamer and enthusiast base, and that's a very small part of the PC world. And many, many gamers hold on to their GPUs for many years.
     

  13. #5113
    Quote Originally Posted by tetrisGOAT View Post
    No. There are none.
    Chaud lied to me? D: He told me to wait for a month so the 6xx series will be released when I order my computer.

  14. #5114
    He must've mixed up with HD 7000 series.

  15. #5115
    Quote Originally Posted by Drunkenvalley View Post
    He must've mixed up with HD 7000 series.
    Ahh, when will those be released then?

  16. #5116
    Quote Originally Posted by Poppasan View Post
    Chaud lied to me? D: He told me to wait for a month so the 6xx series will be released when I order my computer.
    A month? More like, late spring early summer. Last I heard, it's "May". However, nVidia is said to be releasing them ground-up, meaning low-perform and mobile hitting first, and ending with their flagship.
    Whether they start this in May, or the highest end hits in May, I do not know.

    So much for the people promising the GTX 600-series would be released before the fall. Of 2011.

    ---------- Post added 2011-12-31 at 05:58 PM ----------

    Quote Originally Posted by Poppasan View Post
    Ahh, when will those be released then?
    January for the HD7970, highest end. Find reviews in the big thread about it.
    The rest are likely announced in January and released in February.
    Preliminary guesses here.
     

  17. #5117
    Brewmaster Fierae's Avatar
    10+ Year Old Account
    Join Date
    Jul 2010
    Location
    Northampton, UK
    Posts
    1,331
    nVidia should be Q2, but they offset their quarters like a month behind for some reason - so Q1 is like feb-april, Q2 is may-july etc. afaik.
    Digital Rumination
    Plays: Sylvanas EU - Fierae (Druid) | HotS | EVE | PUBG
    Played: Rift | Guild Wars 2 | SW:TOR | BF4 | Smite | LoL | Skyrim
    Ryzen 1920X - 32GB - 980Ti SLI - PCIE NVMe 1GB SSD - Enthoo Primo - Full WC - 4K

  18. #5118
    So, just bought LA Noire, and it seems to run at a unplayable FPS (10 idle), even though I should have more than the minium requirements. Is this a known problem or just me? I have all my settings lowest btw.

    Should of posted specs:
    GPU - GT 220
    Some random 4gb of ram
    CPU - Pentium (R) dual-core cpu e5200
    Last edited by Poppasan; 2012-01-01 at 02:06 PM.

  19. #5119
    I forgot what your rig looks like, Poppasan, so it's hard for me to tell.

  20. #5120
    Deleted
    Quote Originally Posted by Poppasan View Post
    Should of posted specs:
    GPU - GT 220
    Some random 4gb of ram
    CPU - Pentium (R) dual-core cpu e5200
    The GT 220 is an incredibly weak and outdated card. I see nothing weird in it running L.A Noire on such low framerates, whatever the resolution might be.

    The minimum requirements has a 8600 GT listed, which to my knowledge is more powerful than a GT 220. Perhaps if you bring all settings as low as possible and run it at 1024x768 the framerate will rise to "playable"; say 15-20fps.
    Last edited by mmoc7c6c75675f; 2012-01-01 at 02:31 PM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •