1. #1

    Question Can I crossfire 2 different cards? (HSI 6870 with a other manufacture 6870)

    The title asks the question. Can I crossfire my HSI 6870 with another 6870 that's a different brand. Also, If I can do this can I only use 6800 series?

  2. #2
    Moderator Uggorthaholy's Avatar
    Join Date
    Feb 2009
    Location
    Granbury, TX
    Posts
    3,046
    Quote Originally Posted by wombinator04 View Post
    The title asks the question. Can I crossfire my HSI 6870 with another 6870 that's a different brand. Also, If I can do this can I only use 6800 series?
    Everything must be identical, sans manufacturer, I believe. As long as both are 6870s with the same vram and clock speeds, you should be good! (clock speeds may be able to differ, don't quote me on that)

  3. #3
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,895
    Quote Originally Posted by uggorthaholy View Post
    Everything must be identical, sans manufacturer, I believe. As long as both are 6870s with the same vram and clock speeds, you should be good! (clock speeds may be able to differ, don't quote me on that)
    Oh, but I will quote you. - just because I'm an ass!

    This is only true for Nvidia (but the faster one will downclock itself) and SLI, which is REALLY picky about GPU and memorysize.

    AMD and Crossfire, however, you can mix and match however you want. HD6950 1GiB and a HD6970 2GiB? No worries. HD6950 2GiB and a HD6990 4GiB, a dual GPU-card? Nooo worries.
    HD6870 with a HD6790 or a HD6850? No worries there. (Does not work with the HD6770, though. So don't make that mistake. 6800 series and 6900-series not compatible either)
    They have to be within family though. IE, Barts, Cayman et c.

    Manufacturer, however, do not need to match for either chipset.

    In short, any HD6870 will suit you nicely.
    Last edited by tetrisGOAT; 2011-10-19 at 03:27 AM.

  4. #4
    Quote Originally Posted by tetrisGOAT View Post
    Oh, but I will quote you.

    This is only true for Nvidia (but the faster one will downclock itself) and SLI, which is REALLY picky about GPU and memorysize.

    AMD and Crossfire, however, you can mix and match however you want. HD6950 1GiB and a HD6970 2GiB? No worries. HD6950 2GiB and a HD6990 4GiB, a dual GPU-card? Nooo worries.
    HD6870 with a HD6790? No worries there.

    Manufacturer, however, do not need to match for either chipset.

    In short, any HD6870 will suit you nicely.
    Yeah thanks. I bought the 6870 off of Newegg and they deactivated It so you can't get It anymore. I'd have to buy It off of amazon from a third party reseller.

  5. #5
    TOTALLY NOT
    Banned
    tetrisGOAT's Avatar
    Join Date
    Jun 2008
    Location
    Sweden
    Posts
    12,895
    I should mention that I also did express myself poorly. Even with AMD, if you had chosen a 'lesser' Barts-model (6790), the HD6870 would turn off steamprocessors and downclock itself automatically.

    So yea, any 6870 and you're safe. Doesn't matter if it's referencedesign or not.

    And uggorthaholy is awesome, I'm just an ass to him sometimes. (Want some candy uggo? :3 )

  6. #6
    Moderator Uggorthaholy's Avatar
    Join Date
    Feb 2009
    Location
    Granbury, TX
    Posts
    3,046
    Quote Originally Posted by tetrisGOAT View Post
    I should mention that I also did express myself poorly. Even with AMD, if you had chosen a 'lesser' Barts-model (6790), the HD6870 would turn off steamprocessors and downclock itself automatically.

    So yea, any 6870 and you're safe. Doesn't matter if it's referencedesign or not.

    And uggorthaholy is awesome, I'm just an ass to him sometimes. (Want some candy uggo? :3 )
    *opens mouth* I would love candy

    PS - very nice information Tetris. <3

  7. #7
    Although this thread nearly seems finished, I felt like having some fun and throwing a wrench into your candy party. While true that Nvidia seems more controlling when it comes to SLi "freedom" there are a few things that can be done with a varying degree of "freedom". I have 2 Nvidia 9800 GTX+ cards in SLi. One is overclocked, the other is not. The overclocked card, according to GPU-Z, continues to run at its overclocked rate even when SLi is active (not sure how accurate this is, maybe its not updating). I have not attempted to over-clock the standard GTX+ card due to potential cooling problems. I have put in my 8800GTX and my 7800GTX cards which all have no issue with being paired with a 9800GTX. While true that I can't throw a 9600GS with a 9800GT or a 9800GT with a 9800GTX, I still think that the downclock would hurt the better cards potential if that were possible. Still, being able to make use of my 8800GTX (which is not a whole lot different from the 9800GTX) is still amazing although having SLi 9800 GTX is much better. I just can't justify the down-clock for anything less. Eventually i'll purchase an Nvidia 560TI and my 9800 GTX+ OC will become a Physx dedicated card until I can afford a second 560TI
    "Quit rolling your fat, greasy, cheetos grubbing fingers all over the keyboard!" - Random Burning Crusade Raider circa 2007
    "...tinkering with hardware...more or less electric LEGO for masochists." - Partial quote from Joel Johnson of Kotaku.com
    1/11/06 N/A Ten Day Guest Pass Expired = Vanilla Veteran Proof! (Haters still gonna hate)
    Never forget, that you were warned about the Mists of Pandaria. YOU were WARNED! (10/21/11)

  8. #8
    Pit Lord Wries's Avatar
    Join Date
    Jul 2009
    Location
    Stockholm, Sweden
    Posts
    2,429
    Yes this bit about SLI downclocking one of the cards if they have different clock speeds, turns out they don't do that.

    They are able to co-operate even though clock speed differs. I'm not an expert in SLI rendering but I think it's pretty slave-ish in that each card should render every other "frame".

    In general, I think there's no benefit that one of the cards is clocked higher, just that it doesn't need to downclock it, and won't do it either.
    Obsidian 350D | Intel Core i7 2700K @ 4.8GHz | ASUS Maximus V Gene Z77 | 32GB RAM | Nvidia Geforce GTX 980 | 500GB SSD | LG 34" 21:9 34UM95-P

  9. #9
    Quote Originally Posted by Wries View Post
    Yes this bit about SLI downclocking one of the cards if they have different clock speeds, turns out they don't do that.

    They are able to co-operate even though clock speed differs. I'm not an expert in SLI rendering but I think it's pretty slave-ish in that each card should render every other "frame".

    In general, I think there's no benefit that one of the cards is clocked higher, just that it doesn't need to downclock it, and won't do it either.
    Also it turns out now, after I did some research, that sometimes certain cards will actually over-clock to match the already over-clocked card they are paired with in SLi. Albeit rare, I think that is pretty cool in itself.
    "Quit rolling your fat, greasy, cheetos grubbing fingers all over the keyboard!" - Random Burning Crusade Raider circa 2007
    "...tinkering with hardware...more or less electric LEGO for masochists." - Partial quote from Joel Johnson of Kotaku.com
    1/11/06 N/A Ten Day Guest Pass Expired = Vanilla Veteran Proof! (Haters still gonna hate)
    Never forget, that you were warned about the Mists of Pandaria. YOU were WARNED! (10/21/11)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •