Page 1 of 2
1
2
LastLast
  1. #1

    Wanting to switch to Intel and Nvidia

    I want to make sure I'm not downgrading, I currently have a HD Radeon 6950 and a AMD FX 8120 8-cores

    I was looking at getting a i7-3770K and a GeForce GTX 670

    Little disappointed by the lack of support dev's have been giving recently to AMD users. Assassin's Creed 3 being the most recent, 3 months later and there is still multiple forum threads per month (most reaching 80+ pages) on their forums of people complaining because their AMD only nets them 10-20fps including myself. A few people have been saying its also causing a few GW2 problems because I've noticed people with Intel are getting almost twice my fps due to the games heavy reliance on the CPU.

    Anyways back to the topic, is it an upgrade? I'm a little worried about downgrading but I'm pretty sure I'm safe with the CPU, I'm just unsure about the GPU. I get a little confused when comparing the brands.

  2. #2
    I used to be a hardcore AMD fan for many MANY years. However, the past couple of years AMD has just dropped the ball in several places. There have been many complications with games and at times (not all the time) their latest chip-set is sub-par compared to the latest intel one. After so many MANY years with AMD, I finally decided to make the switch to Intel. Best thing I ever did. The performance just blasts the AMD chipset I had out of the water. On top of that, the stability of the intel chipset far exceed the frequent issues I had with the AMD chipsets.

    BTW: I also ditched my ATI cards and went with Nvidia. Second best thing I did. (In my opinion) A lot more games work better on Nvidia. ATI drivers are flimsy at best and I've always had issues getting drivers to work properly. With ATI, It also didn't help that months almost a year would go by before a new driver release was brought out. Half the time they never really addressed any of the bugs I was having. Made the switch to Nvidia and I'm enjoying great performance from my cards and I don't have to worry about drivers since they have beta/release drivers on regular schedules.

    Mind you I'm sharing my experience as intel convert. I'm sure there are others out their that would beg to differ.

  3. #3
    Quote Originally Posted by Xruptor View Post
    BTW: I also ditched my ATI cards and went with Nvidia. Second best thing I did. (In my opinion) A lot more games work better on Nvidia. ATI drivers are flimsy at best and I've always had issues getting drivers to work properly. With ATI, It also didn't help that months almost a year would go by before a new driver release was brought out. Half the time they never really addressed any of the bugs I was having. Made the switch to Nvidia and I'm enjoying great performance from my cards and I don't have to worry about drivers since they have beta/release drivers on regular schedules.

    Mind you I'm sharing my experience as intel convert. I'm sure there are others out their that would beg to differ.
    If you were still using an ATI card, you definitely should have upgraded. And yes, ATI drivers were a pile of poo. Good thing that ATI no longer produces graphics cards and drivers and has not done so for the last 3-4 years, huh?

  4. #4
    An upgrade by far. Do it and enjoy your games without hassle. If you're only interested in gaming with that computer, buy the i5-3570(k) and save a few AUD.

  5. #5
    Deleted
    First off, i think you're using the wrong FX processor, as most current games don't use more than four cores, so that halves your CPU ability right there.

    Secondly, it is an upgrade, but I wouldn't recommend getting the i7-3770K unless you have the OC, and even then, you can save by getting the i5-3570k. The GPU shoudl easily be portable, so unless you have money to burn, so to speak, I wouldn't recommend getting the GTX 670 for the time being.

  6. #6
    I made the switch around christmas time after noticing that WoW: MoP was running really crappy on my amd Phenom 940 and ATI HD 5770. Switched to a FX 6100 and a GTX 660. Ended up not liking the FX 6100 and changed out my MB and CPU to a i5 3570k and a ASrock Z77 board. Saw a massive improvment from my first chip and video card. Stick with most of the name brands for gpu's. Though stay away from Evga unless its a really good deal and is a non-reference card.

  7. #7
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    A upgrade by far as George has pointed out. You'll notice it in games more than apps.

    dont be fooled by the "8 core" amd fad, they arent as good as they are hyped up to be and they run hot as hell, literally.

  8. #8
    Quote Originally Posted by jholdaway View Post
    I made the switch around christmas time after noticing that WoW: MoP was running really crappy on my amd Phenom 940 and ATI HD 5770.
    Well, yeah. The Phenom 940 is a pretty old chip and the 5770 would have been about 3 years old at that point in time.

    Switched to a FX 6100 and a GTX 660. Ended up not liking the FX 6100 and changed out my MB and CPU to a i5 3570k and a ASrock Z77 board. Saw a massive improvment from my first chip and video card.
    Yes, because going from a Phenom 940 to a 3570K is like going from a Pentium 166 to a core2quad.

  9. #9
    Quote Originally Posted by moremana View Post
    dont be fooled by the "8 core" amd fad, they arent as good as they are hyped up to be and they run hot as hell, literally.
    Literally? That means the temperature of Lava and people buring in eternal fire? Wouldn't the whole PC catch fire with such a CPU?
    Atoms are liars, they make up everything!

  10. #10
    Where is my chicken! moremana's Avatar
    15+ Year Old Account
    Join Date
    Dec 2008
    Location
    Florida
    Posts
    3,618
    Quote Originally Posted by Kryos View Post
    Literally? That means the temperature of Lava and people buring in eternal fire? Wouldn't the whole PC catch fire with such a CPU?
    yes, yes it would, buy one and watch your pc go up in flames...

    My son decided to go the 8150 route, we both have gigabyte boards, he has the 990 fx, me the z77 and a i5 3570k and a hd 7770, he has a radeon 7850 and in wow I get 12-25 fps more than he does and his 200 mm fan blows out hot air, my 2 120s on top blow out cool air.

  11. #11
    If they are so bad why ps4 and x-box 720 are gonna use amd 8-core cpu and amd video chipsets? And also wiiu use amd video chipset. AMD took them all!

  12. #12
    Quote Originally Posted by Kryos View Post
    Literally? That means the temperature of Lava and people buring in eternal fire? Wouldn't the whole PC catch fire with such a CPU?
    Hell is supposed to have rivers of sulphur (brimstone) and can therefore be no hotter than 444.6°C (boiling point of Sulphur).

    Regardless, I think he was using "hell" in the figurative sense of "hotter than any other CPU on the market".



    If they are so bad why ps4 and x-box 720 are gonna use amd 8-core cpu and amd video chipsets? And also wiiu use amd video chipset. AMD took them all!
    Because consoles work differently to PCs. No annoying DirectX API to bother with, for one thing.

  13. #13
    I had before AMD chips and they are slow and put out way too much heat.
    Went to Intel and performance even with worse chip on paper was better.
    Only problem with intel is, that memory controller is integrated inside CPU and makes it more expensive but lowers memory latency (this is why intel is way faster).
    AMD has memory controller on MB, making MB more expensive, but you have same CPU slot for more time than intel. Intel basically switch socket every generation, so when you want to change CPU, you need to change MB as well.

    I had once ATI tv tuner and it was the worst buy I ever made. Yeah I never knew ATI was such a crap company that have great hardware and stupid software (drivers) that don't work at all. I sold tuner to someone and bought some Chinese crap and it lasted more than 5 years with no problems and crashes.

    When I started gaming it was an era of voodoo 3 3000 and 3dfx. When they got bust (nvidia bought them) I switched to nvidia and I was never sorry for it. I never ever had any serious problem with their drivers.

    My suggestion, go and buy intel CPU and nvidia card and get rid of that AMD and ATI card. Save your nerves.

  14. #14
    Thanks for the replies, guess I'll see if I can order them next week.

    Quote Originally Posted by Yova View Post
    If they are so bad why ps4 and x-box 720 are gonna use amd 8-core cpu and amd video chipsets? And also wiiu use amd video chipset. AMD took them all!
    I'm pretty sure that has to do with price too

    Quote Originally Posted by moremana View Post
    A upgrade by far as George has pointed out. You'll notice it in games more than apps.

    dont be fooled by the "8 core" amd fad, they arent as good as they are hyped up to be and they run hot as hell, literally.
    Lol so true, running at 85c at the moment which is 185f playing GW2. Pretty sure my motherboard is going to shut down my PC any minute now :/ considering idle is somewhere under 55 if I remember correctly lol
    Last edited by Asphyxiate9; 2013-02-09 at 02:01 PM.

  15. #15
    Quote Originally Posted by Yova View Post
    If they are so bad why ps4 and x-box 720 are gonna use amd 8-core cpu and amd video chipsets? And also wiiu use amd video chipset. AMD took them all!
    And since no one is really buying them, they have to use them somehow or lose money. That's just my thoughts on the matter lol.

  16. #16
    Quote Originally Posted by Yova View Post
    If they are so bad why ps4 and x-box 720 are gonna use amd 8-core cpu and amd video chipsets? And also wiiu use amd video chipset. AMD took them all!
    They are cheap and the demand is less than for Intel or Nvidia so they can produce as much chips as needed by MS and Sony. A few months ago AMD was close to bankruptcy and had to let a lot of people go. When a company desperately needs money you can get great deals.

  17. #17
    Quote Originally Posted by orangelemonrain View Post
    Lol so true, running at 85c at the moment which is 185f playing GW2. Pretty sure my motherboard is going to shut down my PC any minute now :/ considering idle is somewhere under 55 if I remember correctly lol
    The maximum safe operating temperature for 8120 is 61C Core temp. Your CPU is probably throttling a large amount and I really hope your sensor are reading those temps incorrectly or it will pretty soon break down.

    You should take a serious look into the cooling you use.
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  18. #18
    Quote Originally Posted by orangelemonrain View Post
    Lol so true, running at 85c at the moment which is 185f playing GW2. Pretty sure my motherboard is going to shut down my PC any minute now :/ considering idle is somewhere under 55 if I remember correctly lol
    I have i5-3350P with stock cooler and currently playing SWTOR at stable 60 FPS on max details I have 51-57c on CPU cores and 55-65c on GFX card (GTX 660).
    When idle CPU is on around 33c and GFX on 35c. All stock cooling! So I hope now you can see difference between AMD and Intel. Also my CPU is made to operate at max 69c. Over that temp MB is downclocking CPU and voltage, over 75c shiuts down computer. 85c is very very close to meltdown of cores so I suggest you to not to play at that temperatures if you don't want to throw your CPU in trash bin.

  19. #19
    Quote Originally Posted by Numeanor View Post
    I have i5-3350P with stock cooler and currently playing SWTOR at stable 60 FPS on max details I have 51-57c on CPU cores and 55-65c on GFX card (GTX 660).
    When idle CPU is on around 33c and GFX on 35c. All stock cooling! So I hope now you can see difference between AMD and Intel. Also my CPU is made to operate at max 69c. Over that temp MB is downclocking CPU and voltage, over 75c shiuts down computer. 85c is very very close to meltdown of cores so I suggest you to not to play at that temperatures if you don't want to throw your CPU in trash bin.
    You can't compare Intel CPU temps to AMD, also your temp settings are not correct, TjMax for Ivy Bridge CPUs are 105c
    Intel i5-3570K @ 4.7GHz | MSI Z77 Mpower | Noctua NH-D14 | Corsair Vengeance LP White 1.35V 8GB 1600MHz
    Gigabyte GTX 670 OC Windforce 3X @ 1372/7604MHz | Corsair Force GT 120GB | Silverstone Fortress FT02 | Corsair VX450

  20. #20
    If you are set on spending that amount of money on an upgrade, wouldn't a 3570k and a 7970GHZ/GTX680 be a better option?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •