Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    Quote Originally Posted by Nathasil View Post
    How exactly is that a "bad" or "fundamentally flawed" or "ass" product? It's too expensive, i get that. But so are Apple products. Are Apple products "ass", too? The 4060 TI is a good GPU. Depending on your exact preferences it even is a very good GPU.
    Yeah, there aren't bad products, just bad prices. And nVidia right now is just crazy with how high their prices are. The 4060Ti, both models, are $100-150 over what they should be. The 4090 is about $800-1,000 over the price it should be. Their entire product stack for the rtx4000 series has this problem.

  2. #22
    Quote Originally Posted by Djuntas View Post
    https://www.youtube.com/watch?v=rtt6...sableadblock=1 Enjoy hub

    Anyway the watts are more down to Nvidia also making the 40 series power limited (IMO) or other efficiencies.
    Anyway why don't you want 1440p? I see many screens nowadays not even made 1080p. I doubt we'll see any 1080p oled screens anytime soon. When oled drops in price that's instant buy for me...Recently bought the stupid popular LG gp850 or whatever its called (HUB reviewed it as well), and nah man...My cheap LG A1 oled TV, literally ½ the price of many of the oled monitors looks so much better its not even close...And I don't care about hz, lag and all that shit (My old 144 hz TN screen still fine, just not as pretty colors as ISP), Im to slow and bad at games anyway, but oled is also king in that.
    1440p is just not an option for me. The size of my room, desk and distance i sit from my screen do not allow for a monitor larger than typical 1080p displays...so it's just physically not an option without getting a sore neck and tired eyes as bonus because i would just sit way too close to the screen.

    But maybe more important than that: I would not WANT to play at 1440p. I personally - SUBJECTIVELY - see absolutely no difference in image quality between 1080p and 1440p. I DO see a difference between medium details+no RT @ 40fps and Ultra Details+RT @ 60fps...so i would personally always opt for the latter. I do not need more than solid 60fps, either. So a card that delivers 1080p, maximum setting at solid 60fps is absolutely perfect for me. That the 4060 TI is also low on power consumption (which also implies it's quiet and cool) is just a bonus. A big bonus.
    Last edited by Nathasil; 2023-05-25 at 09:30 PM.

  3. #23
    Quote Originally Posted by Nathasil View Post
    1440p is just not an option for me. The size of my room, desk and distance i sit from my screen do not allow for a monitor larger than typical 1080p displays...so it's just physically not an option without getting a sore neck and tired eyes as bonus because i would just sit way too close to the screen.

    But maybe more important than that: I would not WANT to play at 1440p. I personally - SUBJECTIVELY - see absolutely no difference in image quality between 1080p and 1440p. I DO see a difference between medium details+no RT @ 40fps and Ultra Details+RT @ 60fps...so i would personally always opt for the latter. I do not need more than solid 60fps, either. So a card that delivers 1080p, maximum setting at solid 60fps is absolutely perfect for me. That the 4060 TI is also low on power consumption (which also implies it's quiet and cool) is just a bonus. A big bonus.
    Not even sure how to take this comment. Very vague...and what's your definition of 'typical 1080p displays'? Seems like you need to do some more research on displays.

    "My memory... since when? If everything is a dream, don't wake me." -Cloud Strife, Final Fantasy VII

  4. #24
    Quote Originally Posted by Pann View Post
    Are you sure that you're not thinking of the 6700xt as every benchmark I've seen has the 6800xt some way ahead of the 4060Ti.
    You are right, i only checked Cyberpunk, but out of the 4 games GN tested at 1080p+RT Cyberpunk was the only one where the 6800XT was weaker, it performed better in 3 of them.

    Tomb Raider:
    6800XT: 103 fps low, 135 avg
    4060TI: 92 low, 113 avg

    Control:
    6800XT: 76 low, 93 avg
    4060TI: 65 low, 75 avg

    F1 2022:
    6800XT: 66 low, 99 avg
    4060TI: 62 low, 83 avg

    I currently have a 240Hz FreeSync display, but before that i always played at 60Hz+vsync and honestly, i experience so much screen flickering now that i intend to go back to 60Hz+vsync with my next setup, which means i would not benefit from anything beyond 60fps anyway, i would only be hit hard by anything below 60fps, so the higher avg frames of the 6800XT are not really of any benefit to me...the 140 watts less, however, would be of benefit to me...as would be the lower price.

    - - - Updated - - -

    Quote Originally Posted by TbouncerT View Post
    Not even sure how to take this comment. Very vague...and what's your definition of 'typical 1080p displays'? Seems like you need to do some more research on displays.
    I don't understand what's vague about it? The typical 1080p display is 24/25 inches visible. Bigger than that is typically 1440p and up. How big your screen should be is a direct function of your viewing distance. It should never be so large that you need to turn your head or constantly have to move your eyes. There are a lot of "formulas" for that out there and display manufacturers try to rewrite them constantly because they want to sell you bigger displays. But i can say that a display larger than 25" would not only be too large for me subjectively - it would also very physically not fit on my desk.

    Then there is the more important question: Why do it? Why should i buy a new desk, refit my room, buy bigger screens (which need more watts to run and produce more heat) just to then have to buy a more powerful PC to sit further away from the screen only to have the image quality be SUBJECTIVELY exactly as it was before? That makes absolutely zero sense.

    My room is very small. It's also beneath the roof. Temperature is a HUGE issue for me, especially in summer and the coming years will only exarcerbate this issue. I can FEEL the watts of heat my PC is blowing at me. Getting the machine to run cooler is a much, much, MUCH higher priority for me than going from 1080p > 1440p.

    Again: That is my personal priority and my subjective view on image quality. I do not claim those are objective facts or that i would be in some form speaking for the "average gamer" or anything like that. I can only say what my gaming reality is and what my priorities are. And those are running 1080p at the highest possible image quality and solid 60fps while consuming as little power as possible.
    Last edited by Nathasil; 2023-05-25 at 10:20 PM.

  5. #25
    https://pcpartpicker.com/products/mo...t=price&page=1

    Plenty of 1440p monitors at 24 to 25”.

    Not that a 27” monitor is actually much bigger, and that’s the most common size for 1440p.

  6. #26
    Quote Originally Posted by Kagthul View Post
    https://pcpartpicker.com/products/mo...t=price&page=1

    Plenty of 1440p monitors at 24 to 25”.

    Not that a 27” monitor is actually much bigger, and that’s the most common size for 1440p.
    https://pcpartpicker.com/products/mo...000&sort=price

    1470 1080p displays at 24" vs 42 1440p displays.....how is that not "typical"....? I think that pretty much is the definition of "typical".

    My opinion on resolution: You need to run a resolution at which you personally are no longer disturbed by artifacts of pixels. This point is subjective. It's not the same for everybody. Just like some people have much, much higher expectations to the colour clarity or lighting of their display while others see absolutely no difference. Do i personally see pixel artifacts running 1080p on a 25" display? No, i don't. I really don't. Which means going to 1440 would not be of "small" benefit to me....it would be of ZERO benefit to me. If i were running a 30" display, things would be different. I probably would start to see pixels at 1080p then and would want to upgrade to 1440. But given the choice i would ALWAYS run 1080p at higher settings instead of going for 1440. That is my personal opinion on resolution.

    Anyway: This thread was not (and should not be) about 1080p vs 1440p, i think. It was about a GPU that is specifically marketed as a top end 1080p gaming GPU being called "a waste of sand". I disagree. I think it is an excellent GPU for that resolution, not just in specs, but it also is reasonably priced against AMD cards that can deliver 60fps at 1080p Ultra+RT. The whole current Gen of GPUs is too expensive, but that inludes AMD just as well.

    My takeaway from all of this is that GN (and some posters in this thread) think playing at 1080p simply does not "deserve" to have 60fps at Ultra+RT. Which i find incredibly conceded. "If you cannot afford a 4k setup, why should you even bother with fps or fidelity settings?". There is no other explanation why GN does not take the cheapest currently available GPU on the market that can deliver 60fps consistently at 1080p Ultra+RT seriously.

    If there were a bunch of cheaper, better GPUs available that can run 1080p Ultra at 60fps it would be a totally different story. But there aren't. There really aren't. Of all the GPUs discussed in this thread the 4060 TI is the cheapest, best value per frame card to do that.

    Edit: Also, aside from not taking 1080p seriously, GN criticised the 4060 TI not being a serious upgrade over the 3060 TI generationally....this upsets me even more. As they themselves showed in their own benchmarks the 4060TI is 10-20% faster at 1080p Ultra. It also consumes 20% less power (which many here don't seem to care about, but i do) at the SAME PRICE. In which world is 10-20% more performance at 20% less power consumption for the same price not a generational improvement? I don't know! But in my world it is. Even if the performance would be exactly the same, i would still consider delivering it for 20% less power at the same price is a reasonable generational upgrade.

    The 4090 is ~50% faster than the 4060 TI in 1080p Ultra. But it is 400% the price. So is this card a "waste of sand", too? GN would say: "NO! You cannot compare a 4090 at 1080p! That's not the resolution this GPU is intended for!"...yeah, exactly...
    Last edited by Nathasil; 2023-05-26 at 01:20 AM.

  7. #27
    I'm relatively convinced games are just kneecapped on release to not work on older cards to force people to by these "upgrades" because there's really no other reason to do it.

  8. #28
    Just to emphasize on the power usage a little more: Where i live (Germany) the kwh elictricity goes for 31 euro cents (~33 dollar cents) on average. I know elictricity is subsidised heavily in the US, but your bills will go up in the coming years, or they will have to charge you those subsidies in other areas. Anyway, i will use the 33 dollar cents from where i live.

    A 4060 TI is 400 dollars. It saves me 0.04 kwh per hour of gaming compared to a 3060 TI. That's 1.32 dollar cents. This means on pure energy costs alone the card has "paid for itself" after 30303 hours, that's ~3.5 years of 24/7 activity under full load. Does my GPU run under full load 24/7? No. It does not. A card used for mining would but those days are long past. I personally game a LOT of hours per day....so for me it would be roughly 3 times the timespan...so ~10 years. The last GPU i used for an extended time was a GTX 970, which was released almost 10 years ago. The GTX970 also had rougly the same power consumption as the 3060TI, btw.

    So what i want to say is: If you are the type of gamer that upgrades the GPU only every few years, the 4060 TI's low power consumption means it does offset a good chunk of its overprice by energy savings. If you run that card for 10 years, like i almost did with my 970 before, it basically pays for itself and you get all the better features and performance for free.

    I simply disagree with completely glossing over the power argument.
    Last edited by Nathasil; 2023-05-26 at 01:46 AM.

  9. #29
    Quote Originally Posted by Nathasil View Post
    Just to emphasize on the power usage a little more: Where i live (Germany) the kwh elictricity goes for 31 euro cents (~33 dollar cents) on average. I know elictricity is subsidised heavily in the US, but your bills will go up in the coming years, or they will have to charge you those subsidies in other areas. Anyway, i will use the 33 dollar cents from where i live.

    A 4060 TI is 400 dollars. It saves me 0.04 kwh per hour of gaming compared to a 3060 TI. That's 1.32 dollar cents. This means on pure energy costs alone the card has "paid for itself" after 30303 hours, that's ~3.5 years of 24/7 activity under full load. Does my GPU run under full load 24/7? No. It does not. A card used for mining would but those days are long past. I personally game a LOT of hours per day....so for me it would be roughly 3 times the timespan...so ~10 years. The last GPU i used for an extended time was a GTX 970, which was released almost 10 years ago. The GTX970 also had rougly the same power consumption as the 3060TI, btw.

    So what i want to say is: If you are the type of gamer that upgrades the GPU only every few years, the 4060 TI's low power consumption means it does offset a good chunk of its overprice by energy savings. If you run that card for 10 years, like i almost did with my 970 before, it basically pays for itself and you get all the better features and performance for free.

    I simply disagree with completely glossing over the power argument.
    Can I ask how your setup is? Most desks are standard 80 CM deep (like, any other sizes is almost custom, and most are made 120-160 cm wide). When I recently got my 27-inch I was also skeptical, since I don't want to sit too close either, but then it's more about the right posture, sitting up straight etc.

    Anyway I'll just say this...the 4060ti is bad value because it does not age well, nor does it deliver performance today in demanding games. I find buying such cards, unless you don't care about high settings/PC gaming, to be a waste of money. Like my 4070ti 7800X3D system can barely max Read dead 2 60 FPS, and can't max Cyberpunk with ray tracing native rendering. Well I think I got like 45 FPS with ray tracing, or was it closer to 75?

    What I want for a system is also different than others I guess, In general, I want a system to last 8-10 years at least. That's what all my PCs have done in my life. So buy costly stuff, and keep it for longer. Yes, my system is kinda overkill for WoW or older games, but in the future, it won't be for new titles.

    Like 2 of my friends are slowly upgrading their systems...I find that pointless. Just keep it for so long you can, then buy all new parts. Like my friend bought a 6750XT, newer ddr4 ram, but still keeps his 6700k...As he said he gained no real FPS upgrade in WoW, since its CPU demanding.
    Last edited by Djuntas; 2023-05-26 at 10:07 AM.
    Youtube channel: https://www.youtube.com/c/djuntas ARPG - RTS - MMO

  10. #30
    Bloodsail Admiral reemi's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Montreal
    Posts
    1,035
    When you want performance and keep your pc few years, you don't go with 4060.

  11. #31
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,849
    If you want GPU that will last and you're intending to stay in 1440p area for next few years - you either go for 4080 or 4070Ti.

    DLSS will carry these for years to come and 4080 on top of it comes with frame buffer that will be all 1440p needs for next 5 years. 4070Ti is practically a 3080Ti with DLSS3, which is a nice thing going forward (and don't reee at me with latency - it's just an option to keep in mind).

    4060Ti and shit like that - it's a "I need to shove something now to keep up and I'll upgrade in 1-2 years at most" kind of card.
    Last edited by Gaidax; 2023-05-26 at 12:30 PM.

  12. #32
    I played 1080p 10 years ago. I am playing 1080p now. The chance that i will play anything other than 1080p in the next 4 years is zero%. The chance that i will play higher resolutions in more than 4 years is minimal.

    Which GPU on the market available RIGHT NOW offers the best performance/dollar for me? It's the 4060 TI.

    Are there a lot of gamers who play exclusively at 1080p? Yes, there are. What GPU would you recommend to them? A 6700XT cannot run current gen games at 1080p maxed out. It just can't. The 6800 XT can, but it costs a lot more and it will then cost you even more running the card because of twice the power cosumption while offering absolutely ZERO benefit to a 1080p gamer. It's also not anymore "future proof" than a 4060 TI. It's just a worse pick for a 1080p gamer in any way you can look at it.

    So, again, if somebody wants to play 1080p maxed out, what is the best GPU for them on the market available?

    Edit: The 4070 is a better GPU and also very efficient, you can downthrottle it and basically run it like a 4060 in 1080p...but it also is 200 dollars more. TI is 400 dollars more. Why would you spend that much more money just to then downthrottle your GPU? How does that help the "the product is overpriced" argument?
    Last edited by Nathasil; 2023-05-26 at 04:36 PM.

  13. #33
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,849
    Quote Originally Posted by Nathasil View Post
    I played 1080p 10 years ago. I am playing 1080p now. The chance that i will play anything other than 1080p in the next 4 years is zero%. The chance that i will play higher resolutions in more than 4 years is minimal.

    Which GPU on the market available RIGHT NOW offers the best performance/dollar for me? It's the 4060 TI.

    Are there a lot of gamers who play exclusively at 1080p? Yes, there are. What GPU would you recommend to them? A 6700XT cannot run current gen games at 1080p maxed out. It just can't. The 6800 XT can, but it costs a lot more and it will then cost you even more running the card because of twice the power cosumption while offering absolutely ZERO benefit to a 1080p gamer. It's also not anymore "future proof" than a 4060 TI. It's just a worse pick for a 1080p gamer in any way you can look at it.

    So, again, if somebody wants to play 1080p maxed out, what is the best GPU for them on the market available?

    Edit: The 4070 is a better GPU and also very efficient, you can downthrottle it and basically run it like a 4060 in 1080p...but it also is 200 dollars more. TI is 400 dollars more. Why would you spend that much more money just to then downthrottle your GPU? How does that help the "the product is overpriced" argument?
    What you say is fine, but should such a GPU cost $400?

    If this GPU would have been named 4050Ti and cost $300 USD as it should, then nobody would even peep. Heck, people would probably praise Nvidia.

  14. #34
    Nvidia seems to want 1080p gamers to buy a new GPU every time a new series of gpus comes out.

    4060 series might be running some titles on ultra today, but as new games come out you will be reduced to awful performance in as little as 12 months.

    Crippling the memory was the wrongest choice possible and i sure as heck wont be buying a gpu that barely outperforms its predecessor while having no future proof of any kind.

  15. #35
    Quote Originally Posted by Nathasil View Post
    You are right, i only checked Cyberpunk, but out of the 4 games GN tested at 1080p+RT Cyberpunk was the only one where the 6800XT was weaker, it performed better in 3 of them.

    Tomb Raider:
    6800XT: 103 fps low, 135 avg
    4060TI: 92 low, 113 avg

    Control:
    6800XT: 76 low, 93 avg
    4060TI: 65 low, 75 avg

    F1 2022:
    6800XT: 66 low, 99 avg
    4060TI: 62 low, 83 avg

    I currently have a 240Hz FreeSync display, but before that i always played at 60Hz+vsync and honestly, i experience so much screen flickering now that i intend to go back to 60Hz+vsync with my next setup, which means i would not benefit from anything beyond 60fps anyway, i would only be hit hard by anything below 60fps, so the higher avg frames of the 6800XT are not really of any benefit to me...the 140 watts less, however, would be of benefit to me...as would be the lower price.

    - - - Updated - - -



    I don't understand what's vague about it? The typical 1080p display is 24/25 inches visible. Bigger than that is typically 1440p and up. How big your screen should be is a direct function of your viewing distance. It should never be so large that you need to turn your head or constantly have to move your eyes. There are a lot of "formulas" for that out there and display manufacturers try to rewrite them constantly because they want to sell you bigger displays. But i can say that a display larger than 25" would not only be too large for me subjectively - it would also very physically not fit on my desk.

    Then there is the more important question: Why do it? Why should i buy a new desk, refit my room, buy bigger screens (which need more watts to run and produce more heat) just to then have to buy a more powerful PC to sit further away from the screen only to have the image quality be SUBJECTIVELY exactly as it was before? That makes absolutely zero sense.

    My room is very small. It's also beneath the roof. Temperature is a HUGE issue for me, especially in summer and the coming years will only exarcerbate this issue. I can FEEL the watts of heat my PC is blowing at me. Getting the machine to run cooler is a much, much, MUCH higher priority for me than going from 1080p > 1440p.

    Again: That is my personal priority and my subjective view on image quality. I do not claim those are objective facts or that i would be in some form speaking for the "average gamer" or anything like that. I can only say what my gaming reality is and what my priorities are. And those are running 1080p at the highest possible image quality and solid 60fps while consuming as little power as possible.
    Cyberpunk is the only game where nVidia always had and will have a clear advantage over AMD, especially with RT on.

    Whatever other game, raster performances of AMD cards are just better. And situation will probably VASTLY improve further when FSR3 will be out.

  16. #36
    I just watched another review of the 4060TI (https://www.youtube.com/watch?v=WLk8xzePDg8) and was about to disagree with it in the comments when their conclusion at the end stopped me because it was kinda inspiring and i agree with their message that PC should always be the technically superior gaming platform compared to consoles and products like the 4060TI do not help making it so.

    I can see that i am biased. I am an exclusive 1080p gamer who does not plan to go anywhere higher in the lifetime of this, the next and probably even the GPU generation after that and i also care more about power consumption than the "average guy" because i melt in my room in summer and my electricity is among the most expensive in the world according to this chart: https://www.globalpetrolprices.com/electricity_prices/

    So for me the saved energy is a bigger factor than it probably is for most players and even big enough to offset a good chunk of the overprice.

    This GPU right now is so perfectly tailored to me that i probably glossed over its shortcomings just like i blamed other to gloss over its strenghts. Of course i would be in favour of it being only 300 dollars.

    My only problem with reviews dimissing it as junk right out of hand is that there really right now are no other GPUs that perform better or equally at 1080p Ultra which would be any better in value/frame. Even the review that i posted above (which toally shits on the GPU) has it as the most cost efficient GPU for 1440p(!!!) right now at MSRP prices....so the only way to get better value/frame is to relie on discounted AMD cards of the last generation, most of them running out of stock soon. That is for 1440p, mind you! A resolution at which the 4060TI sucks. And they completely ignore the power aspect, which in my personal case adds ~50 dollars in value to the 4060Ti PER YEAR compared to the most efficient GPUs of last gen. The AMD last gen GPUs which are better value/frame than 4060TI when discounted are NOT the most efficient, so the saving/year would be even greater for me.

    But again, my preferences are not everybody's. I get that. If the 4060 TI was 300 dollars i would not even be writing about it here...i would have already bought one because at this price point and saving ~50 dollars in just one fucking year on energy...there really would be nothing you could do wrong with it even if you intend to only use it for 2 years max.
    Last edited by Nathasil; 2023-05-28 at 03:03 AM.

  17. #37
    Quote Originally Posted by Nathasil View Post

    My only problem with reviews dimissing it as junk right out of hand is that there really right now are no other GPUs that perform better or equally at 1080p Ultra which would be any better in value/frame. Even the review that i posted above (which toally shits on the GPU) has it as the most cost efficient GPU for 1440p(!!!) right now at MSRP prices....so the only way to get better value/frame is to relie on discounted AMD cards of the last generation, most of them running out of stock soon. That is for 1440p, mind you! A resolution at which the 4060TI sucks. And they completely ignore the power aspect, which in my personal case adds ~50 dollars in value to the 4060Ti PER YEAR compared to the most efficient GPUs of last gen. The AMD last gen GPUs which are better value/frame than 4060TI when discounted are NOT the most efficient, so the saving/year would be even greater for me.

    But again, my preferences are not everybody's. I get that. If the 4060 TI was 300 dollars i would not even be writing about it here...i would have already bought one because at this price point and saving ~50 dollars in just one fucking year on energy...there really would be nothing you could do wrong with it even if you intend to only use it for 2 years max.
    At MSRP... The 6700 can be purchased for 289 USD currently, with a faster memory bus and slightly more vram. The video you linked shows the limitations of the 4060ti. You can set things to ultra, but they look like crap in recent games and DLSS 3 can't save it from this. That doesn't bode well for future. The 4060ti is not priced well for what it does. Vram cards with 8 gigs and 128 gig bus should be less than 300 if they are released at all.

    Regardless, it is your money. Spend it how you wish. Just don't expect the card to hold on to ultra settings for much longer, if it does at all.

  18. #38
    Okay, I've reevaluated my opinion:
    Since I am currently using a PCIe 3 Mainboard and the 4060ti only uses 8 lanes and is specced for PCIe 4.0 I would have to upgrade my whole system just to accomodate the 4060. Which isn't worth it.
    At least not for a 500€ (16GB version) low-end graphics card.
    Last edited by LordVargK; 2023-05-28 at 04:05 PM.

  19. #39
    Quote Originally Posted by LordVargK View Post
    Okay, I've reevaluated my opinion:
    Since I am currently using a PCIe 3 Mainboard and the 4060ti only uses 8 lanes and is specced for PCIe 4.0 I would have to upgrade my whole system just to accomodate the 4060. Which isn't worth it.
    At least not for a 500€ (16GB version) low-end graphics card.
    PCIe 3 is fine? Check https://www.techgoing.com/nvidia-rtx...herboard-test/
    Teamwork is essential - it gives the enemy someone else to shoot at!

  20. #40
    Bloodsail Admiral bloodkin's Avatar
    15+ Year Old Account
    Join Date
    Feb 2009
    Location
    in your mind
    Posts
    1,197
    Quote Originally Posted by Nathasil View Post
    I played 1080p 10 years ago. I am playing 1080p now. The chance that i will play anything other than 1080p in the next 4 years is zero%. The chance that i will play higher resolutions in more than 4 years is minimal.

    Which GPU on the market available RIGHT NOW offers the best performance/dollar for me? It's the 4060 TI.

    Are there a lot of gamers who play exclusively at 1080p? Yes, there are. What GPU would you recommend to them? A 6700XT cannot run current gen games at 1080p maxed out. It just can't. The 6800 XT can, but it costs a lot more and it will then cost you even more running the card because of twice the power cosumption while offering absolutely ZERO benefit to a 1080p gamer. It's also not anymore "future proof" than a 4060 TI. It's just a worse pick for a 1080p gamer in any way you can look at it.

    So, again, if somebody wants to play 1080p maxed out, what is the best GPU for them on the market available?

    Edit: The 4070 is a better GPU and also very efficient, you can downthrottle it and basically run it like a 4060 in 1080p...but it also is 200 dollars more. TI is 400 dollars more. Why would you spend that much more money just to then downthrottle your GPU? How does that help the "the product is overpriced" argument?
    The fact that you ONLY stick to 4060 just goes to show how stupid the market has gotten. Not only is Nvidia gouging the market themselves (while crying about scalpers not too long ago, top kek), the products are flaws at launch, you only get half the PCI lanes that it's predecessor has! sure, it might have far more mem, but it's still fuctionally at the same performance as a 3060 for a card that costs an arm and a leg for no good reason. Nvidia is selling you a shit product at an even worse price, they're approaching apple levels of mistreatment, and people still suck it up, same for AMD's.

    I really hope that Intel of all brands can break open the market and get some good board partners, the A750 and A770 (16g ver) are only getting better and more competetive. Second hand cards are also a good alternative, if you're willing and capable of checking the card and know what you're looking for. I'm also really enjoying the escelating launches of each card, Nvidia should feel they can't keep fucking with the market in their own wallet.

    When I had to upgrade from a 1660 super, I chose to buy second hand (got a 3070). Unless someone truely abused the F out of a card and/or physically tore it apart, most cards will run fine for more than 5 to 10 years as long as you keep up decent maintenance. The only other alternative for me, was to go for a brand new Intel A770 16g, and I regret not getting it.

    - - - Updated - - -

    Quote Originally Posted by Nathasil View Post
    I just watched another review of the 4060TI (https://www.youtube.com/watch?v=WLk8xzePDg8) and was about to disagree with it in the comments when their conclusion at the end stopped me because it was kinda inspiring and i agree with their message that PC should always be the technically superior gaming platform compared to consoles and products like the 4060TI do not help making it so.

    I can see that i am biased. I am an exclusive 1080p gamer who does not plan to go anywhere higher in the lifetime of this, the next and probably even the GPU generation after that and i also care more about power consumption than the "average guy" because i melt in my room in summer and my electricity is among the most expensive in the world according to this chart: https://www.globalpetrolprices.com/electricity_prices/

    So for me the saved energy is a bigger factor than it probably is for most players and even big enough to offset a good chunk of the overprice.

    This GPU right now is so perfectly tailored to me that i probably glossed over its shortcomings just like i blamed other to gloss over its strenghts. Of course i would be in favour of it being only 300 dollars.

    My only problem with reviews dimissing it as junk right out of hand is that there really right now are no other GPUs that perform better or equally at 1080p Ultra which would be any better in value/frame. Even the review that i posted above (which toally shits on the GPU) has it as the most cost efficient GPU for 1440p(!!!) right now at MSRP prices....so the only way to get better value/frame is to relie on discounted AMD cards of the last generation, most of them running out of stock soon. That is for 1440p, mind you! A resolution at which the 4060TI sucks. And they completely ignore the power aspect, which in my personal case adds ~50 dollars in value to the 4060Ti PER YEAR compared to the most efficient GPUs of last gen. The AMD last gen GPUs which are better value/frame than 4060TI when discounted are NOT the most efficient, so the saving/year would be even greater for me.

    But again, my preferences are not everybody's. I get that. If the 4060 TI was 300 dollars i would not even be writing about it here...i would have already bought one because at this price point and saving ~50 dollars in just one fucking year on energy...there really would be nothing you could do wrong with it even if you intend to only use it for 2 years max.
    I'd suggest to watch dDerbauer's second review of the card https://www.youtube.com/watch?v=uU5jYCgnT7s. Even in Germany where energy prices can be very high, it's still not worth upgrading to that card if it's not compatible with your PCI slot (8 lanes LMAO) vs a 3060 and by the time that 1080P would require more than 8GB of Vram, we're probably at the next gen of cards. By the time we're at the next gen Intel probably has a competitive card out (even though the A770 16GB is already a good deal) at a far better price range. Nvidia and AMD to a somewhat lesser degree is gauging gamers now that they still can until they have to really compete against a third player in the market.

    Edit: I'm kind of in the same boat: I game at 1080p 144hz+ as I didn't get a 1440p screen and don't like too much heat production in my case (so low energy is pref), I've put a lot of thought to it but outright refuse to give a company money directly that treats it's customers and board partners as poorly as Nvidia does (RIP EVGA). I'm pretty happy with my second hand EVGA 3070, I don't know what I'l upgrade to in the future (depends if I get a better screen) but I know it won't be a brand new Nvidia card.
    Last edited by bloodkin; 2023-05-28 at 05:15 PM.
    'Something's awry.' -Duhgan 'Bel' beltayn

    'A Man choses, a Slave obeys.' -Andrew Rayn

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •