https://www.geforce.com/hardware/des...specifications
They're up-front with the fact that there exists a DDR4 variant and that it has much lesser bandwidth. One might be upset that they don't make unique model names for every variation they do but.. like does an uninformed buyer really get more info out of a "1030 vs 1030 SE" compared to "GT 1030 2GB GDDR5 vs 1030 2GB DDR4"? Either way the smart thing would be to do a bit of research.
Majority of people according to Steam. People here need to realize that you don't need the best or even mid range performance to be able to game. But this issue effects mid to high end products as well. The GTX 1060 6GB vs 1060 3GB is not the same card. They're more like the 1070 vs 1080, without a name change. We also had the GTX 970 4GB fiasco that we as consumers defended Nvidia, not that AMD is a saint by any means. Which again, Nvidia calls it 4GB when it's really 3.5GB of usable memory.
But again, Nvidia gets away with it because we have review sites that keep their old benchmarks of these products up. If Nvidia wants to call it a 1030, that's fine so long as the old 1030 benchmarks are removed in favor of these DDR4 versions.
I believe the homeless bum in an early issue of Spawn said it best- "No matter how much you polish a turd, it's still a turd".
10850k (10c 20t) @ all-core 5GHz @ 1.250v | EVGA 3080 FTW3 Ultra Gaming | 32GB DDR4 3200 | 1TB M.2 OS/Game SSD | 4TB 7200RPM Game HDD | 10TB 7200 RPM Storage HDD | ViewSonic XG2703-GS - 27" IPS 1440p 165Hz Native G-Sync | HP Reverb G2 VR Headset
MMO-Champion Rules and Guidelines
you need to be retarded to be "fooled" by this
its going to say DDR4 on the box
noone is obliged to change their GPU model name because of a memory change
if you dont do research --> proceed at your own risk .. be responsible for your own self
now if they made an actual directed effort to make this DDR4 card appear absolutely indistinguishable in every way (on the box, on their site etc.) from a regular GT 1030 then yeah it'd be a "scam"
now that is definitely a stupid anti-Nvidia circlejerk reactionthe old 1030 benchmarks are removed
and yeah, GT 1030 is definitely not a real gaming card even for budget builds, assuming we are talking about modern 3D games (even on low settings) and not ~Hearthstone
There are more people like this than you expect. I work with those people - i'm the guy they call when they buy random stuff and i need to fix their fuckup (and they still don't realize that buying cheap and uninformed ends up costing more than buying the right stuff at first).
Sometimes they get it, sometimes they don't. Especially in the case of parents buying a "home pc that everyone will use but don't want to spend money on it and it needs to last forever", with a teenager children that believes to be smarter than them and tries to make them buy gaming/top tier components; it ALWAYS ends with the parents going in a random pc store or even worse on amazon and buy random crap because they remember half the names.
I'm not saying it's NVidia's fault. Hell, it's better for me too making money out of them. Yet you cannot say it's not a blatant try to rip money out of people who don't know any better.
It's just the experiences i get every day.
EDIT: yes, i don't agree too with the removal of old benchmarks - it's not like traditional GDDR5 cards won't exist anymore. It simply doesn't make sense.
Non ti fidar di me se il cuor ti manca.
What AMD initially did was just as bad(though I would argue going from 16CU to 14CU isn't as bad a downgrade as going from GDDR5 to DDR4). AMD did go to the vendors and tell them to fix their labeling(to mixed success).
The shitty thing is, this didn't need to happen. They could have made it a GT 1020. There's no NEED to reuse the 1030 label. This is just greedy Nvidia hoping to trick buyers into buying a lemon.
The 1030 DDR4 benchmarks also include the GDDR5 version as I don't see any reviewers benchmarking these cards without including the GDDR5 version. Consumers will see both and make their own minds up, as opposed to consumers who Google GT 1030 and see how well the 1030 does and then pick up the DDR4 version without thinking twice, thus being screwed. Especially cause the older 1030 GDDR5 benchmarks don't mention a DDR4 version at all, but the newer 1030 DDR4 benchmarks will mention the GDDR5 version.
Nvidia deserves to punished, and this is how you do it. After all, this is the reason they didn't call it a GT 1020. They called it a 1030 to capitalize on the old 1030 GDDR5 benchmarks.
And how many consumers know what DDR4 or GDDR5 translates to in performance? Most people aren't technical savvy like us. Most people just wanna play their favorite game and will pick up a DDR4 version cause it's the cheapest.
You think MSI or EVGA are going to label the boxes DDR4 clearly? It's going to be on the side in fine print. Go to NewEgg and find me a box that clearly shows DDR4. For that matter, even NewEggs specifications don't show me the memory bandwidth from the DDR4, which would be a bigger indicator of performance than just calling it "DDR4".noone is obliged to change their GPU model name because of a memory change
if you dont do research --> proceed at your own risk .. be responsible for your own self
now if they made an actual directed effort to make this DDR4 card appear absolutely indistinguishable in every way (on the box, on their site etc.) from a regular GT 1030 then yeah it'd be a "scam"
https://www.newegg.com/Product/Produ...E&gclsrc=aw.ds
Or pro consumer, depending on how you look at it.now that is definitely a stupid anti-Nvidia circlejerk reaction
You're justifying an anti-consumer practice on the idea that you declare these cards not for gaming. The person that buys it to play OverWatch isn't going to care what you think.and yeah, GT 1030 is definitely not a real gaming card even for budget builds, assuming we are talking about modern 3D games (even on low settings) and not ~Hearthstone
There are people who would like to game but can't afford a 1050. Especially in poorer countries. The integrated Intel option just doesn't cut it for gaming. The AMD APU's are an option but high speed memory pricing used to make the 1030 a moderately attractive option. After this change, the AMD APU options are a no-brainer.
Ultimately, my point is just that there are people who can't afford anything more than a 1030. Should they not be allowed to game?
My problem, who the fuck bothers producing a gpu of any tier with ddr4? Who looks for ddr4, when nothing similar has been used for discrete gpu's? Of what use is this card, people say GT cards are not for low end gaming, but please come up with another use for a $100 gpu.
I can guarantee you that my sister knows nothing about DDR4 yet she is the one who is buying a gaming PC for her kids. The vast majority of people buying computers know absolutely nothing about what is inside them. They believe whatever the sales person tells them about the PC.
Yeah, I lol'd at all the people who said that, or who look down on the uninformed and since they're clearly stupid they deserve to be taken advantage of so anti-consumer practices are ok in this case because they make them feel good or something. I bet they think kids should have lootboxes too..
who isn't spending at least $300 on a GPU? If you are on a WoW forum? and playing wow? do you really have an excuse to NOT own a standard GPU?