Page 1 of 2
1
2
LastLast
  1. #1
    Bloodsail Admiral Misuteri's Avatar
    15+ Year Old Account
    Join Date
    Nov 2007
    Location
    The Nexus
    Posts
    1,182

    Thoughts on Intel Graphics Cards?

    Anyone really thought about putting one of these in their next build?

    I have yet to see any pre-built makers embrace them and the most consistent criticism of them has been driver issues that are either minimal or the plague depending on titles.

    The price point seems to be very competitive but is this the case where it's not best to be an early adopter?
    The most persecuted minority is the individual.

  2. #2
    Quote Originally Posted by Misuteri View Post
    Anyone really thought about putting one of these in their next build?

    I have yet to see any pre-built makers embrace them and the most consistent criticism of them has been driver issues that are either minimal or the plague depending on titles.

    The price point seems to be very competitive but is this the case where it's not best to be an early adopter?
    The drivers are in poor shape. The software package is atrocious; features like XeSS are terrible. If you normally play older titles that still use DirectX 9, i wouldn't even bother at all. Modern titles they do okay.

    If you can to handle being an early adopter, and actually want to bug report and troubleshoot, go for it. Otherwise I'd sit Arc Alchemist out.

  3. #3
    Bloodsail Admiral
    3+ Year Old Account
    Join Date
    Sep 2020
    Posts
    1,083
    The performance numbers are ok for lower-end casual gaming. But since Intel is new to the gpu market (after eternal delays), driver issues are going to be bumpy probably for a couple of years. Early adopters are basically doing the beta testing. AAA releases also usually get a sponsorship with Nvidia or AMD and then fine tune them for the sponsored supplier with some small performance/detail advantages. It remains to be seen if Intel is going to jump into that fray, but until they do Intel gpu owners will also be behind on those too for the bigger releases. And benchmarks have the Intel gpus behind AMD and Nvidia's last-gen. Ok for lower-res gaming, but then again if that's all you need an older Nvidia or AMD card will do the trick and be much more stable.

    So they aren't an option I'd recommend yet due to potential driver problems. In a year or two down the road if/when Intel shows they are keeping their drivers up with game releases to ensure no issues then I'd give them more serious thought. But then again Intel could pull the plug by then on their gpu program if sales disappoint (ex. Optane). In the meantime performance and bugs are likely to be hit and miss, ok in some games and not so good in others.

  4. #4
    I have mostly read bad reviews on Intel's latest graphic cards. Check youtube reviews, too. Many reviewers had issues getting it to work, and had to maneuver workarounds to get it to work. Plus, what Linkedblade said above: drivers are in poor shape.

  5. #5
    Quote Originally Posted by Medievaldragon View Post
    I have mostly read bad reviews on Intel's latest graphic cards. Check youtube reviews, too. Many reviewers had issues getting it to work, and had to maneuver workarounds to get it to work. Plus, what Linkedblade said above: drivers are in poor shape.
    Yeah, most people buy budget cards to play modern games at decent settings, which intel's gpu's can, but anything a few years older that use DX9 or older they can't. Spending the $300-400 on an A750 or A770 only to play modern titles and then be frustrated by silly issues like the screen only showing in 400x300 or not at all, isn't worth it.

    Imo the only card of value in the stack is the A310 which will probably be $100-150, but worth it for AV1 encode/decode, when the transition to AV1 from h265 happens. But that's only for video snobs, not gaming. Hopefully intel's qa team gets working on drivers, which it looks like they are honestly trying, but time will tell.

  6. #6
    As long as you play mostly newer titles, theyre fine. They are cheaper than performance equivalent cards from AMD and nVidia.

    If you play a lot of older titles... the issues (that Intel has basically said they have zero intention of fixing because they simply do not care about it) then i wouldn't. While they run, and they run at "good' acceptable framerates (Well above 60 still in most cases) its still terrible compared to even a 3050 or RX 6500.

    If you just need a "i need more outputs" GPU, the A310 is FAR better than anything nVidia and AMD are offering.

  7. #7
    If I will ever upgrade my gaming PC - I still see no reason to pick anything other than nVidia.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  8. #8
    Quote Originally Posted by ldev View Post
    If I will ever upgrade my gaming PC - I still see no reason to pick anything other than nVidia.
    Well, one reason is that their prices are insane, and AMD has similar performing cards at better prices. See the 6950XT is at $800 USD and the 3090 is at least $1000, and that price difference was much more extreme when considering MSRPs. Otherwise if you have a G-sync display, or want to use Ray Tracing, then stick with Nvidia. If you're looking at rasterization performance AMD is better price/performance.

  9. #9
    Quote Originally Posted by Linkedblade View Post
    Well, one reason is that their prices are insane, and AMD has similar performing cards at better prices. See the 6950XT is at $800 USD and the 3090 is at least $1000, and that price difference was much more extreme when considering MSRPs. Otherwise if you have a G-sync display, or want to use Ray Tracing, then stick with Nvidia. If you're looking at rasterization performance AMD is better price/performance.
    Yea that's not a reason for me. I want a good GPU, not a bargain.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  10. #10
    Quote Originally Posted by ldev View Post
    Yea that's not a reason for me. I want a good GPU, not a bargain.
    Yep and thats the problem. In every day life Nvidia is a monopoly now. They are becoming like Apple...No they are Apple now. No competition.

    Only thing AMD is good for is competition on nvidia prices, but hey...Nvidia now only catters to high-end anyway. OR WAIT NO! Cause the bastards just refreshed/planned a new 3060 skew with only a 128-bit memory interface and just 8GB vram (Worse than orginale 3060)...Woooh, 2 year old mid range GPU.

    Sigh. DlSS 3.0 is also so far ahead of FSR Inb4 4060 costing 6000 danish kroner when it launches. And just to note, a 3060 is 3000 danish kroner, and the 3060ti is 4000...rough prices etc.

    Denmark is now at 11% inflation year over year now. I did a inflation google calculator, and the build I did in my signature was like 11.000 kroner in 2015 money, now its more like 13500 kroner - So if I buy a new desktop I guess im gonna spend 13-15000 danish fucking kroner...And even if I undervolt the system it will also use more power maybe. Oh well 1 4090 starts at 15-16.000 danish kroner so hey w/e...
    Last edited by Djuntas; 2022-10-27 at 09:00 PM.
    Youtube channel: https://www.youtube.com/c/djuntas ARPG - RTS - MMO

  11. #11
    Quote Originally Posted by Linkedblade View Post
    Well, one reason is that their prices are insane, and AMD has similar performing cards at better prices. See the 6950XT is at $800 USD and the 3090 is at least $1000, and that price difference was much more extreme when considering MSRPs. Otherwise if you have a G-sync display, or want to use Ray Tracing, then stick with Nvidia. If you're looking at rasterization performance AMD is better price/performance.
    Only right now this second. When the respective cards launched, they werent much cheaper and didnt perform as well. And when youre spending enthusiast money… why the hell would you only care about rasterization? (And i literally just saw 3090s going for 850, so im not sure where you pulled that “atleast” 1000 number from).

    As for price… yes and no? People get too hung up on the names of the cards and dont pay attention to their place in the stack.

    They frequently compare card prices based on name when that is irrelevant (because naming schemes can change overnight).

    The top SKU of the 3000 series (mid-gen additions always mess with this, mind you) was the 3090… at 1499. The 4090 (also the top SKU) is… 1599. The additional cost can be almost entirely laid at the feet of increased costs.

    The penultimate card in the *launch* stack is generally in the 700 range, and thy fucked that up this time (with the 4080 being 1199), but that seems to b because their launch strategy changed somewhat and instead of saving the mid-generation insert card, they just launched it. (The 4080 being roughly the same price as the 3080Ti). This would put the 4080 12GB card (now un-launched and almost assuredly being renamed the 4070, as well as price cut to 700$ according to insiders at board partners) in the same position as the 3080 - third card down the stack, around 700$. Yeah, they trie to pull a fast one and increase the #3 SKU price by 200$, but the market immediately corrected them and put the kibosh on that. But looking at the “mature” product stack (mid-gen upgrades, inserts, and replacements), the prices have been relatively stable for generations (and even come down for the Halo SKU - Titans used to run over 2K).

    And while AMD’s “equivalents” of the enthusiast SKUs (the top 3 SKUs, generally) were cheaper than their nVidia card competitors… they kind of have to be given that the nVidia cards DO perform better (even if only barely in the case of the 6950XT vs the 3080Ti), AND have features that the AMD cards simply cannot replicate. (nVenc, DLSS (more performant than FSR), and RT you can actually use). And in those spaces, those features matter. Enthusiasts arent buying 3090s to NOT use RT or DLSS.

    - - - Updated - - -

    Quote Originally Posted by Djuntas View Post
    Cause the bastards just refreshed/planned a new 3060 skew with only a 128-bit memory interface and just 8GB vram (Worse than orginale 3060)...Woooh, 2 year old mid range GPU.
    This is one of those things that often baffles me. Just because the memory iterface isnt as wide doesn't mean it will limit performance per se. And "Just" 8GB of VRAM?

    as compared to... what? There is no game that needs anywhere near 8GB of frame buffer at resolutions and settings that the 3060 can handle. The original 3060 only had 12GB because it was cheaper for nVidia to use that interface. Every card above it had less until you got to the 3090 - 3060Ti, 3070, and 3080 were 8, 8, and 10.

    The "new" 3060 also has GDDR6X that is faster than the original, so the card will likely perform the same or better. And its supposed to be cheaper.

    And it being two years old means... what? It still knocks 1080p Ultra high refresh out of the park. And its not like AMD has anything competitive for nVidia to worry about. The 6600 is categorically a worse card (by a fair bit), and the 6600XT, while closer, still loses out (and really tanks when you use FSR on it, whereas the 3060 has plenty of tensor cores for DLSS) and is only about 30$ cheaper.

    Yes, im sure that AMD eventually has a putative 7600 XT coming down the pipe.. but nVidia almost assuredly also has a 4060 in the wings.

  12. #12
    Quote Originally Posted by ldev View Post
    Yea that's not a reason for me. I want a good GPU, not a bargain.
    So, obvious Nvidia fanboying. Got it.

  13. #13
    Quote Originally Posted by Linkedblade View Post
    So, obvious Nvidia fanboying. Got it.
    Uhh.. no.

    He simply wants the card with the most comprehensive features. AMD GPUs are competitive solely on pure Rasterization. And no spending out past the midrange is going to do *only* rasterization. You dont spend 1,000$ on a GPU to NOT get bells and whistles.

    FSR is not as good as DLSS.
    AMD has no equivalent coming down the pipe for DLSS 3.0 (which people already enabled on 30 series cards)
    AMD has no equivalent to DLDSS
    AMD’s RT implementation sucks.

    Yeah, theyre cheaper, and at performance tiers where only Rasterization really matters, a better “bargain”. At the enthusiast level? They dont compete at al. All they are is cheaper. And thats not enough at that price point. Card can be cheaper all it wants, if it packs it in and gets crippled when i try to enable RT, who the fuck cares?

  14. #14
    Quote Originally Posted by Kagthul View Post
    Uhh.. no.

    He simply wants the card with the most comprehensive features. AMD GPUs are competitive solely on pure Rasterization. And no spending out past the midrange is going to do *only* rasterization. You dont spend 1,000$ on a GPU to NOT get bells and whistles.

    FSR is not as good as DLSS.
    AMD has no equivalent coming down the pipe for DLSS 3.0 (which people already enabled on 30 series cards)
    AMD has no equivalent to DLDSS
    AMD’s RT implementation sucks.

    Yeah, theyre cheaper, and at performance tiers where only Rasterization really matters, a better “bargain”. At the enthusiast level? They dont compete at al. All they are is cheaper. And thats not enough at that price point. Card can be cheaper all it wants, if it packs it in and gets crippled when i try to enable RT, who the fuck cares?
    It was a future encompassing statement. So features and cards yet to be released. We simply don't know what will happen. If we compare AMD now to AMD 3 years ago it's completely different. We simply cannot make such a blanket statement that one company will always be better than another.

  15. #15
    Quote Originally Posted by mrgreenthump View Post
    It was a future encompassing statement. So features and cards yet to be released. We simply don't know what will happen. If we compare AMD now to AMD 3 years ago it's completely different. We simply cannot make such a blanket statement that one company will always be better than another.
    Saving AMD by saying "well you don't know what will happen in 20 years!". Sad.

    "Near future". Better? Not everyone's native language is english, and when you have to go "what do words actually literally mean" to make AMD not suck complete fucking balls - AMD already lost. Deal with it.

    Why can't people accept that some products are simply better? nVidia is better than AMD, Intel/AMD is better than the other (no idea who's winning atm), Lexus is better than Toyota, etc etc etfc.
    Last edited by ldev; 2022-10-28 at 11:59 AM.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  16. #16
    Quote Originally Posted by Kagthul View Post
    Uhh.. no.

    He simply wants the card with the most comprehensive features. AMD GPUs are competitive solely on pure Rasterization. And no spending out past the midrange is going to do *only* rasterization. You dont spend 1,000$ on a GPU to NOT get bells and whistles.

    FSR is not as good as DLSS.
    AMD has no equivalent coming down the pipe for DLSS 3.0 (which people already enabled on 30 series cards)
    AMD has no equivalent to DLDSS
    AMD’s RT implementation sucks.

    Yeah, theyre cheaper, and at performance tiers where only Rasterization really matters, a better “bargain”. At the enthusiast level? They dont compete at al. All they are is cheaper. And thats not enough at that price point. Card can be cheaper all it wants, if it packs it in and gets crippled when i try to enable RT, who the fuck cares?
    If you're spending money on a 3090 or 4090, then you don't even need DLSS. DLSS 3.0 makes competitive gaming worse, because it does not improve latency. The tier of cards where that kind of machine learning would benefit, doesn't exist yet; the 3000 series gets no benefit. If you're an enthusiast and competitive, it's useless.

    - - - Updated - - -

    Quote Originally Posted by ldev View Post
    AMD already lost. Deal with it.
    Lost what exactly?

    Quote Originally Posted by ldev View Post
    nVidia is better than AMD, Intel/AMD is better than the other (no idea who's winning atm)
    You're admitting you don't know enough to have an input.

  17. #17
    Quote Originally Posted by Linkedblade View Post

    You're admitting you don't know enough to have an input.
    Yea on CPUs, duh doi, yes, that is exactly what I did, thanks captain?

    Anyways, people don't need DLSS? Because everyone has 1440p display, and 5k or 6k doesn't exist I guess.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  18. #18
    Quote Originally Posted by ldev View Post
    Yea on CPUs, duh doi, yes, that is exactly what I did, thanks captain?

    Anyways, people don't need DLSS? Because everyone has 1440p display, and 5k or 6k doesn't exist I guess.
    You never mentioned talking about cpu's. The thread is about intel graphics cards...

    Yes, people don't need dlss. It's just a way to pad fps counters, and the quality is subjective at best. Works for fast pace games, because the blurring it causes isn't as noticeable. Games with any static elements or text it's just bad. Like i said with cards like the xx80 or xx90 from nvidia, why would you use a gimmick for fps when the card can handle the fps and get real frames? This is especially true for cards like the 4080/4090.

  19. #19
    Quote Originally Posted by Linkedblade View Post
    You never mentioned talking about cpu's. The thread is about intel graphics cards...

    Yes, people don't need dlss.
    Yes, they do. I have a 3080. If i want to play with Ray Tracing on, i need to use DLSS to maintain enthusiast framerates. Might help if you had any idea what you're fucking tallking about.

    It's just a way to pad fps counters, and the quality is subjective at best.
    The quality is so close that GN called it "indistinguishable" outside of careful side-by-side comparison. I.E. you would never notice without two rigs side by side and careful examination. Their recommendation was unequivocally "just turn it on".

    Works for fast pace games, because the blurring it causes isn't as noticeable.
    It just isn't that noticable. Its almost like dozens of objective experts have tested it and weighed in on it. If you think you can see a massive difference, then its a reverse placebo affect where since you "know" it cant be as good, you're inventing shit in your head.

    Games with any static elements or text it's just bad. Like i said with cards like the xx80 or xx90 from nvidia, why would you use a gimmick for fps when the card can handle the fps and get real frames?
    Because you cant always get those frames. And we werent talking about onlly enthusiast levels. In a card like a 3060 or 3060Ti, DLSS is a big fucking deal. It can take you from "barelyl getting 60fps" to "i get 90-100fps"... easily.

    This is especially true for cards like the 4080/4090.
    Rather obvious you dont own one or you wouldn't think that. Even on the 4090, RT is a killer. Without DLSS, at 4k in Cyberpunk, it was barely keeping 60fps.

    - - - Updated - - -

    Quote Originally Posted by Linkedblade View Post
    DLSS 3.0 makes competitive gaming worse, because it does not improve latency.
    Okay? Enthusiasts seeking massive framerates at 4k have zero crossover with competitive gamers who almost entirely run 1080p 240 or 360. So.. who gives a fuck?

    The tier of cards where that kind of machine learning would benefit, doesn't exist yet; the 3000 series gets no benefit.
    Turns out, DLSS 3.0 can be enabled on 30 series cards with a text file edit. Guy already did it. So its an entirely artificial "restriction" that can be easily worked around (and will be by any enthusiast who cares). And, likely, once word gets out more widely that it is possible, the backlash will likely cause nVidia to officialy support it.

    If you're an enthusiast and competitive, it's useless.
    In this Venn Diagram, there are zero people. You're talking about a fraction of a fraction of a fraction of a fraction of even enthusiasts, which are already a fraction of gamers. Enthusiasts want to play on 4K displays (or high refresh 1440p or UW 1440p) - they are not the sweaty basement dwelling spitbeards who obsses over higher uncapped framerates in CS:GO to give them some imaginary advantage (not because there isn't one possible, because testing shows that there can be, merey that 99% of those guys dont have the skill for that to matter and are deluding themselves if they think they do). Those guys play at 1080p and turn settings down to get framerate. No enthusiast wants screen tearing fucking up his image quality. Competitive Gamers != Enthusiast Builders.

    Lost what exactly?
    Well, the CPU race at the very least, for both gaming and productivity. At least until mid next year when X3D variants of the 7000 series hit. As it stands, though, at every price point, Intel has a better, cheaper product (or marginally more expensive but CRUSHINGLY better, in the case of the 13600K v 7600X, and that doesn't take into account that the TPC of the Intel rig would be ~200$ lower).

    But this thread wasn't about CPUs.

    As for GPUs... they lost in the RX6000 series vs RTX 3000 series matchup across the board. Only late-life, when massive price drops hit, did the lower end RX 6000s pull ahead in a value proposition (the RX 6600 is better than the RTX 3050 and cheaper, and the RX 6600XT is close enough to the 3060 and cheaper, but if you're going to use DLSS or anything, its still better to. get the 3060, and people playing at that budget will get a lot of midleage out of DLSS).

    Im willing to wait and be wowed by RX 7000. But im not betting on that. Not remotely. My bet is on "as good at Rasterization as RTX 4000 for cheaper", and "still cant compete on the compelling features, but better/closer than RX 6000".

    Ooh, also, DLDSR (hough i think i late-night-brained and called it DLDSS) is a thing (and i mentioned it already). Its a feature AMD has NO equivalent for, and it can make even 4K images look better - something to use all those extra horses that you claim make DLSS irrelevant. Yeah, AMD can just do DSR, but that MURDERS performance. DLDSR is MUCH more performant for the same outcome. So that IS a feature that enthusiasts can and will use if they "dont need DLSS".

    You're admitting you don't know enough to have an input.
    Ill just leave this here. https://www.merriam-webster.com/dictionary/irony

    And this. https://www.merriam-webster.com/dictionary/projection (6b)
    Last edited by Kagthul; 2022-10-29 at 06:34 AM.

  20. #20
    Quote Originally Posted by Kagthul View Post
    <snip>
    You're all over the place. Are we talking about gpu's on a gaming forum? Or an enthusiast one? Most people don't play on sku's like the rtx/gtx XX80 or XX90. So, they don't have access to ray tracing or dlss or the ability to reach 4k at decent fps. Only reason you need dlss is to fake 4k or to counter the performance hit of ray tracing. Which again if you're playing competitive games or gaming in general, you don't need or want. If you look at steam's hardware survey there are too few people playing on 30 series cards, and most of the cards are XX60 class cards.

    And don't conflate the market with yourself. Especially humble bragging about buying a 3080.

    I don't know why you're talking about cpu's at all. This thread was about intel discrete graphics. And no one really lost or won anything, and you claiming they did is nonsense.

    What's worse is all the talk about how great nvidia's most expensive cards are is just as nonsensical. We were talking about cards like the A770.

    And the advice "Just buy nvidia" is just as stupid as ever.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •