MMO-Champion Rules and Guidelines
But the server segment has accelerators that Intel could profit from. And the r&d that goes into that has a side effect of creating tech that is also good for quick-maths that graphics engines need.
And Intel is also looking forward to, iirc, to getting rid of sockets altogether.
- - - Updated - - -
RTX can be done on any card, it just takes a lot of the calculations from other things. But you should be able to do it on Vega or Pascal, but with severe performance penalties. The rtx 2000 series just has silicon dedicated to performing those calculations.
Will they actually have hardware? I wasn't following the announcements really, so all I heard there will be ray tracing, but not sure if anybody said there's going to be hardware accelerated ray tracing.
- - - Updated - - -
Maybe it came out wrong, I just meant that from architecture point of view it's not that impressive if at same power they achieve same performance, but with a node ahead. Probably better than before, but they still have a long way to go.
Yeah, in the end, for being a 7nm part, it's not impressive from that angle. It's a big leap for AMD, but I'm sure Nvidia's equivalent 7nm 2070/2080 parts will be far faster. However, how they price those parts will be a huge issue.
In the end, the 5700 XT is only $50 higher than I'd hoped and its completely acceptable, given its performance so far and the fact that it looks to have really solid overclocking/AiB value.
A 10%+ over clock puts it just behind the 1080ti and even 2080 in some cases. I find that pretty compelling myself.
I know, but I'm more interested in a fairly priced card punching above its weight.
The 3700XT performs like an overclocked 2070 as is, and with its overhead, it puts it right next to the big hitters.
Yes, you can overclock a 2080. But really, so what?
The 2080 and 1080ti sit in a zone where overclocking them tends not to offer any tangible benefits.
Playing Tomb Raider, for example, at 90 fps vs 95 isn't going to change anything.
449$ is about the same in € because VAT.
Not sure where you're checking prices but RTX 2070 is around 550€ for multiple models on hinta.fi. Although as usual it's pointless buying anything in this expensive country of ours since you can get them for about 100€ cheaper from Germany.
Also to note, Navi is AMD's mainstream GPU line. Vega is still their high-end GPU lineup with Vega receiving a likely refresh in 2020.
My understanding is that the next Navi will replace Vega in the gaming lineup. Basically Navi is the their gaming offering and Vega will be the compute one. AMD have their hands a bit tied with this release. They can't get into a cat fight with NVidia because they don't have a competitive top end. The Radeon 7 is too expensive to drop the price much. Hopefully they will be in a better position with their next Navi release.
- - - Updated - - -
Exactly, Nvidia isn't standing still either. But, as we are seeing with NVidia, they aren't getting the same jumps in performance. NVidia doesn't have much they can do on the architecture side of things but they will be able to improve by using 7nm as an example. AMD still have some work to do on their architecture. They aren't getting the same gaming performance as NVidia for the same compute performance.
Edit: Just to explain what I mean by architectural changes. AMD are behind NVidia at the moment with the GPU architecture from a gaming standpoint. Design changes can net you very big gains in performance. For example, their are string searching algorithms that can net you a 20 fold increase in speed over a brute force byte by byte comparison. So that means that AMD have a more places that they can optimize. They are already on 7nm so their jumps in performance will have to come from design changes.
Last edited by Gray_Matter; 2019-06-12 at 07:42 AM.
R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B
In a sense that it uses GCN as it's ISA, yes. But that's like saying Zen is still X86_64 when comparing to Bulldozer. RDNA changes the compute unit from NCU(actual Vega uArch name) drastically.
Their uArch naming has been a bit of a mess then calling it the same name as the ISA for quite a bit. But they've tried to stray away from callings their uArchs GCN generations for like 3-4 years now and people are still calling Vega GCN gen 5.
MMO-Champion Rules and Guidelines
You should read the RDNA architecture presentation (https://gpureport.cz/info/Graphics_A...e_06102019.pdf). RDNA isn't GCN, it still has ties to GCN but it's very different. Page 10 onward is where it gets interesting. It's certainly not as efficient as NVidia but it's a big step in the right direction. They are making further changes for the next revision.
RDNA (5700XT) has fewer transistors than Vega 64 (10.3B vs 12.5B), it has fewer T/FLOPS than Vega 64 (9.75 vs 12.58) yet we are seeing jumps in performance of upwards of 20%. It's far more efficient from a gaming perspective than Vega with GCN. For comparison, the 2070 only has 7.465 T/FLOPS but it performs on a par with the 5700XT.
I am pretty sure that NVidia also build on their previous architectures with each new release and don't throw everything away and start from scratch each time.
No. If you look at Zen vs Bulldozer the design is completely different. Yeah, evidently it's a very similar idea, but technology and implementation is completely different. RDNA is literally GCN with minor pipeline optimizations and faster cache.
- - - Updated - - -
Doesnt look any interesting to me. They're basically trying to save money here by optimizing GCN for gaming. I dont mean that they're doing anything wrong here, it looks they're targetting all the right things with RDNA. Most of the stuff they're targetting here are their usual problems with huge memory bandwidth requirements (have to address this if you're moving away from HBM2) and CPU usage. Poor scalability - still not addressed, shitty blower coolers on reference models - still not addressed, overclocking support - is getting worse again, power consumption - still very high (more than direct competitors even though Nvidia is still on 12nm). So yeah it looks like they're targetting midrange again: cheaper, smaller GPUs with cheaper memory.
At the same time Turing design is not that interesting compared to Pascal either btw, it's just optimizations they had to do due to DX11 falling off, and also an attempt to utilize their advancements in compute hardware for consumer market.
R5 5600X | Thermalright Silver Arrow IB-E Extreme | MSI MAG B550 Tomahawk | 16GB Crucial Ballistix DDR4-3600/CL16 | MSI GTX 1070 Gaming X | Corsair RM650x | Cooler Master HAF X | Logitech G400s | DREVO Excalibur 84 | Kingston HyperX Cloud II | BenQ XL2411T + LG 24MK430H-B
That is very likely.
Today building a general purpose CPU or GPU from scratch is an effort of epic proportions, it will require mad amount of funding, highly skilled manpower and long time. If a company already has functioning architecture it will never throw away what it has.
Apple is building its stuff right now from scratch, but the amount of money they can pour into this cannot be matched.
The new RX 5700/5700 XT will be dead on arrival, NV is preparing to launch Super line up and I'm sure the MSRP prices of normal/Ti/Super will fill the gaps and it will make anyone pick any NAVI new card is wasting his money. What a shame AMD!!, they should have learned their mistakes with NV series 10 when they did the same trick.
Last edited by Ballistic; 2019-06-15 at 11:10 AM.