Page 2 of 2 FirstFirst
1
2
  1. #21
    Quote Originally Posted by Remilia View Post
    If they want to keep backwards compatibility, yes, actually. The CPU is going to bottleneck that hard. Then there's software that is completely up to the developers, not Microsoft.
    If they go for a new CPU then they're going to deal with the PR fun with no backwards compatibility, whether it matters or not to the person. It's all PR.
    All PR?

    LMAO what a joke dude. Yea they're totally going to waste money throwing a 6 TF GPU in a console just for PR.

    This is the part where you act like you know more then professional engineers again. The system does not need to use a jaguar apu to have xbox one backwards compatibility. The system would not have to emulate it either. You're making a lot of assumptions based on really nothing.

    Yea just gonna throw a stronger then 980TI GPU in the system for PR and have it bottlenecked. LOL, seen it all in this thread now.

  2. #22
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Tech614 View Post
    All PR?

    LMAO what a joke dude. Yea they're totally going to waste money throwing a 6 TF GPU in a console just for PR.

    This is the part where you act like you know more then professional engineers again. The system does not need to use a jaguar apu to have xbox one backwards compatibility. The system would not have to emulate it either. You're making a lot of assumptions based on really nothing.

    Yea just gonna throw a stronger then 980TI GPU in the system for PR and have it bottlenecked. LOL, seen it all in this thread now.
    The PR was referring to backwards compatibility, not the power of the machine.

    I assume you don't know what bare metal programming is. It's where you're programming with little to no abstraction. A high level of abstraction allows for better compatibility with various hardware, but it also means that it decreases the potential of hardware specific programming and optimization. That's how it's always worked.
    It has nothing about knowing more than a professional. Maybe you should look into software instead of talking about it once in a while. I can go dig up some stuff from zlatan on Anandtech, hes a programmer for these things and very well mentioned that for the PS4 they can't change a lot of things like the cache latency, ISAs and so on. Hes been accurate on all fronts for years.
    https://en.wikipedia.org/wiki/Abstra...engineering%29
    https://en.wikipedia.org/wiki/Low-le...mming_language

    Although this mainly pertains to PS4, it gives you an idea of what to expect.
    http://www.gamasutra.com/view/featur...k_.php?print=1
    "Next, to support the case where you want to use the GPU L2 cache simultaneously for both graphics processing and asynchronous compute, we have added a bit in the tags of the cache lines, we call it the 'volatile' bit. You can then selectively mark all accesses by compute as 'volatile,' and when it's time for compute to read from system memory, it can invalidate, selectively, the lines it uses in the L2. When it comes time to write back the results, it can write back selectively the lines that it uses. This innovation allows compute to use the GPU L2 cache and perform the required operations without significantly impacting the graphics operations going on at the same time -- in other words, it radically reduces the overhead of running compute and graphics together on the GPU."
    Thirdly, said Cerny, "The original AMD GCN architecture allowed for one source of graphics commands, and two sources of compute commands. For PS4, we’ve worked with AMD to increase the limit to 64 sources of compute commands -- the idea is if you have some asynchronous compute you want to perform, you put commands in one of these 64 queues, and then there are multiple levels of arbitration in the hardware to determine what runs, how it runs, and when it runs, alongside the graphics that's in the system."
    If these don't exist in a similar or the same manner then it just won't run properly at all without abstraction. Microsoft could very well made a level of abstraction for the software and API that they're using on their console, but it adds a level of overhead that disallows certain aspect controls like the ones mentioned above. Naughty Dog can't do what they do with this level of abstraction.

    And again, TFLOPS mean nothing, and it never will. A Fury X has a higher TFLOPS than a 980 Ti, but the Fury X is basically the same performance at 3840x2160 and under at 1920x1080 due to the design.

    Now unless you can provide something that's not 'cause professional', we can actually have a proper discussion.

  3. #23
    There is no discussion to be had, you're claiming a 6TF GPU means nothing.

    The discussion ended the second you said that(your first post).

  4. #24
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Cause it is pointless, TFLOPS is compute performance, not gaming. Like it's not surprising coming from you, to not know. Since you prefer Nvidia, you should know by now that TFLOPS never equates game performance, unless you are uninformed by specs of course which apparently you aren't though by quoting 980 Ti. Just cause you have no idea doesn't mean it doesn't make it not so.

  5. #25
    Quote Originally Posted by Tyresias View Post
    4k is not even remotly close to being the standard. For gaming at least the standard is transitioning from 1080p to 1440p for most highend single card builds. If you want 4k in current demanding games that's a 4-5k rig with 2 top of the line gpu's if you want anywhere close to 60fps smooth at max settings minus AA wich is pointless at that ress. People that owna 4k fully capable rig are an extreme minority. If you mean playable at 4k then probably like half to most current games would run on a gtx 960/970 or amd r9 370-390 range with lowered to very lowered settings.

    I do agree that if you have to change consoles that often you might aswell say fuck it and go PC
    The monitors are going to be the most expensive part of the rig(assuming you get more than one, otherwise a decent 4k monitor will run the same as a 1080). However with 4k you don't need to turn all the bells and whistles up...that's the benefit of a higher resolution. Yes you "can" do that, but you don't HAVE to...so 4k isn't 4/5k rig...at worst with everything in there...you're looking at 2-3k(which is still expensive, not saying it isn't but that's also assuming a completely fresh build..benefit is if you're already built a computer your hardware aside from motherboard/cpu/gpu/memory/monitor will just transfer over and yes i am fully away the rest of the computer is pretty cheap in comparison to those parts). However I said that 4k wasn't the standard so I am not sure if your post is arguing with me or not...

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •