Page 6 of 22 FirstFirst ...
4
5
6
7
8
16
... LastLast
  1. #101
    Quote Originally Posted by Haidaes View Post
    Since some user keeps bringing up the speed of light (and making a fool of himself) I figured it might be the time to point something out:

    The speed of light is a bit below 300,000 km/s, that is 300 km/ms, that means in a country like Germany you need about 2 middling datacenters to supply >80% of the population with "light speed internet" in 1ms. Now the reality is obviously that copper is still used alot on the way, which only has an electro-magenetic wave propagation speed of 2/3, that leaves us with 200 km/ms in our trusty (shielded) twisted pair cable (not actually used in most places for long distance).

    Since most of us don't tend to be super heroes like the Flash let's be lazy and say we can get away with 400 km/2ms and instead reach a bit further instead. With a return trip we lose a whopping 4ms per 400 km of distance. For the people in the US, a highly populated state like california has length of about 1200 km, that once again is 2 datacenters for 4 ms delay or 8ms for 1 datacenter and the people in the eastern neighboring states might profit from it as well.

    Now that we have cleared up that the speed of light is not the issue and poor old c can keep cruising through vaccum at max speed, we can start talking about actual things that are the issue here, EE, computer science and infrastructure spending.

    The places where we actually piss away latency are our routers at home, the backbones of our providers, all the little switches between us and the datacenter, obviously the time it takes to calculate the data they send to you and also the fact that lots of differently routed packages bog down systems, since at the end of the day they still only pass data sequentially bit by bit and have share that for so many users per channel.

    As I wrote in an earlier post some of these things can be adressed, the package priority in the US is already no longer the same for everyone and since google owns a decent chunk of the infrastructure they can prioritize their packages, that counters conguestion problems to some degree. Another one is money, obviously google won't have a decent datacenter in every rinky-dinky state with 200k inhabitants/potential customers. Taking a look at the map though it doesn't seem as bad as one might expect, to reach the majority of players is still feasible. As some of the testers have shown, the reality is it can work for a decent chunk of people, no matter how bad the current service actually is from a cost-benefit point of view for the user. Are they there yet? For most people clearly not, is it possible, sure. Just today I read that twitch wants into the business as well, so as said before, this is merely the beginning of the end.

    - - - Updated - - -



    Yea, especially since their launch lineup is mostly old things that you often see in sales. For the early adopters they surely could have done way better as far as offering a decent deal goes.
    Lol I can't speak for everyone but the point wasn't for it to literally be the speed of light and more so that almost any input latency on top of that which is already generated by our monitor and other hardware is too much for any sort of competitive title like an FPS or fighting game where you're trying to get everything as close to 1:1 as you can with your inputs and what you see and feel. You can feel the difference between new and old monitors for example and that is sitting in front of your face. With current internet speeds and the hardware we play games with its essentially impossible for most people to both use stadia and maintain the needed performance for those types of games because it isn't just sending and receiving the packets but also rendering the game afterward and the time it takes you as a human to react to what you're seeing and feeling after that. It adds up in a world where games can already feel bad due to just having the wrong monitor before you start pushing your inputs through a vastly larger network
    Last edited by Erolian; 2019-11-21 at 08:33 PM.

  2. #102
    Immortal Stormspark's Avatar
    7+ Year Old Account
    Join Date
    Jun 2014
    Location
    Columbus OH
    Posts
    7,953
    Quote Originally Posted by Strangebrew View Post
    hahahahahaha, god i love how any fucking retard who bought this garbage is getting shat on all over by google too.
    Honestly, my feeling on the subject, anyone who bought into this deserves to be scammed, for being gullible. "Streaming". LOL. I play games, I don't interact with videos of games that I don't even own. I will never touch a game that doesn't run locally on my hardware. Servers for online features (like MMO's)? Yes I've done those before. But even with those, the client and graphics are local, the network stuff is just data so that lots of players can be in a shared world.

    Hopefully this thing dies and gets buried quickly, and costs Google a ton of money. And hopefully everyone that bought into it gets burned hard and all the money they put into it wasted. That will stop others from trying this purely idiotic idea.

  3. #103
    Quote Originally Posted by GreenJesus View Post
    I thought originally it was just a service/server you could stream games from that you already owned. I didnt think you actually had to buy the games from them.. lol DOA.
    I don't know how you expect your own games to stream from their hardware but yeah, they needed to combine this with the subscription game services you see increasingly coming up where you get a big collection of games, including new ones with the subscription.

    Paying extra and retail price for games you won't even be able to use if you cancel is hilarious.
    It ignores such insignificant forces as time, entropy, and death

  4. #104
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Elim Garak View Post
    The places where we actually piss away latency are our routers at home, the backbones of our providers, all the little switches between us and the datacenter, obviously the time it takes to calculate the data they send to you and also the fact that lots of differently routed packages bog down systems, since at the end of the day they still only pass data sequentially bit by bit and have share that for so many users per channel.
    It's still a speed of light... err speed of electricity issue as you're still dependent on that speed to operate routers and other hops to move data. Those devices aren't going anywhere ever. We can increase throughput but latency will always be roughly the same.
    As I wrote in an earlier post some of these things can be adressed, the package priority in the US is already no longer the same for everyone and since google owns a decent chunk of the infrastructure they can prioritize their packages, that counters conguestion problems to some degree.
    Net Neutrality is back in some states and that would be against the rules. I don't think enough QoS will solve anything.
    Another one is money, obviously google won't have a decent datacenter in every rinky-dinky state with 200k inhabitants/potential customers. Taking a look at the map though it doesn't seem as bad as one might expect, to reach the majority of players is still feasible. As some of the testers have shown, the reality is it can work for a decent chunk of people, no matter how bad the current service actually is from a cost-benefit point of view for the user. Are they there yet? For most people clearly not, is it possible, sure. Just today I read that twitch wants into the business as well, so as said before, this is merely the beginning of the end.
    In the best controlled conditions it doesn't look like this has a leg to stand on. As it stands right now Stadia doesn't recommend WiFi and doesn't recommend using Netflix while using Stadia. The infrastructure may get better, and home WiFi may improve but the latency will always be there.

  5. #105
    Quote Originally Posted by Vash The Stampede View Post
    It's still a speed of light... err speed of electricity issue as you're still dependent on that speed to operate routers and other hops to move data. Those devices aren't going anywhere ever. We can increase throughput but latency will always be roughly the same.

    Net Neutrality is back in some states and that would be against the rules. I don't think enough QoS will solve anything.

    In the best controlled conditions it doesn't look like this has a leg to stand on. As it stands right now Stadia doesn't recommend WiFi and doesn't recommend using Netflix while using Stadia. The infrastructure may get better, and home WiFi may improve but the latency will always be there.
    Not sure why you attribute this to Elim Garak since you clearly quoted me here. And no it is not a speed of light issue. While the speed of light is also the speed limit of information propagation, this is not what is holding you back here. The switching losses/delays and calculation times are not affected by the speed of light on the scale we are talking about, what hampers us here are things like impedances, charge changes in semi-conductor gates, the way we format data and the way we build our infrastructure (not everyone having a direct connection to everyone else, etc).

    With that out of the way you are right that the service currently is unattractive in cases where you share your internet connection with others in the house, don't have 100+ mbit connection or are in general a player of a certain niche of titles and overly competitive. I have yet to see anyone pull out some CS basics for their tests and try messing with their router settings to bump the QoS for the chrome cast's MAC, but that would certainly be interesting to note in regards to maximizing your performance in households with multiple users. Heck even GBit office connections might see changes here in regards to the varying imput lag spikes.
    Hooking up your devices via a cable is also something I personally don't mind, since that has always been the preferable way for good connections. It certainly doesn't make this more appealing to the soy latte slurping crowd on phones 3 floors down from the place where the WLAN router is installed. But then kids grow up playing fortnite on their phone and shift millions (billions?) in MTX money, I frankly consider playing a shooter on a phone to be beneath me, but that hasn't stopped others so far.

    Digital foundry put up a good video where they tested the system and gave by far one of the most unbiased reviews of the tech itself (probably because they actually got their code), while also clearly pointing out the failures. They averaged out an additional latency of 45ms compared to consoles, which already seem to have a fairly major amount of imput lag (70 - 120 ms), which is something the overwhelming majority of players won't notice, but it certainly won't appeal to everyone. Another major issue is clearly that the imput lag is variable, for some testers apparently more than others. They will need to start ironing out these flaws quick and help people to maximize the performance, which makes this launch all the more laughable, as people pay for being beta testers and don't even get a good value proposition on old games (which most of the titles just are). If you want to complain about something then complain about that - because that is where this product fails by far the most (that and the bear claw clips inside the controller which make the modular design kind of pointless again.. that one just hurts me from a professional PoV, screws exist you wankers!).
    Last edited by Cosmic Janitor; 2019-11-22 at 06:16 AM.
    You are welcome, Metzen. I hope you won't fuck up my underground expansion idea.

  6. #106
    Cloud gaming is amazing....in the future. ISP's are the bottleneck and unless 4k video becomes main stream they aren't going to have sufficient demand to increase data caps and speeds.

  7. #107
    Quote Originally Posted by Deferionus View Post
    I played WoW with 600 MS and got to 2200 arena rating with it. When I moved to a middle sized city and got cable internet for the first time and only had 190 ms I was able to get to 2700 and felt like a god with how reactive the internet was. Now I have fiber internet and play games with usually ~30 ms, but I don't do arenas anymore lol.

    I can ping 8.8.8.8 and I only have 9 ms response times. This cloud gaming thing would probably work for me, but I spend 2 grand building PCs so I'm not really a part of their target market.
    Big difference between WoW Latency and Cloud Gaming latency is it's one way trips, not a round trip. Your game is still rendered locally, you're only sending inputs to the server you don't have to send them and then wait on your animation to happen after it received said input and sends the image back to you. This is why online games rendered locally don't have terrible latency like cloud gaming.

  8. #108
    Banned Video Games's Avatar
    15+ Year Old Account
    Join Date
    Mar 2009
    Location
    Portland (send help)
    Posts
    16,130
    Lag might be a big problem but i still think getting scammed by also needing to buy games is the biggest hurdle it actually has. Imagine if you had to buy the snes games on switch on top of the sub fee to play them. People would riot

  9. #109
    The Unstoppable Force Elim Garak's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    DS9
    Posts
    20,297
    Quote Originally Posted by Tech614 View Post
    Big difference between WoW Latency and Cloud Gaming latency is it's one way trips, not a round trip. Your game is still rendered locally, you're only sending inputs to the server you don't have to send them and then wait on your animation to happen after it received said input and sends the image back to you. This is why online games rendered locally don't have terrible latency like cloud gaming.
    You are talking about perceived latency here. The physical latency is the same be it a local client or a cloud one. The input/feedback has to travel to the server/cloud and back. With clients, you just might experience delayed lag correction - a rollback, with clouds - you just experience a pause.


    As for the speed of light: a direct perfect cable around the world will have a data travel time of 125ms. So if you are on the opposite side of the datacenter - your ping will never exceed 125ms - if it's just a direct perfect cable. Quite playable. But I'm pretty sure you will have a datacenter quite closer than 20000 km away - and if it's just 2000 km away - you are looking at 12ms ping with a direct perfect cable. So when it comes to Earth - the speed of light is not that a significant contributor to lag. It's the cable materials and infrastructure.
    All right, gentleperchildren, let's review. The year is 2024 - that's two-zero-two-four, as in the 21st Century's perfect vision - and I am sorry to say the world has become a pussy-whipped, Brady Bunch version of itself, run by a bunch of still-masked clots ridden infertile senile sissies who want the Last Ukrainian to die so they can get on with the War on China, with some middle-eastern genocide on the side

  10. #110
    Quote Originally Posted by Elim Garak View Post
    The physical latency is the same be it a local client or a cloud one.
    Except it's not. That's not how a round trip vs a one way trip works my dude, not even gonna read the rest of this. It's either semantics or wrong so either way not really worth it.

    There is a reason online games work for almost anything, and cloud gaming only works for specific genres. In WoW when you're moving left, you see your character start moving left at the native latency on your hardware. In a cloud rendered game you have to wait on your input to travel to said service then send the image back to you with your character finally moving left.
    Last edited by Tech614; 2019-11-22 at 08:58 AM.

  11. #111
    Please wait Temp name's Avatar
    10+ Year Old Account
    Join Date
    Mar 2012
    Location
    Under construction
    Posts
    14,631
    Quote Originally Posted by Vash The Stampede View Post
    eventually ARM along with a number of other system on chip manufacturers will jump into the PC market.
    I'm not going to argue with everything else you said, I think you're right, but this?
    no. ARM won't make a splash in the PC scene. They and Microsoft tried with the original surface tablets running Windows RT. Almost NOTHING got ported. Hell, the platform is dead now

    Games are still barely getting ported between OS's, and you're expecting an entirely new CPU architecture to come in?

    Of course they could make x86 based CPUs, but then they'd have to pay royalty fees to both intel and AMD to make them, and I very much doubt they'll be able to charge enough to make their money back when they're new, unknown, and super small-scale in x86.

  12. #112
    Quote Originally Posted by Tech614 View Post
    Except it's not. That's not how a round trip vs a one way trip works my dude, not even gonna read the rest of this. It's either semantics or wrong so either way not really worth it.

    There is a reason online games work for almost anything, and cloud gaming only works for specific genres. In WoW when you're moving left, you see your character start moving left at the native latency on your hardware. In a cloud rendered game you have to wait on your input to travel to said service then send the image back to you with your character finally moving left.
    What he means is that while indeed in a cloud rendered game the screen you see suffers from a lot more lag, your location updates on the server just as quickly as it would if you were playing on local hardware since in both cases the signal from your controller has to travel to the server and be updated.

    You can see this in games like WoW when your talking to someone while travelling with them and they see you in a different spot then you see yourself.
    It ignores such insignificant forces as time, entropy, and death

  13. #113
    Quote Originally Posted by ldev View Post
    100k car for 5 years and selling for 35k after 5 years.

    So buying outright: after 5 years you'll sell for 35k and have 35k left after spending 100k and owning a 100k car for 5 years.

    Invest 90k with, let's go on the low safe side, 10% annual return. After 5 years I will end up with 115k.
    100k car with 10k downpayment and I will have to return it after 5 years, monthly payment is 1075. So after 5 years I will pay: 10k + 1075*60 = 74500. And got 115k from investments. 115k - 74500 = 40.5k left. 40.5k left after lease is 5.5k more than 35k left when buying outright.

    I invest safely, so even for me leasing a 100k car is 5k CHEAPER than outright buying it. And some of my coworkers have 17% annual return (after losses, obv) on their investments. So after 5 years they will end up with 134k. With 17% annual return investments, that's... 24 fucking thousand eur saved on leasing a car vs buying one.

    Please do tell me more how leasing is stupid, because I don't see it.
    if your 100k car is selling for 35k after 5 years, you must really fuck it up. cause my 55k truck resold for 42k after 6 years. lol leasing is a horrendous decision unless you are the type who wants a new car every year or two. leasing is a very bad financial decision, nearly everyone in investing could tell you as much. Just like making car payments in general, buying new cars constantly, are all very bad decisions.

    your 100k to 35k car example is so horrendously bad it is laughable, cars depreciate like CRAZY, but above certain values that rate steeply falls off. No one is getting 100k cars for 35k 5 years later unless you are putting 25k miles on it a year. which if you were, would destroy your lease. no one in their right mind would advocate leasing is SMARTER. it might work better for people who want the newest constantly, but there is no doubt it is a worse financial decision

  14. #114
    Immortal Stormspark's Avatar
    7+ Year Old Account
    Join Date
    Jun 2014
    Location
    Columbus OH
    Posts
    7,953
    Quote Originally Posted by Gorsameth View Post
    What he means is that while indeed in a cloud rendered game the screen you see suffers from a lot more lag, your location updates on the server just as quickly as it would if you were playing on local hardware since in both cases the signal from your controller has to travel to the server and be updated.

    You can see this in games like WoW when your talking to someone while travelling with them and they see you in a different spot then you see yourself.
    Yep...and keep in mind, even in games like WOW, the client/graphics is rendered locally on your computer. The network is only used to pass data about abilities used and other players/NPC's/mobs. And even in that situation the latency is definitely noticeable, it's just that MMO's like that are designed with that in mind.

  15. #115
    Many platform are offering all their games "free to play" at the cost of a sub, (xbox live, origin etc) and you want me to pay a sub plus buy games to play 60€/each too? Bitch please this is even worse than a fusion ea+activision+ubisoft
    You think you do, but you don't ©
    Rogues are fine ©
    We're pretty happy with rogues ©
    Haste will fix it ©

  16. #116
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Temp name View Post
    I'm not going to argue with everything else you said, I think you're right, but this?
    no. ARM won't make a splash in the PC scene. They and Microsoft tried with the original surface tablets running Windows RT. Almost NOTHING got ported. Hell, the platform is dead now

    Games are still barely getting ported between OS's, and you're expecting an entirely new CPU architecture to come in?
    This is where Linux comes in. Microsoft had a hard time because Microsoft tried to push for ARM users into the Windows app store to get their software. Which is really bad considering that they didn't give you an option to run older x86 based applications on ARM through some sort of emulation. Linux doesn't have this problem since most of the code is open source and therefore easy to port over to ARM. Even if it isn't, Linux had QEMU and other methods to run x86 code on ARM... very slowly. There's a reason why you run Debian on the Raspberry Pi and not Windows.

    It obviously won't happen soon, just like cloud gaming won't happen soon, but eventually cheaper ARM based CPU's or even SOC's will be made for PC and will allow gamers to get cheaper hardware than x86 offers. Even still, there's a few GPU companies that could push their way into PC gaming that could force lower prices. Think of Mali and PowerVR graphics coming to PC. Apple kicked out PowerVR in favor of their own graphics they now make for their iOS devices. This will be the direction the computer industry will go, because AMD/Intel/Nvidia are subsisting over the server market right now, which will pay much more for computer hardware than gamers will. Future gaming PC's might look a lot like the Rapsberry Pi, but bigger and much more powerful.
    Of course they could make x86 based CPUs, but then they'd have to pay royalty fees to both intel and AMD to make them, and I very much doubt they'll be able to charge enough to make their money back when they're new, unknown, and super small-scale in x86.
    You have to pay royalty fee for ARM as well. Only CPU architecture that doesn't ask for a fee is RISC-V. Eventually Intel will have to open up x86 to be licensed so that x86 doesn't become irrelevant. That's what IBM had to do once Apple dropped PowerPC, but too little too late. Might be the same situation for Intel and x86.
    Last edited by Vash The Stampede; 2019-11-22 at 08:25 PM.

  17. #117
    Quote Originally Posted by Tech614 View Post
    Big difference between WoW Latency and Cloud Gaming latency is it's one way trips, not a round trip. Your game is still rendered locally, you're only sending inputs to the server you don't have to send them and then wait on your animation to happen after it received said input and sends the image back to you. This is why online games rendered locally don't have terrible latency like cloud gaming.
    I never claimed it would be better than a local connection. I work at an ISP / MSP so I'm familiar with what you are talking about. Something like an FPS would also be a bad application of cloud gaming. However if you start looking at turn based games, tactical games, games that are not as sensitive it would be a more seamless experience I'd imagine.

  18. #118
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,865
    Quote Originally Posted by Deferionus View Post
    I never claimed it would be better than a local connection. I work at an ISP / MSP so I'm familiar with what you are talking about. Something like an FPS would also be a bad application of cloud gaming. However if you start looking at turn based games, tactical games, games that are not as sensitive it would be a more seamless experience I'd imagine.
    Basically this, as I mentioned before that "speed of light" troll started getting all militant on me.

    The guy simply does not understand that the "speed of light" is not the issue, but the fact that half of the infrastructure we feed on is a rotting wreck passing through shitty ancient equipment tucked away in a dozen of pops we pass through who traffic shape the shit out of everything.

    On top of that once cloud gaming inevitably picks up, more intense titles will developed with cloud gaming in mind where there will be various ways to reduce and offset additional lag introduced by this, such as partially running game locally and using cloud hardware to draw secondary (but most consuming stuff) like scenery and so on.

    I already gave example of obvious stuff where, HUD and player characters can be drawn and controlled locally, while NPCs, scenery and extra eyecandy like shadows, lighting and so on done remotely and streamed. This, for example may be enough for most of the games, except for FPS and make these games available in 4k even for business laptops with integrated graphics, which is more than enough for only something like HUD and couple of characters drawn locally.
    Last edited by Gaidax; 2019-11-23 at 12:45 AM.

  19. #119
    Quote Originally Posted by Gaidax View Post
    I already gave example of obvious stuff where, HUD and player characters can be drawn and controlled locally, while NPCs, scenery and extra eyecandy like shadows, lighting and so on done remotely and streamed. This, for example may be enough for most of the games, except for FPS and make these games available in 4k even for business laptops with integrated graphics, which is more than enough for only something like HUD and couple of characters drawn locally.
    Not sure about that. Rendering a HUD locally and overlaying it would be easy, true, but also utterly pointless as it costs you next to nothing to do that. Rendering a character locally though, that would be way more complicated, because you can't just overlay it later on the scenery. That is not how (modern) 3D rendering works. Just think about cases where part of your character is obstructed by the scenery, you'd still need to communicate with the server to exchange the exact location, the pose, the effects, etc. At that point you can just let the server render all of it as you'd gain nothing here.

    I could see some frame timing and decompression optimization potential, but splitting the rendering seems rather doubtful to me. For FPS this might be a bit different, but I doubt you would gain much by doing so, unless you cover 2/3 of your screen with your weapon and save some rendering time that way.
    You are welcome, Metzen. I hope you won't fuck up my underground expansion idea.

  20. #120
    Quote Originally Posted by BeepBoo View Post
    You keep the car longer than 5 years and get to the phase where you don't make payments any more. I know. Shocking that a car can last you 10, 20 years WITHOUT a single major repair when nicely maintained and not crashed. Now do your dumb plot with all that money you're not wasting on payments after the 5 year mark. Also, a 5 year loan is retarded. Buy a cheaper car and pay it off sooner.
    > Also, a 5 year loan is retarded.

    I agree. Facelift comes out after 3.

    lol
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •