Page 2 of 3 FirstFirst
1
2
3
LastLast
  1. #21
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Dukenukemx View Post
    As opposed to this. Much better use of screen real estate AMIRIGHT?
    http://cdn2.knowyourmobile.com/sites...?itok=ClC_y66g
    This tablet in itself without those stupid buttons is already bad, the bezels are ridiculously enormous and you could very well put buttons at them.

    This is an example of good screen/body ratio.

    And your logic doesn't make sense, if you want buttons you can buy a hand-held console and be happy. Tablets are tablets.

  2. #22
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Quote Originally Posted by Pterodactylus View Post
    Intel passed on making chips for the iPhone - dumb asses.
    I don't think it'd matter when Apple was planning to design their own CPUs anyways.

    - - - Updated - - -

    Intel tried to push into the more mobile areas but with limited success. It's one of the growing albeit slowing down market now. Atom was a spawn of it despite it being utter ass. They failed however and are pursuing more profitable areas. Consumer desktop are at a standstill and going down (-15% last year), chips going down by 4% which is why they're pursuing more server orientated markets like Knight's Landing's super computer contracts and data center. Even making absurdly huge silicon 690mm2~(iirc) chips at (relatively) lower TDP than Nvidia's offering.
    With more emphasis on off site (cloud) computing and resources for consumer, it's where the current growing market is at and being focused on.

    http://anandtech.com/show/10262/inte...ce-adjustments

  3. #23
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Artorius View Post
    This tablet in itself without those stupid buttons is already bad, the bezels are ridiculously enormous and you could very well put buttons at them.

    This is an example of good screen/body ratio.

    And your logic doesn't make sense, if you want buttons you can buy a hand-held console and be happy. Tablets are tablets.
    Ok, now put your thumbs on it. The point of this is, without a good solution for gaming input for mobile, there's no future for faster better CPU's. PC needs more demanding applications. Specifically games, as games have always driven development. My belief is that PC gaming needs better tools. Better tools to make 3D models, and create animations, cause these things are extremely time consuming. Mobile devices needs better input. How you go about that, it up for debate. Otherwise games like Candy Crush aren't going to engage people to want to buy faster better System On Chips. I would even argue the server market is a dead end, cause I don't think people are going nuts over cloud. Everyone who talks about the cloud is always businesses who want to make a could service to sell. Not sure if anyone is interested in a cloud service.

  4. #24
    Quote Originally Posted by Dukenukemx View Post
    So are invisible walls. It's still bad game design.
    That's only bad game design for games trying to simulate reality, which many developers do not bother with (and for good reason). There will always be limits to video games and how many realistic features and checks you could implement. AI for example would be a supreme pain just with variable dialogue alone. You complain that wooden doors are indestructible, when in every game there will always be an objects and walls that are indestructible.

    We are gradually reaching new lengths in applying realism to games, but IMO we should just accept that games are what they are. Games. Not everything has to be a simulation.

  5. #25
    Deleted
    Quote Originally Posted by kail View Post
    Games. Not everything has to be a simulation.
    Church.

    A game developer's first priority is to entertain. Realism isn't fun, per definition.

  6. #26
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Dukenukemx View Post
    Ok, now put your thumbs on it.
    Okay?

    Pls Duke.
    The point of this is, without a good solution for gaming input for mobile, there's no future for faster better CPU's.
    Gaming hardly matters, the strong CPUs aren't on the consumer market. The reason why Intel isn't trying to improve performance of the consumer market chips more is because their competition screwed up in the past and because they're in a place that hardly needs improvement. The bottlenecks are almost never on the CPU nowadays.
    PC needs more demanding applications. Specifically games, as games have always driven development.
    PCs have demanding applications, plenty of them. Gaming only drives development of graphic's cards and it's arguably true that other more important things have stolen part of it, since GPUs nowadays can be used at computing too.
    My belief is that PC gaming needs better tools. Better tools to make 3D models, and create animations, cause these things are extremely time consuming.
    Problem is not the tools, they just don't bother with doing extremely heavy games because there isn't a large enough market for it. Consoles are what drive those "AAA" games and consoles can't run what you want developers to make.
    Mobile devices needs better input. How you go about that, it up for debate. Otherwise games like Candy Crush aren't going to engage people to want to buy faster better System On Chips.
    Honestly, people aren't only buying faster phones because they want to play games. In fact it doesn't even matter for Android games because the mid tier phones can already mas-out th games at constant 60fps just fine. One of the reasons why Samsung and Qualcomm push their GPU game forward is because driving 221184000 pixels per second without stuttering is not an easy task for a GPU of that caliber. Everything you do at Android comes from the screen, if you try to scroll something and the phone does jerky animations you won't have a nice experience. The actual difference between those high-end mobile GPUs nowadays is which one consumes less energy to do the same thing, because efficiency actually matters at mobile since you have batteries.
    I would even argue the server market is a dead end, cause I don't think people are going nuts over cloud. Everyone who talks about the cloud is always businesses who want to make a could service to sell. Not sure if anyone is interested in a cloud service.
    What? You understand that the entire internet and databases are basically servers right? Unless you plan on shutting off the Internet, there will always have a huge market for servers.

  7. #27
    Yeah but one server can run a 100 websites and one 24 core physical machine can run multiple virtual servers, and the whole thing can work for a decade. It's not a dead end but it's not like everyone buys a million sever CPUs per year.
    Last edited by haxartus; 2016-04-22 at 03:48 PM.

  8. #28
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Artorius View Post
    I meant on the screen like you're playing a game, not reading a webpage. The point is that you're wasting screen space with your fingers.
    PCs have demanding applications, plenty of them. Gaming only drives development of graphic's cards and it's arguably true that other more important things have stolen part of it, since GPUs nowadays can be used at computing too.
    But the CPU industry was also driven by gaming as well, just not recently. There were more copies of Doom than there was of Windows back in the day.
    Honestly, people aren't only buying faster phones because they want to play games. In fact it doesn't even matter for Android games because the mid tier phones can already mas-out th games at constant 60fps just fine. One of the reasons why Samsung and Qualcomm push their GPU game forward is because driving 221184000 pixels per second without stuttering is not an easy task for a GPU of that caliber. Everything you do at Android comes from the screen, if you try to scroll something and the phone does jerky animations you won't have a nice experience. The actual difference between those high-end mobile GPUs nowadays is which one consumes less energy to do the same thing, because efficiency actually matters at mobile since you have batteries.
    Which is why I haven't bought a high end phone for a while.
    What? You understand that the entire internet and databases are basically servers right? Unless you plan on shutting off the Internet, there will always have a huge market for servers.
    But do you need the latest Intel Xeon CPU to run your web server? If it already meets demands then why upgrade? The whole Internet of Things is a term coined up to get more people to buy junk connected to the internet, just like how the word cloud. Something that people can do themselves.


  9. #29
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    In terms of cloud - http://anandtech.com/show/10271/micr...g-cloud-growth
    Quote Originally Posted by haxartus View Post
    Yeah but one server can run a 100 websites and one 24 core physical machine can run multiple virtual servers, and the whole thing can work for a decade. It's not a dead end but it's not like everyone buys a million sever CPUs per year.
    If all servers did were light duty work then that'd make sense, but they're not limited to just that. Not all servers are even for public use. Rendering farms, internal servers for companies, schools, governments, databases, games, whatever. China just signed an agreement with AMD for Opteron CPUs based off Zen for example.

    Hell Facebook just shed a shit ton of Sandy Bridge Xeon CPUs onto Ebay months ago.

    - - - Updated - - -

    Quote Originally Posted by Dukenukemx View Post
    But do you need the latest Intel Xeon CPU to run your web server? If it already meets demands then why upgrade? The whole Internet of Things is a term coined up to get more people to buy junk connected to the internet, just like how the word cloud. Something that people can do themselves.

    [video=youtube;86aGW2azWUU]https://www.youtube.com/watch?v=86aGW2azWUU[video]
    You're giving way too much credit to the general public.
    The first video talked about it spanning over months and a lot of homework to be done JUST for it.

  10. #30
    Deleted
    Quote Originally Posted by haxartus View Post
    Yeah but one server can run a 100 websites and one 24 core physical machine can run multiple virtual servers, and the whole thing can work for a decade.
    Right now, I could use a CPU with 100 times more power than is available simply because video transcoding at 4k or 8k is so demanding that even my multi-cpu-almost-latest-Xeon runs out of juice.

    And no, GPU transcoding is not an option because of decreased quality and poor support for HEVC.

  11. #31
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,865
    Quote Originally Posted by Dukenukemx View Post
    Good luck with that Intel. What's funny is that the slump the PC industry is facing is partially due to Intel's lack of innovation. Why is a the 6600K not really faster than a 4960K, as far as IPC goes?
    You do not understand much, nobody, besides some 3% of people, gives a damn that that Skylake is barely faster than let's say Haswell, because what is popular and needed is efficiency, not brute force, due to mobile.

    It does not take much to understand that Desktop PCs are a dying breed, people switch to lean laptops for their day-to-day needs in addition to phones and other gadgets. That trashcan you have under your desk is literally a dinosaur that will be in museum 10-15 years from now.

    What Intel did innovate in the last years and you seem not to be caring about - is introduction of fast and snappy laptops that last many many hours and are very light and thin. If you compare same price laptop from 2012 to modern Ultrabook - performance-wise there is no difference, but the said ultrabook will be light as a feather and as thin as it gets.

    These layoffs are probably as they say targeting the branches they want to move away from, namely Desktop PC, which is really in a weird situation now where many many tasks it was the way to go for are now replaced by laptops, while server equipment was the whole other branch of requirements and needs making it not compatible with that either. In short, nobody besides heavy gamers and some specific business needs really needs desktops nowadays.

  12. #32
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Gaidax View Post
    It does not take much to understand that Desktop PCs are a dying breed, people switch to lean laptops for their day-to-day needs in addition to phones and other gadgets. That trashcan you have under your desk is literally a dinosaur that will be in museum 10-15 years from now.
    I don't believe the desktop PC will go away. I believe it'll take another form. Like SteamBoxs or the Apple trash can.
    What Intel did innovate in the last years and you seem not to be caring about - is introduction of fast and snappy laptops that last many many hours and are very light and thin. If you compare same price laptop from 2012 to modern Ultrabook - performance-wise there is no difference, but the said ultrabook will be light as a feather and as thin as it gets.
    Except that laptops are going to run into a brick wall with heat and power. We can't keep shrinking the size of transistors in chips, and at some point we'll reach our limit. Unless we have better methods to dissipate heat and better batteries, the laptop will stop evolving. Tablets will hit this limit sooner, if they haven't already.

    Also if Intel cared about laptops then why don't they use CPUS with Iris Pro Graphics? Seems like only Apple gets those chips.
    In short, nobody besides heavy gamers and some specific business needs really needs desktops nowadays.
    This is true, but this has always been true. The difference now is that laptops are cheap, and more than powerful enough to do basic tasks. For the average person there's no need to get a desktop. But this means those people don't need to upgrade either. I got a 2010 laptop that still does what I want just fine. When I'm out and about I rarely have enough free time to play games, but if I did then I would just use my home desktop computer to stream a game to my laptop. As much as I could go out and buy an expensive laptop to game on, the prices don't reflect my level of giving a damn. Maybe it's just me, but I think PC hardware in general is overpriced. Especially laptops.

  13. #33
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,865
    Quote Originally Posted by Dukenukemx View Post
    Also if Intel cared about laptops then why don't they use CPUS with Iris Pro Graphics? Seems like only Apple gets those chips.
    They use, anyone is free to buy these, they are just expensive, that's why nobody buys them. But for example Dell uses those in some of their models, for example XPS 13 (2016) has Iris as an option and I think MSI also got some model with it.

    The issue is merely price, not that Intel somehow withholds those and for the price difference most vendors simply go for dGPU in this case, which is the better choice for anything besides making your laptop as slim as possible.

    - - - Updated - - -

    Quote Originally Posted by Dukenukemx View Post
    Except that laptops are going to run into a brick wall with heat and power. We can't keep shrinking the size of transistors in chips, and at some point we'll reach our limit. Unless we have better methods to dissipate heat and better batteries, the laptop will stop evolving. Tablets will hit this limit sooner, if they haven't already.
    This point is still far away and it is much better to invest into this, instead of into immobile cans. If Intel for even one second stop pushing mobile, it will be eaten by ARM fast, which is the one real threat to them now.

    But really, the main point is that Desktop market is simply being slaughtered by Mobile and it is not going to get better, because well - there is no need for Desktops really...
    Last edited by Gaidax; 2016-04-23 at 09:57 PM.

  14. #34
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Gaidax View Post
    This point is still far away and it is much better to invest into this, instead of into immobile cans. If Intel for even one second stop pushing mobile, it will be eaten by ARM fast, which is the one real threat to them now.

    But really, the main point is that Desktop market is simply being slaughtered by Mobile and it is not going to get better, because well - there is no need for Desktops really...
    Nvidia tried to invest into the mobile market and it kinda didn't work out. Unless you look at self driving cars where Nvidia I think made it big? But even still, even the mobile market is in the same situation. Don't know about you but I have no reason to go buy a new tablet, not that I used the one I have. Just updated it with newer version of Android.

  15. #35
    Fluffy Kitten Remilia's Avatar
    10+ Year Old Account
    Join Date
    Apr 2011
    Location
    Avatar: Momoco
    Posts
    15,160
    Nvidia failed at the mobile market cause their tegra chips were shit, they were power hungry and under performing compared to the competition (Qualcomm) at the time.
    Again, you are not everyone. Mobile was bound to slow down. Not everyone upgrades every new shiny device that comes out but it's still one of the market that appeals to the general public.

  16. #36
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    I don't expect people to dump a tablet for a desktop PC, but I don't expect people to dump a tablet for a newer and faster tablet. It's just like PC, in that once you have a tablet then you have no reason to buy another. It's gotten to the point where quad core ARM chips have 2 faster cores and 2 slower but efficient cores. There's no real demand for better performance, and therefore new tablets.

    - - - Updated - - -

    The emphasize my point, here's a article about how smartphones have reach their peak. For Intel to go to that market just means they're moving to a market with a similar problem to PC. This could change if actual innovation was put into devices. I still remember FireFox's concept of Seabird, and that would be pretty awesome to have. VR could also ignite PC gaming as well, and therefore the CPU market. But until VR headsets are cheaper, and Apple/Samsung begin to think outside the box, the market just won't expand.


  17. #37
    The Unstoppable Force Gaidax's Avatar
    10+ Year Old Account
    Join Date
    Sep 2013
    Location
    Israel
    Posts
    20,865
    Yes, mobile is also slowing down, simply because the handsets and tablets are already more than capable enough to drive the content which is usually consumed by people in this format, but it definitely is healthier than desktops which are on their way to the graveyard.

    Smartphone performance is through the roof already and usually the things really slow down because of outside issues such as reception and connection quality and not because it can not drive the applications fast enough. what should be attacked now is efficiency, better power tech, storage and connectivity. SoC performance? I'd say I'm at the point where I don't really care about a couple of tens of milliseconds more time to open some app, but I do care that my phone can't survive the day if I actually use it a lot and my next purchase will be basically phone with same performance as my current but double or more battery life.


    Besides that there is still a lot of space to improve with laptops and there is whole huge and lucrative IoT market waiting for grabs.

    As for Intel entering saturated marked, it's all a matter of the product they can deliver, if their solution is exceptional in price/performance/efficiency - people will flock to it in no time. All it will take is a messup by Qualcomm like the one with 810 and being around with a superior solution to get the ball rolling. The above example with Nvidia happened simply because the product they introduced was not very good, compared to alternatives for day-to-day tasks, it was also pretty expensive.
    Last edited by Gaidax; 2016-04-24 at 02:52 PM.

  18. #38
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Gaidax View Post
    Besides that there is still a lot of space to improve with laptops and there is whole huge and lucrative IoT market waiting for grabs.
    I don't see laptops improving when so many people are going crazy for ultra thin. With so little space for cooling, these devices will overheat and throttle. They already do, especially the Macs.
    As for Intel entering saturated marked, it's all a matter of the product they can deliver, if their solution is exceptional in price/performance/efficiency - people will flock to it in no time. All it will take is a messup by Qualcomm like the one with 810 and being around with a superior solution to get the ball rolling. The above example with Nvidia happened simply because the product they introduced was not very good, compared to alternatives for day-to-day tasks, it was also pretty expensive.
    The key problem Intel faces is customizability. Anyone can make a ARM SoC. Because you can simply go to ARM, but their designs, and manufacturer them however you want. The Samsung Exynos is exactly from ARM designs. Qualcomm and Apple both have tweaked designs, with their own GPUs. You are free to pick and choose how you want your SoC's made.

    Intel does not do this. They don't even share designs with AMD. Unless Intel decides to do what ARM does, they can't expand. And this is with x86, which is in a very powerful position cause so much software is written for it.

  19. #39
    The Lightbringer Artorius's Avatar
    10+ Year Old Account
    Join Date
    Dec 2012
    Location
    Natal, Brazil
    Posts
    3,781
    Quote Originally Posted by Dukenukemx View Post
    I don't see laptops improving when so many people are going crazy for ultra thin. With so little space for cooling, these devices will overheat and throttle. They already do, especially the Macs.
    That's the number one priority for devices that are supposed to be portable and hassle free. You don't want to put something heavy at your backpack and walk all day with it, you also don't want to put something that isn't thin because more than often you need to put other things inside your bad. You're thinking with a small mind in the subject, the problem isn't the lack of space for cooling.

    Look at the Surface Book for example, Microsoft eliminated the throttling problem (well, it does still power throttles because the power supply can't give enough power when both the CPU and the GPU are trying to work at 100%, but it isn't because of high temperature) simply by putting half of the hardware at the keyboard half and the other half at the screen half. It's a genius design that nobody tried to do before and works magically well.

    The key problem Intel faces is customizability. Anyone can make a ARM SoC. Because you can simply go to ARM, but their designs, and manufacturer them however you want. The Samsung Exynos is exactly from ARM designs.
    The current Exynos, 8890, actually has custom design at its big cores, but still uses Cortex A53 at the small ones. This and the Mali GPUs. 7420 and 5433 had A57 and A53 cores, which are both ARM designs but the magic isn't at this. The snapdragon 810 was exactly the same configuration of cores (4xA53 and 4xA57) but performs ~40% worse than the 7420. And besides, SoCs are like their name suggest "system on chip", the CPU isn't everything that you have inside of a SoC and the architecture plays a big role at performance. AMD and Intel make x86 CPUs but this doesn't make them equal, that are a lot of other factors.

    Qualcomm and Apple both have tweaked designs, with their own GPUs. You are free to pick and choose how you want your SoC's made.

    Intel does not do this. They don't even share designs with AMD. Unless Intel decides to do what ARM does, they can't expand. And this is with x86, which is in a very powerful position cause so much software is written for it.
    ARM and Intel are different companies, ARM makes designs and sell them. Intel makes designs, have the fabs, make the CPUs and sell them. The companies operate in a different way, their business model isn't comparable.

    Intel has little to win with ARM's strategy by now, it would've been a smart choice in the past maybe, due to the ridiculously small operating cost of a design only company. But Intel today is a company that owns what? 80% of the CPU market? They can continue doing what they want, and they'll be safe unless MS and Apple decide to ditch X86 in favor of ARM and they won't in the new future because Intel is a close partner for both.

  20. #40
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by Artorius View Post
    Look at the Surface Book for example, Microsoft eliminated the throttling problem (well, it does still power throttles because the power supply can't give enough power when both the CPU and the GPU are trying to work at 100%, but it isn't because of high temperature) simply by putting half of the hardware at the keyboard half and the other half at the screen half. It's a genius design that nobody tried to do before and works magically well.
    As far as I know, it does throttle. If it can't receive enough power to operate at max, which I haven't heard of, then that just proves my other point.



    ARM and Intel are different companies, ARM makes designs and sell them. Intel makes designs, have the fabs, make the CPUs and sell them. The companies operate in a different way, their business model isn't comparable.
    IBM with PowerPC did the same thing as Intel, until they change their business model to do exactly what ARM was doing. This is how we got PowerPC in the 360,PS3, GC, Wii, and Wii U, because of this reason. Sadly PowerPC doesn't seem popular anymore.
    Intel has little to win with ARM's strategy by now, it would've been a smart choice in the past maybe, due to the ridiculously small operating cost of a design only company. But Intel today is a company that owns what? 80% of the CPU market? They can continue doing what they want, and they'll be safe unless MS and Apple decide to ditch X86 in favor of ARM and they won't in the new future because Intel is a close partner for both.
    AMD plans to produce server and destkop ARM cpus, which could be a problem for Intel. They have 80% of the market, but it's a market share they could lose. Instead of worrying about keeping what they have, they're more concerned with other markets. They shouldn't, unless they plan to pull the same crap they did with AMD in the past, and use questionable business practices?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •