Page 4 of 7 FirstFirst ...
2
3
4
5
6
... LastLast
  1. #61
    Quote Originally Posted by Vash The Stampede View Post
    Nope, not even close. I must have spent a week setting Linux Mint 20 on a laptop I'm giving my nephew, and it shouldn't need a week. The problem is Windows software doesn't play nice in Linux, which is the same problem a Mac OSX users would run into. For each Windows game I've installed, it took me a few days to get it working properly. Minecraft thankfully just works, because the website does have a simple installer. Roblox tough.. still doesn't work and my nephew is addicted to that game. I also spent a good deal of time making Linux Mint 20 look like Windows 10. I have to give myself a lot of credit because you'll have a hard time determining if this isn't Windows. The start button looks like Windows, the sounds the system makes are from Windows, and even the boot and shutdown animation is that of Windows. Want him to feel like he's using Windows.

    If I were installing Windows I would have been done in a few hours. No problems installing games and Roblox would actually work. Why would anyone go through the trouble of installing Linux just to have less compatibility with applications and jump through hoops? These are problems that I hope can be fixed eventually.
    Out of curiosity, why are you giving your nephew a machine running Mint when he doesn't seem to want or need a Linux machine, and would want to be using a Windows machine?

    It kind of feels like you're making a point that using a Linux kit is a mistake because you'd rather use a Windows one. Which... well... obviously. If you want to use Windows, you want your machine to look and function like Windows, you want to run Windows apps and you want to run Windows games, then yeah, you're better off using Windows.

  2. #62
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by jellmoo View Post
    Out of curiosity, why are you giving your nephew a machine running Mint when he doesn't seem to want or need a Linux machine, and would want to be using a Windows machine?
    Like the Macbook Air the machine I gave my nephew is old, and there are reasons to use Linux over Windows in some situations. It has an B960 that on Windows 10 doesn't get OpenGL support, which is fine for most applications but isn't on others. Linux has excellent OpenGL support and I do get better performance on Linux than Windows 10 in that regard. Also no viruses as I don't want to be technical support for this machine if he decides to download random things off the internet and infect the machine. There's also the Windows 10 update problem as I could disable it but that just makes the machine more vulnerable to infections, but if I don't disable the auto update then it'll force the laptop to reboot which can be a problem.

    I thought about putting Linux on the Macbook Air but I don't get any particular advantages for that machine. OpenGL runs fine enough and you do get Roblox and Minecraft working as well. Apple doesn't force updates on you. You don't have a virus problem like on Windows. Getting Window games working on it is worse than getting it to work on Linux but she only plays Minecraft and Roblox, so it isn't a problem.
    It kind of feels like you're making a point that using a Linux kit is a mistake because you'd rather use a Windows one. Which... well... obviously. If you want to use Windows, you want your machine to look and function like Windows, you want to run Windows apps and you want to run Windows games, then yeah, you're better off using Windows.
    Well yea, which is why I still use Windows on my main machine. I'm a power user and therefore expect more from my machines than normal people, but I also spend a lot more time doing it. Linux Mint 20 has more problems with dependencies than Mint 19. Mint 20 blocks snapd for ethical reasons, which just makes my life harder. Because I have to have the bleeding edge video drivers, Oibaf PPA will break some graphics once in a while. Things obviously haven't been getting better. One day it might be as easy and simple as using Windows but it's not there yet. Even though Valve is 100% committed to Linux and has been working on Wine for years, Wine still sucks. Sucks hard.

  3. #63
    Quote Originally Posted by Vash The Stampede View Post
    The A14X is not coupled with enough storage for AAA gaming.

    Just like that, Apple is at Nvidia 3080+ performance? Okey dokey smokey.

    I'm not mentioning Linux, just others here who want to use it against me.

    So how's Cyberpunk 2077 running on the Apple M1?

    Most debunkers are using Apples to Apples hardware. You can't do that now with Apple's M1, so we are free to use anything to compare to Apple, including AMD's Ryzen.

    This is how I know you don't read my posts. Nothing I've suggested in this thread says that I think Linux is better for most people. I've even posted a link to Phoronix.com where I've posted an angry rant about the current state of Linux and how it's gone backwards lately.

    Doesn't look it here. It loses 50% to 75% performance compared to native.

    Umm... 1.5TB isn't enough? This isn't a cyber punk forum, this is a wow forum. Scroll back up, and look at my post in regards to the next M1X, and M2 processors on the horizon.

    My statement is, in it's price class, the m1 mini, you cannot build better performing hardware right now with the current prices of GPU's

  4. #64
    Quote Originally Posted by Vash The Stampede View Post
    Like the Macbook Air the machine I gave my nephew is old, and there are reasons to use Linux over Windows in some situations. It has an B960 that on Windows 10 doesn't get OpenGL support, which is fine for most applications but isn't on others. Linux has excellent OpenGL support and I do get better performance on Linux than Windows 10 in that regard. Also no viruses as I don't want to be technical support for this machine if he decides to download random things off the internet and infect the machine. There's also the Windows 10 update problem as I could disable it but that just makes the machine more vulnerable to infections, but if I don't disable the auto update then it'll force the laptop to reboot which can be a problem.
    Absolutely fair enough. Even if Linux doesn't end up being his cup of tea, I do believe it's beneficial for people to learn some of it. That being said, even if viruses aren't a worry, you may very well become tech support should he try and mess around a little too much. For fun, leave the admin password on a sticky note one day and see how much trouble he gets into.

    I thought about putting Linux on the Macbook Air but I don't get any particular advantages for that machine. OpenGL runs fine enough and you do get Roblox and Minecraft working as well. Apple doesn't force updates on you. You don't have a virus problem like on Windows. Getting Window games working on it is worse than getting it to work on Linux but she only plays Minecraft and Roblox, so it isn't a problem.
    My experience with Linux on a Macbook Pro some years back was actually a lot more solid than I thought it would be. One of the laptops I've had the fewest out of the gate issues with. The standardized parts really do help. But it's really solving a problem that doesn't exist. For the average consumer, they aren't getting much in the way of benefit outside of perhaps a small performance boost and stability (if so desired).

    Well yea, which is why I still use Windows on my main machine. I'm a power user and therefore expect more from my machines than normal people, but I also spend a lot more time doing it. Linux Mint 20 has more problems with dependencies than Mint 19. Mint 20 blocks snapd for ethical reasons, which just makes my life harder. Because I have to have the bleeding edge video drivers, Oibaf PPA will break some graphics once in a while. Things obviously haven't been getting better. One day it might be as easy and simple as using Windows but it's not there yet. Even though Valve is 100% committed to Linux and has been working on Wine for years, Wine still sucks. Sucks hard.
    I'm not terribly familiar with the latest version of Mint, the last one I tried was, I believe, 18, and even then it wasn't more than spin. I'm the opposite of wanting bleeding edge with Linux. I want rock solid stability. My main machine dual boots Windows and Debian. Debian is for work, Windows is for play. I simply don't try to do things like gaming on Linux. As you mentioned, WINE sucks balls. It's gotten better, Lutris is definitely a help, but the best case scenario is usally along the lines of "the game works perfectly. Well, except for..." and that's pretty telling.

    But I look at OS as similar to picking the best tool for the job. If Windows is a wrench, maybe Mac OS is a screwdriver, and Linux (as catchall as it is) is a pair of pliers. No one tool suits every single need, and sometimes while you can use one, it isn't ideal. I think for the lion's share of users, the OS actually doesn't matter a whole lot. If your experience is "I use an app to accomplish some task. Then I sue another app to accomplish a different task. And so on..." then unless your task is super specific that needs a super specific app, it doesn't terribly matter. For most people, it's rarely all that specific. But for some, it absolutely is. If you're a video editor trained in Final Cut, you're gonna use a Mac. If you're a video editor trained in Premier, you aren't touching an M1 Mac with a ten foot pole. You'll want the best Windows machine you can get. If you are a VFX compositor, you won't go anywhere near a Windows machine, you'll hope that there's a Linux machine setup for you, but a Mac will do in a pinch if needed.

    If you want to do dedicated gaming, you absolutely want a Windows machine. Anything else just is flat out inferior. But, if you just want to do *some* gaming, then a Mac or Linux kit will do. It's just not the best choice.

  5. #65
    Old God Vash The Stampede's Avatar
    10+ Year Old Account
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,939
    Quote Originally Posted by SigmaShift View Post
    Umm... 1.5TB isn't enough?
    How much is 1.5TB?
    This isn't a cyber punk forum, this is a wow forum.
    Fair enough but I'm responding to the Apple M1's ability to game in general.
    Scroll back up, and look at my post in regards to the next M1X, and M2 processors on the horizon.
    Release them first then we'll talk. Speculation on stuff that isn't released and have zero clue on performance is a waste of time.
    My statement is, in it's price class, the m1 mini, you cannot build better performing hardware right now with the current prices of GPU's
    Price class or power class? I can easily find laptops that will outperform the M1 for a better price, but not in battery life or weight at the same time. Even if I could find laptops that will outperform the Apple m1 devices in every aspect, the typical response is build quality is better on Apple. I couldn't win even if I was correct.

    Quote Originally Posted by jellmoo View Post
    Absolutely fair enough. Even if Linux doesn't end up being his cup of tea, I do believe it's beneficial for people to learn some of it. That being said, even if viruses aren't a worry, you may very well become tech support should he try and mess around a little too much. For fun, leave the admin password on a sticky note one day and see how much trouble he gets into.
    He won't know it's not Windows and he won't need the password unless he tries to do something he shouldn't.
    My experience with Linux on a Macbook Pro some years back was actually a lot more solid than I thought it would be. One of the laptops I've had the fewest out of the gate issues with. The standardized parts really do help. But it's really solving a problem that doesn't exist. For the average consumer, they aren't getting much in the way of benefit outside of perhaps a small performance boost and stability (if so desired).
    The thing here is you gotta know the limitations of each platform. Windows vs Mac vs Linux all have their plus and minuses and you have to accept them if that's your platform of choice. I cannot in good conscience tell someone to use Linux. At the same time, I can't recommend Mac OSX either. Unless the user knows exactly what they want out of a Mac, then I couldn't recommended it. The fact is Windows does handle a lot of things rather well, with no real downsides other than Microsoft is watching you. As long as your hardware is supported then Windows 10 is the way to go. I mean actively supported, meaning your graphics card got a driver update this year, not like 5 years ago.
    As you mentioned, WINE sucks balls. It's gotten better, Lutris is definitely a help, but the best case scenario is usally along the lines of "the game works perfectly. Well, except for..." and that's pretty telling.
    The problem with Lutris is that a lot of games work right out the box in Steam, which Lutris just lets Steam handle it. Which is fine if you have the Steam version, but obviously not everyone has the Steam version. There's also far too many Wine versions because there's far too many patches to get games working on Linux. Mac though has no proton, which makes getting Windows games to work on Mac much worse than Linux. I can just download Proton-GE and place it in the /opt folder and just run Wine off that and it'll play games that requires Proton. No Vulkan support means no DXVK, which means good luck. Valve is responsible for MoltenVK but there's just too much going on that needs Vulkan to make proton work on Mac. Apple deprecated OpenGL which means no WineD3D. No equivalent to system calls to the kernel like what was implemented into Linux. That's why some games with DRM don't work because they make direct calls to the kernel. This was not a big problem for Mac users because they have boot camp, but not with the Apple M1 and probably won't ever. Parallels would be a perfectly good solution if you don't mind the performance loss, because Parallels is a virtual machine and virtual machines lose a lot more performance.

    Valve is really behind most of the work done in Linux for the past 5 years. Valve is also responsible for MoltenVK on Mac but to me it's clear that Valve kinda gave up on Mac. They could port Proton if they wanted to, and they could forgo Vulkan for a MetalD3D wrapper. Getting games to work on Linux is super difficult, but getting them to work on Mac OS X through Wine is impossible. If only Apple supported Vulkan then this could turn things around, and they really should.

  6. #66
    Scarab Lord bergmann620's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Stow, Ohio
    Posts
    4,402
    So...

    I haven't posted here in what feels like a decade, but...

    I bought the M1 MacBook Air version with the 8-Core GPU. I pre-ordered it for $1,299.

    I like hardware. My work laptop is a 13" Surface Book 2. My home daily driver when I bought the Air was an Alienware 17R4, the maxed out one with the 1080 GTX and 32GB RAM. I've only owned older hardware on the Mac side- a couple of 8+ year old MacBook Air's and an 11-year old iMac. I also have a 2017 Surface Pro, and a 2020 iPad Pro 12.9".

    I work as Director of IT for a small manufacturer.

    My daily workload primarily consists of tons of e-mail and browsing, creating basic presentations, monitoring our hardware assets, productivity software, and some low-rent content creation (photo and video creation).

    My (PC) gaming primarily consists of WoW, Heroes of the Storm, and some occasional Forza.

    I'll start with the obvious:

    The MBA, in any configuration, past or present, is not intended as a gaming machine. On my AW laptop, I could run WoW at 4K at ~75 FPS with most things maxed. On the MBA, I'm running at ~ 1600P at ~50FPS with most things medium-to-high.

    HotS runs pretty well on the Air, but it's certainly not maxed out at 150 FPS like on the AW.

    I haven't tried, but I doubt I can fire up Forza unless I'm streaming from the One X.

    Comparing it to the two Intel "Thin and Light" machines I have access to, it is significantly faster as a gaming machine for the games that run on it.

    M1 destroys Intel at basically every level with regard to iGPU.

    There are two hiccups with 1st-gen M1: Exporting to a display is tricky if you want to connect at 4K60- I haven't gotten that working yet. Also, they have no support for eGPU.

    As a productivity machine...

    Unless there is a specific piece of software that is 100% incompatible, or you need Clydesdale-like workstation power, this thing absolutely spanks ASS. It is faster at every single task I've thrown at it. Faster on cold boot and resume, faster opening apps, faster running apps, faster transfer speeds off my USB-C external drive where I keep my Lightroom catalogue, faster rendering image edits, faster rendering 4K video.

    I'm sure there are some software compatibility issues somewhere, but I have not come across one in the month+ I've been using it. And this is moving from a primarily PC workflow. This thing combines the speed and immediacy of working on an iPad Pro with the much greater software selection of MacOS.

    I have not opened my Surface Book in at least 10 days.

    Also... I had a rare in-office day last week. Forgot my charger. Got home after a 10-hour day at 38%. My SB has solid battery life, but would have died in the late afternoon.

    When you combine all of the above with the ecosystem edge... (Instant hot-spot connections to your phone, texts/calls on the laptop/iPad/Watch/Phone, laptop running iOS apps, ease of AirPods use, etc) every PC device between ~ $699 and $1,500 that is not gaming-dedicated should be feeling the "I'm in danger" GIF.

    There's something I've always said about the iPhone vs Android debate:

    iPhone tells you what is going to suck about your phone, where as Android lets you decide what is going to suck about your phone.

    It's also true in the laptop / cheap desktop realm. In either case, 99% of the folks who aren't buying the absolute cheapest hardware available are better off letting Apple make that decision for them, especially with the M1 out.
    indignantgoat.com/
    XBL: Indignant Goat | BattleTag: IndiGoat#1288 | SteamID: Indignant Goat[/B]

  7. #67
    Quote Originally Posted by zmuci View Post
    I've seen a couple of youtube videos where it runs really smoothly, but mostly in the air and in low populated places. I would love to see its performance in a raid or an epic battleground
    It runs well. Set to 50% scale, and no AA it’s great. I doubt you will get this performance from ANY ultrabook or mGPU on the market

  8. #68
    Bloodsail Admiral m4xc4v413r4's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Home
    Posts
    1,075
    Quote Originally Posted by loadedaxe View Post
    I play on mine at 1440p, I really dont pay attention to my fps as I do not lag, then again I have mine set on preset 5. When I get home and play tonight I will and repost and show my settings, I did have to play with them but I dont remember what all I had to do. I also play SC2 and in maps with 4 players I have to run on low, 2 players I can run on medium. Its not awesome, but you can play and have a decent experience.

    There are many videos on the M1 Mini.

    here are a few.



    Damn that guy on the first video is the definition of cringe to me... He tries so hard to make it look like the system he just spent a ton of money on is doing great running WoW but he puts everything that actually uses any performance to the minimum and attempts to excuse that with the typical "I don't really like it / it's not necessary" bullshit.

    Over €800 (not counting any peripherals obviously) to have like 30 fps at 10 on an empty area looking at the ground... Jesus people are delusional.
    Like, if a computer does what you need it to do it's fine, I have no problem with that, but don't try to "sell it" as some amazing thing while showing it clearly is not.

  9. #69
    Scarab Lord bergmann620's Avatar
    10+ Year Old Account
    Join Date
    Dec 2010
    Location
    Stow, Ohio
    Posts
    4,402
    Quote Originally Posted by m4xc4v413r4 View Post
    Damn that guy on the first video is the definition of cringe to me... He tries so hard to make it look like the system he just spent a ton of money on is doing great running WoW but he puts everything that actually uses any performance to the minimum and attempts to excuse that with the typical "I don't really like it / it's not necessary" bullshit.

    Over €800 (not counting any peripherals obviously) to have like 30 fps at 10 on an empty area looking at the ground... Jesus people are delusional.
    Like, if a computer does what you need it to do it's fine, I have no problem with that, but don't try to "sell it" as some amazing thing while showing it clearly is not.
    Spent a ton of money on? The Mini is less than the cost of a enthusiast gpu and power supply.

    I'm not in the guy's mind, so I don't know about what settings he really cares about, but I know lots of people who turn specific effects off to get an edge.

    And, it is amazing, just not in the way you're thinking. It's amazing that you can get a solid WoW experience out of a box smaller than a brownie tin, and at that price point. It's not amazing compared to a $1,200 gaming tower, but it's competitive with a lot of stuff in its' price range, and I can't find anything close in that form factor.
    indignantgoat.com/
    XBL: Indignant Goat | BattleTag: IndiGoat#1288 | SteamID: Indignant Goat[/B]

  10. #70
    Quote Originally Posted by m4xc4v413r4 View Post
    Damn that guy on the first video is the definition of cringe to me... He tries so hard to make it look like the system he just spent a ton of money on is doing great running WoW but he puts everything that actually uses any performance to the minimum and attempts to excuse that with the typical "I don't really like it / it's not necessary" bullshit.

    Over €800 (not counting any peripherals obviously) to have like 30 fps at 10 on an empty area looking at the ground... Jesus people are delusional.
    Like, if a computer does what you need it to do it's fine, I have no problem with that, but don't try to "sell it" as some amazing thing while showing it clearly is not.
    Is the Mac Mini amazing as a gaming machine? Of course not. Nobody in their right mind should be thinking "hey, the best $700 I can spend for a gaming machine is a Mac Mini!"

    But all accounts seem to indicate that the M1 Macs are amazing machines "at what they are built for". There is more to computing than just gaming. When it comes to things like productivity. coding and content creation, these machines seem to be a phenomenal choice. People are finding that you absolutely can game on them, but that certainly doesn't mean that they are an ideal choice for gaming. Anyone who is saying so is either twisting the narrative to suit their needs or flat out lying.

  11. #71
    Setting 10 is pointless anyway. It crushes framerates for no perceptible visual gain.

    Setting 7 is the spot where it really matters. And itll run at ~60fps (with the usual predictable dips) at those settings and mostly medium to high GPU bound things (textures, water, etc), which is fine.

    High end gaming it is not, but as I and others have already said many times - its fine. Its not a gaming machine. But you can do some light gaming on it.

  12. #72
    Oh my. I just got my M1 and it's melting my 10700k pc and both i9 Macbook Pros.

    This almost seems the end of cisk x86 cpus! How the fuck is anyone going to compete with 5 watt M1 cpus that melt 125 watt competition?

    Again, Apple takes something that exists and perfects it. Colour me impressed, again.
    success comes in the form of technical solutions to problems, not appeals to our emotional side

  13. #73
    Quote Originally Posted by nocturnus View Post
    Oh my. I just got my M1 and it's melting my 10700k pc and both i9 Macbook Pros.
    For.... what? Browsing the web, watching videos, etc? My 250$ Chromebook here does all those things just as fast as my PC or MBP. Because those dont require real horsepower.

    This almost seems the end of cisk x86 cpus! How the fuck is anyone going to compete with 5 watt M1 cpus that melt 125 watt competition?
    Because they dont. My 6-core, no hyperthreading Core i5 8600K @ 5ghz absolutely takes the M1 out and makes it bite the curb (And the M1 is not 5W. the GPU portion alone is up to 12W when not thermally constrained in the Air) in any benchmark you care to name. And uses more like 70W under torture load. A 10900K will absolutely obliterate an M1.

    And thats not exactly a super modern chip (considering the 10600K has HT, and higher clocks for less energy, and Rocket Lake is shaping up to be 18-25% IPC gains for the same power).

    And, no one seriously gives one single fuck about power consumption on the desktop. Wow, the M1 is a ~24W chip under full tortue load (both GPU and CPU cores maxed out). EAT THAT INTEL! Except... thats totally fucking irrelevant to a desktop. If i can get the same or better performance, i dont care if it uses more power.

    In the Mobile space, Intel's Tiger Lake performs EXTREMELY competitively with the M1 already (with a 4 core/8 thread, 24W part that comes in JUST shy of the M1s performance - running at extremely limited clocks (that 4.X Ghz boost on the Tiger Lake parts looks impressive till you realize that they stay there for sub 10 seconds. The max sustained all-core on most of those machines is somewhere in the 2.5ghz range - LOWER sustained clocks than the M1... and yet it still comes within 15% of the performance). And 8-core TL parts on are on their way in just a month or two.

    Yeah, the GPU in the M1 is quite a bit better than the Xe graphics in the Tiger Lake parts... but so what? Its like lipstick on a flying bull pig. No one using an ultraportable gives a single fuck about GPU performance. None. No use-case for an ultraportable does work that requires GPU muscle. And Intel can simply increase the number of Xe cores on the chips in the future (when they released Tiger Lake, the M1 wasnt really on anyone's radar) if for some reason thats even necessary.

    And AMDs upcoming Ryzen 5000 mobile chips appear to be poised to punch the M1 right in the throat. Better performance for six core parts (with no SMT) that pull 20W and have Navi GPU cores. Its far from "over".

    Again, Apple takes something that exists and perfects it. Colour me impressed, again.
    Its far from perfect. (The unified memory and total inability to upgrade one goddamn thing is a pretty serious fucking problem)

    Its very good. its (currently) better than the competition in a very narrow set of parameters that really dont matter to 90%+ of the market. Even if it merely performed "as good", it would still be a no-brainer for Apple because it increases their profit margin and reduces their reliance on other dev streams and developments.

    Its a great start. Future AS ARM Macs may have quite a bit more muscle (i wouldnt actually expect that given that RISC scales EXTREMELY poorly in clock speeds vis-a-vis power consumption) - at least in terms of performance with multi-core aware applications. (Because theres very little stopping Apple from just gluing 20 of these cores down and calling it a day). But it will ALWAYS lag in single-core performance unless they want to go super hot on these things. Even POWER chips, in their 10th, soon to be 11th generation, have serious trouble going past 3.2-3.5ghz at manageable power consumption, AND it requires shutting off their SMT, which is the entire POINT of POWER chips (SMT8). Apple has provided not one single iota of evidence theye overcome that RISC limitation.

    Again, for anything that is heavily multithreaded, Apple can just throw down MOAR COARZ! Anything that isn't, though, Intel and AMD (and CISC) are far from dead. Single core still matters A LOT. Tons of things simply CANNOT be multithreaded extensively.

    Can you expect an M2 or whatever in an iMac with like 12 performance cores and 4-8 low power cores and like 24 GPU cores that does "very well" vis-a-vis an 11700K or Ryzen 5700? Sure. But its not going to just "smoke" them. Not at reasonable prices at least. I mean, sure, if you crammped like 128 GPU cores in there and like 64 performance cores... itll smoke Intel or AMD consumer chips. But then you should be comparing it to say, Threadripper or Xeon Gold or EPYC.

    Again, for those it the back. M1/Apple Silicon ARM is very good. Especially for a freshman effort. Im not saying otherwise.

    But it is not perfect, its not for everyone, and it isn't some sea-change. They launched when both Intel and AMD were between major chip versions, and in particular when Intel is still finding its way out of a blind alley that they got themselves stuck in (much like AMD before the Ryzen turnaround). The stuff coming out in response from both vevndors appears to be EXTREMELY competitive (higher end "2nd gen" Tiger Lake parts, Ryzen 5000 Mobile).

    Honestly, thats great. More competition only helps.

    But it aint perfect, its not hookers and sunshine, and its not the second coming.

    Its just.... good..

  14. #74
    Quote Originally Posted by Kagthul View Post
    Yeah, the GPU in the M1 is quite a bit better than the Xe graphics in the Tiger Lake parts... but so what? Its like lipstick on a flying bull pig. No one using an ultraportable gives a single fuck about GPU performance. None. No use-case for an ultraportable does work that requires GPU muscle. And Intel can simply increase the number of Xe cores on the chips in the future (when they released Tiger Lake, the M1 wasnt really on anyone's radar) if for some reason thats even necessary.
    Just want to sherry pick one quick thing: The above isn't entirely true. The GPU performance can be important for content creators on the go. While great CPU performance is usually more important for things like video editing, aspects of software like Premiere Pro (Lumetris springs to mind) can be GPU intensive. For somebody covering an event and making a video away from their main machine, having solid capability there is a definite boon.

    It is absolutely a niche use case, no doubt, but it is there. Now, whether a different machine that is more easily upgraded and contains a more respectable amount of RAM would be better for the task regardless is another topic altogether.

  15. #75
    Quote Originally Posted by jellmoo View Post
    Just want to sherry pick one quick thing: The above isn't entirely true. The GPU performance can be important for content creators on the go. While great CPU performance is usually more important for things like video editing, aspects of software like Premiere Pro (Lumetris springs to mind) can be GPU intensive. For somebody covering an event and making a video away from their main machine, having solid capability there is a definite boon.

    It is absolutely a niche use case, no doubt, but it is there. Now, whether a different machine that is more easily upgraded and contains a more respectable amount of RAM would be better for the task regardless is another topic altogether.
    Ill stand by my "no one using an ultraportable".

    Content creators on the go use pro level machines if they care at all about actually getting work done. Which is why i firmly expect the larger upcoming AS/ARM MacBook Pros to have more powerful chips across the board (more performance cores, more GPU cores) for precisely that reason. And then that will put them up against mobile "work" l aptops like the Razer Blade Pro, etc - which are similar in size and weight and features to the larger MBPs - and have discrete GPUs that can and will compete well with whatever they stuff into a larger MBP, i'd be willing to bet.

    Battery life would still be a clear spot where Apple's going to win, though. But most traveling professionals i know (i know a ton of photogs due to being involved in the convention industry) are either always at spts where they can plug in (Convention Centers) or, if they are out on a shoot, have a generator (Honda's 1000W and 1500W ultra quiet models are very popular) because they also have to run lights, etc. So im not sure its a 100% selliing point.

    The 13" MBP is actually a really odd duck. It doesn't perform enough better than the MBA to even justify existing, IMO. Its basically... a MBA with more ports, and the Touchbar that few people care about (and no option not to have it). They should have just had a 3rd MBA SKU with more ports and called it a day, and had ALL the MBPs have a 6 or 8 Performance Core + larger GPU variant of the M1 to at least justify their existence.
    Last edited by Kagthul; 2021-01-10 at 02:28 AM.

  16. #76
    Quote Originally Posted by Kagthul View Post
    Ill stand by my "no one using an ultraportable".

    Content creators on the go use pro level machines if they care at all about actually getting work done. Which is why i firmly expect the larger upcoming AS/ARM MacBook Pros to have more powerful chips across the board (more performance cores, more GPU cores) for precisely that reason. And then that will put them up against mobile "work" l aptops like the Razer Blade Pro, etc - which are similar in size and weight and features to the larger MBPs - and have discrete GPUs that can and will compete well with whatever they stuff into a larger MBP, i'd be willing to bet.

    Battery life would still be a clear spot where Apple's going to win, though. But most traveling professionals i know (i know a ton of photogs due to being involved in the convention industry) are either always at spts where they can plug in (Convention Centers) or, if they are out on a shoot, have a generator (Honda's 1000W and 1500W ultra quiet models are very popular) because they also have to run lights, etc. So im not sure its a 100% selliing point.

    The 13" MBP is actually a really odd duck. It doesn't perform enough better than the MBA to even justify existing, IMO. Its basically... a MBA with more ports, and the Touchbar that few people care about (and no option not to have it). They should have just had a 3rd MBA SKU with more ports and called it a day, and had ALL the MBPs have a 6 or 8 Performance Core + larger GPU variant of the M1 to at least justify their existence.
    I absolutely do not disagree with what you're saying. But, just pointing out that right now, content creators are using those machines. M1 Macs are filling that role with content creators seeking mobility. There absolutely will be better options when the beefier Pros come out, and a lot of them will switch, but as of right now, they are seeing use.

    And yeah, the current Pro has the benefit of battery life and better cooling, but the Air performs so close that it may as well not matter. It is very much an odd duck. But for content creators that use, say, Final Cut, the M1s are a compelling choice for when they are on the go.

  17. #77
    Quote Originally Posted by jellmoo View Post
    I absolutely do not disagree with what you're saying. But, just pointing out that right now, content creators are using those machines. M1 Macs are filling that role with content creators seeking mobility. There absolutely will be better options when the beefier Pros come out, and a lot of them will switch, but as of right now, they are seeing use.

    And yeah, the current Pro has the benefit of battery life and better cooling, but the Air performs so close that it may as well not matter. It is very much an odd duck. But for content creators that use, say, Final Cut, the M1s are a compelling choice for when they are on the go.
    No no no, youtube/tik tok content creators are not content creators, if you're not using two titans with mocha and after effects to produce avengers levels of visuals - you're NOT content creator.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  18. #78
    I got my M1 today and it feels on par with i9-10900K + RTX 3070 combo and for 15W thats amazing. 120+ fps on WoW.

  19. #79
    Quote Originally Posted by Kagthul View Post
    For.... what? Browsing the web, watching videos, etc? My 250$ Chromebook here does all those things just as fast as my PC or MBP. Because those dont require real horsepower.
    For compiling (and it doesn't even support java properly yet), video editing and basically everything that requires raw cpu power. It's outperforming basically every high-end cpu there is, without the software even having been properly optimized yet.

    But hey, don't take my word on it, you meme:









    And the other 100 video's of jaw-dropped people.

    Fact of the matter is that this chip is amazing and pretty much a revolutionary step. I predict it's the first one away from the horribly inefficient cisk toward risk and it's about fucking time.

    I'm sure Intel's kicking itself even harder than when they refused to collaborate on the iPhone.
    Last edited by nocturnus; 2021-01-10 at 01:54 PM.
    success comes in the form of technical solutions to problems, not appeals to our emotional side

  20. #80
    Bloodsail Admiral m4xc4v413r4's Avatar
    10+ Year Old Account
    Join Date
    Aug 2009
    Location
    Home
    Posts
    1,075
    Quote Originally Posted by bergmann620 View Post
    Spent a ton of money on? The Mini is less than the cost of a enthusiast gpu and power supply.

    I'm not in the guy's mind, so I don't know about what settings he really cares about, but I know lots of people who turn specific effects off to get an edge.

    And, it is amazing, just not in the way you're thinking. It's amazing that you can get a solid WoW experience out of a box smaller than a brownie tin, and at that price point. It's not amazing compared to a $1,200 gaming tower, but it's competitive with a lot of stuff in its' price range, and I can't find anything close in that form factor.
    For the same cost you get better performance. The cost of a "enthusiast gpu" isn't the objective, why are you even trying to do that stupid comparison? And about the settings, the guy is doing a benchmark, what settings he likes to turn down or not, no one cares, he can do that for himself, not while he's trying to showoff his computer benchmarking a game. Everything else you said just reads like you didn't even read my comment so go read it first instead of skipping after the first phrase.

    Quote Originally Posted by jellmoo View Post
    Is the Mac Mini amazing as a gaming machine? Of course not. Nobody in their right mind should be thinking "hey, the best $700 I can spend for a gaming machine is a Mac Mini!"

    But all accounts seem to indicate that the M1 Macs are amazing machines "at what they are built for". There is more to computing than just gaming. When it comes to things like productivity. coding and content creation, these machines seem to be a phenomenal choice. People are finding that you absolutely can game on them, but that certainly doesn't mean that they are an ideal choice for gaming. Anyone who is saying so is either twisting the narrative to suit their needs or flat out lying.
    Read my entire post, not sure what part of it is contradicting with it being good for something that isn't gaming, the last part is pretty clear about that.

    You two couldn't stop yourselves from showing that you're clearly apple fanboys, you raged at the very sight of a fact attacking your favorite brand and didn't even realize that I never said what you're trying to attack me with, in fact, i clearly stated that the comment was about gaming since the video was about benchmarking a game. Whether the machine is good for anything else is completely irrelevant, no one is talking about "something else". The entire thread and the video i commented on is about wow performance on these machines. End of discussion.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •