Page 2 of 7 FirstFirst
1
2
3
4
... LastLast
  1. #21
    I have mine on preset level 5 for open world questing with shadows on fair. It averages about 50 fps. If I drop the scaling to 1080p it gets about 5-7 fps more. Addons may make a difference I have not tested it with out.

    In Raids and BGs on preset 2 is where you have to be to have any acceptable decent experience and even then it lags in some areas.

    I have the M1 Mini for one reason. Work, can it play? yes, is it awesome in gaming? No If the OP is buying the M1 Mini because of work needs, price, portability etc, then whether or not its a good buy, that is up to him. If I were spending ~$700-1000 to play wow on, I would definitely do something different.

    "Beware of false knowledge; it is more dangerous than ignorance."





  2. #22
    Pandaren Monk Zoibert the Bear's Avatar
    Join Date
    Nov 2010
    Location
    Basque Country, Spain
    Posts
    1,928
    Wait you guys can play sub 165 FPS at ultra graphics @1440p? Must feel like 2001.

  3. #23
    Old God Vash The Stampede's Avatar
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,865
    Quote Originally Posted by Thoriangun View Post
    No

    have you seen final cut performance vs Adobe piece of shit garbage and hope they suffer in real use?

    Final cut on M1 destroys Adobe may they burn in hell garbage.

    If anything, Apple has highlighted what their software on the desktop can do vs others at its full potential now, final cut was already better then adobe piece of shit on Intel, this is not a wake up call to Intel or AMD/Nvidia, this is a wakeup call to every software creative suite there is and how they need to be updated properly to be properly utilised.
    Ok... so we're comparing Adobe on Intel to Final Cut on Apple M1? Why should I care? Apple software is working really well on Apple hardware. Should I be surprised? My point is that as of right now the ecosystem for the M1 is very limited. MacOSX was already limited but now it'll be even more limited. OpenGL isn't up to date on Apple, and you have to use Metal API to get anything modern on it. How many developers you think will bother to utilize ARM based Mac with Metal API on a product that has like 16% of the desktop market share? While Apple does make good CPU's, they aren't a GPU manufacturer. Apple broke up with Imagination in 2017 and poached some of their engineers to produce Apple based GPU's. They couldn't do it themselves so they basically rehired Imagination and paid for some of their patents. The M1 can only play ShadowLands at medium settings, so imagine how they'll be able to compete with AMD and Nvidia? Intel is now entering the GPU market and they have some big talent working on it as well.

    If you wanna use the Apple M1 for Final Cut Pro then sure, go ahead. Don't expect to game on it though. Don't expect there to be a lot of old Mac X86 or Windows x86 apps that'll run fast or at all on the Apple M1. Don't ask why Cyberpunk 2077 doesn't work on the Apple M1. Don't ask why your $1300 Macbook with 256GB can only install one game on it, and you can't upgrade the storage either. The Apple cross compiler is just a Fat Binary, which means Apple wanted developers to build for both ARM and X86 a few years ago. Not everyone does that. Not on Linux, not on Android, and not on Mac. You are a beta tester and you'll find out eventually what crap you'll have to go through to get other apps working on it. Like this guy.

  4. #24
    Quote Originally Posted by Vash The Stampede View Post
    My point is that as of right now the ecosystem for the M1 is very limited.
    Not really. Everything that worked before works now, largely. Very few things just fail to run and most of them run as fast as they used to because while they are not running as fast as they would if they were native ARM/RISC, the M1 is already quite a bit faster than the chips that it replaced (because in the Air, particularly, the Intel chips were thermal and power limited to hell) so the experience is still better.

    MacOSX was already limited but now it'll be even more limited.
    You keep using that world but never provide any examples. While there are definitely areas where MacOS is more limited (gaming!) as a work/daily driver/professional's machine, its just as capable as Windows and more capable than Linux and ChromeOS.

    OpenGL isn't up to date on Apple, and you have to use Metal API to get anything modern on it. How many developers you think will bother to utilize ARM based Mac with Metal API on a product that has like 16% of the desktop market share?
    Just as many as did when Macs ran on Motorolla CPUs and PowerPC CPUs. Or more, because Apple's slice is quite a bit larger now than it was then. And while that's "!6% of the desktop market" (which inexplicably counts laptops..) its more like 35% of the Laptop market.

    [/url] While Apple does make good CPU's, they aren't a GPU manufacturer. Apple broke up with Imagination in 2017 and poached some of their engineers to produce Apple based GPU's. They couldn't do it themselves so they basically rehired Imagination and paid for some of their patents. The M1 can only play ShadowLands at medium settings, so imagine how they'll be able to compete with AMD and Nvidia? Intel is now entering the GPU market and they have some big talent working on it as well.
    Errr.... OK? Big corporation buys the tech they need to do it in house. News at 11. Apple wasn't a chipmaker, either, until they bought out and hired people to do it. And, actually, there's some pretty compelling evidence that their workstation chips will have 80+ GPU cores of the existing type in the M1. Clocked significantly higher. Now, we have no idea what that will actually perform like, but to say its never going to happen is just stupid.

    Now, will those GPUs be good for gaming? Not terribly likely. Because Apple doesn't care. But you keep shifting back and forth between looking at this solely as "macs suck at gaming" (which anyone with a brain in here knew already) and "macs suck in general and are less capable as basic computers" willy-nilly. One does not support the other.

    If you wanna use the Apple M1 for Final Cut Pro then sure, go ahead. Don't expect to game on it though. Don't expect there to be a lot of old Mac X86 or Windows x86 apps that'll run fast or at all on the Apple M1.
    Almost everything runs just as fast as it did before because of how thermally constrained the MacBook Air and Pro were prior to this. So... wut? And.. .like.. you think theyre not going to have more thermal headroom (and room for power draw to kick up clocks) in an iMac(ARM) or MacPro(ARM). Again... gaming? No, because Apple doesn't care to support it well enough. But general performance? The number of X86-64 apps that dont just.. fire right up with no problems is quite low. (And your later link is hillarious - hes trying to run an emulator through an emulator to run an emulator? WTFM8. Talk about a niche problem). Same as it was when Rosetta 1 was a thing and PowerPC apps ran just fine on Intel.

    Don't ask why Cyberpunk 2077 doesn't work on the Apple M1. Don't ask why your $1300 Macbook with 256GB can only install one game on it,
    So you're aware, you had the price wrong this whole time. The cheapest ARM Mac is 999$, and 899$ if you're a student or can reasonably pass yourself off as one (they rarely check). Not saying the 256GB of storage that you cant upgrade is a great thing or anything, 'cause it isnt - but its also the norm in other PC ultralights. LG Gram? You're not upgrading the guts in that. You're comparing a 3lb MBA to a 7lb Chonktastic plastic fuckbrick of a PC "laptop". They aren't comparable products. You cant upgrade most ultraportables, regardless of manufacturer.

    and you can't upgrade the storage either. The Apple cross compiler is just a Fat Binary, which means Apple wanted developers to build for both ARM and X86 a few years ago. Not everyone does that. Not on Linux,
    LOLWUT? Linux devs dont develop RISC binaries? Are you mentally damaged? Linux runs on a lot more RISC machine than it does X86 machines by an order of magnitude. Almost the entirety of Big Iron is POWER.

    Android doesn't develop RISC binaries? COnsidering that almost EVERY Android phone (and the iPhone) are ARM based... what are you even blubbering about? The number of phones that run on X86 chips is microscopic (though they do exist; my sons old ASUS ZenFone was Intel). You dont even have to recompile Android apps - its all translated at the system level (which is what Rosetta 2 is doing). There is no app that will run on an ARM phone that wont run identically on that ZenFone. Not one.

    not on Android, and not on Mac. You are a beta tester and you'll find out eventually what crap you'll have to go through to get other apps working on it. Like this guy.
    Since Apple's already done one of these transitions, and i lived through it without really even noticing...

    Dont think you're on the right track here.

    Now, again, im not saying the M1 and ARM Macs are the second coming. Merely that they are very good and Apple can survive just fine on a separate architecture from everyone else, as they did so for 20+ years by running on Motorollas 68000 series chips and then IBM/Motorolla's PowerPC chips.

    Will gaming be a thing on Mac? No more than it ever was. Boutique porting houses will still port games that they feel will make enough money, some AAA devs will still port to ARM Mac (because its really not that hard and Apple's compiler does a pretty good job of translating DirectX into Metal, and is even better with Vulkan, not that hardly anything uses Vulkan) if they feel they can make a buck on Mac versions.

    But it will still be third tier gaming (after PCs and Consoles) just like it always was.

    But for productivity/work/using your machine for things that aren't just "gaming" - itll be fine. Not amazing, not better than sliced bread, but also not shit, or a dumpster fire. Itll be... fine.
    Last edited by Kagthul; 2021-01-05 at 08:06 AM.

  5. #25
    Quote Originally Posted by kukkamies View Post
    Your post looks so emotional instead of thinking things through hehe. Could help to calm down a bit and write better sentences and think about the post.
    See, that fits your agenda. Mine is simpler - not a native speaker and my native language is very different to english hehe.
    I have to say, though i normally have Vash and idev on ignore because they rarely post anything worthwhile, this was worth hitting "view post" to watch them spew back and forth.
    This is sadness of vegan levels - HOW WILL THEY KNOW I IGNORE THEM?!?!?!!!! I. MUST. TELL. THEM. Not how ignoring fucking works. And even more sadness - why did you even bother ignoring, if you still suck at it and still view posts? That's a god damn extra click. Jesus.


    Just screenshot your ignore list and put it in your sig, jesus man, easier way to jerk.
    Last edited by ldev; 2021-01-05 at 09:15 AM.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  6. #26
    Quote Originally Posted by Zoibert the Bear View Post
    Wait you guys can play sub 165 FPS at ultra graphics @1440p? Must feel like 2001.
    Not sure what you are using, my 10900k@5.3GHz+RAM@4600MHz+3080 gets me to
    1080p-7: 397 (0.1% FPS) 444 (AVG FPS) in dungeons

    https://www.3dmark.com/spy/16080271

    And I will still see sub 60 fps in some situations.
    hidden information WoWArmory | Raider.IO | WoWProg | logs Logs1 | Logs2 | Logs3

  7. #27
    Quote Originally Posted by ldev View Post
    See, that fits your agenda. Mine is simpler - not a native speaker and my native language is very different to english hehe.

    This is sadness of vegan levels - HOW WILL THEY KNOW I IGNORE THEM?!?!?!!!! I. MUST. TELL. THEM. Not how ignoring fucking works. And even more sadness - why did you even bother ignoring, if you still suck at it and still view posts? That's a god damn extra click. Jesus.


    Just screenshot your ignore list and put it in your sig, jesus man, easier way to jerk.
    ???
    What agenda?
    The second quote you replied to isn't even posted by me but someone else?

    You're truly mad.

  8. #28
    Quote Originally Posted by kukkamies View Post
    ???
    What agenda?
    The second quote you replied to isn't even posted by me but someone else?

    You're truly mad.
    Second quote - not yours, so not for you. Madness, truly, I know right? It's okay, you didn't understand this simple complex, of course you didn't understand what was meant about "agenda", time to move on.
    Last edited by ldev; 2021-01-05 at 11:32 AM.
    My nickname is "LDEV", not "idev". (both font clarification and ez bait)

    yall im smh @ ur simplified english

  9. #29
    Quote Originally Posted by ldev View Post
    Second quote - not yours, so not for you. Madness, truly, I know right? It's okay, you didn't understand this simple complex, of course you didn't understand what was meant about "agenda", time to move on.
    Yep. It's the only post I have on this whole thread. I don't see the agenda. Sounds like you don't see it yourself either since you're unable to tell it?

  10. #30
    Better performance than other integrated graphics, worse than a PC with a dedicated GPU. If you're looking to buy this machine primarily for gaming, don't, your $699-$799 could be "much" more effectively spent on the PC side of things, even if the form factor is an issue, Intel is making some pretty decent gaming NUCs with mobile GPUs now. If this is going to be like a work machine and you want to know if you can do some casual gaming on the side, sure, you can run at 1080p med settings and get acceptable FPS in open world or 5-mans.

  11. #31
    Quote Originally Posted by Vash The Stampede View Post
    Ok... so we're comparing Adobe on Intel to Final Cut on Apple M1? Why should I care? Apple software is working really well on Apple hardware. Should I be surprised? My point is that as of right now the ecosystem for the M1 is very limited. MacOSX was already limited but now it'll be even more limited. OpenGL isn't up to date on Apple, and you have to use Metal API to get anything modern on it. How many developers you think will bother to utilize ARM based Mac with Metal API on a product that has like 16% of the desktop market share? While Apple does make good CPU's, they aren't a GPU manufacturer. Apple broke up with Imagination in 2017 and poached some of their engineers to produce Apple based GPU's. They couldn't do it themselves so they basically rehired Imagination and paid for some of their patents. The M1 can only play ShadowLands at medium settings, so imagine how they'll be able to compete with AMD and Nvidia? Intel is now entering the GPU market and they have some big talent working on it as well.

    If you wanna use the Apple M1 for Final Cut Pro then sure, go ahead. Don't expect to game on it though. Don't expect there to be a lot of old Mac X86 or Windows x86 apps that'll run fast or at all on the Apple M1. Don't ask why Cyberpunk 2077 doesn't work on the Apple M1. Don't ask why your $1300 Macbook with 256GB can only install one game on it, and you can't upgrade the storage either. The Apple cross compiler is just a Fat Binary, which means Apple wanted developers to build for both ARM and X86 a few years ago. Not everyone does that. Not on Linux, not on Android, and not on Mac. You are a beta tester and you'll find out eventually what crap you'll have to go through to get other apps working on it. Like this guy.
    Not gonna lie, but if you own an Apple product, why care for most other non other Apple software?

    You shouldn't be buying an Apple hardware to run non Apple software, and some titles that have been ported have been ported properly cos that onus is entirely on the devs even if the userbase for that software is low.

    Your argument for the M1 is pointless, you should be taking any pot shots at both Intel and Adobe who have held back the entire industry, this is the reason why Apple have taken it on themselves to create this chip.

    It is all Intel and Adobes fault.

    It is all Intels and Adobes fault.
    Last edited by Thoriangun; 2021-01-05 at 08:26 PM.

  12. #32
    Quote Originally Posted by Lucitity View Post
    Better performance than other integrated graphics, worse than a PC with a dedicated GPU. If you're looking to buy this machine primarily for gaming, don't, your $699-$799 could be "much" more effectively spent on the PC side of things, even if the form factor is an issue, Intel is making some pretty decent gaming NUCs with mobile GPUs now. If this is going to be like a work machine and you want to know if you can do some casual gaming on the side, sure, you can run at 1080p med settings and get acceptable FPS in open world or 5-mans.
    Considering a 1650 Super is $320 now....

    No, my main gaming rig is a i9 10900k with a 3090, and I also have a 5950x with a 3080. I also have a 10 core iMac with a 16gb 5700 XT, just looking to buy this Mac mini to tinker around with.

  13. #33
    Quote Originally Posted by SigmaShift View Post
    Considering a 1650 Super is $320 now....

    No, my main gaming rig is a i9 10900k with a 3090, and I also have a 5950x with a 3080. I also have a 10 core iMac with a 16gb 5700 XT, just looking to buy this Mac mini to tinker around with.
    Then get it! Thats mainly why I did. I have a 2020 iMac with the 10700k, I added my own ram to 64GB and it has the RX5500XT. I love the Intel iMac, I use it at my main house, but I have two homes, so the mini is perfect for when I am at the other house on the weekends as it takes up no space which allows for a small desk as it is only a 2 bedroom house and I can actually play on it and get some work done. I have it on a LG 32QN600-B monitor and it works perfect, its a fun machine, pretty beasty for the size and runs LP and FCP really well.

    The only issue I have is sometimes my bluetooth mouse and keyboard (Logitech G613 and G603) disconnect and reconnect. its a issue that Apple is supposedly working on. Other than that, I am pretty impressed with it. Especially for the size it is.

    "Beware of false knowledge; it is more dangerous than ignorance."





  14. #34
    Old God Vash The Stampede's Avatar
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,865
    Quote Originally Posted by Kagthul View Post
    Not really. Everything that worked before works now, largely. Very few things just fail to run and most of them run as fast as they used to because while they are not running as fast as they would if they were native ARM/RISC, the M1 is already quite a bit faster than the chips that it replaced (because in the Air, particularly, the Intel chips were thermal and power limited to hell) so the experience is still better.
    Largely works, but you will run into problems. Gaming though is probably going to be more of an issue, but then again why are you buying a Mac to game?
    You keep using that world but never provide any examples. While there are definitely areas where MacOS is more limited (gaming!) as a work/daily driver/professional's machine, its just as capable as Windows and more capable than Linux and ChromeOS.
    As a work/professional machine sure, but not for gaming. Even for work/professional I question those who choose a Mac. Also Mac is so capable that Apple uses Linux for their servers. Microsoft does as well but...
    Just as many as did when Macs ran on Motorolla CPUs and PowerPC CPUs. Or more, because Apple's slice is quite a bit larger now than it was then. And while that's "!6% of the desktop market" (which inexplicably counts laptops..) its more like 35% of the Laptop market.
    It's bigger now because Apple went Intel. Which allowed for better Windows application compatibility like bootcamp. If that doesn't strike your fancy then you have Parallels. This was a bigger issue for PowerPC because it wasn't x86. The software library on x86 Windows is so massive that even Microsoft can't get people onto ARM Windows. Since Apple insists on Metal instead of Vulkan then developers have another hurdle to overcome for software development on Apple hardware. There's no reason to create Metal except to lock people into the Apple ecosystem. That has not worked well traditionally.
    Errr.... OK? Big corporation buys the tech they need to do it in house. News at 11. Apple wasn't a chipmaker, either, until they bought out and hired people to do it. And, actually, there's some pretty compelling evidence that their workstation chips will have 80+ GPU cores of the existing type in the M1. Clocked significantly higher. Now, we have no idea what that will actually perform like, but to say its never going to happen is just stupid.
    My point is that the CPU market is not a big deal since anyone can buy an ARM license and make a SOC. GPU is different and Apple clearly doesn't have the engineers to make it happen. Sure Apple may have 80 or 100 GPU cores but that doesn't mean it'll be competitive against Amd, Nvidia, AND Intel now. This worked well for the mobile market but this is not the mobile market.
    Now, will those GPUs be good for gaming? Not terribly likely. Because Apple doesn't care. But you keep shifting back and forth between looking at this solely as "macs suck at gaming" (which anyone with a brain in here knew already) and "macs suck in general and are less capable as basic computers" willy-nilly. One does not support the other.
    This thread is about playing WoW on the M1 so... yea gaming. The M1 obviously does fine in general computing while producing low heat and consuming low power. Do I think getting an ARM based Macbook is worth the transition from x86 to ARM from a general computing perspective? Nope. While there are mostly no problems with x86 software, you only need one to give you a headache. Migrating a software library from one popular CPU architecture like x86 to ARM is going to take some time. If your time isn't worth money then sure go ahead. If you never NEVER need to run Windows x86 apps reliably, then go ahead.
    The number of X86-64 apps that dont just.. fire right up with no problems is quite low.
    You think so? You should ask these software developers what trouble they're having with it. You know, the people that do professional work. I'm not saying these problems won't eventually be fixed but you are the beta tester. First generation hardware with first generation software will be problematic at some point.


    (And your later link is hillarious - hes trying to run an emulator through an emulator to run an emulator? WTFM8. Talk about a niche problem). Same as it was when Rosetta 1 was a thing and PowerPC apps ran just fine on Intel.
    Someone who bought an M1 is trying to run Citra, and since I run Citra, I found his post. Citra is also open source so it can be ported to the Apple M1. People are trying to get Yuzu working on the M1. They are doing it.

    So you're aware, you had the price wrong this whole time. The cheapest ARM Mac is 999$, and 899$ if you're a student or can reasonably pass yourself off as one (they rarely check). Not saying the 256GB of storage that you cant upgrade is a great thing or anything, 'cause it isnt - but its also the norm in other PC ultralights.
    Apparently it's $899 regardless. Last I checked it was $1300 for the cheapest M1 Macbook. A lot of stuff that Apple does is the norm on Android devices, but it doesn't mean you should buy those devices.
    LG Gram? You're not upgrading the guts in that. You're comparing a 3lb MBA to a 7lb Chonktastic plastic fuckbrick of a PC "laptop". They aren't comparable products. You cant upgrade most ultraportables, regardless of manufacturer.
    Don't buy those laptops?
    LOLWUT? Linux devs dont develop RISC binaries? Are you mentally damaged? Linux runs on a lot more RISC machine than it does X86 machines by an order of magnitude. Almost the entirety of Big Iron is POWER.

    Android doesn't develop RISC binaries? COnsidering that almost EVERY Android phone (and the iPhone) are ARM based... what are you even blubbering about? The number of phones that run on X86 chips is microscopic (though they do exist; my sons old ASUS ZenFone was Intel). You dont even have to recompile Android apps - its all translated at the system level (which is what Rosetta 2 is doing). There is no app that will run on an ARM phone that wont run identically on that ZenFone. Not one.
    You're not reading what I said correctly. I said not everyone makes a Fat Binary because it's twice the work as a regular binary and twice the size. Yes Linux has ARM, and I guess RISC-V if that's what you're referring to when you say RISC. Linux is also different because it has been ported to everything and nearly all the software is open source. Is it on Mac? Is it on Android? Not all Android apps like PUBG have a x86 Android working version. Linux doesn't need a Fat Binary because it has repositories.

  15. #35
    Incorrect, cheapest M1 Mac is $699, which is the Mini, and it is also the highest performing one to boot.

  16. #36
    Quote Originally Posted by Vash The Stampede View Post
    Apparently it's $899 regardless. Last I checked it was $1300 for the cheapest M1 Macbook.
    No, it wasnt. It was never 1300$. The prices have not changed since launch.

    A lot of stuff that Apple does is the norm on Android devices, but it doesn't mean you should buy those devices.

    Don't buy those laptops?
    The incoherency and ignorance you espouse is sometimes hard to comprehend. People buy those laptops because they dont want to carry Chonkasaurus Rex the Plastic Behemoth in their bag all day. You cant compare an ultraportable to a big plastic block. They are two separate machines targeted at two entirely separate audiences. Most professionals aren't going to buy a chonkin gaming laptop that gets 3 hours of life for their work machine. Theyre going to buy a Gram, a MacBook, or other ultraportable. The exceptions are a fairly small number of people who need as much oomph on the go as they can get (photogs for doing edits on the spot, etc) that dont make up a huge number of people (but are a crowd that will spend the moolah).

    You're not reading what I said correctly.
    Fault of the author, then. If you cant write a sentence that makes sense, the reader is not to blame. I can only read it like you wrote it.

    I said not everyone makes a Fat Binary because it's twice the work as a regular binary and twice the size. Yes Linux has ARM, and I guess RISC-V if that's what you're referring to when you say RISC.
    .... ARM. Advanced RISC Machine. Herp a derp derp derp.

    Linux is also different because it has been ported to everything and nearly all the software is open source. Is it on Mac? Is it on Android? Not all Android apps like PUBG have a x86 Android working version. Linux doesn't need a Fat Binary because it has repositories.
    ..../sadtrombone.

    Android is Linux. Womp Womp. And my son played the PUBG mobile game just fine on his Zenfone. so.. yeah.

  17. #37
    Quote Originally Posted by Kagthul View Post
    No, it wasnt. It was never 1300$. The prices have not changed since launch.



    The incoherency and ignorance you espouse is sometimes hard to comprehend. People buy those laptops because they dont want to carry Chonkasaurus Rex the Plastic Behemoth in their bag all day. You cant compare an ultraportable to a big plastic block. They are two separate machines targeted at two entirely separate audiences. Most professionals aren't going to buy a chonkin gaming laptop that gets 3 hours of life for their work machine. Theyre going to buy a Gram, a MacBook, or other ultraportable. The exceptions are a fairly small number of people who need as much oomph on the go as they can get (photogs for doing edits on the spot, etc) that dont make up a huge number of people (but are a crowd that will spend the moolah).



    Fault of the author, then. If you cant write a sentence that makes sense, the reader is not to blame. I can only read it like you wrote it.



    .... ARM. Advanced RISC Machine. Herp a derp derp derp.



    ..../sadtrombone.

    Android is Linux. Womp Womp. And my son played the PUBG mobile game just fine on his Zenfone. so.. yeah.
    Bro, you made me roll out of my chair.
    @Vash The Stampede
    I try not to get into it with people on the internet so don't take this harshly. Your videos are not of norm. The 1st one has problems with niche hardware that 99% of the population do not use, and he should have checked with the vendors website to see if it was compatible.

    The second, minor issues that some have been resolved, the Bluetooth thing, his fix is what I also had to do, mine still has some issues, but not near as it was when I first used it and my issues get corrected by flipping of the on/off switch on my keyboard or mouse, but Logitech is working on some new drivers for the M1.

    You are right about one thing, being early adopters of any new tech can be a pain, but honestly, you just sound like you hate Apple, and thats ok. But also stating pros should not use them, well iJustine is a pro, and she uses one, so does Marques Brownlee, those are just a few successful youtubers. There are many nationwide companies and professionals that use the Macs, heres some proof for you. Thats just after a 30 second google search. (The company I work for uses Macs, iPhones and iPad pros)

    And playing games on a Mac...you say is bad. I play games on my 2020 Intel iMac with no issues at all, and if its not made for macOS, I reboot into bootcamp and all is well. So to say buying a Mac for gaming is bad, that all depends on the Mac you buy, just like any prebuilt Windows PC. Now does the M1 Mini beast in gaming? No, but you can play and if you play with the settings besides using presets you can get it to work pretty well.

    Bottom line, if you want people to take you seriously, maybe do some research or provide at least some anecdotal evidence instead of being a Windows/Linux fan. I do not know that you are, but you sure do sound like it.

    Me, I am neither, I do have a Windows PC as well, a Ryzen 3600 with a 5700XT, they both have their positives and negatives, do not get me started on Windows and the crappy search and windows updates that are still broken...lets see for how many years? and dont get me started on Linux! I do love to play around with Linux, but it has more issues than Windows for the average user.

    They all have their own problems, to say no one should buy either is ridiculous. Everyone has their own needs and wants, even if you disagree.

    "Beware of false knowledge; it is more dangerous than ignorance."





  18. #38
    Old God Vash The Stampede's Avatar
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,865
    Quote Originally Posted by Kagthul View Post
    No, it wasnt. It was never 1300$. The prices have not changed since launch.
    I went to the Apple website just now and it says $999. It was $899 earlier today and a few weeks ago I saw it for $1300. I have no idea why I'm seeing different prices. Also their website is showing the M1 Macbook playing some Sonic racing game with the phrase, "8 Core GPU, Plays Hard, Works Wonders". They are advertising gaming on these.
    People buy those laptops because they dont want to carry Chonkasaurus Rex the Plastic Behemoth in their bag all day. You cant compare an ultraportable to a big plastic block. They are two separate machines targeted at two entirely separate audiences. Most professionals aren't going to buy a chonkin gaming laptop that gets 3 hours of life for their work machine. Theyre going to buy a Gram, a MacBook, or other ultraportable. The exceptions are a fairly small number of people who need as much oomph on the go as they can get (photogs for doing edits on the spot, etc) that dont make up a huge number of people (but are a crowd that will spend the moolah).
    I attribute people who buy Apple products as idiots. I don't have a better explanation as to why they buy these products. Also just because the laptop is made of metal doesn't mean it's better. Take it from me who fixes these things as I'm fixing one right now. The Macbook Air I'm fixing is dented all over, and there's not a damn thing I can do about that. Most dents are cosmetic but one is preventing the lid from closing all the way, which is problematic for keeping it asleep. A bit of hammering later and it mostly closes now. I've had Macbooks so badly bent up that the insides were still good but the housing had to be thrown away. Plastic has it's benefits in that they shrug off most impacts. Massive impacts will crack it but I've plastic welded those fixed.

    You can still buy an x86 Macbook and that's what I would recommend to buy if you're still in the market of buying a Mac. You won't lose anything significant buying one of those. This Lenovo X1 Carbon is lighter than the new Macbooks and has a removable SSD, and twice the storage for the same price. This Gateway destroys the new Macbook in performance and is only 1lb heavier and 0.3" thicker.
    Fault of the author, then. If you cant write a sentence that makes sense, the reader is not to blame. I can only read it like you wrote it.
    You heard what you want to hear. I can't help that.
    .... ARM. Advanced RISC Machine. Herp a derp derp derp.
    Fault of the author...
    Android is Linux. Womp Womp.
    Yes it is but it doesn't have repositories like traditional Linux distros. Librem 5 phones are a different story.
    And my son played the PUBG mobile game just fine on his Zenfone. so.. yeah.
    I was trying to get Android VM working on Linux and was told that PUBG Android doesn't work on x86 because there's no binary for it. I'll have to try again.

    - - - Updated - - -

    Quote Originally Posted by loadedaxe View Post
    Bro, you made me roll out of my chair.
    @Vash The Stampede
    I try not to get into it with people on the internet so don't take this harshly. Your videos are not of norm. The 1st one has problems with niche hardware that 99% of the population do not use, and he should have checked with the vendors website to see if it was compatible.
    You say that but then you say...
    The second, minor issues that some have been resolved, the Bluetooth thing, his fix is what I also had to do, mine still has some issues, but not near as it was when I first used it and my issues get corrected by flipping of the on/off switch on my keyboard or mouse, but Logitech is working on some new drivers for the M1.
    See I use Linux and I'm niche but I know that Linux sucks for anyone who isn't willing to deal with problems. I'm willing to deal with the problems that comes with using Linux and I accept those problems, but most people won't. Certainly nobody who needs to do anything professional will use Linux. Then again I didn't pay money to be the beta tester that is Linux.
    You are right about one thing, being early adopters of any new tech can be a pain, but honestly, you just sound like you hate Apple, and thats ok.
    Yea, and I have my reasons. Both technical and moral.
    But also stating pros should not use them, well iJustine is a pro, and she uses one, so does Marques Brownlee, those are just a few successful youtubers.
    Don't let others think for you. Lots of people went out not ever wearing a mask and they were successful. GradeAUnderA used an old Samsung Galaxy S3 phone to record audio and used Microsoft paint to do his art, and he used to be successful... until he got depression and stopped uploading. My point is what you use doesn't matter but what you do with it.
    And playing games on a Mac...you say is bad. I play games on my 2020 Intel iMac with no issues at all, and if its not made for macOS, I reboot into bootcamp and all is well. So to say buying a Mac for gaming is bad, that all depends on the Mac you buy, just like any prebuilt Windows PC.
    You're not proving me wrong here.
    Now does the M1 Mini beast in gaming? No, but you can play and if you play with the settings besides using presets you can get it to work pretty well.
    You can use an old Pentium 4 to game, but it depends on the game. Shadowlands is based on a game from 2004, so the requirements aren't very high. Cyberpunk 2077 doesn't work on the Apple M1 products unless you use Google Stadia. No amount of settings changed will fix that. Cyberpunk 2077 may never run on the Apple M1 due to the lack of VK_VALVE_mutable_descriptor_type which is needed on Linux to get the game running. Mac also uses Wine like Linux so this maybe unfixable. Maybe if someone messes with MoltonVK then maybe. If this were a x86 Mac then you'd just boot into bootcamp and run Windows.
    Bottom line, if you want people to take you seriously, maybe do some research or provide at least some anecdotal evidence instead of being a Windows/Linux fan. I do not know that you are, but you sure do sound like it.
    I don't expect to be taken seriously, I expect to be correct. Up to you what to do with my advice, which can be in the trash can for all I care.
    Last edited by Vash The Stampede; 2021-01-06 at 03:04 AM.

  19. #39
    @Vash The Stampede.

    I did not have to prove you wrong, you were never right. You see, because you dislike something and give advice not to use it, makes you wrong in itself. I dislike Windows, but it has its uses and I do use it, does not mean I like it nor does it mean I should not recommend it. I would, for the person that would use it knowing its pros and cons. Giving sound advice is only achieved when you do so without bias.

    Its like the guy that pulls up next to me in a Chevy Truck and revs his engine because I am in a Ford Truck, he thinks he is cute and tough, takes off when the light turns green just to get to the next light a few seconds faster than me because I drive with common sense. What he doesn't realize is I don't dislike Chevy, Toyota, Nissan or Dodge. I just got a better deal on the F150, I didn't base my decision on brand, they all have their own problems. I saved money on the Ford and got a better deal. (Traded in a Dodge for it)

    Being brand biased is a lose lose situation. You do seem angry with Apple. I wonder if your as angry with MS......oh wait, problems with MS are to be expected so everyone just ignores it, because DirectX is the only API game devs use right.

    Or is it you hate Apple because you worked on someone's Mac that beat the shit out of it and you cant get the lid to close and you think plastic is better because you can plasti-weld it back, its not the machines fault the user is careless, but in your eyes that's Apple fault? Doesn't matter if its plastic or aluminum, the user is at fault, not the manufacturer.

    I will cease here. Your mind is made up, and thats ok, thats your right as a human.

    If you go through the thread and read all the posts and not just the ones quoting you, you will see the OPs posts, and realize, he owns a Mac already, he wants to play with the M1, thats all that matters, definitely not your opinion.

    "Beware of false knowledge; it is more dangerous than ignorance."





  20. #40
    Old God Vash The Stampede's Avatar
    Join Date
    Sep 2010
    Location
    Better part of NJ
    Posts
    10,865
    Quote Originally Posted by loadedaxe View Post
    @Vash The Stampede.
    Being brand biased is a lose lose situation. You do seem angry with Apple. I wonder if your as angry with MS......oh wait, problems with MS are to be expected so everyone just ignores it, because DirectX is the only API game devs use right.
    No I'm fully aware of how much Windows sucks, which is why I use Linux so I can improve it to the point when it can replace Windows.
    Or is it you hate Apple because you worked on someone's Mac that beat the shit out of it and you cant get the lid to close and you think plastic is better because you can plasti-weld it back, its not the machines fault the user is careless, but in your eyes that's Apple fault? Doesn't matter if its plastic or aluminum, the user is at fault, not the manufacturer.
    I fix laptops so I have an idea of how stupid Apple's design's are. Apple is certainly not alone in stupid designs, there are some that are worse. It's always something with Apple products though, and most Apple users just can't accept that. 2008 or 2007 Macbooks had the housing glued together and of course the glue would break and the screen would fall off. Apple fixed it in 2009 or 2010, and did nothing to those who used the older models. Apple's Nvidia GPU issues. That whole fiasco with the butterfly key problem. More recently, Apple didn't put the cooling fan on the heatsink, which is odd considering they did it correctly on the M1 Powered Macbook Pro by putting a heatpipe. Imagine if Apple sabotaged their Intel Macs to promote their M1's?

    Are the M1 Macbooks bad? No, they're pretty good so far. They don't get hot, they don't even need a fan, and they have good battery. But they are a mistake to buy for the reasons I've given before.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •