> A Chromebookās ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself.
I'm in the same boat as the author; I cut my teeth on a hand-me-down 2005 eMac, then a hand-me-down 2008 Macbook, before finally getting my own 2011 iMac. I think this is overly harsh on Chromebooks given they belong to the cheaper end of the market - you can still put linux on them and go for gold, you're just going to hit earlier resource limits.
I think when you're younger and building an aptitude for computers, it's the limitations of what you have that drive an off-the-shelf challenge: doing what you can with what you've got. That can vary from just trying to play the same video games as your friends (love what /r/lowendgaming does), usage restrictions (e.g locked down school issued laptops) or running professional tooling (very slowly) just like the author.
When IT caught my interest, I did all of the above - on Mac, Windows and Linux, on completely garbage machines. The Macbook Neo is an awesome machine for it's cost/value, but I don't think it's hugely special in the respect described beyond making more power available at a more accessible price point.
I went through a two-year period where I didn't have a decent job and couldn't afford a computer of any kind for myself. I ended up spending some time volunteering for a local non-profit, and they gave me an "old laptop" they had in storage. This was in ~2005.
It was a Sony Vaio, and the only thing I really remember about the hardware/specs is that it had a physical scroll widget under the touchpad on the edge of the case. Software-wise, it was running some relatively locked down version of Windows. I installed Arch on it and used it to rebuild and manage the non-profit's website.
The other thing that I remember from it is that it was my entrance into using the terminal as my primary interface - the first place I used Vim regularly, and the first time I'd installed tmux. One day I was trying to test a dropdown or something on their website, and discovered that my touchpad didn't work. It turned out to have been broken by an Arch update, which wasn't terribly surprising. What was surprising is that once I'd traced down the issue and corrected it, I realized that it had been broken for almost two weeks. I'd used that computer every day and hadn't needed to use a mouse even once.
Man I got a computer engineering degree in 2015 with a $200 Chromebook chrooted into Debian. And I worked professionally for years on an 8gb MacBook Air. The Neo is definitely something younger me would be interested in.
C dev wasn't an issue back in the 1 GB or 256 MB or 16 MB days either. You just didn't use to have a Chrome tab open that by itself is eating 345 MB just to show a simple tutorial page.
Are there even any x86 Chromebooks left at that price point? They are only one that are still capable of chrooting into Linux. ARM Chromebooks remain locked up.
Looking at BestBuy, category chromebook, the first one that comes up is $150, intel n4500.
I don't know if this is particularly current or what, or if it's easy to setup to run another OS or whatever, but it meets your price and architecture criteria.
When my mates at school had the aero glass effect on the new Windows, my ancient hand-me-down laptop wouldn't even try to run it. It could however run Compiz somewhat if it was persuaded very hard!
That's basically the reason I learned Linux initially, and those hours debugging video driver issues would serve me well later on.
I've owned and used the CR-48 prototype Chromebook model, which very well did have a developer mode and a third kernel option built right in. Ran Ubuntu on it with no issues. This has been possible since before the device family was officially available for purchase.
The school thing is different, but also hardly unique. A school issued macbook is often similarly locked down and unusable as a dev machine, due to the student lacking permissions to install anything the school deems dangerous.
It was possible on the Acer model I got when it first came out, but it was still useless. A switch that wiped the whole thing back to defaults was needed to open a terminal and from there a shell script could install Ubuntu.
It still ran the unmodifiable chromeos kernel with no updates and without some of the modules I'd like. And then the screen died.
It was junk. The EeePC was cheaper, lasted much longer and had Debian out of the box.
Author of the post here. You nailed it here; I used Chromebook as the example in my post since the one I used in high school was locked down to basically a kiosk. Couldnāt even open dev tools, much less root it. Such a wild departure from the eMacs I used in my elementary schoolās library where I could set bonkers `defaults write` commands and customize every aspect of my account.
If I got a Chromebook as a personal machine as a kid, I probably wouldāve rooted it and see what I could do, but growing up, the beauty of the Mac (in that Snow Leopard era) was progressive disclosure. I could start on the happy path and have a perfectly stable machine, then customize the behaviors through the terminal, see what it does, mess with the system files, see what breaks, revert it, then go back to using iMovie like normal.
In my (admittedly limited) time using a rooted Chromebook, itās much more like a switch flip. You go from mandatory water wings directly into getting pushed into the ocean and Google shouting āGood luck!!ā
Yeah, very little of this is still true of the period in which the Neo or modern Chromebooks exist.
If the school is managing these Macs, including laptops sent home with a student, then unless it's for a specific purpose they aren't allowing you modify files, you probably aren't allowed to open a terminal or system settings, and you definitely aren't disabling SIP. You might not even be able to access the open internet if they've hard-configured it into a VPN. No different from a managed Chromebook.
Likewise, even older and lower-end unmanaged Chromebooks can enable a full Linux environment that runs a terminal in a browser tab. Doing so doesn't require root or developer mode, and it doesn't change or sacrifice any of the rest of the ChromeOS environment (for which your core assertion, that an unmanaged Mac is a computer and an unmanaged Chromebook is a thin-client appliance, still fundamentally holds). You can install Blender and have it running in a window by about 1 minute into watching a YouTube video titled "How To Download Or Update ANY VERSION Of Blender On Chromebook".
Gaining root on a Chromebook is mostly just a prerequisite to modifying things specific to ChromeOS, but the easier to access, more featureful, and safer LDE is still an entire operating system that you can tinker with, screw up, overload, blow up, and reset to zero, all without losing the happy path of opening up Canva (or, more likely, CapCut on their phone/iPad) and editing videos or whatever.
You don't have to root them to do cool shit anymore. They have a full Linux (Debian based) environment you can enable with a single toggle in the settings. Any GUI apps you install via apt get their icons dropped in the system tray and their windows are rendered via Wayland.
macOS or Windows can be similarly locked down. In the schools, the school locks it down. In many companies, there are management tools like JAMF, InTune, or NinjaOne that lock down laptops, desktops, tablets, and cell phones a little or completely.
A kid looking for the best bang for the carefully saved buck would buy a used machine off eBay, for less than $599, and more capable. An M1 MBP 2021 with 16GB would cost about this much; an M1 Macbook Air, or an M1 Mac Mini with 16 GB would cost half as much. A ton of beefy, perfectly Linux-ready used laptops can be had for under $350.
So this is only for the kids who are obsessed not just with computers, but brand-new computers. Which is a different demographic.
Look, MacOS has certainly rotted over the past few years, but the primary reason I use it is because it's still a hundred times nicer to use than Windows (which is also regressing for worse reasons - shoving in AI and ads instead of benign neglect).
It's still the best desktop UNIX experience, especially since cheap PC laptops (and until very recently expensive ones) almost always have horrible build quality. It's also within only the last few years that PC trackpads came anywhere near the trackpads on Apple machines. Sometimes what you call a "tax" is literally some of us wanting quality.
macOS is the best desktop UNIX for one simple reason: the ā key. The fact that 99% of your GUI keybindings use a key that your CLI tooling cannot use eliminates conflicts and means that you don't have to remember things like "Copy is ^C in Chrome but ^ā§C in the terminal".
using a linux with toshy to get the best of both worlds wrt keybindings. linux and kde is amazing nowadays... I don't miss macos but would be hating linux without mac style keybindings.
Yeah, I use Kinto (which seems to be what Toshy is originally based on). A recent Ubuntu update broke it though, and I accidentally deleted my config file while trying to fix it, so maybe now's a good time to try out Toshy. Looks like Toshy creates a python virtualenv instead of relying on system packages, which should make it a little more resilient to system package changes.
Fedora is the best OS humanity has ever made. No exaguration. There needs to be the best, and its Fedora.
Linux gets a bad reputation because 20-ish years ago Ubuntu sent out free CDs and became the dominant OS. Ubuntu/Mint is part of Debian family, outdated linux. They call outdated Linux 'stable', but its not stable like a table. Its software version frozen. Bugs that are fixed today wont get those fixes for 2 years. Not to mention, a new mouse you buy from amazon/nvidia card/web video player wont work due to the outdated nature of these distros. (And yes, I know you can do surgery to update it, but no one likes that)
Fedora is not Arch. Fedora is the consumer grade Red Hat.
> Linux gets a bad reputation because 20-ish years ago Ubuntu sent out free CDs and became the dominant OS.
I've been an Ubuntu user for 20 years, and RedHat and Suse prior to that. Ubuntu just worked. Debian had packages for everything, including from 3rd party vendors. It lets me focus on my work, and not worry about the OS, or compiling packages, or finding installers. When I had issues (rare), the large user base meant that someone had already figured out a solution to the problem.
The flavor of Linux doesn't matter so much in my opinion.
Debian stable isnāt that much more version locked than CentOS or RHEL. Debian also has the Testing tier, which is semi-rolling. Or you could use Unstable. Or if youāre brave, you could use Nightly.
Ubuntu, Mint, PopOS, and others with Debian as an upstream are not Debian. They build their own packages on their own schedules.
Fedora is not āconsumer grade RedHatā. Itās the rolling release upstream of RHEL, much like Debian Testing is upstream of Debian Stable.
The main reason Linux got a bad reputation was the tribalism of people going off half-cocked talking about their personal preferences without actually working with the alternatives and starting this sort of holy war diatribe.
i've made my entire career digging deep into linux - i've been what some people would call a "power user" for about 25 years, and a professional for 15. i spent over a decade distrohopping, tweaking, tinkering and customizing every distro from Corel to Mandrake to Mandriva to Debian to Slackware to Ubuntu to Gentoo to Arch to Void, and everything in between, plus the BSDs. i've been a sysadmin, network admin, devops engineer, yadda yadda yadda.
i have never once successfully installed fedora. probably just hardware stuff, but as often as i've wanted to try it and opensuse, they have never booted post-install for me. on machines i've successfully installed Debian and openBSD. go figure. i know i'm an outlier here. maybe it's just bad luck.
but reading your post, it sounds like a club i don't want to be a part of. linux is linux. distros don't matter. you can get nearly anything to work if you spend enough time on it. GUI OS installers that fail are not worth my time.
Fedora Silverblue it's better and Cosmic Desktop looks good for a DE in every release (upcoming 44). For some isolated and rollbackable option, your only options are Silverblue and Guix for the hard way. If you use Nonguix for Guix, on your own, but I'd only use a nonfree kernel in an emergency (the wireless adapter somehow gets broken and the alternative is to boot the OS with propietary fw in order to buy a new one). And in that case I would blacklist every propietary fw except for the wireless ones.
And, yes, I have an overlaid Linux-Libre kernel in SilverBlue.
Glad to see someone else care so much about software freedom. Guix is great (though my ideal system would be debian with a shepherd init, fhs, and guix for non-root package management)
> For some isolated and rollbackable option, your only options are Silverblue and Guix for the hard way.
How about Qubes OS? Also the parent never said anything about isolation and roll-backs. Nobody mentioned Silverblue except you. The discussion is about ordinary users, not hackers.
Silverblue is supposed to be for normies. Rollbacks are for when you screw everything up.
But honestly I did not like Silverblue. I had a 13 year old gaming computer I installed it on and I couldnt get the ancient GPU drivers installed due to the way things are containerized. This would have been a few commands otherwise.
Maybe its fine for chromebook-like things. I might have picked a bad testcase.
I'm developing on a $270 refurbished Dell, which has an i7 and 16 gigs of RAM. The Apple processor might be competitive, but the rest of the machine is not. 600 dollars is fine and not unreasonable, but there is certainly an Apple tax.
> You are literally just paying the Apple tax that they deliberately choose
And when I go to the grocery store, I am paying the Safeway tax that they deliberately choose, and when I go to the gas station I am paying the Exxon tax that they deliberately choose, and so on.
When I was sixteen I got one of the earlier digital HD cameras (Canon VIXIA HF100) and Sony Vegas Movie Studio for my birthday. It was a neat camera and I liked Vegas, and I was grateful that my parents got them for me, but an issue that I had with it was that my computer wasn't nearly powerful enough to edit the video. Even setting the preview to the lowest quality settings, I was lucky to get 2fps with the 1080i video.
I still made it work. I got pretty good at reading the waveform preview, and was able to use that to figure out where to do cuts. I would apply effects and walk through frame by frame with the arrow keys to see how it looked. It usually took all night (and sometimes a bit of the next day) to render videos into 1080i, but it would render and the resulting videos would be fine.
Eventually I got a job and saved up and bought a decent CPU and GPU and editing got 10x easier, but I still kind of look back on the time of me having to make my shitty computer work with a certain degree of fondness. When you have a decent job with decent money you can buy the equipment you need to do most tasks, but there's sort of a purity in doing a task that you really don't have the equipment you need.
8-bit. 16KiB of RAM. BASIC as the programming language. 640x256 resolution in 8 colours.
I could make that thing sing in an hour. It was hard to get it to do much, but then the difficulty was the fun thing.
By the time we got to the early 2000s and I could buy something with more RAM, CPU and storage than I could ever reasonably max out for the problems I was interested in at the time, I lost something.
Working within constraints teaches you something, I think. Doing more with less makes you appreciate the "more" you eventually end up with. You develop intuitions and instincts and whole skillsets that others never had to develop. You get an advantage.
I don't think we should be going back to 8-bit days any time soon, but in the context of this post, I want novices to try and build software on an A18 chip, I want learners to be curious enough to build a small word game (Hangman will do at first, but the A18 will let them push way, way past that into the limits of something that starts to feel hard all of a sudden), to develop the intuition of writing code on a system that isn't quite big enough for their ideas. It'll make them thirsty for more, and better at using it when they get it.
> Working within constraints teaches you something, I think.
It absolutely does. But every system has constraints; even when provided with massive resources, humans tend to try things that exceed those resources, as evidenced by Parkinson's Law of data https://en.wikipedia.org/wiki/Parkinson%27s_law
It was worse than you remember. You could have 640x256 in monochrome, or 320x256 with 4 colours, or 160x256 with 16 colours (which IIRC was actually 8 distinct colours plus 8 flashing versions of them).
The game Elite did something extremely evil and clever: it was actually able to switch between modes partway through each frame, so that it could display higher-resolution wireframe graphics in the upper part of the screen and lower-resolution more-colourful stuff for the radar/status display further down.
I hear you, having learned programming on a machine even more constrained by the BBC Micro. But learners today are more likely to "Siri, build me a Hangman app."
Iām waiting for somebody to come and tell us about the time they punched cards by hand, one hole at the time, and then threw coal in the furnace to have the cards interpreted by a steam-powered computer.
I had a similar experience but with design software (which I pirated at the time since I just didn't have the money to buy stuff from Adobe).
I'd install Photoshop and Illustrator on my shitty computer I put together from spare parts my dad didn't have the use of anymore from his business computers. It was horribly slow, but I kinda made it work slowly.
The thing is that I think this is what made me think a bit differently, since everything was slowed down and took more time than I would want it to, I had to make deliberate decisions on what to add/edit. I still work the same way today to pa point, but that's because I'm both faster, more experienced and the computers have gotten more performant (and because I can afford better devices sure).
When I look at my half-brother and his teenage generation I wonder if they can still have such an experience. The personal devices have gotten better and faster, most things are really convenient and you sometimes even don't have to think a lot to do something also because they're cheap to do... they probably won't have the experience of "grinding it out" just for the sake of producing something they like...maybe sports is the closest...no idea, but have been thinking about this quite a lot recently...
I have a typical yuppie software job with decent pay, so generally I will buy the right tools for a job now instead of trying to make due with whatever I can scrap together. I'm not that busy of a person, but I certainly have more obligations than I did when I was sixteen, and now sometimes it really is worth it to spend an extra grand on something than it is to spend a week hacking together something from my existing stuff.
Still, I look back at the hours I spent making terrible YouTube videos with my terrible computer really fondly. I was proud of myself for making things work, I was proud of the little workarounds I found.
I think it's the same reason I love reading about classic computing (80's-90's era). Computers in the 80's were objectively terrible compared to anything we have now, and people still figured out how to squeeze every little bit of juice possible to make really awesome games and programs. The Commodore 64 and Amiga demos are fun to play around with because people will figure out the coolest tricks to make these computers do things that they have no business doing. I mean, the fact that Bad Apple has been "ported" to pretty much everything is something I cannot stop being fascinated by. [1] [2] [3] [4]
It's a great example of going the extra mile due to external limitations. I bet you developed skills and intuitions you wouldn't have if you started with great hardware from the get go.
I don't think this is about the macbook neo. I don't think the comments need to devolve into a mac vs. linux argument. It's simply an ode to that kid pushing hardware to the limits, and learning so much along the way.
What I feel a bit sad about is, I was that kid. Growing up in a 3rd world country, running games that i didn't own on hardware that ought not run it, debugging why those games don't work, rooting my phone and installing custom OSs just for the heck of it. Man I had so much time to tinker.
Now I have amazing gaming hardware but I barely touch games. When I do, its on steam. I've swapped out the endless tinkerability of android with the vanilla 'it just works'-ness of the iphone. That curiosity took me far, but I seem to have lost it along the way.
(Author of the post here) The post was inspired by the Neo and provoked by a certain YouTuberās review of it, but yeah itās about the Neo in the same way that The Old Man and the Sea is about fishing.
I wrote about the Mac in general since thatās what I know, but I imagine if I grew up in the Windows world and liked Windows more, I would have a similar experience with my dadās old ThinkPad or something.
You perfectly encapsulated how I felt as a kid pushing my computer to its limit just to learn and try new things. I didn't have a Mac, but the experience was identical.
> I've swapped out the endless tinkerability of android with the vanilla 'it just works'-ness of the iphone. That curiosity took me far, but I seem to have lost it along the way.
I feel this, and on the whole I've done the same thing. I'm deep in the Apple ecosystem because it all just works together without me having to tinker with it. I think this is mostly a reaction to now doing that stuff professionally - 4 days a week, whether I feel like it or not, I'm required to make computers do things they couldn't do before I started.
When I get to the end of the work day, or out of bed on a Sunday morning, I might get the urge to tinker with things but I refuse to have tinkering with things to make them work be a requirement for my rest time. Leisure tinkering must be on my terms, because if I'm forced to tinker with something just to do what I really wanted to do that's not tinkering, that's the thing fucking with me, and I will swear profusely at it throughout.
I didn't grow up in a 3rd world country but had the same experience, bar running games I don't own. Not everyone in the west had parents that wanted to just spend thousands on hardware that seemed to be obsolete next year, or any means of making that money. And I've never stopped using sub-par hardware, to this day I enjoy squeezing every drop of performance from cheap pre-owned stuff.
Most of us learned a lot that way, trying to squeeze and make something work out of nothing. That's why we understand much more than kids today. In the end that is the reason I still optimize stuff in my corporate company and I have a pretty awesome job, so it's a good path.
Mostly same story. Tinkered for hours with Windows 3.1 floppy disks. Reinstalling OSās all the time because Iād break stuff or Iād just want a fresh slate. I loved pushing the boundaries. In my 30ās I slowed down with the tinkering because of life (kids, work). I thought I lost the ability to tinker. But recently at 42, I bought a MacBook for the sole purpose of tinkering on the couch at the end of the day (basically after being on computer the whole day, I didnāt want to be in office anymore). And slowly, itās coming back. Iām playing with new things, learning about Neural Networks, learning about Softare Defined Radio, installing tons of random libraries and tools to test that out. Itās coming back. Keep pushing on it and hopefully it returns for you too!
I'd agree it is about the Neo in the sense that the device and the talk around it obviously triggered this post.
I don't think the author is exactly bashing the Chromebook. I'm reading it in that the author praise an open ecosystem where you have flexibility and the choice to "take off the guard rails" and go where the device was not originally made to take you.
There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
it's so wild that we're in a place now where the utterly brainwashed can say this with a straight face:
> There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
He's looking back to a time when they were still special. When every keynote brought out a new, interesting product, feature, OS enhancement, etc. Back in the Steve Jobs era, it was still worth tuning in every year to see what was new.
The whole article is about how Apple is still special just like when he was a kid.
But anyway, I find it funny that author implies if a kid gets a MacBook Neo they will explore all the possibilities to use and customize it, but somehow the same kid won't try to push a Chromebook to its limit. It 100% matches my stereotype of how Apple fans view machines.
Yeah, you'd find out about it all in the newspapers a few hours later, and none if it was "clap to yourself in an empty room" impressive anyway. I was around back then and I didn't feel the need to act like a drug addict whenever Steve Jobs opened his mouth.
> The kid who tries to run Blender on a Chromebook doesnāt learn that his machine canāt handle it. He learns that Google decided heās not allowed to.
Or they learn to enable developer mode, unlock the bootloader, and install Linux, or use the officially supported Crostini, or so on. There's like 3 different ways to run Linux desktop apps on a modern Chromebook.
The Macbooks don't let have an officially supported path to unlocking the bootloader (edit: yes, I'm aware of asahi linux, which lives on the edge of what apple allows) and install your own OS. The Chromebooks do. I don't think that comparison plays as favorably as you think.
The same Asahi developers also wrote about how Apple didnāt document anything and especially, Apple never talked in public about this. Apple betting Apple, If they had cared a single second about this, they would have called this Bootcamp 2.
Honestly Iām pretty convinced that this Ā« open Ā» bootloader was just there to avoid criticism and bad press from specialized outlets when they presented the M1 because, for once, they needed specialized outlet to benchmark the M1 performance and not have anything bad to say about anything else.
They constantly break everything year after year without documenting any change which effectively makes Asahi unusable in anything recent.
Iām betting that they are just patiently waiting for Asahi to die by being too late of several years (which is already the case) to announce Ā« The most secure Mac ever Ā» silently releasing with closed bootloader when nobody and especially the press will care anymore.
Donāt get me wrong, I love Asahi and I even have it installed on my M2 Air, the project is doing incredible quality work. But I donāt believe it will last long. Hope Iām wrong, though.
For them to call it Bootcamp 2 (a "product" per se), they'd have had to have another OS they could actually demo installing. Otherwise "Bootcamp 2" is just a mysterious empty chooser window.
But at the time there was nothing, because Apple Silicon wasn't a platform anyone but them was targeting, because they had just created it.
So they built the infrastructure, and then waited for someone to actually start taking advantage of it, before bothering to acknowledge it.
And because that "someone" isn't a bigcorp (i.e. Microsoft) wanting to do a co-marketing push, but just FOSS people gradually building something but never quite "launching" a 1.0 of it ā Apple just "acknowledged" it quietly, at developer conferences, exposing it only via developer-centric CLI tooling, rather than with the sort of polished UI experience they would need if Microsoft was trying to convince Joe Excel User to dual-boot Windows on their Apple Silicon MBP.
> announce Ā« The most secure Mac ever Ā» silently releasing with closed bootloader
That's extremely unlikely to happen, as Apple's hardware and OS developers build Macs and macOS (and all the other hardware + OSes) using Macs and macOS. And those engineers (and engineers working at Apple's hardware and accessory manufacturing partners) will always need to be able to diddle around with the kernel and extensions "in anger" without needing to go through a three-day-turnaround code-signing process.
There's a whole proprietary, distributed kernel development and QC flow for macOS, that looks a lot like the Linux one (i.e. with all the same bigcorps involved making sure their stuff works), but all happening behind closed doors. But all the same stuff still needs to happen regardless, to ensure that buggy drivers don't ship. Thus macOS kernel development mode being just one reboot-and-toggle away.
> And because that "someone" isn't a bigcorp (i.e. Microsoft) wanting to do a co-marketing push, but just FOSS people gradually building something but never quite "launching" a 1.0 of it ā Apple just "acknowledged" it quietly, at developer conferences, exposing it only via developer-centric CLI tooling, rather than with the sort of polished UI experience they would need if Microsoft was trying to convince Joe Excel User to dual-boot Windows on their Apple Silicon MBP.
It's also important to remember that Microsoft was in the middle of their Qualcomm exclusivity deal at the time of the M1's release, and thus Windows for ARM wasn't available on anything other than a few select devices or unofficial use of Insider builds.
That deal didn't actually expire until 2024[1], at which point Windows for ARM finally started to be sold in an official capacity with stable builds widely available.
It's entirely possible, though unconfirmed, that Apple was intentionally leaving the door open for "Boot Camp 2", and Microsoft simply never took them up on the offer, either because they were stuck in a deal made prior to the M1's release that prevented it, or because they no longer saw a financial benefit to being able to sell Windows to Mac users (possibly since Windows license sales are effectively a rounding error to Microsoft at this point; they make way more off of subscription services and/or Office, all of which are already available on macOS without having to dual-boot Windows).
> possibly since Windows license sales are effectively a rounding error to Microsoft at this point; they make way more off of subscription services and/or Office, all of which are already available on macOS without having to dual-boot Windows
AFAICT, the way Microsoft wants things to work, is that "Windows" is the native fat-client platform / SDK that ISVs are supposed to use/target when building fat-client apps that interact with (i.e. generate spend on) Azure-based backend systems. The #1 way Microsoft makes money at this point isn't from direct consumer or even volume-licensed subscriptions; it's from providing paid backend infra to dev shops who had long since locked themselves into the Microsoft/Windows development ecosystem, and who therefore saw Azure as the only valid cloud backend to integrate with when "cloud-enabling" their software (and/or, where the compliance story of integrating their previously native-and-local-syncing software with Azure, was 100x simpler than with integrating with any other cloud, due to Azure+Windows being able to act as a trusted principal-agent pair that can enforce policy-based security via a shared "cloud domain" identity [Entra ID] baked right into the OS ACL layer.)
Until recently, though, Microsoft thought of the Windows "platform" the same way Apple do of the Mac "platform": that "Windows"-the-platform-SDK was the same thing as Windows-the-OS. Which necessarily meant that consumers must be pushed with all conceivable effort toward using Windows-the-OS on their machines, so that these dev shops who had targeted Windows-the-SDK could reach them with their software (so that those dev shops would in turn spend more on Azure.)
But I think this equivalence is going away!
From what I've seen of discussions in various Microsoft-aligned sources recently, it feels to me like some part of what Windows 12 may mean by calling itself a "modular OS", is that Microsoft may be establishing some kind of very clean boundary layer between Windows-the-OS and Windows-the-platform/SDK.
---
What would that look like? I don't know for sure, but here's some spitballing:
Well, picture Mono, but as a complete UWP projection, shipping with all the native libraries that are built into Windows.
Or, if you'd prefer, picture Wine/Proton; but rather than black-box-reverse-engineered equivalents to Windows DLLs, it is all the DLLs that come with Windows. Except, now rebuilt from the ground up so that they compile against NTOS or Mach or Linux syscalls.
Basically, "the complete Windows platform" as a JVM-like runtime you "get for free" when installing Windows-the-OS, but can install on top of macOS and Linux. (Probably in various runtime profiles, as with Embedded vs Server vs Desktop JREs. You don't need D3D on your server.)
This would be likely to take 100% of the wind out of the sails of the Wine/Proton projects overnight. And maybe kill Mono itself, too. After all, why bother with half-assed third-party implementations of the Windows Platform, when you can just install the "real" Windows Platform, and get guaranteed bug-for-bug compatibility with existing Windows software (relying on the same databases of app shims and fixes Windows-the-OS has shipped with for ~forever)?
SteamOS would be reduced to "a Linux distro that preinstalls the Windows Platform." ReactOS might or might not (depends on stubbornness) be reduced to "a clean-room-implemented NTOSKRNL-compatible base OS, that preinstalls the Windows target of the Windows Platform."
Wine/Proton themselves would, if they even bothered to keep going, end up rebranded as "an alternative ground-up Windows Platform runtime." (If the official "Windows Platform runtime" was then open-sourced, then likely Wine/Proton would fully fade into obscurity, as anyone who wanted to maintain their own libre Windows Platform runtime would just start by forking Microsoft's. Very similar to the situation with OpenJDK.)
---
In any case, regardless of how they do it, any move in this direction would make it blindingly obvious why Microsoft wouldn't care about enabling something like a "Boot Camp 2" feature on Macs any more: they no longer care if you install Windows-the-OS on a Mac; they rather want you to install the Windows Platform runtime under macOS. And then they'll have you as a consumer of Windows Platform products all the same.
(Actually, even better, as they'll have you far more of the time. In the Boot Camp business strategy, any time you spend booted into macOS puts you out of Microsoft's reach, save for the few first-party apps Microsoft has ported to macOS + sells on the App Store. In the Windows Platform business strategy, meanwhile, you can be running arbitrary Windows Platform apps on your Mac 100% of the time you're using it!)
Switching to developer mode is very likely something he wonāt be doing nor allowed to do on the Chromebook his parents bought him or the school assigned him.
Some kids undoubtedly get there, exactly as you say. That's not at all the same experience as opening a device that has a MUCH bigger sandbox to begin with and lets them start exploring with boundless applications from the beginning.
The bootloader kids get my deep respect. I think I'd rather give my kid a Neo to begin with.
Asahi linux effectively only supports the M1 and M2 chips, so even a modern macbook air won't work, and even on "supported devices" you can't use thunderbolt or a usb-c display yet.
These use the A series chip, and even supporting new M chip revisions has been enough of an undertaking that I wouldn't really expect this to get Asahi linux anytime soon....
And apple can lock down the bootloader to be closer to the iPad/iPhone at any time with no notice, and based on their past actions, it would be quite in-line with their character to do so.
The Asahi folks have also demonstrated M3 support, though without GPU-accelerated graphics (the M3 GPU is very different from the M1 or M2). Much of the effort is currently on getting the existing components upstream.
I'm confused. Isn't coreutils a just small subset of even macOS's current zsh's builtins? What do you prefer about systemd to launchd? defaults seems like a convenient way to manage settings. Is it confusing for people from other operating systems?
`grep -P` kinda annoying. GNU has Perl-compatible regex, and BSD does not. You're reaching for `perl` or installing `ggrep` the moment you need a lookbehind.
It is made by Apple yes. It's not very bad, it even has big font support from the VT100 series. And a lot of style settings in the menu bar. It's not iterm2 but it's way better than what windows offers (not just the console but the newer windows terminal isn't as good either IMO)
Asahi only supports M1 and M2 series Macs currently. The Neo uses an A18 Pro, which was only ever in an iPhone before. I wouldnāt count on Asahi coming to these soon.
Reverse engineered documentation is very often preferable to the internal kind, which is not necessarily accurate. So either way, the Asahi folks are doing valuable work.
Maybe some of these agentic AI superstars can point their 100x engineering chops at this. This would impress me but not going to hold my breath for that.
I really don't get this comment section. You get a Macbook then you have a perfectly usable machine which will run all the mainstream software you ask of it, and then you get natively compiled well supported developer tooling, no VM required. The best argument for Chromebooks is that you can throw away ChromeOS and install Linux or use Linux in a VM. These are not even close to the same.
I think folks want to hate Apple more than they want to admit that Chromebooks kinda suck.
Thereās an entire Linux distro (Asahi) for MacBooks. Apple has never released a Mac with a locked bootloader.
And macOS frankly provides a far better Unix experience than ChromeOS, in my experience, having actually used both (including for development, though only for a short time on ChromeOS because it was horrible).
Apple did not lock the bootloader, but they do not provide documentation for their products.
What would have been a trivial porting work with documentation, becomes extremely time-consuming and hard work without documentation.
That is why Asahi Linux lags by several years with the support for Apple computers, and it is unlikely that this lag time will ever be reduced. Even for the old Apple computers the hardware support is only partial, so such computers are never as useful for running Linux as AMD/Intel based computers.
> Or they learn to enable developer mode, unlock the bootloader, and install Linux, or use the officially supported Crostini, or so on. There's like 3 different ways to run Linux desktop apps on a modern Chromebook.
Oh so all our hypothetical child has to do to discover what computers can actually do is completely rebuild one's software from scratch with no prior knowledge.
Next you'll tell me F1 drivers in their teens just have to LS swap a Saturn SC2 and book time at a track.
I used to be the cool tech guy in school because I memorized the tutorial to jailbreak iPhone or to cheat in games with a memory editor. You know, stuff like "when you see this screen, click that icon", "find row 5 and change the second value to 0", or "open terminal, copy paste this command and hit enter". I don't think I learned anything useful from those.
You learned that such things are even possible, and you learned that other people saw you as the cool tech guy just because you took time to memorise that stuff.
Well, sure. Maybe you're the kid in the article who opened Xcode and Blender and Final Cut, but it didn't click for you. Of course not everything is for everyone, but it doesn't prove exploring the limits like that is a bad thing.
And these days, you can ask your favourite LLM for step by step advice, and you can even give it shaky phone camera shots of the error message on your screen.
I've been using computers since 1991 (I'm 42, from 1984), and to be honest this stuff is getting harder and more confusing, not easier. Mostly because it keeps changing, and not based on any logic towards improvement. Sure I'm good at getting my questions and problems solved now, especially with AI, but I don't believe I have the ingrained mastery I felt after a while with computers in the 90s.
?? I installed Omarchy on an old MBP simply by inserting the usb stick into a USB port and holding a key combo during boot. Didnāt have to unlock anything.
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called āProjectsā with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes. He is going to open GarageBand and make something that is not a song. He is going to take screenshots of fonts he likes and put them in a folder called ācool fontsā and not know why.
āMrs. Jonson, the result are back. You son has autism.ā
Autism is a quite strong diagnosis at the end of a spectrum. Not every tech-loving introvert is autistic. That's the kind of arrogant attitude that marginalizes nerds and on a forum like HN people really ought to know better.
As somebody who identified a lot with what's in that article I can say that I haven't just made peace with having been "different" but I love it and wouldn't want it any different comparing my life today with that of the arrogant non-nerds who made fun of us back in school.
The colloquial use of the word āautismā carries with it a specific connotation and mind image. That primarily negative stereotype is being reinforced by the joke by way of it being delivered as a medical diagnosis (āthe results are backā).
Your parent comment is arguing against perpetuating the wrong negative connotations and lack of understanding of autism.
Not to say the original author was doing it maliciously, I donāt think they were.
(yes, I know we're all on the spectrum somewhere, but a diagnosis is defined as it impacting your life severely, and I think many people would say that I have many autistic traits that are negative).
I mean i did this sort of thing a ton and am not on the spectrum. Idk when being "quirky" meant you are autistic, seems like it was within the last 7-10 years though.
> He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how heās using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
I still love to revive old hardware and push it beyond it limits. Mostly because i think it's fun, but also because it's dirt cheap or free. Back then made an old GPS system play Monkey Island or mp3's or read E-books. Reinstalled lots of old Android phones and tablets. Made photo frames out of them. Made webcams out of them. Transformed old laptops into Chromebooks. Make lots of old NAS devices work again. Stuff like that.
I got a cracked copy of 3ds Max at a LAN party (back when it was "Discreet 3dsmax"), and immediately dragged dozens of cubes, spheres, and cones into the scene.
Then I closed it for a year. Opened it up again one day, followed a box-modeling tutorial (from the documentation PDF linked in the Help menu!), and I was hooked. Spline modeling, rigging, walk cycles, texturing, lighting experiments, every spare minute for the rest of high school.
I still remember the whole-body panic of accidentally turning on "adaptive degradation", which replaced all meshes with their bounding cubes when rotating the viewport camera, and thinking I had broken my video card.
This article is a strange combination of defending the macbook neo from stupid attacks, and making similarly stupid attacks on the chromebook, with no self-awareness (unless there's some level of irony I'm missing here, which, come to think of it, might well be the case).
Chromebooks have a linux VM where you can install anything, including GUI apps, and doing that is much more straightforward then installing something from the web on a mac. Download, right-click, install on linux. No scary warnings. No need to go to system settings.
I think this is entirely fair - this laptop imo could satisfy 90+% of Apple's userbase (including me, I used to do dev work on my M1 Air I bought out of curiosity) - it's a fully featured computer, that's incredibly well made, to the point that I don't think anything compares favorably in the Windows land at any price point. Likewise you'd have to go for the premium segment if you wanted to get a machine with similar single-thread perf (which is the perf that matters the most for most people)
It's not locked down in any way, this is a fully featured machine, unlike Chromebooks, which in my experience don't cost a cent less than equivalently specced Windows laptops. Due to this I never considered buying those.
I'm not even a Mac fan, I just think they make nice computers and their OS has the least amount of downsides atm.
This is why I skip tech youtube videos with MacOS. The users have the reality distortion field.
I always wonder what the world would be like in a battle between Google and M$ rather than M$ and Apple. Obviously less advertisement, more focus on function and less form.
As a kid, I grew up before laptops became hugely popular, so instinctively my thought was, of course a MacBook (indeed any MacBook) is not the computer for a kid; the Mac mini or an iMac is. The author started with a 2006 Core 2 Duo iMac inside the kitchen and he should know this. The simplest and easiest parental supervision that doesnāt involve any software is to have a desktop machine in the living room, within the peripheral vision of parents. Watching some videos that you shouldnāt be watching? Dad comes and tells you why not. Want to bring the computer to your bedroom and eschew sleep? Physically impossible. Playing a game for an hour? Dad comes and tells you itās enough. But learning about formulas in Excel? He comes and offers to answer your questions.
The Neo seems kind of nice but I don't really see how it's more significant than "a nice low end computer." The article reads like its fire from Olympus but a nicer screen and trackpad is only incrementally better than what was available in Chromebooks and cheap PCs.
Personally I think a the Steam Machine will have a better chance to cheat a general computing device into the home of someone not looking for it. The Neo gives me hope on price point.
For the past decade or so, many children had no access to real computers. Before covid, many households either only had school-issued chromebooks, or only smartphones. With covid causing a rise in remote schooling, many families got laptops, but again often only locked-down chromebooks.
There's adults nowadays that do their taxes on their phone, cut videos on their phone, and edit spreadsheets on their phone.
And while smartphones and chromebooks are great at accomplishing your desired tasks, they offer no opportunities for growth. You can't change and play around with the system, become a power user, modify your system, look behind the curtain, and gain real understanding.
It's an interesting paradox: the more we made computing accessible, the less we got out of it.
When a PC was expected to boot to an OS and not much else, we had all the freedom - by necessity - to tinker and learn. Hardware was barely enough for most day-to-day usage, so we upgraded relatively frequently and got to know the physical innards as well.
This is all so streamlined today that even computers can be smartphones with "apps", or even just a browser that gets you to google slides and everything else (or the MS equivalents). It was probably a necessity that, as computers became infrastructure, they would become simplified, so 90% of the population can indeed file their tax return online (and the remaining 10% have their younger family members do it).
This also means that people nowadays simply don't know that they can walk into any second hand store and get a $200 PC with a warranty that'll be much more productive than any smartphone if they have the knowledge to use it properly. But was there really a loss? These are, for the most part, people that would not have been able to hop on the internet wagon if it'd relied on maintaining a linux distro at all. That's regarding adults; children now do indeed grow up with walled systems for the most part, and that might be a loss.
I didn't really read it as a specific advert for this computer, but rather a nostalgic defense of cheap starter PCs in general. It gave me some hope for the future.
I'd say the low end is closer to a Raspberry Pi or perhaps a used old Thinkpad. A $600 machine with good single-core performance is only low end if you ignore everything outside the Apple product lines.
> trackpad is only incrementally better than what was available in Chromebooks and cheap PCs
Did you use a touchpad of an old cheap PC? Apple would not dare to use one comparable to that in their wildest nightmares.
One also has to consider that Apple remains an āaspirationalā kind of computer. The things bemoaned by HNers due to Apple having something of the status of a luxury brand delivering a premium computing experience are also desirable to huge numbers of people in the world seeking to improve their status and lot in life. Itās very easy for us in the west to overlook that thereās a couple of billion people in the world earning $300-400 a month. So thereās a billion kids out there who would perhaps be lusting after this machine instead of struggling along with a very recycled and half-decrepit laptop. Thereās also huge numbers of people in the west who live paycheck to paycheck so having an actual machine at this price point that will deliver years of faultless computing will probably make a big difference. At least I get the aspirational tone the OP is arguing for - a kid completely learning the edges and maxing out their machine will likely produce better results and better educational outcomes than one given a top of the range MBP or windows desktop supercomputer.
The build quality and usability on mac laptops is something else, I've yet to see even 2kā¬+ laptops that people typically get for their jobs that aren't a pain to use without a mouse and monitor. Whereas I'm sitting here in front of my macbook and not touching the mouse next to it most of the time.
That's definitely valuable, but not for a child in my opinion, it's the type of luxury equivalent to a Mercedes over a Renault. Perfectly defensible but, just like a Mercedes is hardly a starter car, I don't think an MBP is that fit for a starter PC. It's also mostly useless if you're not traveling for work regularly.
That said, does any of that even matter any more? People were learning Blender, programming and whatever else 15 years ago on low to mid range machines already. The equivalently priced - or dirt cheap second hand - machines of today are multiple times more capable at everything. Stick Linux and a $5 mouse in it and you're 90% of the way to a macbook pro in terms of user experience.
That's to say, I agree with the core of the article: kids will make the most out of the least. But I disagree that this particular laptop is a necessity or a boon for that. If anything, it's a hindrance for being a mac.
1. This is the most optimistic, inspirational thing I've read in months :)
2. Are there kids like that still?? I would love to think so. None of the kids in my circle of parents are. There is one teenager who's going into computer science because they are smart and love math, which is great, but they never built or explored or been curious about anything on their computer as far as I can tell. There is a big ecosystem of wish fulfilment and instant gratification, and I think (right) limitations like the author insinuated are part of the allure.
To 2., yes - you just have to look in the right places. You're sure to find them in middle/high school or university robotics teams, for example.
When computing was niche, you really only got into if you had a real interest in it (I assume - I wasn't around back then). That's changed, but it doesn't mean that that category doesn't exist anymore. If anything, it's probably way larger in absolute terms, if a smaller proportion of people who work on software in general.
I thought it was inspirational as well. When he described making a folder called "Projects" it reminded me of that feeling I had after graduating from a TI-99/4A to a PC. The possibilities were endless.
Different computers definitely give me different feelings when I use them. Some inspire creativity and the desire to make something meaningful or beautiful, others feel like machines made for work or for play. All of the boundaries are fuzzy but, for me, computers definitely have an emotional valence.
> A Chromebookās ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself. The kid who tries to run Blender on a Chromebook doesnāt learn that his machine canāt handle it. He learns that Google decided heās not allowed to.
As someone who lived on a chromebook for fun because it was a cheap way to get a browser machine that also had Linux access. I don't really get this. You can run blender on a chromebook as soon as you turn on the linux container. It will run even better if you install linux on it after a quick firmware flash.
If it's locked down by a school that's not really the chromebooks fault, schools are gonna lock down Macbook Neos via management policy the exact same way.
Heās right. I built a hackintosh from a PowerMac G4 motherboard I bought off of eBay with my saved-up babysitting money when I was 12 or 13 because I was absolutely desperate to have a machine I could edit movies with, I couldnāt afford a real Mac, and I read on the internet somewhere that this was the cheapest way to get one. I knew lots of older brothers who were āintoā computers (all of them for gaming) that thought I was an idiot, because building my own mac made everything ten times harder. I didnāt care. I was obsessed.
This is a $599 computer with purpose-built architecture for (barely) running (small, underpowered, near-useless) LLMs. There are children saving pennies for this machine that will do great, horrifying, dangerous things with these computers. I canāt wait to see the results.
I remember this period of my own life. I had taken over my father's old 486 and spent my days and evenings trying to learn the basics of programming in C. I was making silly text based games, dreaming I'd one day be creating the game of my dreams. I also modded games by opening every content file and trying to figure out what they did and how I could modify them. I was still years from realizing game development was a career and not just a hobby.
I had replaced all the Windows sounds and cursors to customize the system so it looked and sounded like a Sci Fi system. I even patched the boot screen to be a humorous screen of "MS Broken Windows". It also was quite broken from messing with system files I didn't understand.
Because most people donāt know that the boot screen and even the shut-down (Safe to Shut Down Windows) screens were simple BMPs, they get shit scared when you āhackedā the computer to show different messages/pictures. (Always backup and have a renamed copy of the BMP, just in case.)
I second that! This is also how I feel about Raspberry Pis. There's so much they can't do, and yet in a way they can do everything. It's not the power of the machine, its about how much control you have or how close you can get to the metal. At least that way you learn about why you need more powerful hardware.
It's one click to set up a Debian environment on a Chromebook. Same on an Android phone. You can learn plenty from that. Once you've learned the limits of what you can learn within that environment, it's not difficult to then unlock the bootloader and learn even more.
To be honest, anytime I see someone recommending how easy it is to install Debian I always feel like theyāre some relic from the nineties. Kids likely wonāt follow any advice starting with āinstall Debianā.
They will if they are at all "technically curious" and bump-up against the limits of "ChromeOS" and running software they want to execute. A couple quick searches will find them some instructions, and boom - after a week or so, they are running Debian, their own Minecraft server, Blender (poorly), or whatever had prompted them to look for alternatives.
Never underestimate the time investment and frugality of a "technically curious" young person... Myself, I would have been a happy end-user, loading/playing games, running software - except, I bought a cheap modem - with physical IRQ jumpers - and no documentation - and it's default jumper settings conflicted with my mouse in Windows. If it hadn't been for that cheap/frugal purchase and then having to invest the time to troubleshoot, I wouldn't have become "technical" and moved on to greater and greater challenges and learning experiences. Most people would have just returned it and got an external modem instead, or given-up on even the possibility of connecting to BBS's...
What is fundamentally different from the late 80's/early 90's, is now there is a tremendous wealth of knowledge on the internet to actually facilitate that troubleshooting type of learning activity. Is that better? Well - there will always be a "known solution", but what I find many people do now, is follow whatever the first set of instructions they find, treating them like a "magical spell", without knowing/learning "why/how"... [And if the first set of instructions doesn't work, the majority just "give up"]
Overall - in my experience, the percentage of people who are truly "technically curious" is about the same as it ever was - single-digits... It ultimately depends on whether or not their interests/passions/blockers align with being forced to go "beyond" their comfort zone.
Yeah, that really resonated; the author captured something about the way kids explore.
It brought back memories of when I first started using a Unix time share at university, and exhaustively read all the man pages. Didnāt know why, just wanted to discover everything.
> They have very little interest in what you might become because of one.
Love the spirit of the post.
As a high school dropout, with a GED, Iāve spent my entire adult life, looking up noses. I chose a career jammed to bursting, with sheepskins, because I really enjoy doing tech. Not because I wanted to make money, or because I wanted to be a big shot.
My first ever program, was in the 1970s, some time. It was a Heathkit programmable calculator. My first ever āseriousā program, was Machine Code, typed into a 6800-based STD card, nailed to a piece of wood, with a hex keypad, and an 8-digit LED display. My first personal computer, was a VIC-20, with 3KB of RAM. My first Apple computer was a Mac Plus, with 4MB of RAM, and an external 20MB SCSI hard drive.
Learning on limited resources helps us to become frugal and efficient. It also helps us to become tough as hell. Some of the best engineers I ever worked with, had rough backgrounds.
These days, I use a pretty maxed-out Mini, and an LG ultrawide screen. Iām spoiled.
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it
10 year old me identifies with this so much.
I managed to get the computer to display 256 colours instead of the 16 it had been set up with. Everyone was impressed and this meant I was now allowed to take the computer apart and put it back together again without anyone being scared.
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called āProjectsā with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
>Somewhere a kid is saving up for this. He has read every review. Watched the introduction video four or five times. Looked up every spec, every benchmark, every footnote. He has probably walked into an Apple Store and interrogated an employee about it ad nauseam. He knows the consensus. He knows itās probably not the right tool for everything he wants to do.
Anywhere in the world, the kind of kid that does all this and installs Blender on it is WAY more likely to save up for any janky terrible half-working PC laptop with a bit better specs (memory in particular), or a desktop computer if possible, because A. games B. Linux C. piracy and more software D. he does not care about it being Apple or "just working", in the words of the author himself. I don't know how the US kids in particular feel about this since the reality distortion field is so strong, but anywhere else it's like this.
Depends on the kidās hobby and purpose of the computer. Iāve always known that MacBooks are better for music making (especially driver hassle, audio signal reliability, etc.). Could I afford a MacBook as my first laptop? No. Did I buy a second-hand MacBook as soon as I could afford it, and have been happy with it since? Yes. As a teenager, I wouldāve loved to buy a new MacBook for 500/600ā¬.
Iām sure most other applications are less Mac-optimized, though (software development, 3d/graphics editing, gaming, ā¦).
That's very true, and it's a great tragedy. Kids get skilled in troubleshooting awful computers instead of getting skilled in creative things which actually have future value.
As for piracy, it's just as easy on a Mac, and MacOS has more quality software than any other platform - unless you're talking about software used in factories and such.
But how many kids actually "save up" for a computer vs being given one by their parents or getting a hand-me-down from relatives? I would suspect that many parents would be more than happy to buy a Macbook for their kid if they showed that kind of interest.
I think you're making it sound like they're forced to build Linux from scratch while walking knee deep in the snow, uphill both ways. That's way too detached from reality and certainly it's not a "tragedy". Kids are not doing specialized things that only have future value, they tinker with everything. Usually on what's available, yes.
>and MacOS has more quality software than any other platform
This is simply untrue, and not something a tinkerer cares about on a general-purpose machine anyway (with my niece and son as n=2).
Windows have also always required a lot of tinkering and trouble shooting to make things work in a pleasant way.
But for Linux, the creative software simply isn't there in many cases for a kid to start learning. Unless it's programming, which is not everybody's talent.
A kid tinkering with any kind of creative software learns and absorbs important skills which they can build on later if they want to. These things are much more valuable than system troubleshooting or becoming skilled in a game.
For young people out there, it is better to build a desktop gig instead of a laptop. You can't change much in a laptop, unless it is some legacy laptops, but definitely not a Mac. Parents should help their kids to build a 16GB Linux gig. It's going to be more expensive than $599 nowadays due to the price of everything, but still, it is very expansible, and the kid can earn $$ and decide which parts he wants to upgrade.
BTW a laptop is definitely the only choice if the kid wants mobility, though.
Sometimes I feel privileged for being in the generation that learnt to program BASIC on a C64 when it was the coolest thing around at the time. Being that much closer to the metal is a whole different experience of learning what a computer is and can do.
Is that even possible now? Probably not. Years ago I tried to get my kids interested in playing with their own Raspberry Pi when they came out, that they could do whatever they wanted with on the side, to little effect. Not even the idea of setting one up as their own Minecraft server (they were heavily into it at the time) piqued their interest. Oh well.
Most child of every generation don't care about those things. Most of the few that cared about the C64 just used it to play game. You are in the minority who got interested in the C64 and the minority within that minority who also was interested with BASIC. It's good you tried with your kids but the odds were against you.
Meanwhile, some other kid in your area probably got scolded for installing F-Droid. Oh well...
I totally admire raspberry pi and their attempt to get kids a gateway into cheap computing - made by people the generation who started on those BASIC machines. But Iāve always found it to be a radically different experience on Raspberry Pi given it boots into a full desktop and has endless things to do, compared to the empty terror-filled void that is a blinking BASIC cursor with nothing else on the screen except for some arcane copyright message. Loading a game from tape and experiencing the 5-minute cacophony of that noise was also a surreal and tedious experience for the nippers of the 1980s. It made you really want it in a way that machines since canāt deliver.
Plenty of great tools for kids to start making games with if they're interested in it! Personally, I think running something on a Raspberry Pi isn't very interesting or inviting as a first thing to play with. Creating a game in Roblox, designing an outfit in Roblox, or building a game within Minecraft is more interesting. And people build crazy stuff in Scratch.
But also, not every kid is interested in that anyway.
I tried to learn to program as a kid too. It didn't take, couldn't get past the hello world/simple program stage interest wise. I just wanted to go right to making games. Closest I got was messing with configs and skins and some map making. Took until later in college when I started programming "for real."
Talking about staring at interfaces, I got my first Pentium computer when I was 7 in a village in Pakistan. I spent all day fooling around and accidently stumbled upon quick basic. Having nothing to do I learnt how to code because the help menu listed all the commands and the interpreter gave errors when I did something wrong.
With a clear feedback loop and the insane motivation of a child I learnt to make games/software on basic which ended up defining my life.
Sometimes we overthink it, all a child needs is a safe environment to fool around and letting them be obsessed about things.
> This computer is for the kid who doesnāt have a margin to optimize. Who canāt wait for the right tool to materialize. Who is going to take whatās available and push it until it breaks and learn something permanent from the breaking.
That kid will be much better off with a used laptop and Linux or BSD.
I started college with a white G3 iBook. By the end of freshman year I had installed Yellow Dog Linux, then Suse, Mandriva and eventually Gentoo.
Now, 20+ years later all my home computers are running Linux (Debian though), and my kids grew up using Linux.
But I'm going to send my teenager to college with Windows or a Mac. They're going to be 1200 miles away, and they're going to need to get support for their computer and I won't be there.
Yes, I like Linux 1000x better than Windows or Mac, but Linux demands a different relationship with the admin. This kid hasn't wanted that relationship with tech, and will rely on friends to help get Office or Zoom or whatever installed.
I'm still deciding between Mac and Windows now. I'll probably end up getting a quality used business laptop from FB marketplace, but the Neo is interesting too.
Cheap computers with hardware constraints have been around for decades. Now Apple ships one with pretty damn good performance, and they've invented "cheap computers with hardware constraints." HA!
My first computer was a Commodore 64 I found in a pile of trash a few years after they came out. My first PC was a 33Mhz Cyrix Instead I bought off my first college roommate. Now there are some real hardware constraints!
But yeah, necessity is the mother of invention. No doubt about it. Just not seeing how a $600 polished and performant laptop fits that bill ;)
The kidās parents want to be able to monitor their kid. The kidās parents want to be able to drag the machine to a local store and have the people there fix it.
The kidās parents - and the kid - all have iPhones, so itās familiar.
The kidās school requires Windows or Mac for their WiFi and wonāt let the kid use Linux because they donāt trust it.
Thereās plenty of reasons why Linux isnāt the answer in current climate.
Depends very much on whether the kids' interests leans toward doing computer stuff, or doing stuff with computers.
And they can do the former in a VM anyway. Install Linux, or a BSD, and go. With the bonus that you can experiment fearlessly because you've got snapshots and the worst-case for experimentation still leaves you with an entirely functioning laptop. Or use a cheap VPS, remotely.
Most schools don't let you use chargers due to fire and tripping hazards. The macbooks strength is you can use it on battery for the entire day. Most alternatives fail at this.
They would still be better purchasing a used MacBook as you can find those at similar prices and they (assuming they aren't Intel) will have the long battery life and more.
Very much so if you don't care about gaming. x86-64 emulation has already been great, and 99% of popular apps have native ARM64 versions. The only exception was Discord for me for a long time. I used to use an unofficial wrapper called "Legcord" instead. But, now even Discord has a native Windows version. I mostly use my laptop for software development + browsing.
I haven't tried gaming, but I feel like it'll suck for almost anything that's not natively ARM64. Steam doesn't have an ARM64 based client yet, AFAIK.
Your question is essentially "why do Electron apps exist?" and the full answer would be quite long.
The most important one is that an app's lifecycle can be different than a web browser. You don't always keep a web browser open, but you might want to keep Discord open regardless of what you do with the web browser. That kind of lifecycle management can be tedious and frustrating for a regular user.
Discord's electron app has many features that its web app doesn't such as "Minimize to system tray", "Run at startup", "Game/media detection", "In-game overlays" etc.
Even PWAs can't have most of these features, so that's why we have to deploy an entire browser suite per app nowadays.
I grew up tinkering with linux since 2.0.x days up until 2.4.x lamenting on how Linux needs to take over Microsoft. Then sudden (legal) blindness hit, and computers were becoming difficult to use. Took a punt on a powerbook 12" at the time, and learned the accessibility features and never looked back. I do 90% of things in a terminal on a mac today, but Windows / Linux is the real limited OS for me due to lack of accessibility features rendering them unusable.
Don't downvote, it certainly did for me. My first computer was a MBP 13inch from 2009, as I was apple obsessed like the person in the parent article. Time passes and I realized I really didn't like either Windows or Mac, and for the past 10 years Ive been linux only. It really does happen, even if rarely.
Good on you for rising up to the ranks of Linux/BSD.
You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software that Apple makes that much more difficult. Most Apple users are content to use apps written by others and that is as far as their interest goes.
An analogy is the car market. Most people don't care how the car works, etc. They just want to get to places. If you only need to drive to the shops and do minimal errands, you don't even need a truck - a sedan will do just fine. Same with computers, lots of different market segments with distinct needs and expectations.
> You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software
You don't really need that to use Linux.
People should stop copy/pasting urban myths or stories from the late 90's. We are in 2026 and one can perfectly buy a laptop preinstalled on linux with full support and just find the apps they need from an "app store" which in this case is just the frontend for the flatpak and packages manager. Picking up an app from Gnome Software is no different than installing an app from the play/apple/microsoft store.
Yep everyone has their preference. A lot of us have done both. Iāve run multiple distros. Iāve played with low level software. I have used and continue to use open source tools in places.
And I prefer my Mac to this day as my main machine.
Consumer user or Linux hacker is a false dichotomy people sometimes like to try to slot people into (not accusing you GianFabien).
My first computer was a Compaq my parents got during that peak era of home PC mass adoption in the late 90s. I immediately played a ton of games, got on AOL, learned VBScript, C++, HTML, etc.
This was such a natural and common thing that I never even questioned if others were having a different experience with computers. This sounds crazy now, but it felt as if everyone was either going to learn to program or already had, not as a career choice but as an essential form of literacy. I mean even the calculators were programmable!
To me, Macs were just "the boring computers" we had at school and what my grandparents bought. They seemed locked down and weird like an appliance. I have no idea what my life would be like now if I had grown up in a different time and with a Mac.
This isn't to hate on Macs, but to tell the story of the dominance of Microsoft at the time and how much culture shifted towards more "dumb" consumerism. By the time the first iPod came out I realized the adults had no interest in any of this more progressive future. Then the iPhone and Windows Vista confirmed it.
I installed Ubuntu on the ThinkPad I had in high school and never really looked back. To this day, I am still baffled by the obsessions people have with AI "replacing jobs" and Apple devices as status symbols. I think those people miss the point entirely and worry about their incomplete worldview being passed down to younger generations. What I see is the masses refusing to participate and technofeudalists taking advantage of them.
Interesting read! I love to see this spirit. I grew up with a different - but similar experience. Only, as an 80s and 90s kid, computers were nothing but limitations. Even when my dad built a machine with a 133MHz Cyrix chip, already a year later, it was outdated by computers with literally double the computing power.
That Cyrix machine was already miles ahead of the 386 that was handed down to me to play text based games on and learn dos through hard knocks. I remember leafing through old hard drives that had 10mb of capacity and realizing they had no value despite not being that old.
Later in college, I had the confidence to build my own first desktop with parts cobbled together from sketchy resellers. Athlon A1 single core 1ghz. Man that thing could fly.
I loved every word of this article, I went back in time and felt again with my Pentium III 500 Mhz trying to hack everything out, like changing the splash screen of Windows 98 SE while loading just to feel I did it!
These things were the ones that led me to this passion and that today, with LLMs and almost only business applications to develop for the sake of living, feel like the magic is finally fading.
God I miss the old times! (I'm 36 yo but I feel like 70 in this specific moment!)
I'm sorry to say that those kids are a lot fewer and farther between than they were even 15 years ago, and much, much fewer than 30+ years ago.
When I was working my most recent corporate job (as a people manager, natch) there were new hires even in 2019 that had never owned a computer that wasn't a phone, and just used whatever laptop or other system was supplied by their school or (now) work. This experience blackpilled me a little, I will say.
Why do you think they're fewer and farther between? It's almost certainly at least partially because of Chromebooks and the save-the-user-from-themself design philosophy.
I would say it's primarily because their phone (android or iphone) does nearly everything they would ever want a "computer" to do, and so they never had a hook into real curiosity about computing. This will probably be exactly the same whether the school supplies a Macbook Neo or a Chromebook.
Hackers tinkered with other things (e.g. the telephone) before pervasive computing.
I imagine the young hackers of today just find other things with which to tinker.
I've seen kids build some amazing things in Minecraft. Is this really all that different from modifying the source to a game you copied-by-hand from BYTE magazine?
I'm a computer nerd who started on DOS on IIRC a 286, all the Windowses (OK not quite all, though my first job did expose me to a couple NT versions and 2K in addition to nearly-all the home versions I upgraded through on my own) and various Linuces (Gentoo as my primary OS, even, for like... five years, LOL)
These days there are two things I use my "real" non-server computers for in my personal life:
1) Media piracy, only because I've been too lazy to set up a headless torrent system on my media server w/ VPN connection (why does the media server exist? Again, purely for piracy, remove that use case and it'd be gone as a waste of electricity and space the very next day). I could fix this in a Saturday and remove this entire use case, improving my process significantly at the same time, just haven't yet.
2) Video games. Really hoping that new Steam console doesn't end up being sky-high expensive, because I'm excited about "consolizing" this use case and ditching my last "real" PC tower (other than one server that I go months and months without directly messing-with; and that "real" PC tower is already running Bazzite to make it Steamdeck-like, it's just janky as fuck because it's Linux on frankensteined hardware so of course it's janky as fuck, so I'm still eager to swap it for something designed and with support for that actual use case).
Everything actually-important happens on iOS, and usually on a phone.
In my personal life, I've concluded that I just have no idea what more-useful-than-the-time-it-takes stuff I'd even do with a "real" computer, despite spending absolute fuckloads of my time screwing around with them from ages like 7-30. What I learned in that time was "how to computer" to a pretty advanced level, and luckily that per se has paid the bills and then some, but in all that time the actually-directly-useful-to-me stuff I've done with it has amounted to very little.
To me, personally, I've realized PCs are a solution looking for a problem, and that so rarely does my trying to bridge that gap result in an actual net-benefit (usually, not even close) that I've mostly stopped trying. That impulse got me where I am, can't complain, that "wasted" time over literal decades gained me a bunch of skills that are (it turns out) almost entirely useless to me personally but that others are willing to pay for, great, but I find myself a computer nerd who doesn't actually know WTF to do with a "real" computer. I just use my damn phone.
I suppose I'd still find them a lot more useful if iPads and iPhones didn't exist and so I needed to do banking and reading and such on some other computery device, but those do exist, so... yeah, not a lot of motivation to even own a "real" PC anymore, to include a MacBookāI doubt I'll replace my M1 Air when it finally dies.
To sum up, as a lifelong computer dork, I don't even know why I need a "real" computer any more, let alone why anyone else does.
You can't even see this whole comment on an iPhone, let alone the immediate contextual thread that is necessary for framing it.
That sounds flippant, but it's the visceral reaction I have to trying to do everything on iOS (iPads are a little better than iPhone in this regard, but still have the "everything through a keyhole" feeling).
That said, tiny viewports are not really the main problem, since obviously a modern iPhone has far more resolution than almost any monitor available in 1990, or even 2000. It's more that most exploration and creation is not doable on an iPhone, by design.
My original point, though, was that seeing new grad software developers who have never had any non-externally-directed interaction with a computer made me realize why we have to show them how to use the shell in a terminal, and why they seemed to have no particular curiosity about any software thing not directly related to the task that they're doing. For new grads who aren't curious, finding that something they're doing doesn't work due to architectural issues or some nonobvious combination of bugs prompts neither asking for help nor a deep dive into what the problem could be. Asking for help is basically cheating, to them, and they've never before encountered real problems that weren't explained in the text, handled by their group partner, or trivially stack-exchangable (remember, this realization was 2019, and therefore before common LLM usage). At standup after standup, they report making progress on working through the moderately complex ticket they picked up, but in fact, nothing useful is happening. Sometimes people like this can be shown the way, and then become functioning and capable developers and troubleshooters. Sometimes not.
I liked this not because it's a good story. It is, but that's beside the point. I liked this because it's my story. Not literally so, but the shape of it is. He's struck a nerve at the heart of growing up eager and curious and seeing a computer as a pathway to your dreams.
Iāve never had a post hit me with nostalgia as hard as this one. Thanks to the author for capturing what it felt like to be a stupid little kid with a weird old computer so well.
This is true but also not at all the point of a review. Some tools are better suited for some tasksā reviews help those with the privilege of choice find the best ones for them. Otherwise youād have a review of a hammer saying āthis is a great tool for driving screws if youāre not afraid to get cREaTive with it!ā Folks who need to make do with what they have already know about their constraints.
I took the article as talking about the difference between reviews that say āthis computer is not going to be great at Xā and the reviews that say āthis machine is only good for office tasks or Yā. The gatekeeping tone.
It can do most anything. It may not be amazing, but people get buy. And they may be ok with it.
I saw tons of comments in the original post about the Neo from people who talked about how they used extremely old hand-me-down/used laptops to learn to start programming and fall in love with computers.
I was just watching a video from ETA PRIME who tests lots of small computers to see how good they are for gaming.
He was playing RoboCop on it, and it ran pretty well. 45-ish FPS. It was using 11 gigs of RAM at the time. So it was obviously in swap.
It reminds me of the iPhone 5C when I had a 5S, it's a beautiful colorful breath of fresh air that I wish I had but my needs are so much greater. But if I wasn't an engineer who needed a highspeced MacBook Pro I'd go with it.
Bingo! IMO, laptops are best used as thin clients and you do the heavy lifting on servers or a box in a closet somewhere.
I'va been migrating my workflow to this approach and I'm an embedded dev! My closet does have hw strewn about but once you set it up that you don't have to touch wires it's super convenient.
My one gripe with MacBook airs up to m4 was support for only one external monitor. But m4 fixed this.
This is how I mostly use my Windows PC: remote access from my Android tablet via ssh and rdp. My gripe is entirely different: Microsoft has turned to crap.
RDP: Every time a native notification pops up, I get disconnected (usually the notification is about something I've been doing, such as starting a self-hosted server or running winget via unigetui). It randomly disconnects when I've been using it for more than a few minutes, even when there isn't a notification.
All of this so far seems to be limited to Android's rdp client (the Windows app). For Windows built-in RDP client, my issue is that there's no way to make it resize the desktop like vmconnect does when you resize the window (and no way to proxy vmconnect connections easily for home use--I do not want to enable WinRM for the full system and figure out how to secure it, I just want a single PC on the LAN to be able to access a single VM conveniently, preferably able to log in as different users)
But there's issues with ssh (and likely WinRM/ps remoting, though I haven't used it) as well: with Linux you need to use sudo, but with Windows there's apparently no CLI requirement; ssh runs elevated (though apparently you can change this; I make do with connecting to a running psmux session that's not elevated). So far as I know, there's no way to elevate without the GUI being involved (admittedly I haven't looked since I started using ssh with Windows).
Linux? Connecting to Linux works perfectly. I can't use xrdp or ssh or vnc or forwarding x11 over ssh or [other] and they work perfectly. I used to use x2go before Wayland, and despite the pain of actually getting it working even that worked better; XDMCP required some amount of setup, but it was awesome (too bad there's nothing that efficient with Wayland); xpra looks great, but either didn't exist or I was unaware at the time.
The only issues with Linux remoting are, again, Windows-related (it's seemingly impossible to get vmconnect enhanced session to work properly with Linux at all on Fedora 43; the things I've found online don't seem to work for me).
Apple has turned computers into luxury items. They are not simply computers, there is some status and image projection involved.
As a hacker and tinkerer I could never justify the cost of purchasing one of their machines, but I see people around me trying to sell their old Apple machines and phones at absurd prices (I do live in a peripheral economy and Apple stuff is even more expensive here).
So it makes sense for Apple to segment its market like that. It makes sense for their audience too.
When I shop for a car, I find the same issue: most analyses have very little to do with the car's technical attributes and there's a lot of gibberish about design and lifestyle.
Outstanding article; I'm glad you put these thoughts into words and published them because I've felt similarly this week since I've had time and reason to reminisce on my 2010 MacBook. I had AutoCAD on that poor little computer, working at the pace it could handle.
John Gruber used it for a day and found it was actually totally adequate, or better, for his actual daily work.
> But just using the Neo, without any consideration that itās memory limited, I havenāt noticed a single hitch. Iām not quitting apps I otherwise wouldnāt quit, or closing Safari tabs I wouldnāt otherwise close. Iām just working ā with an even dozen apps open as I type this sentence ā and everything feels snappy.
I think people are assuming it's going to be a worse experience than it actually is. I don't know how it does it with 8GB of RAM either, but apparently it does; i suppose my guess would be SSD and bus are fast enough that swapping on app change is no longer so disruptive? (I don't know if improvements in virtual memory swap logic could also be a thing that matters or not, this is not my area).
My first computer was a hand-me-down Compaq LTE laptop, several times removed from the original owner, with a 700MB hard drive and Windows 95 a decade after those were leading-edge specs. It had only Word and Access, of all things, and little room for more.
But it was mine, I tinkered with it forever, learned databases enough to turn Access into a basic quasi-Excel for my needs, cataloged things that really didn't need to be tabulated, and generally learned as much as that little machine would let me.
That was a limited computer, one that couldn't possibly have let me do what I needed to do when I hit university. But it got me started, taught me to tinker, and I'm prety sure pushed me to learn more than a state-of-the-art for the time computer would have.
And so I do wonder, at times, if it's the nostalgic look back at early computing that makes people inclined to say "my god that would have been an amazing computer to start out with" when you look at an entry-level computer. I'm inclined, even, to say man that's going to be an epic $100 computer on the second-hand market in a half decade or less.
When at the same time, it's actually a solid machine for more of us than us geeks with our inflated expectations of computers have than we'd like to accept. That, too, is pretty cool.
In high school we had a g3 Mac that we got Final Cut Pro on. It took forever to render a minute or two of video. But having a nonlinear editing system that took forever to do anything was way better than not having anything at all.
Edit: for true self embarrassment purposes you can see some of the films we made here:
I had no personal computer for years except what only served as my Plex Server until I took it down.
I bought a 16GB M2 MacBook Air after I was Amazoned to work on a side contract when I was between jobs. I used it for four weeks and the only thing I ran on it was VSCode, Safari and Zoom. I would have been fine with the MacBook Neo. Right now with a job, itās about the same - we use GSuite in a browser.
When I was 6, I got a Commodore PLUS/4, which was Commodore's unsuccessful attempt at a business-oriented 8 bit computer. I think my folks wanted to give me a Commodore 64, but things happen. Since there were not nearly as many games on that thing, I learned to program early. It had a build in assembler/disassembler (shift-reset would reset the computer without wiping the memory - just the first byte of the program memory), so I learned how to reverse engineer assembly code before I was 10. This arguably "wrong tool" shaped my whole life, and was maybe the most important thing that happened to me. If I got a C64, I could have easily turned into someone else.
You learn the most from failure and the least from success. Likewise, you grow the most from pushing through the limits and the least from living in abundance. You can do pretty much anything on a full spec Mac Studio, but so can anyone else with a lot of money. But if you push through the limit of a MacBook Neo, you just did something no one else was able to do. And that is awesome.
Yes, exactly this. I'm using a 10 year old elitebook folio g1. It's about 2 pounds and does what I need it to do. Available with a 4k screen if that's your thing. Does not feel sluggish. Given that I'm not gaming, video editing, or doing local LLMs (and I think there is a big chunk of the population in that camp), I feel like I am missing out on nearly zero.
(And I'm not trying to say anything is special about the laptop I'm using. I adore using trackpoint (so much that I brought my own trackpoint keyboard in to work to use there) so would gladly trade for an old thinkpad if what I had didn't already do what I need it to do).
In the middle of a gaming session one stops thinking about graphics once it's reached a certain level of fidelity, and that level is far below RTX. Not worth the money, especially today.
This reminds me of learning linux and cli commands because an old netbook with like 512 of ram was struggling with modern guis and web browsers. So I used a minimal i3 installation and interacted mostly with a terminal.
"I edited SystemVersion.plist to make the āAbout this Macā window say it was running Mac OS 69, which is the s*x number, which is very funny."
If I'd see that on my kids' computer it would fill me with pride.
I do miss the days that a (Linux) computer was like this for me. I say Linux because I had a similar obsession with FOSS and what it meant in a broad sense. But it doesn't matter, before that I de-compiled some program to make the text on the Windows START button different. Re-installed Win 2000 about every week, often after f-ing it up. Before that I changed some lines in DOS' autoexe.bat so it would ask for a password (which was just 2 input parameters readable in the autoexec.bat, but that is some mighty fine security (through obscurity) in a normie family).
This article struck a nerve. There's something about the curiosity of tinkering around in a computer. It's the most powerful technology humankind has built. It's versatile. It lets you break it. It's a bicycle for the mind, as Steve Jobs would say.
May all the hackers out there, old and young, discover the beauty of the personal computer.
There are absolutely $599 laptops available from companies that aren't Apple but they've all made major compromises to get to that price in ways that people will notice.
The cooling will be terrible so that every 30 seconds the fans kick in at full speed, thermal throttling takes hold, and then it decides all is fine 30 seconds after that so you're working on a machine that's constantly cycling fan noise. The trackpad will suck, meaning you need to have a mouse precariously balanced somewhere anytime you're using it. It'll have some irritating BIOS feature you can't work out how to turn off that flashes a giant icon on the screen whenever you hit the caps-lock button. The casing will be plastic and snap where the screen hinges are screwed in after a year of light use. It may even be a Chromebook, making it a glorified tablet with an attached keyboard and (terrible) trackpad.
The thing Apple have done here isn't to release a $599 laptop, its to release a $599 laptop where the compromises are ones the average user buying a $599 laptop isn't going to notice everyday, while not compromising on the things they do notice. Its made of metal, the trackpad feels good to use, the keyboard is pleasant, and the battery is large enough that you're not carrying around a charger that weighs as much as the laptop itself.
> compromises the average user buying a $599 laptop isn't going to notice everyday
Oh my god yes. I've read way too much discussions that completely overlook this aspect.
So many people get so fixated on meaningless labels such as "smartphone CPU" (meaning it's bad) and things to pick on like "ew no HDMI or Ethernet" as if it was a life-saving thought terminator that preserves the world view in which absolutely nothing under the Apple brand can be in any way good.
As new I have no idea but I think I have never spent that much on a computer as I always buy second hand. 400⬠has always been my max. I honestly don't know why I should spend more, the brand new laptop the company I work for gave me don't run significantly faster than my 7y old laptop and this has pretty much always been the case in the last 25 years.
I looked at laptops around that price range in Australia, and they're all awful in one way or another. Either they're Chromebooks (yuck) or come with an ad-infested OS (gross) and/or with serious compromises in quality and finish.
When doing my Bachelor degree my dad gave me an old thinkpad to run on linux. It was a horrible experience for preparing powerpoints, papers, etc. But I still have that command line muscle memory and an eye to spot errors which really helped me in my career. In my final year I bought myself a macbook because I earned real money doing a consulting internship. But the unix muscle memory stayed, and I found working with IDEs so wasteful. In my first years at my job I rejected word and excel still to do everything in groff and awk.
I could 100% agree, putting up with i5-4210u and geforce 840m for twelve years. Making stupid shit, ditching Windows for Linux to gain any usability. Editing videos in too high resolutions. I taught myself being patient with electronics and it's the thing I couldn't have taught any other way.
Same. I got a crazy old Ubuntu desktop when I was 9 or something. It couldnāt run any games and thatās why I learned Python. I wouldnāt be who I am today if I had a machine capable of running Minecraft at the timeā¦
> Nobody starts in the right place. You donāt begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machineās limits become a map of the territory.
This is why you should grow your own trees and wait a million years and melt the silicon into nvidia gpus and install linux.
learn linux.
only sanfran idealists dreaming of a world that destroys them use apple products.
A key point that TFA misses (probably for the sake of story-telling) is that, unlike the 2006 iMac the author fondly remembers of, MacBook Neo is not a hand-me-down computer.
It is not the proverbial gift horse. You are paying fresh $ for it. So, it is only reasonable to have some baseline expectations on redeeming value from it.
Also, an important point of the MacBook Neo criticism is that because of its cut-down features, a Neo may never graduate to a "hand-me-down computer", but instead head straight to the e-waste pile.
ARM macs are too new for us to know how the reuse/hand-me-down/legacy support world will shake out for them. Thereāll be signs when the first M1 machines get axed but for now, I have no clue.
Apple wants to give me $250 trade-in for my M1 Air with 16GB, but it seems to be worth $500+ on the open market, so yeah, still above hand-me-down territory. It feels as good as the day I bought it, and literally the only reason I'm considering replacing it is now we have 2/3 laptops with magsafe and I'd like to start distributing those chargers around the house. So tempting to just swap for a used M2 for a couple hundred dollars, but the chore of moving to a new computer is holding me back.
This post reminds me of a 14-year old boy reading the MS-DOS manual to figure out what AUTOEXEC.BAT and CONFIG.SYS at his younger brotherās football games.
Dunno about Xcode, but if you put a go compiler in there, I doubt it will compile or run slowly. Some dependencies may require C, but you could avoid that mostly.
> He is going to open GarageBand and make something that is not a song.
As a kid I just loved playing around in Reason (1 or 2?) and making strange sounds and just flipping the racks around and trying to connect the cables in any jack possible, to see what happens.
I would open one of the demo songs, to 'learn' how these racks were wired, and then experimented.
My first computer was a Palm m100 with 2 MB of RAM. No persistent storage. (I once lost all my data when changing batteries a bit too slowly.)
Iām optimistic that kids will continue ignoring adults telling them what is and isnāt āreasonableā with this hardware and that software and just have fun!
How? I grew up with Windows, learned decent skills on that, probably as much as I would have on a Mac. The current mobile era stuff has put alot or control and grit away, for making things 'more accessible'.
Windows would do just fine. But the state of cheap Windows laptops is abysmal, and Windows as a product is in the doghouse lately because... well, I honestly don't know why Microsoft is doing what they're doing, but from the outside they certainly do appear to want to ruin Windows.
Chromebooks themselves can actually be great machines for hacking (in the traditional sense, not the modern security/jailbreaking sense). E.g. https://support.google.com/chromebook/answer/9145439?hl=en is arguably better than a direct typical Linux install because it's an isolated environment which won't break the main function of the device as you tinker.
As the page notes though, the real problem for kids is the devices are of course locked down:
> Important: If you use your Chromebook at work or school, you might not be able to use Linux. For more information, contact your administrator.
The more common "modern" definition popularized in the ~90s has dropped the non-malicious meaning regardless which side of the "did hacker originally include both usages or not" debate one sits on. That doesn't mean the original definition ever went away though of course!
They don't have an open source kernel. You can't recompile the kernel or build your own device drivers. I'm not sure what you mean by "learning about computers", but I personally find being able to peek into the kernel source code to be more educational than anything in the mac ecosystem.
The hardware here is incredible, but it's crippled by not adequately supporting Linux, BSD, or any other properly open source kernel you can compile and install yourself. A good learning environment doesn't put up immovable barriers like "you need a kernel signed by apple", it lets you push away barriers when you're ready, like "Are you sure you want to turn off secureboot, or install your own secureboot keys"
The parent commenter said "learning about computers". Most "professional developers" don't learn about computers, they learn enough react to get a paycheck, but don't have an insatiable curiosity about how the whole computer works (i.e. the "hacker spirit").
Professional developers are not what this thread is about. It's about curious kids, about hackers, and that group does peek at kernel source code (as well as everything else).
Iām fairly confident that the Venn diagram of (a) nine-year-olds that are playing with a computer and (b) people who claim that access to kernel source code is a prerequisite to ālearning about computersā is two circles that are barely touching.
It's something you never need to look at, until suddenly you do and then it's invaluable. Any time you format some data for another system and get a cryptic error code back, looking at the source code becomes invaluable.
> You can't recompile the kernel or build your own device drivers.
I just donāt think this is what, like, nine-year-olds are looking for in a computer.
In any case, at least itās good that theyāre starting with macOS over Windows! Puts them on a good path to understanding that POSIX is the One True Paradigm and therefore makes them much more likely to compile their own kernel in the future.
My curiosity for all things computer related was boundless, but I eventually tinkered with Linux but only because Iād had been exposed to a *nix style command line from the comfort of an OS X desktop first.
By then I had started messing around with code but had only built toys and extremely basic tools and wouldāve been lucky to write a moderately functional desktop program using high level libraries (which didnāt happen for several more years).
Writing drivers or poking around in kernel code was so far beyond the scope of capabilities at that point that you wouldāve had better luck teaching your dog how to knit. I donāt think I couldāve had any chance at doing these things until at least my mid 20s.
> Writing drivers or poking around in kernel code was so far beyond the scope of capabilities at that point that you wouldāve had better luck teaching your dog how to knit.
I get the feeling a whole bunch of teenagers have written drivers to cheat in Fortnite/whatever other game - with that being said, probably not at 9 years old.
Yes, though SSDs that can sustain 1.5G/s and an OS that transparently compresses memory before swapping yield a lot better experience than Win95 swapping.
I actually gave a de-gendered draft of the essay to some friends a few days ago and heard that it landed with a thud ā the essay is largely written about myself, in the third person, retrospectively, so removing the pronoun made the autobiographical thread harder to follow. I switched it to "he" to make that clearer.
I think if I used "she" it would've made the "That kid was me" transition harder ā it would either involve some gymnastics to make it make sense or it would introduce a reading I didn't intend.
Yes, I thought that was probably the case, but it stuck out to me. Sorry, I didn't mean to come across as chiding you; I think the basic point about who is and isn't "allowed" to think of themselves as a computer-person is a really interesting one, and also that you can't address this topic properly without addressing gender (and race).
But seriously, if you go to school today they will make a point to put you on the slowest, weakest Chromebook available because they're terrified that you're going to play Krunker.
The result is you become an adult and you'll never buy a Chromebook. It's the same way that being bullied on the schoolbus means you become an adult who'll never ride public transit.
will sell you a desktop computer for around $150 (e.g. four of them for the price of a Neo) which will put an enterprising young person on a much better path to learn about computers than the Neo ever will.
Now you might say I'm being Orwellian but I really think Swift/XCode/iOS is "slavery" and the web platform is "freedom". I mean dev tools are fully competitive for the web platform, I can write my application once and run it on desktop and phone and tablet and VR headset and game consoles and all sorts of things I don't even know about it. I never have to ask permission if I want to deploy an app or update it. I don't have to pay anyone 30% of my revenue.
For this kind of experience I would recommend just buying a Thinkpad t480 you can buy for 200$ and install a Linux distro like Linux mint and then something more challenging like arch Linux
But it is AI! Or, at least, it's been run through it. (Staccato sentences; Not X. Not Y. Z...) It's a shame for a personal reflection. It's hard to imagine what the (I'm guessing) Claude-isms add that improve what would otherwise have been a nice unmolested personal essay.
What else is a shame is claiming that some single language feature supports a foregone conclusion that the writing's been 'molested'. It's hard to imagine what a constructive comment this could've been with the minimum of effort to know that the author has written this way consistently since at least 2021, before the first public release of ChatGPT.
It's also hard to imagine how difficult you must enjoy being, when you could have offered a kind clarification but instead dove into some obituary style takedown.
Author here, itās all me. I ran it through Claude before publishing to spot check me on grammar/typos and it caught a few syntax things, but this is just my writing style.
Hereās a satire piece I wrote in the summer of 2021. Tonally very different but you can pick up on my voice between it and my essay yesterday: https://samhenri.gold/blog/20210803-localhost/
>What Apple put inside the Neo is the complete behavioral contract of the Mac.
I remember seeing and using my first powerbook 160 and being blown away that it had the "complete behavior contract of the Mac" that I knew at the time. Even the 16 shades of gray screen made it more luxurious than a classic black-and-white Mac.
And the "What's on Your Powerbook" ads, with Todd Rundgren and Rev. Don Doll, SJ.
>Todd also co-developed graphic tablet software with a music theme for Apple in a technology venture in the late eighties. With Dave Levine, he designed and developed a screensaver product called Flowfazer (see example of one of the screensavers below), with the strapline āMusic for the Eye.ā They introduced it at MacWorld thinking they would publish it themselves, but found there was already well-funded competition with Berkeley Systems Flying Toasters and were forced to abandon the project.
Depending on your process, there is nothing wrong starting with this tool (Neo) first. It's a classic dilemma. For your first tool, buy the cheapest one possible to get the job done. Once the tool becomes understoond, it's limits reached, it's place in the process discovered, then, buy the most expensive one you can afford.
A little dramatic in tone but loved it all the same. I really do remember what it felt like to work on a āmachineā as a kid. The family dell lol hit all sorts of walls but learned a lot.
I don't get the folks referring to this as a "Chromebook killer". Chromebooks start at around US$150 new. The MacBook Neo is 4 times the price at US$599. There are premium Chromebooks like the Chromebook Plus line that are more in the Neo price range, but those aren't the ones being bought for schools and such. Doesn't make the Neo a bad thing, of course, I think it's a solid basic laptop from the reviews.
I think for kids in particular, it's important to remember that the educational discount brings it down to US 500.
That's not exactly nothing but that's a pretty reasonable amount for a non-crap laptop.
I like the sentiment expressed here, but on this note, I think there are other dangers to consider listening to early reviewers:
- Reviewers do get early access and often are receiving units AND doing their tests, writing their script, recording, and editing their videos before regular users can even possibly get a system shipped in. At best this rushes them where they miss details (e.g., few reviewers noticed that the MacBook Pro 14" M5 keyboard is different hardware then what you got on the M4 Pro because so much content is rushed)
- Reviewers are almost never experts on what street prices look like because they are focused on reviewing, getting content out ASAP. They are not spending time monitoring pricing with only a few exceptional channels doing so.
- The best marketing machine companies like Apple absolutely groom the review ecosystem without even needing to tell reviewers what to do directly. It's a competitive landscape of self-made YouTubers who are susceptible to positive reinforcement from the industry. i.e., companies don't have to tell reviewers to censor themselves, they can instead use positive reinforcement to select which reviewers are getting the best access and privileges.
Now, about the computer itself: related to the way the author of this article talks about the MacBook Neo, about the role of a cheap computer to just try have a working computer that is able to get some stuff done: this is the kind of thing that should likely steer you AWAY from this MacBook Neo that initially looked so exciting.
If you're considering a ~$500-750 computer, well, not only should you be checking the used market, but also, actually look at the competition to this thing.
The reactions I've seen from regular people seems to be, basically, "wow, Apple pulled off an incredible feat, they've disrupted the computer market again!"
Well, let's pump the brakes. First off, realize the Neo is making a lot of the same trade-offs that budget laptops have been doing for years. They aren't even giving you a backlit keyboard! The lower model cuts out biometric auth! There's no haptic trackpad, which used to be a major differentiator for Apple! It comes with a tiny slow charger! The battery life is actually not that good under load/bright screen because the battery is tiny! The CPU is old/slower/low power biased! These are all the classic cheap laptop tradeoffs that give PC manufacturers a LOT of room to actually compete really well against the Neo.
On top of that, almost every cheapo Windows laptop on the market is going to deliver to you a computer with at least a replaceable SSD. Usually RAM is soldered but it's not impossible to find that as something you can upgrade as well even on consumer-ish stuff that isn't just an old ThinkPad.
Actually spend the time to jump on some retailer websites like Best Buy and take a look at what the street prices look like.
There are multiple computers on there that make way more sense for someone budget constrained than a MacBook Neo.
My two favorites, one at a lower price and one at a higher price:
Lenovo Yoga 7 2-in-1 2K OLED Touchscreen Laptop, AMD Ryzen AI 5 340 2025 - 16GB memory, 512GB SSD, $679. This is a proper mid-range laptop and not just some cheap bottom of the barrel model in the lineup. To gain an OLED touchscreen, double the RAM, and the same storage as the highest Neo model at the same price, this is just great all around. I'm pretty sure these get very respectable battery life as well.
Lenovo IdeaPad Slim 3x 15.3" touchscreen snapdragon X, 16GB memory, 256GB storage, $549. With this model, you get a lot of the same ARM benefits that Apple is giving you. Sure, Windows on ARM is not the kind of polished native experience as a Mac, but we are just talking about a cheap laptop that works and, generally, everything you want to do in Windows will work on an ARM system. Once again, you're getting doubled RAM, which is important, and you're going to gain a touch screen, numpad, and possibly even beat out the Neo's battery life.
Another option is the HP OmniBook X Flip 2-in-1, a little less of a good value than the above, but it's another 16GB/512GB option that slides under $700.
You make some great points here. Hereās one of the places Iām coming from that seems to be aligned with the author of this.
I find macOS to be a superior OS for doing computer work to all the alternatives. It still sucks for a lot of reasons, but to my taste it generally sucks less. Iām a web dev, so I host a lot of crap in Linux, and Iām pretty confident in using it as a desktop. But the general day to day experience I find macOS superior.
Thereās plenty of people in similar boats, and this is the
most affordable machine (new, not used) that lets someone get to use macOS.
For a lot of people with budget limits Iād point them to used MacBook Air models rather than the Neo, but having this as a new model is a really nice option for some people.
Also you can call the Neo CPU slow but its benchmarks run circles around anything you find at its price range. Those machines have more RAM and storage, but the Neo will likely provide a more responsive experience than anything in its price range.
I do agree on refurb/used rather than the Neo. The best low-ish cost computer Apple is selling right now is probably their refurbished $750 MacBook Air M4 with 16GB RAM/256GB storage.
The only way I'll push back on this is the Ryzen 5 AI 340 is faster at multicore than the A18 Pro. Slower single core by a slight amount, and much slower iGPU.
However, that means to compete with the MacBook Neo more completely including integrated GPU, all you have to do is go up one CPU SKU to the Ryzen 7 AI 350 and you're further increasing your multi-core performance lead as well as completely closing the iGPU gap by doubling your GPU performance.
That same Yoga laptop is offered in this configuration including extra storage (16GB RAM/1TB SSD/Ryzen 5 AI 350) for $800
That...really is only $100 more than the 512GB configuration of the MacBook Neo if we aren't tossing in the education store pricing.
Perhaps it's more of a MacBook Air competitor at that price range. Stretching up to $800 is a lot...but you do also get a lot for that stretch.
All of the computers you listed have an inferior CPU, inferior battery life, inferior performance, inferior build quality, and inferior software for most peoples usecases. I know we all love linux here, but a lot of creative (or school, or work) apps that people use don't support Linux, so people must choose between MacOS and Windows.
All of the "cons" you list for the Neo apply doubly if not more for the alternatives you provided. Not to mention the cheap plastic build quality, poor OEM support, horrible screens, etc.
Ryzen 5 AI 340 has a higher multicore benchmark score than the A18 Pro. If you go up to $800 you get the Ryzen 7 AI 350 which matches or beats the A18 Pro in graphics, gets pretty much on par in single core performance, and that SKU has 16GB/1TB in its configuration. If you spend $100 less on the high end Neo with 512GB you get half the storage and lose a lot of I/O and get a worse screen and no replaceable SSD.
USB 3 5Gbps and USB 2 as your only ports are pathetic. Competing systems have more throughput and other conveniences like microSD readers, HDMI, and USB-A.
Inferior battery life, care to send me test data to back that up? Because the Neo is not a star at battery life for medium intensity tasks. It has the smallest battery of any Mac. Early reviews note that screen brightness and higher intensity workloads quickly deplete the battery. It comes with a very slow charging power brick. I guarantee you the physical size of the battery in that Yoga is much larger than what you get in the Neo.
Inferior software: highly subjective. There are over 900 million PC gamers on this planet who canāt play PC games on their MacBook Neo. Windows objectively runs more applications than Mac. Plenty of people I know prefer Windows over Mac.
Cheap build quality: again, Neo has no haptic trackpad, so itās not that different than a typical windows PC.
Poor OEM support: Lenovo sells parts, Apple doesnāt.
Horrible screens: the Neo has the worst screen in any Mac, the Yoga laptop has a touchscreen OLED. Have you seen the Neo screen in person?
In my opinion, this article looks like a straw man argument, and the author appears to completely misinterpret "This is not the computer for you."
Such a statement needs to be understood in the relevant context. It's not intended to discourage kids from buying a Mac! Rather, it's intended to rebut critics who are already Mac owners and who scoff at the MacBook Neo technical specs, such as RAM. The computer is indeed not for them, people who can already afford a MacBook Pro, for example. The point of "This is not the computer for you" is the opposite of how the author characterized it: the point is that the MacBook Neo and its specs are actually fine for the people who are going to buy one.
For some strange reason, the author has invented an imaginary opponent to become offended by. We're supposed to cheer for the kids here, and I see that many people have fallen for it, but the whole schtick falls completely flat for me. The kids were never endangered or discouraged by the reviews of the MacBook Neo.
I don't know why you're downvoted. No matter how many feel-good anecdotes the author tacks to their article, to me the premise appears a strawman. It would have been entirely possible to make pretty much all the same points about just getting a used Thinkpad, or anything really.
My first laptop as a kid was a passed-down business Toshiba that was to be scrapped. I then bought a soldering iron to fabricate a serial dongle in order to reset the BIOS password that was locking it down, and then installed Xubuntu on it. Guess young me shoulda gotten a Macbook instead to inspire the true spirit of freedom and exploration?
It's an old and persuasive myth of the Apple community that of course it's not about the tool, but what you do with it creatively. Still, they never fail to mention how the tool being an Apple is important in one way or another. I just don't get it.
Thank you thank you thank you for writing this. This made me smile and feel like shedding a tear. This is exactly how I felt when I watched the Dave2D video of the MacBook Neo, and other "reviews" that miss the entire point. This captures the point. The other reviews capture specs. This captures the emotion. This captures the reality.
" I faked being sick to watch WWDC 2011 ā Steve Jobsā last keynote ā and clapped alone in my room when the audience clapped, and rebuilt his slides in Keynote afterward because I wanted to understand how heād made them feel that way."
You have many years of experience creating accounts to post like this, as well as to break HN's rules in countless other ways, so I assume the question is rhetorical.
Because new user, ostensibly bad faith, throwing short cliched statements with no clear intent to start a meaningful discussion.
Seemingly effortless comments yelling to the void not worth starting a conversation with. Not the kind of comments that belongs or are wanted in this platform.
I'd say this comment fails the "Be kind. Don't be snarky" test. Wouldn't you? If we're appealing to the rules to justify our actions it puts a bigger burden on ourselves to make sure we're perfectly in line with them too :)
> A Chromebookās ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself.
I'm in the same boat as the author; I cut my teeth on a hand-me-down 2005 eMac, then a hand-me-down 2008 Macbook, before finally getting my own 2011 iMac. I think this is overly harsh on Chromebooks given they belong to the cheaper end of the market - you can still put linux on them and go for gold, you're just going to hit earlier resource limits.
I think when you're younger and building an aptitude for computers, it's the limitations of what you have that drive an off-the-shelf challenge: doing what you can with what you've got. That can vary from just trying to play the same video games as your friends (love what /r/lowendgaming does), usage restrictions (e.g locked down school issued laptops) or running professional tooling (very slowly) just like the author.
When IT caught my interest, I did all of the above - on Mac, Windows and Linux, on completely garbage machines. The Macbook Neo is an awesome machine for it's cost/value, but I don't think it's hugely special in the respect described beyond making more power available at a more accessible price point.
I went through a two-year period where I didn't have a decent job and couldn't afford a computer of any kind for myself. I ended up spending some time volunteering for a local non-profit, and they gave me an "old laptop" they had in storage. This was in ~2005.
It was a Sony Vaio, and the only thing I really remember about the hardware/specs is that it had a physical scroll widget under the touchpad on the edge of the case. Software-wise, it was running some relatively locked down version of Windows. I installed Arch on it and used it to rebuild and manage the non-profit's website.
The other thing that I remember from it is that it was my entrance into using the terminal as my primary interface - the first place I used Vim regularly, and the first time I'd installed tmux. One day I was trying to test a dropdown or something on their website, and discovered that my touchpad didn't work. It turned out to have been broken by an Arch update, which wasn't terribly surprising. What was surprising is that once I'd traced down the issue and corrected it, I realized that it had been broken for almost two weeks. I'd used that computer every day and hadn't needed to use a mouse even once.
Man I got a computer engineering degree in 2015 with a $200 Chromebook chrooted into Debian. And I worked professionally for years on an 8gb MacBook Air. The Neo is definitely something younger me would be interested in.
The problem is like all Apple stuff it's just needlessly limiting and has few advantages over alternatives.
Did the same for my freshman year of Uni on a $99 Chromebook. Java and C dev on 4GB of ram wasn't an issue.
That said, I quickly upgraded to a 4 year used Thinkpad and that was a huge difference.
C dev wasn't an issue back in the 1 GB or 256 MB or 16 MB days either. You just didn't use to have a Chrome tab open that by itself is eating 345 MB just to show a simple tutorial page.
> $200 Chromebook chrooted into Debian
Are there even any x86 Chromebooks left at that price point? They are only one that are still capable of chrooting into Linux. ARM Chromebooks remain locked up.
Looking at BestBuy, category chromebook, the first one that comes up is $150, intel n4500.
I don't know if this is particularly current or what, or if it's easy to setup to run another OS or whatever, but it meets your price and architecture criteria.
https://www.bestbuy.com/product/hp-14-chromebook-intel-celer...
ARM chromebooks run the Debian containers just fine. It's just at settings toggle to enable it and you don't even need dev mode.
There were a bunch of intel atom ones IIRC. I got my degree with a used EEEpc with one of those.
As did I. The most unbelievable part is that we used that tiny keyboard.
EEEPc ? Oh, I used to dream of having an EEEPc ... I got my degree using old C64 - had to manually encrypt packets with pen and paper to use https.
https://www.youtube.com/watch?v=VKHFZBUTA4k
I mean, never mind younger us. I have a M5 MBP and even I am tempted for a Neo for travelling
There is truly no space in my device repertoire for a Neo and I can say that with confidence because of how much time I've spent trying to find one.
"Finding Neo"
When my mates at school had the aero glass effect on the new Windows, my ancient hand-me-down laptop wouldn't even try to run it. It could however run Compiz somewhat if it was persuaded very hard!
That's basically the reason I learned Linux initially, and those hours debugging video driver issues would serve me well later on.
When Chromebooks originally came out, that was not an option. And almost all school-issued computers will not let you do this.
I've owned and used the CR-48 prototype Chromebook model, which very well did have a developer mode and a third kernel option built right in. Ran Ubuntu on it with no issues. This has been possible since before the device family was officially available for purchase.
The school thing is different, but also hardly unique. A school issued macbook is often similarly locked down and unusable as a dev machine, due to the student lacking permissions to install anything the school deems dangerous.
It was possible on the Acer model I got when it first came out, but it was still useless. A switch that wiped the whole thing back to defaults was needed to open a terminal and from there a shell script could install Ubuntu. It still ran the unmodifiable chromeos kernel with no updates and without some of the modules I'd like. And then the screen died.
It was junk. The EeePC was cheaper, lasted much longer and had Debian out of the box.
That includes school-issued Macs, so I don't see how that's an argument against Chromebooks.
Author of the post here. You nailed it here; I used Chromebook as the example in my post since the one I used in high school was locked down to basically a kiosk. Couldnāt even open dev tools, much less root it. Such a wild departure from the eMacs I used in my elementary schoolās library where I could set bonkers `defaults write` commands and customize every aspect of my account.
If I got a Chromebook as a personal machine as a kid, I probably wouldāve rooted it and see what I could do, but growing up, the beauty of the Mac (in that Snow Leopard era) was progressive disclosure. I could start on the happy path and have a perfectly stable machine, then customize the behaviors through the terminal, see what it does, mess with the system files, see what breaks, revert it, then go back to using iMovie like normal.
In my (admittedly limited) time using a rooted Chromebook, itās much more like a switch flip. You go from mandatory water wings directly into getting pushed into the ocean and Google shouting āGood luck!!ā
Yeah, very little of this is still true of the period in which the Neo or modern Chromebooks exist.
If the school is managing these Macs, including laptops sent home with a student, then unless it's for a specific purpose they aren't allowing you modify files, you probably aren't allowed to open a terminal or system settings, and you definitely aren't disabling SIP. You might not even be able to access the open internet if they've hard-configured it into a VPN. No different from a managed Chromebook.
Likewise, even older and lower-end unmanaged Chromebooks can enable a full Linux environment that runs a terminal in a browser tab. Doing so doesn't require root or developer mode, and it doesn't change or sacrifice any of the rest of the ChromeOS environment (for which your core assertion, that an unmanaged Mac is a computer and an unmanaged Chromebook is a thin-client appliance, still fundamentally holds). You can install Blender and have it running in a window by about 1 minute into watching a YouTube video titled "How To Download Or Update ANY VERSION Of Blender On Chromebook".
Gaining root on a Chromebook is mostly just a prerequisite to modifying things specific to ChromeOS, but the easier to access, more featureful, and safer LDE is still an entire operating system that you can tinker with, screw up, overload, blow up, and reset to zero, all without losing the happy path of opening up Canva (or, more likely, CapCut on their phone/iPad) and editing videos or whatever.
You don't have to root them to do cool shit anymore. They have a full Linux (Debian based) environment you can enable with a single toggle in the settings. Any GUI apps you install via apt get their icons dropped in the system tray and their windows are rendered via Wayland.
macOS or Windows can be similarly locked down. In the schools, the school locks it down. In many companies, there are management tools like JAMF, InTune, or NinjaOne that lock down laptops, desktops, tablets, and cell phones a little or completely.
>the one I used in high school was locked down to basically a kiosk.
The Macbook Neo will be no different if the school is actually managing them properly.
A kid looking for the best bang for the carefully saved buck would buy a used machine off eBay, for less than $599, and more capable. An M1 MBP 2021 with 16GB would cost about this much; an M1 Macbook Air, or an M1 Mac Mini with 16 GB would cost half as much. A ton of beefy, perfectly Linux-ready used laptops can be had for under $350.
So this is only for the kids who are obsessed not just with computers, but brand-new computers. Which is a different demographic.
It's also for parents who want to give their kids a "real computer" without breaking the bank.
My 12-year-old wanted a laptop to build mods for games. I got her a new M1 Macbook 8GB - $425 from Walmart, refurbished.
My 17-year-old wanted a laptop for college, but wasn't sure what she needed or wanted yet. I gave her my 2020 M1 MBP.
If either of those situations arose today, I'd get them a Neo.
> The Macbook Neo is an awesome machine for it's cost/value
Uh... if you need to compile for iOS, sure.
But outside of that, no its not.
You are literally just paying the Apple tax that they deliberately choose.
Look, MacOS has certainly rotted over the past few years, but the primary reason I use it is because it's still a hundred times nicer to use than Windows (which is also regressing for worse reasons - shoving in AI and ads instead of benign neglect).
It's still the best desktop UNIX experience, especially since cheap PC laptops (and until very recently expensive ones) almost always have horrible build quality. It's also within only the last few years that PC trackpads came anywhere near the trackpads on Apple machines. Sometimes what you call a "tax" is literally some of us wanting quality.
macOS is the best desktop UNIX for one simple reason: the ā key. The fact that 99% of your GUI keybindings use a key that your CLI tooling cannot use eliminates conflicts and means that you don't have to remember things like "Copy is ^C in Chrome but ^ā§C in the terminal".
using a linux with toshy to get the best of both worlds wrt keybindings. linux and kde is amazing nowadays... I don't miss macos but would be hating linux without mac style keybindings.
Yeah, I use Kinto (which seems to be what Toshy is originally based on). A recent Ubuntu update broke it though, and I accidentally deleted my config file while trying to fix it, so maybe now's a good time to try out Toshy. Looks like Toshy creates a python virtualenv instead of relying on system packages, which should make it a little more resilient to system package changes.
Oh man, you NEED to use Fedora.
Fedora is the best OS humanity has ever made. No exaguration. There needs to be the best, and its Fedora.
Linux gets a bad reputation because 20-ish years ago Ubuntu sent out free CDs and became the dominant OS. Ubuntu/Mint is part of Debian family, outdated linux. They call outdated Linux 'stable', but its not stable like a table. Its software version frozen. Bugs that are fixed today wont get those fixes for 2 years. Not to mention, a new mouse you buy from amazon/nvidia card/web video player wont work due to the outdated nature of these distros. (And yes, I know you can do surgery to update it, but no one likes that)
Fedora is not Arch. Fedora is the consumer grade Red Hat.
> Linux gets a bad reputation because 20-ish years ago Ubuntu sent out free CDs and became the dominant OS.
I've been an Ubuntu user for 20 years, and RedHat and Suse prior to that. Ubuntu just worked. Debian had packages for everything, including from 3rd party vendors. It lets me focus on my work, and not worry about the OS, or compiling packages, or finding installers. When I had issues (rare), the large user base meant that someone had already figured out a solution to the problem.
The flavor of Linux doesn't matter so much in my opinion.
Debian stable isnāt that much more version locked than CentOS or RHEL. Debian also has the Testing tier, which is semi-rolling. Or you could use Unstable. Or if youāre brave, you could use Nightly.
Ubuntu, Mint, PopOS, and others with Debian as an upstream are not Debian. They build their own packages on their own schedules.
Fedora is not āconsumer grade RedHatā. Itās the rolling release upstream of RHEL, much like Debian Testing is upstream of Debian Stable.
The main reason Linux got a bad reputation was the tribalism of people going off half-cocked talking about their personal preferences without actually working with the alternatives and starting this sort of holy war diatribe.
i've made my entire career digging deep into linux - i've been what some people would call a "power user" for about 25 years, and a professional for 15. i spent over a decade distrohopping, tweaking, tinkering and customizing every distro from Corel to Mandrake to Mandriva to Debian to Slackware to Ubuntu to Gentoo to Arch to Void, and everything in between, plus the BSDs. i've been a sysadmin, network admin, devops engineer, yadda yadda yadda.
i have never once successfully installed fedora. probably just hardware stuff, but as often as i've wanted to try it and opensuse, they have never booted post-install for me. on machines i've successfully installed Debian and openBSD. go figure. i know i'm an outlier here. maybe it's just bad luck.
but reading your post, it sounds like a club i don't want to be a part of. linux is linux. distros don't matter. you can get nearly anything to work if you spend enough time on it. GUI OS installers that fail are not worth my time.
Fedora Silverblue it's better and Cosmic Desktop looks good for a DE in every release (upcoming 44). For some isolated and rollbackable option, your only options are Silverblue and Guix for the hard way. If you use Nonguix for Guix, on your own, but I'd only use a nonfree kernel in an emergency (the wireless adapter somehow gets broken and the alternative is to boot the OS with propietary fw in order to buy a new one). And in that case I would blacklist every propietary fw except for the wireless ones.
And, yes, I have an overlaid Linux-Libre kernel in SilverBlue.
Glad to see someone else care so much about software freedom. Guix is great (though my ideal system would be debian with a shepherd init, fhs, and guix for non-root package management)
> For some isolated and rollbackable option, your only options are Silverblue and Guix for the hard way.
How about Qubes OS? Also the parent never said anything about isolation and roll-backs. Nobody mentioned Silverblue except you. The discussion is about ordinary users, not hackers.
Silverblue is supposed to be for normies. Rollbacks are for when you screw everything up.
But honestly I did not like Silverblue. I had a 13 year old gaming computer I installed it on and I couldnt get the ancient GPU drivers installed due to the way things are containerized. This would have been a few commands otherwise.
Maybe its fine for chromebook-like things. I might have picked a bad testcase.
Given the list of alternatives you provided Iām inclined to disagree with you.
I'm developing on a $270 refurbished Dell, which has an i7 and 16 gigs of RAM. The Apple processor might be competitive, but the rest of the machine is not. 600 dollars is fine and not unreasonable, but there is certainly an Apple tax.
> You are literally just paying the Apple tax that they deliberately choose
And when I go to the grocery store, I am paying the Safeway tax that they deliberately choose, and when I go to the gas station I am paying the Exxon tax that they deliberately choose, and so on.
What amount of that 600$ cost do you reckon is the Apple tax? I'm curious what comparables you see, and how much they cost.
When I was sixteen I got one of the earlier digital HD cameras (Canon VIXIA HF100) and Sony Vegas Movie Studio for my birthday. It was a neat camera and I liked Vegas, and I was grateful that my parents got them for me, but an issue that I had with it was that my computer wasn't nearly powerful enough to edit the video. Even setting the preview to the lowest quality settings, I was lucky to get 2fps with the 1080i video.
I still made it work. I got pretty good at reading the waveform preview, and was able to use that to figure out where to do cuts. I would apply effects and walk through frame by frame with the arrow keys to see how it looked. It usually took all night (and sometimes a bit of the next day) to render videos into 1080i, but it would render and the resulting videos would be fine.
Eventually I got a job and saved up and bought a decent CPU and GPU and editing got 10x easier, but I still kind of look back on the time of me having to make my shitty computer work with a certain degree of fondness. When you have a decent job with decent money you can buy the equipment you need to do most tasks, but there's sort of a purity in doing a task that you really don't have the equipment you need.
I learned to code on my school's BBC Micro. [0]
8-bit. 16KiB of RAM. BASIC as the programming language. 640x256 resolution in 8 colours.
I could make that thing sing in an hour. It was hard to get it to do much, but then the difficulty was the fun thing.
By the time we got to the early 2000s and I could buy something with more RAM, CPU and storage than I could ever reasonably max out for the problems I was interested in at the time, I lost something.
Working within constraints teaches you something, I think. Doing more with less makes you appreciate the "more" you eventually end up with. You develop intuitions and instincts and whole skillsets that others never had to develop. You get an advantage.
I don't think we should be going back to 8-bit days any time soon, but in the context of this post, I want novices to try and build software on an A18 chip, I want learners to be curious enough to build a small word game (Hangman will do at first, but the A18 will let them push way, way past that into the limits of something that starts to feel hard all of a sudden), to develop the intuition of writing code on a system that isn't quite big enough for their ideas. It'll make them thirsty for more, and better at using it when they get it.
[0] https://en.wikipedia.org/wiki/BBC_Micro
> Working within constraints teaches you something, I think.
It absolutely does. But every system has constraints; even when provided with massive resources, humans tend to try things that exceed those resources, as evidenced by Parkinson's Law of data https://en.wikipedia.org/wiki/Parkinson%27s_law
It was worse than you remember. You could have 640x256 in monochrome, or 320x256 with 4 colours, or 160x256 with 16 colours (which IIRC was actually 8 distinct colours plus 8 flashing versions of them).
The game Elite did something extremely evil and clever: it was actually able to switch between modes partway through each frame, so that it could display higher-resolution wireframe graphics in the upper part of the screen and lower-resolution more-colourful stuff for the radar/status display further down.
AlexandertheOk's documentary on Elite and the BBC Micro: https://www.youtube.com/watch?v=lC4YLMLar5I
I hear you, having learned programming on a machine even more constrained by the BBC Micro. But learners today are more likely to "Siri, build me a Hangman app."
Iām waiting for somebody to come and tell us about the time they punched cards by hand, one hole at the time, and then threw coal in the furnace to have the cards interpreted by a steam-powered computer.
Is this close enough? itās from 1969, I wonder what became of them:
āTomorrow's World: Nellie the School Computer 15 February 1969 - BBCā
https://www.youtube.com/watch?v=f1DtY42xEOI
Do you have a substantive argument against any points made by parent?
I had a similar experience but with design software (which I pirated at the time since I just didn't have the money to buy stuff from Adobe).
I'd install Photoshop and Illustrator on my shitty computer I put together from spare parts my dad didn't have the use of anymore from his business computers. It was horribly slow, but I kinda made it work slowly.
The thing is that I think this is what made me think a bit differently, since everything was slowed down and took more time than I would want it to, I had to make deliberate decisions on what to add/edit. I still work the same way today to pa point, but that's because I'm both faster, more experienced and the computers have gotten more performant (and because I can afford better devices sure).
When I look at my half-brother and his teenage generation I wonder if they can still have such an experience. The personal devices have gotten better and faster, most things are really convenient and you sometimes even don't have to think a lot to do something also because they're cheap to do... they probably won't have the experience of "grinding it out" just for the sake of producing something they like...maybe sports is the closest...no idea, but have been thinking about this quite a lot recently...
At some point the limitations can flip around.
when you're young, time is infinite, money is scarce.
Older, and time seems to take over. The limitations are - when can you free up the time? Is relaxing allowed?
Oh no argument on that.
I have a typical yuppie software job with decent pay, so generally I will buy the right tools for a job now instead of trying to make due with whatever I can scrap together. I'm not that busy of a person, but I certainly have more obligations than I did when I was sixteen, and now sometimes it really is worth it to spend an extra grand on something than it is to spend a week hacking together something from my existing stuff.
Still, I look back at the hours I spent making terrible YouTube videos with my terrible computer really fondly. I was proud of myself for making things work, I was proud of the little workarounds I found.
I think it's the same reason I love reading about classic computing (80's-90's era). Computers in the 80's were objectively terrible compared to anything we have now, and people still figured out how to squeeze every little bit of juice possible to make really awesome games and programs. The Commodore 64 and Amiga demos are fun to play around with because people will figure out the coolest tricks to make these computers do things that they have no business doing. I mean, the fact that Bad Apple has been "ported" to pretty much everything is something I cannot stop being fascinated by. [1] [2] [3] [4]
[1] https://youtu.be/2vPe452cegU
[2] https://youtu.be/qRdGhHEoj3o
[3] https://youtu.be/OsDy-4L6-tQ
[4] https://youtu.be/Ko9ZA50X71s
It probably depends where you live. When I was young, time was infinite and money were scarce. Now they're both the limit.
It's a great example of going the extra mile due to external limitations. I bet you developed skills and intuitions you wouldn't have if you started with great hardware from the get go.
I don't think this is about the macbook neo. I don't think the comments need to devolve into a mac vs. linux argument. It's simply an ode to that kid pushing hardware to the limits, and learning so much along the way.
What I feel a bit sad about is, I was that kid. Growing up in a 3rd world country, running games that i didn't own on hardware that ought not run it, debugging why those games don't work, rooting my phone and installing custom OSs just for the heck of it. Man I had so much time to tinker.
Now I have amazing gaming hardware but I barely touch games. When I do, its on steam. I've swapped out the endless tinkerability of android with the vanilla 'it just works'-ness of the iphone. That curiosity took me far, but I seem to have lost it along the way.
(Author of the post here) The post was inspired by the Neo and provoked by a certain YouTuberās review of it, but yeah itās about the Neo in the same way that The Old Man and the Sea is about fishing.
I wrote about the Mac in general since thatās what I know, but I imagine if I grew up in the Windows world and liked Windows more, I would have a similar experience with my dadās old ThinkPad or something.
You perfectly encapsulated how I felt as a kid pushing my computer to its limit just to learn and try new things. I didn't have a Mac, but the experience was identical.
> I've swapped out the endless tinkerability of android with the vanilla 'it just works'-ness of the iphone. That curiosity took me far, but I seem to have lost it along the way.
I feel this, and on the whole I've done the same thing. I'm deep in the Apple ecosystem because it all just works together without me having to tinker with it. I think this is mostly a reaction to now doing that stuff professionally - 4 days a week, whether I feel like it or not, I'm required to make computers do things they couldn't do before I started.
When I get to the end of the work day, or out of bed on a Sunday morning, I might get the urge to tinker with things but I refuse to have tinkering with things to make them work be a requirement for my rest time. Leisure tinkering must be on my terms, because if I'm forced to tinker with something just to do what I really wanted to do that's not tinkering, that's the thing fucking with me, and I will swear profusely at it throughout.
I didn't grow up in a 3rd world country but had the same experience, bar running games I don't own. Not everyone in the west had parents that wanted to just spend thousands on hardware that seemed to be obsolete next year, or any means of making that money. And I've never stopped using sub-par hardware, to this day I enjoy squeezing every drop of performance from cheap pre-owned stuff.
Most of us learned a lot that way, trying to squeeze and make something work out of nothing. That's why we understand much more than kids today. In the end that is the reason I still optimize stuff in my corporate company and I have a pretty awesome job, so it's a good path.
Mostly same story. Tinkered for hours with Windows 3.1 floppy disks. Reinstalling OSās all the time because Iād break stuff or Iād just want a fresh slate. I loved pushing the boundaries. In my 30ās I slowed down with the tinkering because of life (kids, work). I thought I lost the ability to tinker. But recently at 42, I bought a MacBook for the sole purpose of tinkering on the couch at the end of the day (basically after being on computer the whole day, I didnāt want to be in office anymore). And slowly, itās coming back. Iām playing with new things, learning about Neural Networks, learning about Softare Defined Radio, installing tons of random libraries and tools to test that out. Itās coming back. Keep pushing on it and hopefully it returns for you too!
> I don't think this is about the macbook neo.
It shouldn't be, except that the author chose to make every single paragraph about Mac, Apple ecosystem and bashing Chromebook.
I'd agree it is about the Neo in the sense that the device and the talk around it obviously triggered this post.
I don't think the author is exactly bashing the Chromebook. I'm reading it in that the author praise an open ecosystem where you have flexibility and the choice to "take off the guard rails" and go where the device was not originally made to take you.
There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
it's so wild that we're in a place now where the utterly brainwashed can say this with a straight face:
> There's an argument to be made that it is ironic that Apple is the example of this, but to me that shows why I _still_ like MacOS, when all the other variants (iPadOS and iOS) are entirely locked down.
And faking being sick so he could clap at Apple marketing events. He kinda lost me there.
He's looking back to a time when they were still special. When every keynote brought out a new, interesting product, feature, OS enhancement, etc. Back in the Steve Jobs era, it was still worth tuning in every year to see what was new.
The whole article is about how Apple is still special just like when he was a kid.
But anyway, I find it funny that author implies if a kid gets a MacBook Neo they will explore all the possibilities to use and customize it, but somehow the same kid won't try to push a Chromebook to its limit. It 100% matches my stereotype of how Apple fans view machines.
Yeah, you'd find out about it all in the newspapers a few hours later, and none if it was "clap to yourself in an empty room" impressive anyway. I was around back then and I didn't feel the need to act like a drug addict whenever Steve Jobs opened his mouth.
Were you a kid then too? :)
Lighten up. He was a kid and he knows how silly that was.
> The kid who tries to run Blender on a Chromebook doesnāt learn that his machine canāt handle it. He learns that Google decided heās not allowed to.
Or they learn to enable developer mode, unlock the bootloader, and install Linux, or use the officially supported Crostini, or so on. There's like 3 different ways to run Linux desktop apps on a modern Chromebook.
The Macbooks don't let have an officially supported path to unlocking the bootloader (edit: yes, I'm aware of asahi linux, which lives on the edge of what apple allows) and install your own OS. The Chromebooks do. I don't think that comparison plays as favorably as you think.
The bootloader isnāt locked. Asahiās developers have written about how Apple specifically built support for third-party OSes into the bootloader.
The same Asahi developers also wrote about how Apple didnāt document anything and especially, Apple never talked in public about this. Apple betting Apple, If they had cared a single second about this, they would have called this Bootcamp 2.
Honestly Iām pretty convinced that this Ā« open Ā» bootloader was just there to avoid criticism and bad press from specialized outlets when they presented the M1 because, for once, they needed specialized outlet to benchmark the M1 performance and not have anything bad to say about anything else.
They constantly break everything year after year without documenting any change which effectively makes Asahi unusable in anything recent.
Iām betting that they are just patiently waiting for Asahi to die by being too late of several years (which is already the case) to announce Ā« The most secure Mac ever Ā» silently releasing with closed bootloader when nobody and especially the press will care anymore.
Donāt get me wrong, I love Asahi and I even have it installed on my M2 Air, the project is doing incredible quality work. But I donāt believe it will last long. Hope Iām wrong, though.
For them to call it Bootcamp 2 (a "product" per se), they'd have had to have another OS they could actually demo installing. Otherwise "Bootcamp 2" is just a mysterious empty chooser window.
But at the time there was nothing, because Apple Silicon wasn't a platform anyone but them was targeting, because they had just created it.
So they built the infrastructure, and then waited for someone to actually start taking advantage of it, before bothering to acknowledge it.
And because that "someone" isn't a bigcorp (i.e. Microsoft) wanting to do a co-marketing push, but just FOSS people gradually building something but never quite "launching" a 1.0 of it ā Apple just "acknowledged" it quietly, at developer conferences, exposing it only via developer-centric CLI tooling, rather than with the sort of polished UI experience they would need if Microsoft was trying to convince Joe Excel User to dual-boot Windows on their Apple Silicon MBP.
> announce Ā« The most secure Mac ever Ā» silently releasing with closed bootloader
That's extremely unlikely to happen, as Apple's hardware and OS developers build Macs and macOS (and all the other hardware + OSes) using Macs and macOS. And those engineers (and engineers working at Apple's hardware and accessory manufacturing partners) will always need to be able to diddle around with the kernel and extensions "in anger" without needing to go through a three-day-turnaround code-signing process.
There's a whole proprietary, distributed kernel development and QC flow for macOS, that looks a lot like the Linux one (i.e. with all the same bigcorps involved making sure their stuff works), but all happening behind closed doors. But all the same stuff still needs to happen regardless, to ensure that buggy drivers don't ship. Thus macOS kernel development mode being just one reboot-and-toggle away.
> And because that "someone" isn't a bigcorp (i.e. Microsoft) wanting to do a co-marketing push, but just FOSS people gradually building something but never quite "launching" a 1.0 of it ā Apple just "acknowledged" it quietly, at developer conferences, exposing it only via developer-centric CLI tooling, rather than with the sort of polished UI experience they would need if Microsoft was trying to convince Joe Excel User to dual-boot Windows on their Apple Silicon MBP.
It's also important to remember that Microsoft was in the middle of their Qualcomm exclusivity deal at the time of the M1's release, and thus Windows for ARM wasn't available on anything other than a few select devices or unofficial use of Insider builds.
That deal didn't actually expire until 2024[1], at which point Windows for ARM finally started to be sold in an official capacity with stable builds widely available.
It's entirely possible, though unconfirmed, that Apple was intentionally leaving the door open for "Boot Camp 2", and Microsoft simply never took them up on the offer, either because they were stuck in a deal made prior to the M1's release that prevented it, or because they no longer saw a financial benefit to being able to sell Windows to Mac users (possibly since Windows license sales are effectively a rounding error to Microsoft at this point; they make way more off of subscription services and/or Office, all of which are already available on macOS without having to dual-boot Windows).
[1]: https://www.tomshardware.com/pc-components/cpus/windows-on-a...
> possibly since Windows license sales are effectively a rounding error to Microsoft at this point; they make way more off of subscription services and/or Office, all of which are already available on macOS without having to dual-boot Windows
AFAICT, the way Microsoft wants things to work, is that "Windows" is the native fat-client platform / SDK that ISVs are supposed to use/target when building fat-client apps that interact with (i.e. generate spend on) Azure-based backend systems. The #1 way Microsoft makes money at this point isn't from direct consumer or even volume-licensed subscriptions; it's from providing paid backend infra to dev shops who had long since locked themselves into the Microsoft/Windows development ecosystem, and who therefore saw Azure as the only valid cloud backend to integrate with when "cloud-enabling" their software (and/or, where the compliance story of integrating their previously native-and-local-syncing software with Azure, was 100x simpler than with integrating with any other cloud, due to Azure+Windows being able to act as a trusted principal-agent pair that can enforce policy-based security via a shared "cloud domain" identity [Entra ID] baked right into the OS ACL layer.)
Until recently, though, Microsoft thought of the Windows "platform" the same way Apple do of the Mac "platform": that "Windows"-the-platform-SDK was the same thing as Windows-the-OS. Which necessarily meant that consumers must be pushed with all conceivable effort toward using Windows-the-OS on their machines, so that these dev shops who had targeted Windows-the-SDK could reach them with their software (so that those dev shops would in turn spend more on Azure.)
But I think this equivalence is going away!
From what I've seen of discussions in various Microsoft-aligned sources recently, it feels to me like some part of what Windows 12 may mean by calling itself a "modular OS", is that Microsoft may be establishing some kind of very clean boundary layer between Windows-the-OS and Windows-the-platform/SDK.
---
What would that look like? I don't know for sure, but here's some spitballing:
Well, picture Mono, but as a complete UWP projection, shipping with all the native libraries that are built into Windows.
Or, if you'd prefer, picture Wine/Proton; but rather than black-box-reverse-engineered equivalents to Windows DLLs, it is all the DLLs that come with Windows. Except, now rebuilt from the ground up so that they compile against NTOS or Mach or Linux syscalls.
Basically, "the complete Windows platform" as a JVM-like runtime you "get for free" when installing Windows-the-OS, but can install on top of macOS and Linux. (Probably in various runtime profiles, as with Embedded vs Server vs Desktop JREs. You don't need D3D on your server.)
This would be likely to take 100% of the wind out of the sails of the Wine/Proton projects overnight. And maybe kill Mono itself, too. After all, why bother with half-assed third-party implementations of the Windows Platform, when you can just install the "real" Windows Platform, and get guaranteed bug-for-bug compatibility with existing Windows software (relying on the same databases of app shims and fixes Windows-the-OS has shipped with for ~forever)?
SteamOS would be reduced to "a Linux distro that preinstalls the Windows Platform." ReactOS might or might not (depends on stubbornness) be reduced to "a clean-room-implemented NTOSKRNL-compatible base OS, that preinstalls the Windows target of the Windows Platform."
Wine/Proton themselves would, if they even bothered to keep going, end up rebranded as "an alternative ground-up Windows Platform runtime." (If the official "Windows Platform runtime" was then open-sourced, then likely Wine/Proton would fully fade into obscurity, as anyone who wanted to maintain their own libre Windows Platform runtime would just start by forking Microsoft's. Very similar to the situation with OpenJDK.)
---
In any case, regardless of how they do it, any move in this direction would make it blindingly obvious why Microsoft wouldn't care about enabling something like a "Boot Camp 2" feature on Macs any more: they no longer care if you install Windows-the-OS on a Mac; they rather want you to install the Windows Platform runtime under macOS. And then they'll have you as a consumer of Windows Platform products all the same.
(Actually, even better, as they'll have you far more of the time. In the Boot Camp business strategy, any time you spend booted into macOS puts you out of Microsoft's reach, save for the few first-party apps Microsoft has ported to macOS + sells on the App Store. In the Windows Platform business strategy, meanwhile, you can be running arbitrary Windows Platform apps on your Mac 100% of the time you're using it!)
To be clear, "Apple" is a group, not a unified thing with one will.
That doesn't mean that the engineeers will necessarily ship something more flexible than what the PMs asked for. Often not.
But sometimes they will.
Apple's legal department will kill it once someone tells them the project is a handy tool for patent trolls to mine for infringements.
> announce Ā« The most secure Mac ever Ā» silently releasing with closed bootloader
Is that gonna be before or after the iphone with no usb port?
Switching to developer mode is very likely something he wonāt be doing nor allowed to do on the Chromebook his parents bought him or the school assigned him.
Will a managed MacBook allow the installation of random native apps, either?
Though let's be realistic, here: $600 is much more than the typical school-assigned Chromebook.
Itās $500 for a kid, a full time student.
Yes
Some kids undoubtedly get there, exactly as you say. That's not at all the same experience as opening a device that has a MUCH bigger sandbox to begin with and lets them start exploring with boundless applications from the beginning.
The bootloader kids get my deep respect. I think I'd rather give my kid a Neo to begin with.
You can't install a different OS on these? Are they different from the M series? Because those have Asahi Linux.
Asahi linux effectively only supports the M1 and M2 chips, so even a modern macbook air won't work, and even on "supported devices" you can't use thunderbolt or a usb-c display yet.
These use the A series chip, and even supporting new M chip revisions has been enough of an undertaking that I wouldn't really expect this to get Asahi linux anytime soon....
And apple can lock down the bootloader to be closer to the iPad/iPhone at any time with no notice, and based on their past actions, it would be quite in-line with their character to do so.
By āpast actionsā, do you mean doing extra work to make the bootloader support other operating systems? https://asahilinux.org/docs/platform/introduction/
The Asahi folks have also demonstrated M3 support, though without GPU-accelerated graphics (the M3 GPU is very different from the M1 or M2). Much of the effort is currently on getting the existing components upstream.
Surprisingly enough you donāt need Linux to learn about computers. You know that Macs have terminal?
The default Mac terminal environment is the Weetabix of UNIX-likes. You need GNU coreutils to do pretty much anything.
My first unix was ultrix. I'll take the default mac stuff over that any day of the week.
I'm confused. Isn't coreutils a just small subset of even macOS's current zsh's builtins? What do you prefer about systemd to launchd? defaults seems like a convenient way to manage settings. Is it confusing for people from other operating systems?
Name one thing lacking in the utilities included with MacOS (which come from BSD).
`grep -P` kinda annoying. GNU has Perl-compatible regex, and BSD does not. You're reaching for `perl` or installing `ggrep` the moment you need a lookbehind.
BSD grep is the pure grep version though. Perl regex is unnecessary bloat.
Is it still shipping with that ancient bash, the awful Iterm and without a package manager? I haven't used OSX for a while.
No. Zsh is now standard, though it still included an old optional version of bash. Apple hates GPLv3 that's why they moved away from bash.
The terminal app is not iterm. But Apple's own Terminal.app
And no there's no package manager but there's brew and macports.
I didn't know it was an homemade terminal, it's just that it looked old and abandoned compared to your average Linux distribution.
It is made by Apple yes. It's not very bad, it even has big font support from the VT100 series. And a lot of style settings in the menu bar. It's not iterm2 but it's way better than what windows offers (not just the console but the newer windows terminal isn't as good either IMO)
The overwhelming majority of UNIX-like software isn't designed for BSD runtimes, to name one.
Throughout history the overwhelming majority of unix-like software was designed to work only on the particular flavor of unix used by its author.
So exactly eves āUnix like softwareā will kids be missing that prevents them from learning about computers?
The overwhelming majority of UNIX-like software is available in the package managers right now for major BSDs.
I ask for a specific example, and you respond with more generalities.
Aside from the BSD software, the Mac software, and all the software thatās actually POSIX-compliant (on purpose or by accident).
Asahi only supports M1 and M2 series Macs currently. The Neo uses an A18 Pro, which was only ever in an iPhone before. I wouldnāt count on Asahi coming to these soon.
I see no reason they couldnāt.
But we know thereās lots of other models that theyāre already working on. We donāt know how similar or different it is from an OS perspective.
The reason is the lack of documentation from Apple.
Reverse engineering needs a lot of time and hard work, which may not be worthwhile.
Sometimes someone does this work, and everyone may benefit from it, but you should never count on this happening, unless you do the work yourself.
Reverse engineered documentation is very often preferable to the internal kind, which is not necessarily accurate. So either way, the Asahi folks are doing valuable work.
Maybe some of these agentic AI superstars can point their 100x engineering chops at this. This would impress me but not going to hold my breath for that.
This is an argument, but itās also fundamentally comparing a computer that works out of the box to one that doesnāt.
I really don't get this comment section. You get a Macbook then you have a perfectly usable machine which will run all the mainstream software you ask of it, and then you get natively compiled well supported developer tooling, no VM required. The best argument for Chromebooks is that you can throw away ChromeOS and install Linux or use Linux in a VM. These are not even close to the same.
I think folks want to hate Apple more than they want to admit that Chromebooks kinda suck.
I was sadly too dumb in high school to figure out how to get Linux running on my Chromebook.
Thereās an entire Linux distro (Asahi) for MacBooks. Apple has never released a Mac with a locked bootloader.
And macOS frankly provides a far better Unix experience than ChromeOS, in my experience, having actually used both (including for development, though only for a short time on ChromeOS because it was horrible).
Apple did not lock the bootloader, but they do not provide documentation for their products.
What would have been a trivial porting work with documentation, becomes extremely time-consuming and hard work without documentation.
That is why Asahi Linux lags by several years with the support for Apple computers, and it is unlikely that this lag time will ever be reduced. Even for the old Apple computers the hardware support is only partial, so such computers are never as useful for running Linux as AMD/Intel based computers.
> Or they learn to enable developer mode, unlock the bootloader, and install Linux, or use the officially supported Crostini, or so on. There's like 3 different ways to run Linux desktop apps on a modern Chromebook.
Oh so all our hypothetical child has to do to discover what computers can actually do is completely rebuild one's software from scratch with no prior knowledge.
Next you'll tell me F1 drivers in their teens just have to LS swap a Saturn SC2 and book time at a track.
It's really not that hard. Someone who can follow a tutorial can do it.
5 seconds of googling will get you an answer to "install blender on a Chromebook"
I used to be the cool tech guy in school because I memorized the tutorial to jailbreak iPhone or to cheat in games with a memory editor. You know, stuff like "when you see this screen, click that icon", "find row 5 and change the second value to 0", or "open terminal, copy paste this command and hit enter". I don't think I learned anything useful from those.
You learned that such things are even possible, and you learned that other people saw you as the cool tech guy just because you took time to memorise that stuff.
Well, sure. Maybe you're the kid in the article who opened Xcode and Blender and Final Cut, but it didn't click for you. Of course not everything is for everyone, but it doesn't prove exploring the limits like that is a bad thing.
And these days, you can ask your favourite LLM for step by step advice, and you can even give it shaky phone camera shots of the error message on your screen.
> It's really not that hard.
Of course not. I could do it in a coma. I've also been using computers since 2004, and you're probably similar.
I've been using computers since 1991 (I'm 42, from 1984), and to be honest this stuff is getting harder and more confusing, not easier. Mostly because it keeps changing, and not based on any logic towards improvement. Sure I'm good at getting my questions and problems solved now, especially with AI, but I don't believe I have the ingrained mastery I felt after a while with computers in the 90s.
?? I installed Omarchy on an old MBP simply by inserting the usb stick into a USB port and holding a key combo during boot. Didnāt have to unlock anything.
Try it on a new one.
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called āProjectsā with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes. He is going to open GarageBand and make something that is not a song. He is going to take screenshots of fonts he likes and put them in a folder called ācool fontsā and not know why.
āMrs. Jonson, the result are back. You son has autism.ā
Autism is a quite strong diagnosis at the end of a spectrum. Not every tech-loving introvert is autistic. That's the kind of arrogant attitude that marginalizes nerds and on a forum like HN people really ought to know better.
As somebody who identified a lot with what's in that article I can say that I haven't just made peace with having been "different" but I love it and wouldn't want it any different comparing my life today with that of the arrogant non-nerds who made fun of us back in school.
Would you give it a rest? It's being used here as a colloquial term that's distinct from an actual medical diagnosis.
The colloquial use of the word āautismā carries with it a specific connotation and mind image. That primarily negative stereotype is being reinforced by the joke by way of it being delivered as a medical diagnosis (āthe results are backā).
Your parent comment is arguing against perpetuating the wrong negative connotations and lack of understanding of autism.
Not to say the original author was doing it maliciously, I donāt think they were.
And before Iām lynched, I was that kid too and I agree with the author. It was just funnily written.
I was that kid too, and I probably have Autism
(yes, I know we're all on the spectrum somewhere, but a diagnosis is defined as it impacting your life severely, and I think many people would say that I have many autistic traits that are negative).
This is also the sort of obsession that follows ADHD. Source: me
I mean i did this sort of thing a ton and am not on the spectrum. Idk when being "quirky" meant you are autistic, seems like it was within the last 7-10 years though.
> He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
This hits home. Not because I did it as a kid; I'm a bit old for that. But because I've done this exact thing two or three times. You stare and know, just know, that somewhere in this byzantine interface there is the raw power to do lots of cool 3D stuff. But damn. It's quite an interface.
> That is not a bug in how heās using the computer. That is the entire mechanism by which a kid becomes a developer. Or a designer. Or a filmmaker. Or whatever it is that comes after spending thousands of hours alone in a room with a machine that was never quite right for what you were asking of it.
Yeah. For me it was an old, beat-up 286 that I couldn't get anyone to upgrade and and loving devotion to MS-DOS, old EGA Sierra games, TSR programs, TUIs, GeoWorks, and just not being able to get enough of it.
When I finally saved up enough to buy a 486 motherboard, I installed Linux because it seemed cool (and was cool) and never looked back. But that 286 sparked my obsession with computers that has influenced almost every aspect of my life.
I still love to revive old hardware and push it beyond it limits. Mostly because i think it's fun, but also because it's dirt cheap or free. Back then made an old GPS system play Monkey Island or mp3's or read E-books. Reinstalled lots of old Android phones and tablets. Made photo frames out of them. Made webcams out of them. Transformed old laptops into Chromebooks. Make lots of old NAS devices work again. Stuff like that.
I got a cracked copy of 3ds Max at a LAN party (back when it was "Discreet 3dsmax"), and immediately dragged dozens of cubes, spheres, and cones into the scene.
Then I closed it for a year. Opened it up again one day, followed a box-modeling tutorial (from the documentation PDF linked in the Help menu!), and I was hooked. Spline modeling, rigging, walk cycles, texturing, lighting experiments, every spare minute for the rest of high school.
I still remember the whole-body panic of accidentally turning on "adaptive degradation", which replaced all meshes with their bounding cubes when rotating the viewport camera, and thinking I had broken my video card.
This article is a strange combination of defending the macbook neo from stupid attacks, and making similarly stupid attacks on the chromebook, with no self-awareness (unless there's some level of irony I'm missing here, which, come to think of it, might well be the case).
Chromebooks have a linux VM where you can install anything, including GUI apps, and doing that is much more straightforward then installing something from the web on a mac. Download, right-click, install on linux. No scary warnings. No need to go to system settings.
I think this is entirely fair - this laptop imo could satisfy 90+% of Apple's userbase (including me, I used to do dev work on my M1 Air I bought out of curiosity) - it's a fully featured computer, that's incredibly well made, to the point that I don't think anything compares favorably in the Windows land at any price point. Likewise you'd have to go for the premium segment if you wanted to get a machine with similar single-thread perf (which is the perf that matters the most for most people)
It's not locked down in any way, this is a fully featured machine, unlike Chromebooks, which in my experience don't cost a cent less than equivalently specced Windows laptops. Due to this I never considered buying those.
I'm not even a Mac fan, I just think they make nice computers and their OS has the least amount of downsides atm.
This is why I skip tech youtube videos with MacOS. The users have the reality distortion field.
I always wonder what the world would be like in a battle between Google and M$ rather than M$ and Apple. Obviously less advertisement, more focus on function and less form.
As a kid, I grew up before laptops became hugely popular, so instinctively my thought was, of course a MacBook (indeed any MacBook) is not the computer for a kid; the Mac mini or an iMac is. The author started with a 2006 Core 2 Duo iMac inside the kitchen and he should know this. The simplest and easiest parental supervision that doesnāt involve any software is to have a desktop machine in the living room, within the peripheral vision of parents. Watching some videos that you shouldnāt be watching? Dad comes and tells you why not. Want to bring the computer to your bedroom and eschew sleep? Physically impossible. Playing a game for an hour? Dad comes and tells you itās enough. But learning about formulas in Excel? He comes and offers to answer your questions.
The Neo seems kind of nice but I don't really see how it's more significant than "a nice low end computer." The article reads like its fire from Olympus but a nicer screen and trackpad is only incrementally better than what was available in Chromebooks and cheap PCs.
Personally I think a the Steam Machine will have a better chance to cheat a general computing device into the home of someone not looking for it. The Neo gives me hope on price point.
For the past decade or so, many children had no access to real computers. Before covid, many households either only had school-issued chromebooks, or only smartphones. With covid causing a rise in remote schooling, many families got laptops, but again often only locked-down chromebooks.
There's adults nowadays that do their taxes on their phone, cut videos on their phone, and edit spreadsheets on their phone.
And while smartphones and chromebooks are great at accomplishing your desired tasks, they offer no opportunities for growth. You can't change and play around with the system, become a power user, modify your system, look behind the curtain, and gain real understanding.
There's an excellent blogpost on this topic, "The Slow Death of the Power User": https://fireborn.mataroa.blog/blog/the-slow-death-of-the-pow...
It's an interesting paradox: the more we made computing accessible, the less we got out of it.
When a PC was expected to boot to an OS and not much else, we had all the freedom - by necessity - to tinker and learn. Hardware was barely enough for most day-to-day usage, so we upgraded relatively frequently and got to know the physical innards as well.
This is all so streamlined today that even computers can be smartphones with "apps", or even just a browser that gets you to google slides and everything else (or the MS equivalents). It was probably a necessity that, as computers became infrastructure, they would become simplified, so 90% of the population can indeed file their tax return online (and the remaining 10% have their younger family members do it).
This also means that people nowadays simply don't know that they can walk into any second hand store and get a $200 PC with a warranty that'll be much more productive than any smartphone if they have the knowledge to use it properly. But was there really a loss? These are, for the most part, people that would not have been able to hop on the internet wagon if it'd relied on maintaining a linux distro at all. That's regarding adults; children now do indeed grow up with walled systems for the most part, and that might be a loss.
In the past, it was hard to start using computers, but once you did, the journey from user to expert and developer was smooth sailing.
Now it's much easier to start using a computer, but going beyond that has become so much harder.
I didn't really read it as a specific advert for this computer, but rather a nostalgic defense of cheap starter PCs in general. It gave me some hope for the future.
I'd say the low end is closer to a Raspberry Pi or perhaps a used old Thinkpad. A $600 machine with good single-core performance is only low end if you ignore everything outside the Apple product lines.
> trackpad is only incrementally better than what was available in Chromebooks and cheap PCs
Did you use a touchpad of an old cheap PC? Apple would not dare to use one comparable to that in their wildest nightmares.
One also has to consider that Apple remains an āaspirationalā kind of computer. The things bemoaned by HNers due to Apple having something of the status of a luxury brand delivering a premium computing experience are also desirable to huge numbers of people in the world seeking to improve their status and lot in life. Itās very easy for us in the west to overlook that thereās a couple of billion people in the world earning $300-400 a month. So thereās a billion kids out there who would perhaps be lusting after this machine instead of struggling along with a very recycled and half-decrepit laptop. Thereās also huge numbers of people in the west who live paycheck to paycheck so having an actual machine at this price point that will deliver years of faultless computing will probably make a big difference. At least I get the aspirational tone the OP is arguing for - a kid completely learning the edges and maxing out their machine will likely produce better results and better educational outcomes than one given a top of the range MBP or windows desktop supercomputer.
you're commenting on an ad for Apple
of course it will praise the product like it's golden, turning disadvantages into "that's actually the good part"
It must be a drag going through life as a cynic.
The idea that constraints aid in creativity is not new.
The build quality and usability on mac laptops is something else, I've yet to see even 2kā¬+ laptops that people typically get for their jobs that aren't a pain to use without a mouse and monitor. Whereas I'm sitting here in front of my macbook and not touching the mouse next to it most of the time.
That's definitely valuable, but not for a child in my opinion, it's the type of luxury equivalent to a Mercedes over a Renault. Perfectly defensible but, just like a Mercedes is hardly a starter car, I don't think an MBP is that fit for a starter PC. It's also mostly useless if you're not traveling for work regularly.
That said, does any of that even matter any more? People were learning Blender, programming and whatever else 15 years ago on low to mid range machines already. The equivalently priced - or dirt cheap second hand - machines of today are multiple times more capable at everything. Stick Linux and a $5 mouse in it and you're 90% of the way to a macbook pro in terms of user experience.
That's to say, I agree with the core of the article: kids will make the most out of the least. But I disagree that this particular laptop is a necessity or a boon for that. If anything, it's a hindrance for being a mac.
1. This is the most optimistic, inspirational thing I've read in months :)
2. Are there kids like that still?? I would love to think so. None of the kids in my circle of parents are. There is one teenager who's going into computer science because they are smart and love math, which is great, but they never built or explored or been curious about anything on their computer as far as I can tell. There is a big ecosystem of wish fulfilment and instant gratification, and I think (right) limitations like the author insinuated are part of the allure.
To 2., yes - you just have to look in the right places. You're sure to find them in middle/high school or university robotics teams, for example.
When computing was niche, you really only got into if you had a real interest in it (I assume - I wasn't around back then). That's changed, but it doesn't mean that that category doesn't exist anymore. If anything, it's probably way larger in absolute terms, if a smaller proportion of people who work on software in general.
I thought it was inspirational as well. When he described making a folder called "Projects" it reminded me of that feeling I had after graduating from a TI-99/4A to a PC. The possibilities were endless.
Different computers definitely give me different feelings when I use them. Some inspire creativity and the desire to make something meaningful or beautiful, others feel like machines made for work or for play. All of the boundaries are fuzzy but, for me, computers definitely have an emotional valence.
> A Chromebookās ceiling is made of web browser, and the things you run into are not the edges of computing but the edges of a product category designed to save you from yourself. The kid who tries to run Blender on a Chromebook doesnāt learn that his machine canāt handle it. He learns that Google decided heās not allowed to.
As someone who lived on a chromebook for fun because it was a cheap way to get a browser machine that also had Linux access. I don't really get this. You can run blender on a chromebook as soon as you turn on the linux container. It will run even better if you install linux on it after a quick firmware flash.
If it's locked down by a school that's not really the chromebooks fault, schools are gonna lock down Macbook Neos via management policy the exact same way.
Heās right. I built a hackintosh from a PowerMac G4 motherboard I bought off of eBay with my saved-up babysitting money when I was 12 or 13 because I was absolutely desperate to have a machine I could edit movies with, I couldnāt afford a real Mac, and I read on the internet somewhere that this was the cheapest way to get one. I knew lots of older brothers who were āintoā computers (all of them for gaming) that thought I was an idiot, because building my own mac made everything ten times harder. I didnāt care. I was obsessed.
This is a $599 computer with purpose-built architecture for (barely) running (small, underpowered, near-useless) LLMs. There are children saving pennies for this machine that will do great, horrifying, dangerous things with these computers. I canāt wait to see the results.
I remember this period of my own life. I had taken over my father's old 486 and spent my days and evenings trying to learn the basics of programming in C. I was making silly text based games, dreaming I'd one day be creating the game of my dreams. I also modded games by opening every content file and trying to figure out what they did and how I could modify them. I was still years from realizing game development was a career and not just a hobby.
I had replaced all the Windows sounds and cursors to customize the system so it looked and sounded like a Sci Fi system. I even patched the boot screen to be a humorous screen of "MS Broken Windows". It also was quite broken from messing with system files I didn't understand.
It was a magical period and I learned so much.
Because most people donāt know that the boot screen and even the shut-down (Safe to Shut Down Windows) screens were simple BMPs, they get shit scared when you āhackedā the computer to show different messages/pictures. (Always backup and have a renamed copy of the BMP, just in case.)
I appreciate the article and agree. If you have a desire to learn computers, just get your hands on whatever you can and learn.
I second that! This is also how I feel about Raspberry Pis. There's so much they can't do, and yet in a way they can do everything. It's not the power of the machine, its about how much control you have or how close you can get to the metal. At least that way you learn about why you need more powerful hardware.
Chrome books and phones teach nothing.
It's one click to set up a Debian environment on a Chromebook. Same on an Android phone. You can learn plenty from that. Once you've learned the limits of what you can learn within that environment, it's not difficult to then unlock the bootloader and learn even more.
To be honest, anytime I see someone recommending how easy it is to install Debian I always feel like theyāre some relic from the nineties. Kids likely wonāt follow any advice starting with āinstall Debianā.
They will if they are at all "technically curious" and bump-up against the limits of "ChromeOS" and running software they want to execute. A couple quick searches will find them some instructions, and boom - after a week or so, they are running Debian, their own Minecraft server, Blender (poorly), or whatever had prompted them to look for alternatives.
Never underestimate the time investment and frugality of a "technically curious" young person... Myself, I would have been a happy end-user, loading/playing games, running software - except, I bought a cheap modem - with physical IRQ jumpers - and no documentation - and it's default jumper settings conflicted with my mouse in Windows. If it hadn't been for that cheap/frugal purchase and then having to invest the time to troubleshoot, I wouldn't have become "technical" and moved on to greater and greater challenges and learning experiences. Most people would have just returned it and got an external modem instead, or given-up on even the possibility of connecting to BBS's...
What is fundamentally different from the late 80's/early 90's, is now there is a tremendous wealth of knowledge on the internet to actually facilitate that troubleshooting type of learning activity. Is that better? Well - there will always be a "known solution", but what I find many people do now, is follow whatever the first set of instructions they find, treating them like a "magical spell", without knowing/learning "why/how"... [And if the first set of instructions doesn't work, the majority just "give up"]
Overall - in my experience, the percentage of people who are truly "technically curious" is about the same as it ever was - single-digits... It ultimately depends on whether or not their interests/passions/blockers align with being forced to go "beyond" their comfort zone.
Yeah, that really resonated; the author captured something about the way kids explore.
It brought back memories of when I first started using a Unix time share at university, and exhaustively read all the man pages. Didnāt know why, just wanted to discover everything.
Except if its an apple device.
> They have very little interest in what you might become because of one.
Love the spirit of the post.
As a high school dropout, with a GED, Iāve spent my entire adult life, looking up noses. I chose a career jammed to bursting, with sheepskins, because I really enjoy doing tech. Not because I wanted to make money, or because I wanted to be a big shot.
My first ever program, was in the 1970s, some time. It was a Heathkit programmable calculator. My first ever āseriousā program, was Machine Code, typed into a 6800-based STD card, nailed to a piece of wood, with a hex keypad, and an 8-digit LED display. My first personal computer, was a VIC-20, with 3KB of RAM. My first Apple computer was a Mac Plus, with 4MB of RAM, and an external 20MB SCSI hard drive.
Learning on limited resources helps us to become frugal and efficient. It also helps us to become tough as hell. Some of the best engineers I ever worked with, had rough backgrounds.
These days, I use a pretty maxed-out Mini, and an LG ultrawide screen. Iām spoiled.
AHH an apple acolyte. so not a real computer user then
Yeah⦠Iāve been hearing that, along with āAppleās a dead company; give it up.ā For much of my lifeā¦
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it
10 year old me identifies with this so much.
I managed to get the computer to display 256 colours instead of the 16 it had been set up with. Everyone was impressed and this meant I was now allowed to take the computer apart and put it back together again without anyone being scared.
> He is going to go through System Settings, panel by panel, and adjust everything he can adjust just to see how he likes it. He is going to make a folder called āProjectsā with nothing in it. He is going to download Blender because someone on Reddit said it was free, and then stare at the interface for forty-five minutes.
Brilliant. Thank you for that.
>Somewhere a kid is saving up for this. He has read every review. Watched the introduction video four or five times. Looked up every spec, every benchmark, every footnote. He has probably walked into an Apple Store and interrogated an employee about it ad nauseam. He knows the consensus. He knows itās probably not the right tool for everything he wants to do.
Anywhere in the world, the kind of kid that does all this and installs Blender on it is WAY more likely to save up for any janky terrible half-working PC laptop with a bit better specs (memory in particular), or a desktop computer if possible, because A. games B. Linux C. piracy and more software D. he does not care about it being Apple or "just working", in the words of the author himself. I don't know how the US kids in particular feel about this since the reality distortion field is so strong, but anywhere else it's like this.
Depends on the kidās hobby and purpose of the computer. Iāve always known that MacBooks are better for music making (especially driver hassle, audio signal reliability, etc.). Could I afford a MacBook as my first laptop? No. Did I buy a second-hand MacBook as soon as I could afford it, and have been happy with it since? Yes. As a teenager, I wouldāve loved to buy a new MacBook for 500/600ā¬.
Iām sure most other applications are less Mac-optimized, though (software development, 3d/graphics editing, gaming, ā¦).
That's very true, and it's a great tragedy. Kids get skilled in troubleshooting awful computers instead of getting skilled in creative things which actually have future value.
As for piracy, it's just as easy on a Mac, and MacOS has more quality software than any other platform - unless you're talking about software used in factories and such.
But how many kids actually "save up" for a computer vs being given one by their parents or getting a hand-me-down from relatives? I would suspect that many parents would be more than happy to buy a Macbook for their kid if they showed that kind of interest.
I think you're making it sound like they're forced to build Linux from scratch while walking knee deep in the snow, uphill both ways. That's way too detached from reality and certainly it's not a "tragedy". Kids are not doing specialized things that only have future value, they tinker with everything. Usually on what's available, yes.
>and MacOS has more quality software than any other platform
This is simply untrue, and not something a tinkerer cares about on a general-purpose machine anyway (with my niece and son as n=2).
Windows have also always required a lot of tinkering and trouble shooting to make things work in a pleasant way.
But for Linux, the creative software simply isn't there in many cases for a kid to start learning. Unless it's programming, which is not everybody's talent.
A kid tinkering with any kind of creative software learns and absorbs important skills which they can build on later if they want to. These things are much more valuable than system troubleshooting or becoming skilled in a game.
Blender, Audacity, Ardour, Inkscape, GIMP, Kdenlive, Puredata (programming, but visual), Krita.
Are these not creative software? Perhaps not industry standard, but what is industry going to look like in a couple of decades anyway?
Half of those are not good enough to inspire creativity in a child, because they are so cumbersome to use. The rest are good as far as I know.
You can't really compare Audacity to Garage Band or GIMP to Affinity (which is now free).
For young people out there, it is better to build a desktop gig instead of a laptop. You can't change much in a laptop, unless it is some legacy laptops, but definitely not a Mac. Parents should help their kids to build a 16GB Linux gig. It's going to be more expensive than $599 nowadays due to the price of everything, but still, it is very expansible, and the kid can earn $$ and decide which parts he wants to upgrade.
BTW a laptop is definitely the only choice if the kid wants mobility, though.
Sometimes I feel privileged for being in the generation that learnt to program BASIC on a C64 when it was the coolest thing around at the time. Being that much closer to the metal is a whole different experience of learning what a computer is and can do.
Is that even possible now? Probably not. Years ago I tried to get my kids interested in playing with their own Raspberry Pi when they came out, that they could do whatever they wanted with on the side, to little effect. Not even the idea of setting one up as their own Minecraft server (they were heavily into it at the time) piqued their interest. Oh well.
Most child of every generation don't care about those things. Most of the few that cared about the C64 just used it to play game. You are in the minority who got interested in the C64 and the minority within that minority who also was interested with BASIC. It's good you tried with your kids but the odds were against you.
Meanwhile, some other kid in your area probably got scolded for installing F-Droid. Oh well...
I totally admire raspberry pi and their attempt to get kids a gateway into cheap computing - made by people the generation who started on those BASIC machines. But Iāve always found it to be a radically different experience on Raspberry Pi given it boots into a full desktop and has endless things to do, compared to the empty terror-filled void that is a blinking BASIC cursor with nothing else on the screen except for some arcane copyright message. Loading a game from tape and experiencing the 5-minute cacophony of that noise was also a surreal and tedious experience for the nippers of the 1980s. It made you really want it in a way that machines since canāt deliver.
Plenty of great tools for kids to start making games with if they're interested in it! Personally, I think running something on a Raspberry Pi isn't very interesting or inviting as a first thing to play with. Creating a game in Roblox, designing an outfit in Roblox, or building a game within Minecraft is more interesting. And people build crazy stuff in Scratch.
But also, not every kid is interested in that anyway.
They used Scratch at school, but you might be right about the Pi just not being inviting.
I tried to learn to program as a kid too. It didn't take, couldn't get past the hello world/simple program stage interest wise. I just wanted to go right to making games. Closest I got was messing with configs and skins and some map making. Took until later in college when I started programming "for real."
Talking about staring at interfaces, I got my first Pentium computer when I was 7 in a village in Pakistan. I spent all day fooling around and accidently stumbled upon quick basic. Having nothing to do I learnt how to code because the help menu listed all the commands and the interpreter gave errors when I did something wrong.
With a clear feedback loop and the insane motivation of a child I learnt to make games/software on basic which ended up defining my life.
Sometimes we overthink it, all a child needs is a safe environment to fool around and letting them be obsessed about things.
I was robbed of that safe environment. Gosh... it hurts me to think about what could have been.
> This computer is for the kid who doesnāt have a margin to optimize. Who canāt wait for the right tool to materialize. Who is going to take whatās available and push it until it breaks and learn something permanent from the breaking.
That kid will be much better off with a used laptop and Linux or BSD.
I started college with a white G3 iBook. By the end of freshman year I had installed Yellow Dog Linux, then Suse, Mandriva and eventually Gentoo.
Now, 20+ years later all my home computers are running Linux (Debian though), and my kids grew up using Linux.
But I'm going to send my teenager to college with Windows or a Mac. They're going to be 1200 miles away, and they're going to need to get support for their computer and I won't be there.
Yes, I like Linux 1000x better than Windows or Mac, but Linux demands a different relationship with the admin. This kid hasn't wanted that relationship with tech, and will rely on friends to help get Office or Zoom or whatever installed.
I'm still deciding between Mac and Windows now. I'll probably end up getting a quality used business laptop from FB marketplace, but the Neo is interesting too.
It's both a great post and a very silly one.
Cheap computers with hardware constraints have been around for decades. Now Apple ships one with pretty damn good performance, and they've invented "cheap computers with hardware constraints." HA!
My first computer was a Commodore 64 I found in a pile of trash a few years after they came out. My first PC was a 33Mhz Cyrix Instead I bought off my first college roommate. Now there are some real hardware constraints!
But yeah, necessity is the mother of invention. No doubt about it. Just not seeing how a $600 polished and performant laptop fits that bill ;)
The kidās parents want to be able to monitor their kid. The kidās parents want to be able to drag the machine to a local store and have the people there fix it.
The kidās parents - and the kid - all have iPhones, so itās familiar.
The kidās school requires Windows or Mac for their WiFi and wonāt let the kid use Linux because they donāt trust it.
Thereās plenty of reasons why Linux isnāt the answer in current climate.
Depends very much on whether the kids' interests leans toward doing computer stuff, or doing stuff with computers.
And they can do the former in a VM anyway. Install Linux, or a BSD, and go. With the bonus that you can experiment fearlessly because you've got snapshots and the worst-case for experimentation still leaves you with an entirely functioning laptop. Or use a cheap VPS, remotely.
Most schools don't let you use chargers due to fire and tripping hazards. The macbooks strength is you can use it on battery for the entire day. Most alternatives fail at this.
They would still be better purchasing a used MacBook as you can find those at similar prices and they (assuming they aren't Intel) will have the long battery life and more.
ARM PC laptops are on par with Macbooks in terms of battery life nowadays.
Is ARM Windows usable these days?
Very much so if you don't care about gaming. x86-64 emulation has already been great, and 99% of popular apps have native ARM64 versions. The only exception was Discord for me for a long time. I used to use an unofficial wrapper called "Legcord" instead. But, now even Discord has a native Windows version. I mostly use my laptop for software development + browsing.
I haven't tried gaming, but I feel like it'll suck for almost anything that's not natively ARM64. Steam doesn't have an ARM64 based client yet, AFAIK.
Isn't Discord just a web app? Why would you need an ARM64 specific version in the first place when you have a browser?
Your question is essentially "why do Electron apps exist?" and the full answer would be quite long.
The most important one is that an app's lifecycle can be different than a web browser. You don't always keep a web browser open, but you might want to keep Discord open regardless of what you do with the web browser. That kind of lifecycle management can be tedious and frustrating for a regular user.
Discord's electron app has many features that its web app doesn't such as "Minimize to system tray", "Run at startup", "Game/media detection", "In-game overlays" etc.
Even PWAs can't have most of these features, so that's why we have to deploy an entire browser suite per app nowadays.
Iāve been an Apple fan boi since the Apple II in my room. 44 years later, 15 in software engineering, and Iām still very happy with Apple
> That kid will be much better off with a used laptop and Linux or BSD.
True, and suffering through the limitations of the Apple platform will show the kid why Linux is better.
I grew up tinkering with linux since 2.0.x days up until 2.4.x lamenting on how Linux needs to take over Microsoft. Then sudden (legal) blindness hit, and computers were becoming difficult to use. Took a punt on a powerbook 12" at the time, and learned the accessibility features and never looked back. I do 90% of things in a terminal on a mac today, but Windows / Linux is the real limited OS for me due to lack of accessibility features rendering them unusable.
Don't downvote, it certainly did for me. My first computer was a MBP 13inch from 2009, as I was apple obsessed like the person in the parent article. Time passes and I realized I really didn't like either Windows or Mac, and for the past 10 years Ive been linux only. It really does happen, even if rarely.
Good on you for rising up to the ranks of Linux/BSD.
You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software that Apple makes that much more difficult. Most Apple users are content to use apps written by others and that is as far as their interest goes.
An analogy is the car market. Most people don't care how the car works, etc. They just want to get to places. If you only need to drive to the shops and do minimal errands, you don't even need a truck - a sedan will do just fine. Same with computers, lots of different market segments with distinct needs and expectations.
> You just need to recognize that not everybody aspires to be competent with lower-levels of hardware and software
You don't really need that to use Linux.
People should stop copy/pasting urban myths or stories from the late 90's. We are in 2026 and one can perfectly buy a laptop preinstalled on linux with full support and just find the apps they need from an "app store" which in this case is just the frontend for the flatpak and packages manager. Picking up an app from Gnome Software is no different than installing an app from the play/apple/microsoft store.
Yep everyone has their preference. A lot of us have done both. Iāve run multiple distros. Iāve played with low level software. I have used and continue to use open source tools in places.
And I prefer my Mac to this day as my main machine.
Consumer user or Linux hacker is a false dichotomy people sometimes like to try to slot people into (not accusing you GianFabien).
I think thisād be a good comment if it werenāt for the superiority complex :/
My first computer was a Compaq my parents got during that peak era of home PC mass adoption in the late 90s. I immediately played a ton of games, got on AOL, learned VBScript, C++, HTML, etc.
This was such a natural and common thing that I never even questioned if others were having a different experience with computers. This sounds crazy now, but it felt as if everyone was either going to learn to program or already had, not as a career choice but as an essential form of literacy. I mean even the calculators were programmable!
To me, Macs were just "the boring computers" we had at school and what my grandparents bought. They seemed locked down and weird like an appliance. I have no idea what my life would be like now if I had grown up in a different time and with a Mac.
This isn't to hate on Macs, but to tell the story of the dominance of Microsoft at the time and how much culture shifted towards more "dumb" consumerism. By the time the first iPod came out I realized the adults had no interest in any of this more progressive future. Then the iPhone and Windows Vista confirmed it.
I installed Ubuntu on the ThinkPad I had in high school and never really looked back. To this day, I am still baffled by the obsessions people have with AI "replacing jobs" and Apple devices as status symbols. I think those people miss the point entirely and worry about their incomplete worldview being passed down to younger generations. What I see is the masses refusing to participate and technofeudalists taking advantage of them.
Unless said kid ports Asahi Linux...
Interesting read! I love to see this spirit. I grew up with a different - but similar experience. Only, as an 80s and 90s kid, computers were nothing but limitations. Even when my dad built a machine with a 133MHz Cyrix chip, already a year later, it was outdated by computers with literally double the computing power.
That Cyrix machine was already miles ahead of the 386 that was handed down to me to play text based games on and learn dos through hard knocks. I remember leafing through old hard drives that had 10mb of capacity and realizing they had no value despite not being that old.
Later in college, I had the confidence to build my own first desktop with parts cobbled together from sketchy resellers. Athlon A1 single core 1ghz. Man that thing could fly.
I loved every word of this article, I went back in time and felt again with my Pentium III 500 Mhz trying to hack everything out, like changing the splash screen of Windows 98 SE while loading just to feel I did it!
These things were the ones that led me to this passion and that today, with LLMs and almost only business applications to develop for the sake of living, feel like the magic is finally fading.
God I miss the old times! (I'm 36 yo but I feel like 70 in this specific moment!)
I'm sorry to say that those kids are a lot fewer and farther between than they were even 15 years ago, and much, much fewer than 30+ years ago.
When I was working my most recent corporate job (as a people manager, natch) there were new hires even in 2019 that had never owned a computer that wasn't a phone, and just used whatever laptop or other system was supplied by their school or (now) work. This experience blackpilled me a little, I will say.
Why do you think they're fewer and farther between? It's almost certainly at least partially because of Chromebooks and the save-the-user-from-themself design philosophy.
I would say it's primarily because their phone (android or iphone) does nearly everything they would ever want a "computer" to do, and so they never had a hook into real curiosity about computing. This will probably be exactly the same whether the school supplies a Macbook Neo or a Chromebook.
Hackers tinkered with other things (e.g. the telephone) before pervasive computing.
I imagine the young hackers of today just find other things with which to tinker.
I've seen kids build some amazing things in Minecraft. Is this really all that different from modifying the source to a game you copied-by-hand from BYTE magazine?
It's the same epidemic. I don't think the author has a problem specifically with Chromebooks.
I'm a computer nerd who started on DOS on IIRC a 286, all the Windowses (OK not quite all, though my first job did expose me to a couple NT versions and 2K in addition to nearly-all the home versions I upgraded through on my own) and various Linuces (Gentoo as my primary OS, even, for like... five years, LOL)
These days there are two things I use my "real" non-server computers for in my personal life:
1) Media piracy, only because I've been too lazy to set up a headless torrent system on my media server w/ VPN connection (why does the media server exist? Again, purely for piracy, remove that use case and it'd be gone as a waste of electricity and space the very next day). I could fix this in a Saturday and remove this entire use case, improving my process significantly at the same time, just haven't yet.
2) Video games. Really hoping that new Steam console doesn't end up being sky-high expensive, because I'm excited about "consolizing" this use case and ditching my last "real" PC tower (other than one server that I go months and months without directly messing-with; and that "real" PC tower is already running Bazzite to make it Steamdeck-like, it's just janky as fuck because it's Linux on frankensteined hardware so of course it's janky as fuck, so I'm still eager to swap it for something designed and with support for that actual use case).
Everything actually-important happens on iOS, and usually on a phone.
In my personal life, I've concluded that I just have no idea what more-useful-than-the-time-it-takes stuff I'd even do with a "real" computer, despite spending absolute fuckloads of my time screwing around with them from ages like 7-30. What I learned in that time was "how to computer" to a pretty advanced level, and luckily that per se has paid the bills and then some, but in all that time the actually-directly-useful-to-me stuff I've done with it has amounted to very little.
To me, personally, I've realized PCs are a solution looking for a problem, and that so rarely does my trying to bridge that gap result in an actual net-benefit (usually, not even close) that I've mostly stopped trying. That impulse got me where I am, can't complain, that "wasted" time over literal decades gained me a bunch of skills that are (it turns out) almost entirely useless to me personally but that others are willing to pay for, great, but I find myself a computer nerd who doesn't actually know WTF to do with a "real" computer. I just use my damn phone.
I suppose I'd still find them a lot more useful if iPads and iPhones didn't exist and so I needed to do banking and reading and such on some other computery device, but those do exist, so... yeah, not a lot of motivation to even own a "real" PC anymore, to include a MacBookāI doubt I'll replace my M1 Air when it finally dies.
To sum up, as a lifelong computer dork, I don't even know why I need a "real" computer any more, let alone why anyone else does.
You can't even see this whole comment on an iPhone, let alone the immediate contextual thread that is necessary for framing it.
That sounds flippant, but it's the visceral reaction I have to trying to do everything on iOS (iPads are a little better than iPhone in this regard, but still have the "everything through a keyhole" feeling).
That said, tiny viewports are not really the main problem, since obviously a modern iPhone has far more resolution than almost any monitor available in 1990, or even 2000. It's more that most exploration and creation is not doable on an iPhone, by design.
My original point, though, was that seeing new grad software developers who have never had any non-externally-directed interaction with a computer made me realize why we have to show them how to use the shell in a terminal, and why they seemed to have no particular curiosity about any software thing not directly related to the task that they're doing. For new grads who aren't curious, finding that something they're doing doesn't work due to architectural issues or some nonobvious combination of bugs prompts neither asking for help nor a deep dive into what the problem could be. Asking for help is basically cheating, to them, and they've never before encountered real problems that weren't explained in the text, handled by their group partner, or trivially stack-exchangable (remember, this realization was 2019, and therefore before common LLM usage). At standup after standup, they report making progress on working through the moderately complex ticket they picked up, but in fact, nothing useful is happening. Sometimes people like this can be shown the way, and then become functioning and capable developers and troubleshooters. Sometimes not.
Anyway, /old-man-rant.
I liked this not because it's a good story. It is, but that's beside the point. I liked this because it's my story. Not literally so, but the shape of it is. He's struck a nerve at the heart of growing up eager and curious and seeing a computer as a pathway to your dreams.
Iāve never had a post hit me with nostalgia as hard as this one. Thanks to the author for capturing what it felt like to be a stupid little kid with a weird old computer so well.
This is true but also not at all the point of a review. Some tools are better suited for some tasksā reviews help those with the privilege of choice find the best ones for them. Otherwise youād have a review of a hammer saying āthis is a great tool for driving screws if youāre not afraid to get cREaTive with it!ā Folks who need to make do with what they have already know about their constraints.
I took the article as talking about the difference between reviews that say āthis computer is not going to be great at Xā and the reviews that say āthis machine is only good for office tasks or Yā. The gatekeeping tone.
It can do most anything. It may not be amazing, but people get buy. And they may be ok with it.
I saw tons of comments in the original post about the Neo from people who talked about how they used extremely old hand-me-down/used laptops to learn to start programming and fall in love with computers.
I was just watching a video from ETA PRIME who tests lots of small computers to see how good they are for gaming.
He was playing RoboCop on it, and it ran pretty well. 45-ish FPS. It was using 11 gigs of RAM at the time. So it was obviously in swap.
Is that ideal? No. But it works.
Iād bet a solid 25% of the people nitpicking the Neo wouldāve called it a breath of fresh air if it wasnāt made by Apple.
I donāt want one, it doesnāt do what I need. But I can definitely see the use cases especially at that price point.
It reminds me of the iPhone 5C when I had a 5S, it's a beautiful colorful breath of fresh air that I wish I had but my needs are so much greater. But if I wasn't an engineer who needed a highspeced MacBook Pro I'd go with it.
I have a high specced MacBook Pro, but honestly I mostly use it for vscode tunnels to my actual dev machine.
So the only real benefit compared to my MacBook Air is that the screen is a bit nicer, and I can keep more Firefox tabs open, because it has more RAM.
Bingo! IMO, laptops are best used as thin clients and you do the heavy lifting on servers or a box in a closet somewhere.
I'va been migrating my workflow to this approach and I'm an embedded dev! My closet does have hw strewn about but once you set it up that you don't have to touch wires it's super convenient.
My one gripe with MacBook airs up to m4 was support for only one external monitor. But m4 fixed this.
This is how I mostly use my Windows PC: remote access from my Android tablet via ssh and rdp. My gripe is entirely different: Microsoft has turned to crap.
RDP: Every time a native notification pops up, I get disconnected (usually the notification is about something I've been doing, such as starting a self-hosted server or running winget via unigetui). It randomly disconnects when I've been using it for more than a few minutes, even when there isn't a notification. All of this so far seems to be limited to Android's rdp client (the Windows app). For Windows built-in RDP client, my issue is that there's no way to make it resize the desktop like vmconnect does when you resize the window (and no way to proxy vmconnect connections easily for home use--I do not want to enable WinRM for the full system and figure out how to secure it, I just want a single PC on the LAN to be able to access a single VM conveniently, preferably able to log in as different users)
But there's issues with ssh (and likely WinRM/ps remoting, though I haven't used it) as well: with Linux you need to use sudo, but with Windows there's apparently no CLI requirement; ssh runs elevated (though apparently you can change this; I make do with connecting to a running psmux session that's not elevated). So far as I know, there's no way to elevate without the GUI being involved (admittedly I haven't looked since I started using ssh with Windows).
Linux? Connecting to Linux works perfectly. I can't use xrdp or ssh or vnc or forwarding x11 over ssh or [other] and they work perfectly. I used to use x2go before Wayland, and despite the pain of actually getting it working even that worked better; XDMCP required some amount of setup, but it was awesome (too bad there's nothing that efficient with Wayland); xpra looks great, but either didn't exist or I was unaware at the time. The only issues with Linux remoting are, again, Windows-related (it's seemingly impossible to get vmconnect enhanced session to work properly with Linux at all on Fedora 43; the things I've found online don't seem to work for me).
I'm not sure about this being the best approach. It just works for me sometimes.
I thought the single external monitor support was only with the M1. Wild it went on after that
Apple has turned computers into luxury items. They are not simply computers, there is some status and image projection involved.
As a hacker and tinkerer I could never justify the cost of purchasing one of their machines, but I see people around me trying to sell their old Apple machines and phones at absurd prices (I do live in a peripheral economy and Apple stuff is even more expensive here).
So it makes sense for Apple to segment its market like that. It makes sense for their audience too.
When I shop for a car, I find the same issue: most analyses have very little to do with the car's technical attributes and there's a lot of gibberish about design and lifestyle.
> most analyses have very little to do with the car's technical attributes and there's a lot of gibberish about design and lifestyle
Which is what makes a used Lexus so compelling. Toyota engineering under the hood, "Luxury" design on top.
Outstanding article; I'm glad you put these thoughts into words and published them because I've felt similarly this week since I've had time and reason to reminisce on my 2010 MacBook. I had AutoCAD on that poor little computer, working at the pace it could handle.
John Gruber used it for a day and found it was actually totally adequate, or better, for his actual daily work.
> But just using the Neo, without any consideration that itās memory limited, I havenāt noticed a single hitch. Iām not quitting apps I otherwise wouldnāt quit, or closing Safari tabs I wouldnāt otherwise close. Iām just working ā with an even dozen apps open as I type this sentence ā and everything feels snappy.
https://daringfireball.net/2026/03/the_macbook_neo
I think people are assuming it's going to be a worse experience than it actually is. I don't know how it does it with 8GB of RAM either, but apparently it does; i suppose my guess would be SSD and bus are fast enough that swapping on app change is no longer so disruptive? (I don't know if improvements in virtual memory swap logic could also be a thing that matters or not, this is not my area).
My first computer was a hand-me-down Compaq LTE laptop, several times removed from the original owner, with a 700MB hard drive and Windows 95 a decade after those were leading-edge specs. It had only Word and Access, of all things, and little room for more.
But it was mine, I tinkered with it forever, learned databases enough to turn Access into a basic quasi-Excel for my needs, cataloged things that really didn't need to be tabulated, and generally learned as much as that little machine would let me.
That was a limited computer, one that couldn't possibly have let me do what I needed to do when I hit university. But it got me started, taught me to tinker, and I'm prety sure pushed me to learn more than a state-of-the-art for the time computer would have.
And so I do wonder, at times, if it's the nostalgic look back at early computing that makes people inclined to say "my god that would have been an amazing computer to start out with" when you look at an entry-level computer. I'm inclined, even, to say man that's going to be an epic $100 computer on the second-hand market in a half decade or less.
When at the same time, it's actually a solid machine for more of us than us geeks with our inflated expectations of computers have than we'd like to accept. That, too, is pretty cool.
In high school we had a g3 Mac that we got Final Cut Pro on. It took forever to render a minute or two of video. But having a nonlinear editing system that took forever to do anything was way better than not having anything at all.
Edit: for true self embarrassment purposes you can see some of the films we made here:
https://youtu.be/FRQv7VUauWs?t=447&si=lCu3rp28XfKWN5ch
And also here:
https://youtu.be/ytKIG802baw?t=615&si=FJI8Cm9yYaXKMZdS
I put the start times at my movies. But there are others as well. lol.
I had no personal computer for years except what only served as my Plex Server until I took it down.
I bought a 16GB M2 MacBook Air after I was Amazoned to work on a side contract when I was between jobs. I used it for four weeks and the only thing I ran on it was VSCode, Safari and Zoom. I would have been fine with the MacBook Neo. Right now with a job, itās about the same - we use GSuite in a browser.
When I was 6, I got a Commodore PLUS/4, which was Commodore's unsuccessful attempt at a business-oriented 8 bit computer. I think my folks wanted to give me a Commodore 64, but things happen. Since there were not nearly as many games on that thing, I learned to program early. It had a build in assembler/disassembler (shift-reset would reset the computer without wiping the memory - just the first byte of the program memory), so I learned how to reverse engineer assembly code before I was 10. This arguably "wrong tool" shaped my whole life, and was maybe the most important thing that happened to me. If I got a C64, I could have easily turned into someone else.
My daily driver has a CPU less than half as powerful and a motherboard that doesn't even support DDR4 RAM.
> He is going to make a folder called āProjectsā with nothing in it.
I'm still this person today :)
You learn the most from failure and the least from success. Likewise, you grow the most from pushing through the limits and the least from living in abundance. You can do pretty much anything on a full spec Mac Studio, but so can anyone else with a lot of money. But if you push through the limit of a MacBook Neo, you just did something no one else was able to do. And that is awesome.
I absolutely loved reading this, bring back memories. I was that kid too.
Me too⦠I sold everything to buy a used 512k Mac with a 400k disk drive and an ImageWriter printerā¦. The joy of that time!
Main machine is a $50 Thinkpad.
I haven't used a computer more recent than 2016. As far as I can tell, the only thing I'm missing is AAA gaming (RTX looks cool), and local LLMs.
I did a bunch of game jams on it. Even won one! (Of course, even 2010s hardware is overkill for 2d games :)
I also did some basic video editing on it but it was a bit slow to render.
I won't say I'm not missing out ā I'm certainly looking forward to an upgrade! ā but that you can get surprisingly far with surprisingly little.
Yes, exactly this. I'm using a 10 year old elitebook folio g1. It's about 2 pounds and does what I need it to do. Available with a 4k screen if that's your thing. Does not feel sluggish. Given that I'm not gaming, video editing, or doing local LLMs (and I think there is a big chunk of the population in that camp), I feel like I am missing out on nearly zero.
(And I'm not trying to say anything is special about the laptop I'm using. I adore using trackpoint (so much that I brought my own trackpoint keyboard in to work to use there) so would gladly trade for an old thinkpad if what I had didn't already do what I need it to do).
In the middle of a gaming session one stops thinking about graphics once it's reached a certain level of fidelity, and that level is far below RTX. Not worth the money, especially today.
This reminds me of learning linux and cli commands because an old netbook with like 512 of ram was struggling with modern guis and web browsers. So I used a minimal i3 installation and interacted mostly with a terminal.
"I edited SystemVersion.plist to make the āAbout this Macā window say it was running Mac OS 69, which is the s*x number, which is very funny."
If I'd see that on my kids' computer it would fill me with pride.
I do miss the days that a (Linux) computer was like this for me. I say Linux because I had a similar obsession with FOSS and what it meant in a broad sense. But it doesn't matter, before that I de-compiled some program to make the text on the Windows START button different. Re-installed Win 2000 about every week, often after f-ing it up. Before that I changed some lines in DOS' autoexe.bat so it would ask for a password (which was just 2 input parameters readable in the autoexec.bat, but that is some mighty fine security (through obscurity) in a normie family).
This article struck a nerve. There's something about the curiosity of tinkering around in a computer. It's the most powerful technology humankind has built. It's versatile. It lets you break it. It's a bicycle for the mind, as Steve Jobs would say.
May all the hackers out there, old and young, discover the beauty of the personal computer.
What's so special about this machine? Is there no comparable $599 Windows or Linux laptop for kids to buy?
There are absolutely $599 laptops available from companies that aren't Apple but they've all made major compromises to get to that price in ways that people will notice.
The cooling will be terrible so that every 30 seconds the fans kick in at full speed, thermal throttling takes hold, and then it decides all is fine 30 seconds after that so you're working on a machine that's constantly cycling fan noise. The trackpad will suck, meaning you need to have a mouse precariously balanced somewhere anytime you're using it. It'll have some irritating BIOS feature you can't work out how to turn off that flashes a giant icon on the screen whenever you hit the caps-lock button. The casing will be plastic and snap where the screen hinges are screwed in after a year of light use. It may even be a Chromebook, making it a glorified tablet with an attached keyboard and (terrible) trackpad.
The thing Apple have done here isn't to release a $599 laptop, its to release a $599 laptop where the compromises are ones the average user buying a $599 laptop isn't going to notice everyday, while not compromising on the things they do notice. Its made of metal, the trackpad feels good to use, the keyboard is pleasant, and the battery is large enough that you're not carrying around a charger that weighs as much as the laptop itself.
> compromises the average user buying a $599 laptop isn't going to notice everyday
Oh my god yes. I've read way too much discussions that completely overlook this aspect.
So many people get so fixated on meaningless labels such as "smartphone CPU" (meaning it's bad) and things to pick on like "ew no HDMI or Ethernet" as if it was a life-saving thought terminator that preserves the world view in which absolutely nothing under the Apple brand can be in any way good.
As new I have no idea but I think I have never spent that much on a computer as I always buy second hand. 400⬠has always been my max. I honestly don't know why I should spend more, the brand new laptop the company I work for gave me don't run significantly faster than my 7y old laptop and this has pretty much always been the case in the last 25 years.
I looked at laptops around that price range in Australia, and they're all awful in one way or another. Either they're Chromebooks (yuck) or come with an ad-infested OS (gross) and/or with serious compromises in quality and finish.
When doing my Bachelor degree my dad gave me an old thinkpad to run on linux. It was a horrible experience for preparing powerpoints, papers, etc. But I still have that command line muscle memory and an eye to spot errors which really helped me in my career. In my final year I bought myself a macbook because I earned real money doing a consulting internship. But the unix muscle memory stayed, and I found working with IDEs so wasteful. In my first years at my job I rejected word and excel still to do everything in groff and awk.
I could 100% agree, putting up with i5-4210u and geforce 840m for twelve years. Making stupid shit, ditching Windows for Linux to gain any usability. Editing videos in too high resolutions. I taught myself being patient with electronics and it's the thing I couldn't have taught any other way.
> window say it was running Mac OS 69, which is the s*x number
bro its your blog just say sex
Same. I got a crazy old Ubuntu desktop when I was 9 or something. It couldnāt run any games and thatās why I learned Python. I wouldnāt be who I am today if I had a machine capable of running Minecraft at the timeā¦
> Nobody starts in the right place. You donāt begin with the correct tool and work sensibly within its constraints until you organically graduate to a more capable one. That is not how obsession works. Obsession works by taking whatever is available and pressing on it until it either breaks or reveals something. The machineās limits become a map of the territory.
This is why you should grow your own trees and wait a million years and melt the silicon into nvidia gpus and install linux.
learn linux.
only sanfran idealists dreaming of a world that destroys them use apple products.
A key point that TFA misses (probably for the sake of story-telling) is that, unlike the 2006 iMac the author fondly remembers of, MacBook Neo is not a hand-me-down computer.
It is not the proverbial gift horse. You are paying fresh $ for it. So, it is only reasonable to have some baseline expectations on redeeming value from it.
Also, an important point of the MacBook Neo criticism is that because of its cut-down features, a Neo may never graduate to a "hand-me-down computer", but instead head straight to the e-waste pile.
ARM macs are too new for us to know how the reuse/hand-me-down/legacy support world will shake out for them. Thereāll be signs when the first M1 machines get axed but for now, I have no clue.
Apple wants to give me $250 trade-in for my M1 Air with 16GB, but it seems to be worth $500+ on the open market, so yeah, still above hand-me-down territory. It feels as good as the day I bought it, and literally the only reason I'm considering replacing it is now we have 2/3 laptops with magsafe and I'd like to start distributing those chargers around the house. So tempting to just swap for a used M2 for a couple hundred dollars, but the chore of moving to a new computer is holding me back.
This post reminds me of a 14-year old boy reading the MS-DOS manual to figure out what AUTOEXEC.BAT and CONFIG.SYS at his younger brotherās football games.
Dunno about Xcode, but if you put a go compiler in there, I doubt it will compile or run slowly. Some dependencies may require C, but you could avoid that mostly.
> He is going to open GarageBand and make something that is not a song.
As a kid I just loved playing around in Reason (1 or 2?) and making strange sounds and just flipping the racks around and trying to connect the cables in any jack possible, to see what happens.
I would open one of the demo songs, to 'learn' how these racks were wired, and then experimented.
Good old memories.
Kids nowadays need those type of experiences too.
There are also some funny humorous pieces on this site.
My first computer was a Palm m100 with 2 MB of RAM. No persistent storage. (I once lost all my data when changing batteries a bit too slowly.)
Iām optimistic that kids will continue ignoring adults telling them what is and isnāt āreasonableā with this hardware and that software and just have fun!
I hope they sell so many of these, because the Mac ecosystem is just better for learning about computers then what most young people use daily.
How? I grew up with Windows, learned decent skills on that, probably as much as I would have on a Mac. The current mobile era stuff has put alot or control and grit away, for making things 'more accessible'.
Windows would do just fine. But the state of cheap Windows laptops is abysmal, and Windows as a product is in the doghouse lately because... well, I honestly don't know why Microsoft is doing what they're doing, but from the outside they certainly do appear to want to ruin Windows.
Has windows actually been screwed up or do people just not like changes in their operating system?
These days it would be an iPad though.
Or a chromebook which is probably worse.
Chromebooks themselves can actually be great machines for hacking (in the traditional sense, not the modern security/jailbreaking sense). E.g. https://support.google.com/chromebook/answer/9145439?hl=en is arguably better than a direct typical Linux install because it's an isolated environment which won't break the main function of the device as you tinker.
As the page notes though, the real problem for kids is the devices are of course locked down:
> Important: If you use your Chromebook at work or school, you might not be able to use Linux. For more information, contact your administrator.
I mostly agree. Just one thing:
> (in the traditional sense, not the modern security/jailbreaking sense)
As far as I can tell, the two senses have pretty much always existed side by side. Nothing traditional vs modern about it.
The more common "modern" definition popularized in the ~90s has dropped the non-malicious meaning regardless which side of the "did hacker originally include both usages or not" debate one sits on. That doesn't mean the original definition ever went away though of course!
They don't have an open source kernel. You can't recompile the kernel or build your own device drivers. I'm not sure what you mean by "learning about computers", but I personally find being able to peek into the kernel source code to be more educational than anything in the mac ecosystem.
The hardware here is incredible, but it's crippled by not adequately supporting Linux, BSD, or any other properly open source kernel you can compile and install yourself. A good learning environment doesn't put up immovable barriers like "you need a kernel signed by apple", it lets you push away barriers when you're ready, like "Are you sure you want to turn off secureboot, or install your own secureboot keys"
Iād bet 99% of professional developers have never peeked at kernel source code or built their own device drivers.
The parent commenter said "learning about computers". Most "professional developers" don't learn about computers, they learn enough react to get a paycheck, but don't have an insatiable curiosity about how the whole computer works (i.e. the "hacker spirit").
Professional developers are not what this thread is about. It's about curious kids, about hackers, and that group does peek at kernel source code (as well as everything else).
Iām fairly confident that the Venn diagram of (a) nine-year-olds that are playing with a computer and (b) people who claim that access to kernel source code is a prerequisite to ālearning about computersā is two circles that are barely touching.
It's something you never need to look at, until suddenly you do and then it's invaluable. Any time you format some data for another system and get a cryptic error code back, looking at the source code becomes invaluable.
> You can't recompile the kernel or build your own device drivers.
I just donāt think this is what, like, nine-year-olds are looking for in a computer.
In any case, at least itās good that theyāre starting with macOS over Windows! Puts them on a good path to understanding that POSIX is the One True Paradigm and therefore makes them much more likely to compile their own kernel in the future.
I was that kid, and can confirm.
My curiosity for all things computer related was boundless, but I eventually tinkered with Linux but only because Iād had been exposed to a *nix style command line from the comfort of an OS X desktop first.
By then I had started messing around with code but had only built toys and extremely basic tools and wouldāve been lucky to write a moderately functional desktop program using high level libraries (which didnāt happen for several more years).
Writing drivers or poking around in kernel code was so far beyond the scope of capabilities at that point that you wouldāve had better luck teaching your dog how to knit. I donāt think I couldāve had any chance at doing these things until at least my mid 20s.
> Writing drivers or poking around in kernel code was so far beyond the scope of capabilities at that point that you wouldāve had better luck teaching your dog how to knit.
I get the feeling a whole bunch of teenagers have written drivers to cheat in Fortnite/whatever other game - with that being said, probably not at 9 years old.
> They don't have an open source kernel
Yes they do in fact, it's called darwin XNU
https://github.com/apple-oss-distributions/xnu
Eh, qemu runs just fine, so you can peek at Linux kernel code (and recompile and experiment with it) on the Mac just as much as you can on Linux.
> then what most young people use daily.
Most people are using Windows or phones where that isnāt an option.
Yeah you can root or change the OS but that seems outside the spirit of the comment to me.
These are Macs. They run Xcode and you can develop apps for your iPhone for free with one.
Yeah you need to pay to distribute, but a computer to do it has never been cheaper.
Not enough memory -> can't do it.
Not enough CPU -> can do it, but it's slow.
(Ubuntu with the OOM killer - could do it, but when it filled half of memory, it was killed.)
Not enough memory is sometimes just a slowdown these days, with ssd and swap.
Swap has existed since Win95 btw
Yes, though SSDs that can sustain 1.5G/s and an OS that transparently compresses memory before swapping yield a lot better experience than Win95 swapping.
Yes, but if your bar is "still works but slower" you don't need that.
For me, not enough memory is mostly -> close some damn browser tabs.
If we're talking about Ubuntu, ZRAM is still a thing.
Of course. I set it up on my Void Linux and Arch Linux.
I fear this kind of experimentation will soon by killed by ai.
It's an interesting slip, given the premise of the article, that this hypothetical child is assumed to be a boy.
I actually gave a de-gendered draft of the essay to some friends a few days ago and heard that it landed with a thud ā the essay is largely written about myself, in the third person, retrospectively, so removing the pronoun made the autobiographical thread harder to follow. I switched it to "he" to make that clearer.
I think if I used "she" it would've made the "That kid was me" transition harder ā it would either involve some gymnastics to make it make sense or it would introduce a reading I didn't intend.
Yes, I thought that was probably the case, but it stuck out to me. Sorry, I didn't mean to come across as chiding you; I think the basic point about who is and isn't "allowed" to think of themselves as a computer-person is a really interesting one, and also that you can't address this topic properly without addressing gender (and race).
"A Chromebook doesnāt teach you that"
But seriously, if you go to school today they will make a point to put you on the slowest, weakest Chromebook available because they're terrified that you're going to play Krunker.
The result is you become an adult and you'll never buy a Chromebook. It's the same way that being bullied on the schoolbus means you become an adult who'll never ride public transit.
https://ithacareuse.org/
will sell you a desktop computer for around $150 (e.g. four of them for the price of a Neo) which will put an enterprising young person on a much better path to learn about computers than the Neo ever will.
Now you might say I'm being Orwellian but I really think Swift/XCode/iOS is "slavery" and the web platform is "freedom". I mean dev tools are fully competitive for the web platform, I can write my application once and run it on desktop and phone and tablet and VR headset and game consoles and all sorts of things I don't even know about it. I never have to ask permission if I want to deploy an app or update it. I don't have to pay anyone 30% of my revenue.
Used thinkpad or mini pc seems more practical
For this kind of experience I would recommend just buying a Thinkpad t480 you can buy for 200$ and install a Linux distro like Linux mint and then something more challenging like arch Linux
I like the cut of you jib! If you lived around the corner, weād have a coffee.
You nailed it.
Is the poop deck really what I think it is?
https://www.youtube.com/watch?v=O8DB60Wj9Cg
If one doesnāt know how to play the violin, they can make a Stradivarius sound like crap.
The hacker is strong in this one. Keep it up.
Learning to make use of limited resources is truly rewarding.
This was lovely, I appreciated reading it
You can run blender on a Chromebook using the Linux environment.
I like how these days you have to say things like: "fuck-ass system modification" just to prove you're not AI.
But it is AI! Or, at least, it's been run through it. (Staccato sentences; Not X. Not Y. Z...) It's a shame for a personal reflection. It's hard to imagine what the (I'm guessing) Claude-isms add that improve what would otherwise have been a nice unmolested personal essay.
What else is a shame is claiming that some single language feature supports a foregone conclusion that the writing's been 'molested'. It's hard to imagine what a constructive comment this could've been with the minimum of effort to know that the author has written this way consistently since at least 2021, before the first public release of ChatGPT.
It's also hard to imagine how difficult you must enjoy being, when you could have offered a kind clarification but instead dove into some obituary style takedown.
Really? In that case I retract the statement and will ponder what AI has done to my ability to assess this kind of writing!
Author here, itās all me. I ran it through Claude before publishing to spot check me on grammar/typos and it caught a few syntax things, but this is just my writing style.
Hereās a satire piece I wrote in the summer of 2021. Tonally very different but you can pick up on my voice between it and my essay yesterday: https://samhenri.gold/blog/20210803-localhost/
>What Apple put inside the Neo is the complete behavioral contract of the Mac.
I remember seeing and using my first powerbook 160 and being blown away that it had the "complete behavior contract of the Mac" that I knew at the time. Even the 16 shades of gray screen made it more luxurious than a classic black-and-white Mac.
And the "What's on Your Powerbook" ads, with Todd Rundgren and Rev. Don Doll, SJ.
https://alumni.creighton.edu/news-events/news/father-don-dol...
Todd> Flowfazer, the screen saver I codeveloped [with David Levine]
Fernando Perdomo - Dreaming in Stereo Suite (FlowFazer Video)
https://www.youtube.com/watch?v=3Z4X4FmIhIw
http://www.trconnection.com/trconn.php/article=grokware.art/...
https://www.independent.co.uk/incoming/pop-for-the-people-by...
https://rocknrollwithme.substack.com/p/todd-rundgren-as-a-pr...
>Todd also co-developed graphic tablet software with a music theme for Apple in a technology venture in the late eighties. With Dave Levine, he designed and developed a screensaver product called Flowfazer (see example of one of the screensavers below), with the strapline āMusic for the Eye.ā They introduced it at MacWorld thinking they would publish it themselves, but found there was already well-funded competition with Berkeley Systems Flying Toasters and were forced to abandon the project.
I used to run POV-Ray on my pathetic AMD 386 clone. A simple render took 20 hours and yet I did it.
Can you use an apple notebook without an apple account?
yes.
Depending on your process, there is nothing wrong starting with this tool (Neo) first. It's a classic dilemma. For your first tool, buy the cheapest one possible to get the job done. Once the tool becomes understoond, it's limits reached, it's place in the process discovered, then, buy the most expensive one you can afford.
The Neo is the right first tool for many people.
A little dramatic in tone but loved it all the same. I really do remember what it felt like to work on a āmachineā as a kid. The family dell lol hit all sorts of walls but learned a lot.
I don't get the folks referring to this as a "Chromebook killer". Chromebooks start at around US$150 new. The MacBook Neo is 4 times the price at US$599. There are premium Chromebooks like the Chromebook Plus line that are more in the Neo price range, but those aren't the ones being bought for schools and such. Doesn't make the Neo a bad thing, of course, I think it's a solid basic laptop from the reviews.
I think for kids in particular, it's important to remember that the educational discount brings it down to US 500. That's not exactly nothing but that's a pretty reasonable amount for a non-crap laptop.
I used non-discounted consumer prices for both. Education discounts for both the Neo and Chromebooks will bring them down further.
I like the sentiment expressed here, but on this note, I think there are other dangers to consider listening to early reviewers:
- Reviewers do get early access and often are receiving units AND doing their tests, writing their script, recording, and editing their videos before regular users can even possibly get a system shipped in. At best this rushes them where they miss details (e.g., few reviewers noticed that the MacBook Pro 14" M5 keyboard is different hardware then what you got on the M4 Pro because so much content is rushed)
- Reviewers are almost never experts on what street prices look like because they are focused on reviewing, getting content out ASAP. They are not spending time monitoring pricing with only a few exceptional channels doing so.
- The best marketing machine companies like Apple absolutely groom the review ecosystem without even needing to tell reviewers what to do directly. It's a competitive landscape of self-made YouTubers who are susceptible to positive reinforcement from the industry. i.e., companies don't have to tell reviewers to censor themselves, they can instead use positive reinforcement to select which reviewers are getting the best access and privileges.
Now, about the computer itself: related to the way the author of this article talks about the MacBook Neo, about the role of a cheap computer to just try have a working computer that is able to get some stuff done: this is the kind of thing that should likely steer you AWAY from this MacBook Neo that initially looked so exciting.
If you're considering a ~$500-750 computer, well, not only should you be checking the used market, but also, actually look at the competition to this thing.
The reactions I've seen from regular people seems to be, basically, "wow, Apple pulled off an incredible feat, they've disrupted the computer market again!"
Well, let's pump the brakes. First off, realize the Neo is making a lot of the same trade-offs that budget laptops have been doing for years. They aren't even giving you a backlit keyboard! The lower model cuts out biometric auth! There's no haptic trackpad, which used to be a major differentiator for Apple! It comes with a tiny slow charger! The battery life is actually not that good under load/bright screen because the battery is tiny! The CPU is old/slower/low power biased! These are all the classic cheap laptop tradeoffs that give PC manufacturers a LOT of room to actually compete really well against the Neo.
On top of that, almost every cheapo Windows laptop on the market is going to deliver to you a computer with at least a replaceable SSD. Usually RAM is soldered but it's not impossible to find that as something you can upgrade as well even on consumer-ish stuff that isn't just an old ThinkPad.
Actually spend the time to jump on some retailer websites like Best Buy and take a look at what the street prices look like.
There are multiple computers on there that make way more sense for someone budget constrained than a MacBook Neo.
My two favorites, one at a lower price and one at a higher price:
Lenovo Yoga 7 2-in-1 2K OLED Touchscreen Laptop, AMD Ryzen AI 5 340 2025 - 16GB memory, 512GB SSD, $679. This is a proper mid-range laptop and not just some cheap bottom of the barrel model in the lineup. To gain an OLED touchscreen, double the RAM, and the same storage as the highest Neo model at the same price, this is just great all around. I'm pretty sure these get very respectable battery life as well.
Lenovo IdeaPad Slim 3x 15.3" touchscreen snapdragon X, 16GB memory, 256GB storage, $549. With this model, you get a lot of the same ARM benefits that Apple is giving you. Sure, Windows on ARM is not the kind of polished native experience as a Mac, but we are just talking about a cheap laptop that works and, generally, everything you want to do in Windows will work on an ARM system. Once again, you're getting doubled RAM, which is important, and you're going to gain a touch screen, numpad, and possibly even beat out the Neo's battery life.
Another option is the HP OmniBook X Flip 2-in-1, a little less of a good value than the above, but it's another 16GB/512GB option that slides under $700.
You make some great points here. Hereās one of the places Iām coming from that seems to be aligned with the author of this.
I find macOS to be a superior OS for doing computer work to all the alternatives. It still sucks for a lot of reasons, but to my taste it generally sucks less. Iām a web dev, so I host a lot of crap in Linux, and Iām pretty confident in using it as a desktop. But the general day to day experience I find macOS superior.
Thereās plenty of people in similar boats, and this is the most affordable machine (new, not used) that lets someone get to use macOS.
For a lot of people with budget limits Iād point them to used MacBook Air models rather than the Neo, but having this as a new model is a really nice option for some people.
Also you can call the Neo CPU slow but its benchmarks run circles around anything you find at its price range. Those machines have more RAM and storage, but the Neo will likely provide a more responsive experience than anything in its price range.
I do agree on refurb/used rather than the Neo. The best low-ish cost computer Apple is selling right now is probably their refurbished $750 MacBook Air M4 with 16GB RAM/256GB storage.
The only way I'll push back on this is the Ryzen 5 AI 340 is faster at multicore than the A18 Pro. Slower single core by a slight amount, and much slower iGPU.
However, that means to compete with the MacBook Neo more completely including integrated GPU, all you have to do is go up one CPU SKU to the Ryzen 7 AI 350 and you're further increasing your multi-core performance lead as well as completely closing the iGPU gap by doubling your GPU performance.
That same Yoga laptop is offered in this configuration including extra storage (16GB RAM/1TB SSD/Ryzen 5 AI 350) for $800
That...really is only $100 more than the 512GB configuration of the MacBook Neo if we aren't tossing in the education store pricing.
Perhaps it's more of a MacBook Air competitor at that price range. Stretching up to $800 is a lot...but you do also get a lot for that stretch.
All of the computers you listed have an inferior CPU, inferior battery life, inferior performance, inferior build quality, and inferior software for most peoples usecases. I know we all love linux here, but a lot of creative (or school, or work) apps that people use don't support Linux, so people must choose between MacOS and Windows.
All of the "cons" you list for the Neo apply doubly if not more for the alternatives you provided. Not to mention the cheap plastic build quality, poor OEM support, horrible screens, etc.
Letās go through these:
Ryzen 5 AI 340 has a higher multicore benchmark score than the A18 Pro. If you go up to $800 you get the Ryzen 7 AI 350 which matches or beats the A18 Pro in graphics, gets pretty much on par in single core performance, and that SKU has 16GB/1TB in its configuration. If you spend $100 less on the high end Neo with 512GB you get half the storage and lose a lot of I/O and get a worse screen and no replaceable SSD.
USB 3 5Gbps and USB 2 as your only ports are pathetic. Competing systems have more throughput and other conveniences like microSD readers, HDMI, and USB-A.
Inferior battery life, care to send me test data to back that up? Because the Neo is not a star at battery life for medium intensity tasks. It has the smallest battery of any Mac. Early reviews note that screen brightness and higher intensity workloads quickly deplete the battery. It comes with a very slow charging power brick. I guarantee you the physical size of the battery in that Yoga is much larger than what you get in the Neo.
Inferior software: highly subjective. There are over 900 million PC gamers on this planet who canāt play PC games on their MacBook Neo. Windows objectively runs more applications than Mac. Plenty of people I know prefer Windows over Mac.
Cheap build quality: again, Neo has no haptic trackpad, so itās not that different than a typical windows PC.
Poor OEM support: Lenovo sells parts, Apple doesnāt.
Horrible screens: the Neo has the worst screen in any Mac, the Yoga laptop has a touchscreen OLED. Have you seen the Neo screen in person?
In my opinion, this article looks like a straw man argument, and the author appears to completely misinterpret "This is not the computer for you."
Such a statement needs to be understood in the relevant context. It's not intended to discourage kids from buying a Mac! Rather, it's intended to rebut critics who are already Mac owners and who scoff at the MacBook Neo technical specs, such as RAM. The computer is indeed not for them, people who can already afford a MacBook Pro, for example. The point of "This is not the computer for you" is the opposite of how the author characterized it: the point is that the MacBook Neo and its specs are actually fine for the people who are going to buy one.
For some strange reason, the author has invented an imaginary opponent to become offended by. We're supposed to cheer for the kids here, and I see that many people have fallen for it, but the whole schtick falls completely flat for me. The kids were never endangered or discouraged by the reviews of the MacBook Neo.
I don't know why you're downvoted. No matter how many feel-good anecdotes the author tacks to their article, to me the premise appears a strawman. It would have been entirely possible to make pretty much all the same points about just getting a used Thinkpad, or anything really.
My first laptop as a kid was a passed-down business Toshiba that was to be scrapped. I then bought a soldering iron to fabricate a serial dongle in order to reset the BIOS password that was locking it down, and then installed Xubuntu on it. Guess young me shoulda gotten a Macbook instead to inspire the true spirit of freedom and exploration?
It's an old and persuasive myth of the Apple community that of course it's not about the tool, but what you do with it creatively. Still, they never fail to mention how the tool being an Apple is important in one way or another. I just don't get it.
Thank you thank you thank you for writing this. This made me smile and feel like shedding a tear. This is exactly how I felt when I watched the Dave2D video of the MacBook Neo, and other "reviews" that miss the entire point. This captures the point. The other reviews capture specs. This captures the emotion. This captures the reality.
" I faked being sick to watch WWDC 2011 ā Steve Jobsā last keynote ā and clapped alone in my room when the audience clapped, and rebuilt his slides in Keynote afterward because I wanted to understand how heād made them feel that way."
jesus christ thats grim
[flagged]
You have many years of experience creating accounts to post like this, as well as to break HN's rules in countless other ways, so I assume the question is rhetorical.
Because new user, ostensibly bad faith, throwing short cliched statements with no clear intent to start a meaningful discussion.
Seemingly effortless comments yelling to the void not worth starting a conversation with. Not the kind of comments that belongs or are wanted in this platform.
I flag uninteresting comments, such as yours.
I'd say this comment fails the "Be kind. Don't be snarky" test. Wouldn't you? If we're appealing to the rules to justify our actions it puts a bigger burden on ourselves to make sure we're perfectly in line with them too :)
For that we have downvotes, not flags.
True. But this comment broke guidelines as well. (https://news.ycombinator.com/newsguidelines.html)
Moderation here sucks and doesn't really work well imho.