I think if Richard Stallman had no qualms about proprietary software, he would have remained in the Lisp machine world, either working for Symbolics or Lisp Machines, Inc., or perhaps starting his own thing. Stallman was a Lisp hacker before starting GNU, and even when deciding on cloning Unix instead of creating a free Lisp-based OS, one of his first projects was GNU Emacs.
An interesting thought experiment is what Stallman would’ve done in that alternate timeline in the late 1980s and early 1990s when Lisp machines were killed off by advances in commodity hardware and compiler technology, the end of the Cold War (the US military was a large customer of Lisp machines), and the AI winter (Lisp used to be synonymous with AI).
So you're saying time travelers came back in time, and caused the end of lisp architecture, solely to prevent an AI singularity years before we could possibly cope with it.
Assuming this post is real (it’s a screenshot, not a link), I wonder if Rob Pike has retired from Google?
I share these sentiments. I’m not opposed to large language models per se, but I’m growing increasingly resentful of the power that Big Tech companies have over computing and the broader economy, and how personal computing is being threatened by increased lockdowns and higher component prices. We’re beyond the days of “the computer for the rest of us,” “think different,” and “don’t be evil.” It’s now a naked grab for money and power.
Apologies for not having a proper archive. I'm not at a computer and I wasn't able to archive the page through my phone. Not sure if that's my issue or Mastodon's
It's a non-default choice by the user to require login to view. It's quite rare to find users who do that, but if I were Rob Pike I'd seriously consider doing it too.
A platform that allows hiding of text locked behind a login is, in my opinion, garbage. This is done for the same reason Threads blocks all access without a login and mostly twitter to. Its to force account creation, collection of user data and support increased monetization. Any user helping to further that is naive at best.
I have no problem with blocking interaction with a login for obvious reasons, but blocking viewing is completely childish. Whether or not I agree with what they are saying here (which, to be clear I fully agree with the post), it just seems like they only want an echochamber to see their thoughts.
>This is done for the same reason Threads blocks all access without a login and mostly twitter to. Its to force account creation, collection of user data and support increased monetization.
I worked at Bluesky when the decision to add this setting was made, and your assessment of why it was added is wrong.
The historical reason it was added is because early on the site had no public web interface at all. And by the time it was being added, there was a lot of concern from the users who misunderstood the nature of the app (despite warnings when signing up that all data is public) and who were worried that suddenly having a low-friction way to view their accounts would invite a wave of harassment. The team was very torn on this but decided to add the user-controlled ability to add this barrier, off by default.
Obviously, on a public network, this is still not a real gate (as I showed earlier, you can still see content through any alternative apps). This is why the setting is called "Discourage apps from showing my account to logged-out users" and it has a disclaimer:
>Bluesky is an open and public network. This setting only limits the visibility of your content on the Bluesky app and website, and other apps may not respect this setting. Your content may still be shown to logged-out users by other apps and websites.
Still, in practice, many users found this setting helpful to limit waves of harassment if a post of theirs escaped containment, and the setting was kept.
It's a non-default setting. So no. I am not sure what you disagree with exactly? We can call out BlueSky when they over-reach, but this is simply not it.
The setting is mostly cosmetic and only affects the Bluesky official app and web interface. People do find this setting helpful for curbing external waves of harassment (less motivated people just won't bother making an account), but the data is public and is available on the AT protocol: https://pdsls.dev/at://robpike.io/app.bsky.feed.post/3matwg6...
So nothing is stopping LLMs from training on that data per se.
That's assuming that AI companies are gathering data in a smart way. The entire MusicBrainz database can be downloaded for free but AI scrapers are still attempting to scrape it one HTML page at a time, which often leads into the service having errors and/or slowdowns.
Yea that’s true. I’m just saying if someone wants to put in a modicum of effort, AT ecosystem is highly scrapable by design. In fact apps themselves (like Bluesky) are essentially scrapers.
The Bluesky app respects Rob's setting (which is off by default) to not show his posts to logged out users, but fundamentally the protocol is for public data, so you can access it.
the potential future of the AT protocol is the main idea i thought made it differentiate itself... also twitter locking users out if they don't have an account, and bluesky not doing so... but i guess thats no longer true?
I just don't understand that choice for either platform, is the intent not, biggest reach possible? locking potential viewers out is such a direct contradiction of that.
edit: seems its user choice to force login to view a post, which changes my mind significantly on if its a bad platform decision.
It's a setting on BlueSky, that the user can enable for their own account, and for people of prominence who don't feel like dealing with drive by trolls all day, I think it's very reasonable. One is a money grab, and the other is giving power to the user.
(You won't be able to read replies, or browse to the user's post feed, but you can at least see individual tweets. I still wrap links with s/x/fxtwitter/ though since it tends to be a better preview in e.g. discord.)
For bluesky, it seems to be a user choice thing, and a step between full-public and only-followers.
I'll (genuinely happily) change my opinion on this when it's possible to do twitter-like microblogging via ATproto without needing any infra from bluesky tye company. I hear there are independent implementations being built, so hopefully that will be soon.
The agent that generated the email didn't get another agent to proofread it? Failing to add a space between the full stop and the next letter is one of those things that triggers the proofreader chip in my skull.
I remember a time when users had a great deal more control over their computers. Big tech companies are the ones who used their power to take that control away. You, my friend are the insincere one.
If you’re young enough not to remember a time before forced automatic updates that break things, locked devices unable to run software other than that blessed by megacorps, etc. it would do you well to seek out a history lesson.
For some context, this is the a long time Googler who's feats include major contributions to GoLang and Co-creating UTF-8.
To call him the Oppenheimer of Gemini would be overly dramatic. But he definitely had access to the Manhattan project.
>What power do big tech companies have and why do you have a problem with
Do you want the gist of the last 20 years or so, or are you just being rhetorical? im sure there will be much literature over time that will dissect such a question to its atoms. Whether it be a cautionary tale or a retrospective of how a part is society fell? Well, we still have time to write that story.
Rob Pike is not a 'Googler' by birth or fame or identity. He was at Bell Labs and was on the team that created Unix, led the team creating Plan 9, co-created UTF-8, and did a bunch more - all long before Google existed. He was a legend before he deigned to join them and lend them his credibility.
I know where they make money, but calling them an advertising company is just a jab. Ha ha, but that doesn't describe Google, like them or not.
I wonder where AT&T made profits and where, like any business, they broke even or had loss leaders. IIRC consumer telephone service was not profitable.
By this logic there is no corporation or entity that provides anything other than basic food, shelter and medical care that could be criticized - they're all just providing something you don't need and don't have access to without them right?
Just to note: these companies control infrastructure (cloud, app stores, platforms, hardware certification, etc.). That’s a form of structural power, independent of whether the services are useful. People can disagree about how concerning that is, but it’s not accurate to say there’s no power dynamic here.
Aftermarket control, for one. You buy an Android/iPhone or Mac/Windows device and get a "free" OS along with it. Then, your attention subsidizes the device through advertising, bundled services and cartel-style anti-competitive price fixing. OEMs have no motivation not to harm the market in this way, and users aren't entitled to a solution besides deluding themselves into thinking the grass really is greener on the other side.
What power did Microsoft wield against Netscape? They could alter the deal, and make Netscape pray it wasn't altered further.
Umm are you being serious? just look of the tech company titans in this photo in this trump inauguration - they are literally a stand in for putins oligarchs at this point
As a fan of older Macs, I didn't know there were any 64-bit PowerPC chips made after the Power Mac G5 and the Cell processors used in the XBox 360 and PlayStation 3.
This is cool; it would be cool to play with a modern, hobbyist 64-bit PowerPC board. I will be keeping an eye on this project!
This board is based on an NXP QorIQ SoC which is designed for networking hardware, not really intended for general purpose computers. It is to my knowledge, and has been for years, the only game in town if you need to be compatible with the PowerPC ISA (IBM POWER processors, while part of the same lineage, cannot run PowerPC code)
3. Aldus PageMaker, which predates Photoshop 1.0. It jumpstarted the desktop publishing revolution, and it also arguably saved the Apple Macintosh, which suffered from slow sales after its introduction, leading to Steve Jobs’ exit from the company.
4. Apple HyperCard, which further made the Mac stand out from its competitors.
5. Netscape Navigator. Yes, Mosaic predates Netscape Navigator, but it was Netscape Navigator that was many non-technical users’ first web browser in the mid-1990s.
6. The Apple iLife suite (e.g., iPhoto, iMovie, GarageBand), which set the Mac apart in the 2000s by offering very easy to use applications for handling digital media.
7. The entire NeXT ecosystem of applications and developer tools from the late 1980s and early 1990s. There are so many, but highlights include Interface Builder and Lotus Improv. NeXT didn’t do very well in the marketplace, but it maintained a niche audience well into the late 1990s due to its software ecosystem and its developer tools.
8. Microsoft Visual Basic was revolutionary for making it easy to write quick-and-dirty GUI applications for Windows.
I miss the days when personal computers were simply tools, akin to pencils and handheld calculators. I remember the days of Macintosh System 7 and Windows 95. No upselling services. No automatic updates. No nagging. You turned your computer on, executed programs, and that was it.
On the Windows side, things started going downhill starting with the Windows XP era, and on the Mac the annoyances began sometime in the mid-2010s.
It seems Microsoft, Apple, and other companies realized that they’re leaving money on the table by not exploiting their platforms. Thus, they’re no longer selling simple tools, but rather they are selling us services.
Yes, there are good Linux distributions that don’t annoy me, and the BSDs never nag me, but the problem with switching to these platforms is that I still need Microsoft Office and other proprietary software tools that are not available outside “Big Tech.” There are other matters that make switching away from Windows and macOS challenging, such as hardware support and laptop battery life.
Easy answer to your last point: Work machine and Non-work machine. If I'm working for a company and the company needs MS Office, they will give me a machine with MS Office. I will treat that machine like a radioactive zone. Full Hazmat suit. Not a shred of personal interaction with that machine. It exists only to do work on and that's that. The company can take care of keeping it up to date, and the company's IT department can do the bending over the table on my behest as MS approaches with dildos marked "Copilot" or "Recall" or "Cortana" or "React Native Start Menu" or "OneDrive" or whatever.
Meanwhile, my personal machine continues to be Linux.
This is what I'm doing at my work now. I'm lucky enough to have two computers, a desktop PC that runs Linux, and a laptop with Windows 11. I do not use that laptop unless I have to deal with xlsx, pptx or docx files. Life is so much better.
A variation I've done occasionally is to run the Microsoft Windows software in a VM on my Linux laptop.
When I last had the MS office suite inflicted upon me, a couple years ago, I was able to run it in a Web browser on Linux.
It's important to remember, though, that these measures probably won't work long-term.
Historically, MS will tend to shamelessly do whatever underhanded things they can get away with at that point in time. The only exception being when they are playing a long con, in which case they will pretend to play nice, until some threshold of lock-in (or re-lock-in) is achieved, and only then mask-off, with no sense of shame. (It's usually not originating bottom-up from the ICs, and I know some nice people from there, but upper corporate is totally like that, demonstrating it again and again, for decades.)
Also, a company requiring to run Microsoft software is probably also a bad place to work in other regards.
> Also, a company requiring to run Microsoft software is probably also a bad place to work in other regards.
My current employer is so great that I have casually mentioned that I might stay until I retire a bunch of times since joining. I've never said that about any other job. We have Word because there are industry requirements that it meets in terms of formatting legal documents. Can other apps supplant it? Possibly, but no one is spending the time and money to find out and it's not my decision to make.
I understand the motivation of the statement, but it's a fallacy.
You just described an exceptionally good place to work (because, how many places would an employee casually mention a bunch of times that they might stay until retirement).
Congrats on findind that situation, but I don't think it's evidence of fallacy of my statement.
> Historically, MS will tend to shamelessly do whatever underhanded things they can get away with at that point in time. The only exception being when they are playing a long con, in which case they will pretend to play nice, until some threshold of lock-in (or re-lock-in) is achieved, and only then mask-off, with no sense of shame.
The Windows 10 bait n switch to Windows 11.
Hundreds of millions of PC users worldwide on old hardware using old Windows OSes were offered Win10 as free upgrade, with the promise that Win10 is the final Windows edition.
Later though, M$ announced Win11 and it would work only on new hardware (BIOS TPM 2.0 constraint), and Win10 is no longer being supported for personal use (except via some complicated ways to get an extension for the Win10 updates). And not only is Win11 buggy and full of ads, its performance is also bad.
Well, the good thing is that such shenanigans are pushing PC users to migrate to Linux.
Valve saw the writing on the wall when Windows 8 was released. Their investment made Linux more feasible for the average user.
This makes me wonder how much better the world would be if corporations didn't have to answer to shareholders. Valve isn't publicly traded, Microsoft is.
> Also, a company requiring to run Microsoft software is probably also a bad place to work in other regards.
Microsoft being shitty notwithstanding…I think you don’t really grasp just how prevalent Microsoft is in the business world - it is not the indicator you think it is.
Too true... even then, there are some MS things I actually like... VS Code and C# at the top of the list. I also like a lot of the things in MS office over alternatives in practice. LibreOffice is just annoying to me every time I use it, and I use it regularly, OnlyOffice has been less reliable still. I still don't equate any of the alternatives to Visio as close to equal despite regularly using them as well.
That said, I emphatically despise a lot of the decision making behind Windows and a lot of MS products... I really wish it was managed/governed more by technical influences than business/fincancial ones in practice. You can see where a lot of the lines are drawn and it's a bit fascinating.
Have a new laptop arriving shortly with enough RAM and storage, that - me being a historically "Windows as primary OS" kind-of-person, with the enshitification of their adding CoPilot to everything and turning Windows 11 into an "agentic" OS, my installation will be Linux-first, and then run Windows via LKVM (hopefully with proper pass-through for TPM + GPU).
Yes - I have "noodled" with Linux in VM's and Raspberry Pi's - but it has never been my primary OS.
> Also, a company requiring to run Microsoft software is probably also a bad place to work in other regards.
This seems like an over generalization, though I agree with your other points. Microsoft is not a good company, but are any of the big tech behemoths?
I could buy an argument that requiring Windows for devs might be a red flag, unless said company is making Windows software or games, but there are plenty of valid reasons to standardize on Windows & Microsoft 365 across the office, especially in very large companies. Even if a company issues macs, they are still probably on M365 unless they are in silicon valley or a startup using Google Workspace.
Consumers aren't Microsoft's customer, and to be honest, I get the vibe that Microsoft would just prefer to stop selling to and catering to consumers/personal users entirely for Windows. Windows in an enterprise, properly reined in by a competent IT department, isn't too bad. Windows gives a lot of tools to IT and the business that you would otherwise have to build yourself, which for non-tech company or a company where software isn't their revenue generating product, has a lot of appeal.
The distaste everyone feels for it is because Windows isn't built for the end user anymore, it's built for the person signing the checks at the company, who usually has different needs. Doesn't mean it's a bad product (although, it's not great), just that you, the user, isn't who its designed for.
Despite Microsoft's behavior and all of Windows' flaws, when properly managed and controlled in an enterprise, it's not so bad, and there's still a ton of software out there that is Windows only.
Where I work now is pretty much like that. Windows on end-user endpoints, Linux everywhere else.
I like this in theory but as someone who travels often with my work laptop, it's nice to be able to use the same hardware for personal use as carrying a second computer is impractical regarding carry weight and packing.
Apple used to allow installing a second copy of MacOS without it being subject to the work profile - completely isolated from the work partition (because you could ignore the "set up work profile" prompts after installation).
I would simply restart my MacBook into the personal install after work & on weekends.
Apple have recently updated the MacOS installer to be always online so I can no longer install a seperate MacOS partition without a work profile.
I ended up buying an ROG Ally but it's honestly not that portable. The power brick is almost the same size as the handheld and it occupies about as much space as a laptop in my carry on.
When I travel for work, I take my work laptop and an iPad in a keyboard case. It’s under 2lbs (0.9kg), it can charge off the same brick as my phone or even pass through charge off my work laptop itself, and keeps me connected to my personal digital life without having to put anything personal on the work machine. It also never raises an eye with security if you have a laptop + iPad.
Usually, the iPad apps are "good enough" (in some ways, they are actually better for travel, as they are designed with features like offline downloads), but if they are not a "real" computer is only a tailscale connection to my home network away over VNC.
Edit: specifically, the iPad + Laptop combo never raises an eye at customs houses. Inside the USA, I've taken as many as 3 laptops for a work trip before, and I can not express how much the TSA does not care. On the other hand, when you go through customs in another country, they can be bit ornery (i.e. suspect you of trying to avoid import tax), so I never want to take more than one laptop through a customs barrier.
p.s. if you want to game in your downtime, such trips are an awesome time to break out the emulator and retro game, an iPad has more than enough power for this, and SNES / d-pad type games work great with a keyboard case as a controller (or, you can just bring a real controller).
All of these gaming laptops really do suck. I feel like these days your better off having a small form factor pc or just remote into your machine from far away.
I never understood the point/market for gaming laptops. They seem popular enough for OEMs to still keep them around, but in almost every way you are much better off with a desktop if you need that much compute.
They can't be used on battery; the discrete GPU will chew through your battery in minutes. They are heavy, loud, hot.
Tried one for a while a long time ago, hated it. I never wanted to bring it anywhere it was so heavy and bulky, so I figured what's the point in having a laptop if I never want to take it with me.
Got a powerful desktop for gaming now, and my portal device is either my iPad or a Macbook air, and I can just remote into my desktop anytime I need.
When I was younger these type of machines were great for me. I usually used them at home but sometimes in my bedroom (aka office) and sometimes in the living room (group games, playing music, just watching TV with the roommates). I would also occasionally take them to school or other people's house (projects, LAN parties).
So it was used primarily like a desktop, and as my only system having power was useful. But the fact that I could put it in my backpack and transport it was super valuable.
Now I do have a more portable laptop and a full desktop setup. But at the time that wasn't the best option.
I actually ended up buying a travel router and 60% of my gaming was done by remoting into my ROG Ally from my work laptop (they didn't block Steam). The remaining 40% of gaming was done plugged into a TV + controller.
For normal browsing I would use RDP - though it would be amazing if Apple supported some kind of displayport in on the MacBook so it could be used as a screen for an external device.
I've been considering selling my Ally and buying a mini PC with a half decent APU as I seldom use it as a handheld.
I'm still using my M1 air for personal use... though I opted for 16gb and 1tb storage, I will wireguard+ssh to my desktop as needed... remote editing in VS Code is nice AF.
Two laptops is easier than you’d think if you have the right bag.
My work lap is so locked down I cannot do anything personal on it, so when I go into the office I always carry two laptops, and the personal one is an old thick heavy dinosaur; it’s got to be at least five pounds. However, with a good bag that has a (non-padded) belt and sternum strap, it is not difficult. The belt carries most of the load and my shoulders don’t hurt; they hardly feel anything.
I deliberately park in the farthest spot at the other side of campus (about a half mile, and up four flights in the garage) to get in exercise steps with the heavy pack.
It’s good exercise but I absolutely need a belt and sternum pack to do it. Wouldn’t dream of trying that with only shoulder straps.
Heh - going on 20+ years, my "running joke" is if the only exercise I truly get is lugging my laptop(s) around (sometimes as many as 3, depending on client-load) + "kit" (Kobo eReader, cables, powerbricks (although if it is an ongoing thing, I leave those onsite or rely on docks), powerbank, and various other gear (occasionally an active "gimbal", occasionally an HT radio + it's gear) - then at least one of them might as well be extremely heavy...
Haven't seen many "laptop-focused" backpacks that have both belts and sternum straps, would love any recommendations.
Tell that to airport check-in staff haha. A laptop and charger are around 3kg and there's only so much clothing I can take out of my suitcase and wear to make it passed check-in.
But I hear you. It's annoying that I can't reuse perfectly good hardware, but it's fine - we make do.
The added scrutiny at some border crossings can be problematic too. Explaining to the inspectors at the Turkey/Bulgaria border why I had two phones and two laptops (and dissuading them of the suspicion that I was smuggling electronics to friends/family) through language barriers was a pain.
I do tell that to airport check-in staff :-) I just take both laptops out. I only do carry-ons and no checked bags and am able to stuff everything needed into one mid-sized tac pack.
> I deliberately park in the farthest spot at the other side of campus (about a half mile, and up four flights in the garage) to get in exercise steps with the heavy pack.
As a side note, this is an excellent habit, sadly I noticed people discover that avoiding effort is not always the best strategy when their muscle mass decreases, and adding elements of strength exercise to their daily routine can be more effective than going to the gym, for various reasons.
If you aren't into gaming at all, you might consider a smaller Macbook Air for personal use... mine is mostly relegated to occasional use unless I'm traveling, where it's mostly email/web use. Small, light, fits my needs and can charge via the same USB adapter I carry for my phone anyway. I have a rather heavy laptop bag so the difference between 1 or two laptops and the portable display isn't that big a difference.
I can do work on the computer running BSD/Linux, save it in a text-only format, transfer it to the work computer then import into Excel, PowerPoint or Word
It's been over 20 years since I had a home computer running Windows (and well over 30 since I've used a mouse)
I think the GP comment is evidence that Microsoft can get away with what it is doing. Even people who can use Linux or BSD will not stop using Windows at home no matter how obnoxious it becomes
There is a substantial difference between complaining and actually taking action and the company seems to recognise that
Same. Work provides the idiot box. I give it its own segmented network too, cause work spyware and all... then run a personal workstation with linux next door to it.
The problem with Linux is that there is no legitimate place to direct your rage at. It is free, nobody owes you anything and every installation is different. When Windows is awful, virtually everyone is being sympathetic. When Linux is awful, there is a genre of people that made using Linux an integral part of their identity, that will explain to you how your frustrations are really your own personal failures.
I'm slowly moving away from the Apple ecosystem, and this is what I rather like about Linux. I find it obviates the anger — there's no specific entity making decisions that make my user experience worse. If something's annoying me, it's quite likely to be my own fault.
You could argue that, with Windows there is a legitimate place to direct your rage at, but the action of directing your rage does not actually have any effect on improving your experience. With Win and Mac, no one cares, because they already have their customers locked in and tight, they will accept any experience degradation. With Linux, you are not a customer so no customer complaints, but still arguably much better support.
If you come at it like a sinner asking for penance, the englightened may come to guide, but that's not what I'm talking about. If you to rage, these same people will become inquistors. Rage isn't all about solving a problem, it's about catharsis. It's not so much about technical support, it's about emotional support. A bad design decision (like the GNOME desktop redesign) is not a technical problem. It's not a bug, it's a feature.
Agreed. And also, if there's something you don't like or a project going in a direction you don't agree with, there is virtually guaranteed to be other people out there that feel the same that are building something different
> When Linux is awful, there is a genre of people that made using Linux an integral part of their identity, that will explain to you how your frustrations are really your own personal failures.
There are also people who often claim that their installation of Linux always crashes after every single update, their favourite commodity hardware that's a decade old still doesnt work out of the box on Linux etc etc.
The truth is somewhere in between and its a lot closer to the positive experience these days compared to the old days.
> When Linux is awful, there is a genre of people that made using Linux an integral part of their identity, that will explain to you how your frustrations are really your own personal failures.
On the one hand, yes, this is not a nice thing to have happen. The frustrations shouldn't happen to begin with, and then people shouldn't be using the reverse Uno card on you just for that.
On the other hand, Linux has a lot fewer of these frustrations (in my experience), and a lot of frustrations are being fixed with time, since you're likely not the only one who is frustrated by it.
On the third hand, the situation being shit for obvious human reasons, not enough dev time, disagreements about the way forward, as is the case with Linux development, is a much, much nicer thing to have your problems caused by, rather than the source of Windows being shit, that is, someone wasn't happy with their dashboard this morning and decided to make that your problem today.
You can always buy someone to direct your rage at if you are a business and wanting to deploy Linux though. Red Hat, Suse, Canonical will all happily sell you support contracts and guarantees.
Idk. My main frustration with Linux has nothing to do with the OS itself. Linux is pretty good actually. My main frustration has to do with software that doesn't run on Linux that I have to use occasionally. So things that force me not to use Linux. But that has gotten much better over the years.
And meanwhile my Windows and MacOS experience has gotten much worse. So I feel pretty good with using Linux as my daily driver for the past 6 years.
I raged a lot when my Arch machine would break after an update and I'd have to do config file surgery on a machine that no longer wanted to boot into a graphical desktop. I've never had that sort of thing happen on Mac or Windows.
Well, that's definitely on you. Arch do warn people to actually read the changelogs if you're going to update/upgrade everything. Whenever I've hit a problem with an Arch machine (I think it's only twice), it was written quite clearly in the update notes along with the fix.
It's actually surprising just how stable Arch Linux can be considering that it's typically using the newest code for everything. If you really want Arch and stability, maybe using something like SteamOS would be better - Arch, but designed to be stable.
> Well, that's definitely on you. Arch do warn people to actually read the changelogs if you're going to update/upgrade everything.
"There’s no point in acting surprised about it. All the planning charts and demolition orders have been on display at your local planning department in Alpha Centauri for 50 of your Earth years, so you’ve had plenty of time to lodge any formal complaint and it’s far too late to start making a fuss about it now."
It's a while since I used Arch (apart from my Steam Deck, but that's a bit different as it's curated and has a read only root filesystem by default), so I've had a look around and I think I meant reading the "Latest News" at https://archlinux.org/
e.g.
> NVIDIA 590 driver drops Pascal and lower support; main packages switch to Open Kernel Modules
> 2025-12-20
> With the update to driver version 590, the NVIDIA driver no longer supports Pascal (GTX 10xx) GPUs or older. We will replace the nvidia package with nvidia-open, nvidia-dkms with nvidia-open-dkms, and nvidia-lts with nvidia-lts-open.
> Impact: Updating the NVIDIA packages on systems with Pascal, Maxwell, or older cards will fail to load the driver, which may result in a broken graphical environment.
> Intervention required for Pascal/older users: Users with GTX 10xx series and older cards must switch to the legacy proprietary branch to maintain support:
> Uninstall the official nvidia, nvidia-lts, or nvidia-dkms packages.
> Install nvidia-580xx-dkms from the AUR
> Users with Turing (20xx and GTX 1650 series) and newer GPUs will automatically transition to the open kernel modules on upgrade and require no manual intervention.
Personally, I used to just run an upgrade and then go look for known problems if pacman threw an error. Of course, the recommendation is to have a good backup before running the upgrade and just roll it back if it has issues (then read the notes).
> Before upgrading, users are expected to visit the Arch Linux home page to check the latest news, or alternatively subscribe to the RSS feed or the arch-announce mailing list. When updates require out-of-the-ordinary user intervention (more than what can be handled simply by following the instructions given by pacman), an appropriate news post will be made.
Yeah, I stopped using it myself as I didn't really need a bleeding edge system. It's actually surprising just how reliable Arch is - I think if you want to run it in production system, you don't bother doing a system upgrade without testing it first.
I do like the Arch wiki though - probably the best source of information on Linux tools etc.
sudo pacman -Syu. -> Secure boot config broken, OS won't boot (Manjaro this summer with some Intel firmware update).
No HDMI sound on nvidia for some distros until recently.
Getting the Wifi to work ootb on Mint is not always easy..
That's your problem right there. EndeavourOS is also a beginner-friendly Arch derivative but less breaky.
> Wifi to work ootb
I definitely feel you on that one, it's just the luck of the draw sometimes... If you haven't considered it, in some laptops the wifi module is a replacable mPCIe or m2 module and if that's the case, more compatible replacements shouldn't be hard to find for cheap or salvaged from broken laptops.
I'm using Debian an when working for a client that requires Windows, I'm working in a VirtualBox with Windows Server 2022 as my desktop OS.
It works really well (running mainly Visual Studio) and licenses are pretty cheap. But the best part is, that there are no ads and other Windows 11 Copilot nonsense.
> Windows 95. No upselling services. No automatic updates
Even Windows 95 came bundled with MSN on the desktop which had a paid monthly fee to access. And its lack of automatic updates was a real problem, as you had to manually find the service packs and security patches. The automatic updates in Windows XP were vastly more convenient.
Automatic updates are needed for security. The only era when you didn't need them was pre-Internet. They're not something we want to get rid of.
> Automatic updates are needed for security. The only era when you didn't need them was pre-Internet. They're not something we want to get rid of.
That was true right up until companies started routinely pushing updates that broke things, removed useful features, added user hostile features, or even outright ads. If I have to give up automatic security updates to not have my software get worse on me over time, I will gladly do so. I would rather have security updates and not have the user-hostile stuff, but we seem to be unable to get that, so the next best thing would be no automatic updates at all.
Installing Internet Explorer 4 on Windows 95 opened up the first version of Windows Update, when it started as a web app with some custom ActiveX plugins. Windows 98 was the first time Windows Update had a bundled link in the OS, and shortly after Windows 98 introduced a "Critical Update" notification that would prompt users to open Windows Update.
Automatic updates arrived in Windows ME.
It's interesting the timeframes on Windows are often earlier than you think they are. Admittedly, a lot of users skipped Windows ME and its strange reputation, so Windows XP may have been their first time seeing automatic updates.
I know you won't believe me, and my precious karma score may suffer by stating reality: you don't NEED security updates. A properly hardened server with no patches will outlive cobbled together trash library patch over garbage code pasted from ai vibing script kiddies. Would you shake your head in disbelief if I told you 'security patches' are the fix delivered by a dealer to quell your shivers?
Give me functionality updates, cumulative service packs, and the just after BBS days when an exploit discovered in your software meant it was used by no one, anywhere, because we no longer trust your coding or your 'fix'
Nobody's talking about "properly hardened servers" here. We're talking about the OS used on desktops and laptops by everyday consumers, connecting to the Internet across a wide variety of Wi-Fi access points.
Do you not see the constant stream of zero-day exploits coming out for consumer operating systems? Do you think those don't need to be fixed?
I'm genuinely curious -- I've never come across anyone with your perspective before, so I'm struggling to understand where it's coming from.
Usually i post and forget but your reasonable reply prompted some effort on my part.
I live life so that at any moment, if modern services of society (food, internet, power, shelter, entertsinment, transport, personal defense) ended, and I was forced to use what I had access to, that my quality of life would persist. Besides physical considerations (hydroponics, solar, guns, hardened vehicles), I maintain nonvolitile backups of the same software I use daily - vanilla(unpatched) OSs from xp to 11), current and older browsers, non-ssl based content and servers, games, music, movies, hoards of older hardware in a cage that may may an emp.. never tested it. Anything computer related I have works from a bare metal install with no internet connection period.
I use the same retail desktop, laptop, wifi, cellular, and wan hardware used by most consumers but only if I can reset and inialize it offline, and can use the built in firewall to restrict outgoing connections to a single executable single port whitelist including my phone. Which means no nags, no updates, no new features, no removed features, no app stores, no federated os logins, no new terms of service, and no telemetry unless I choose to connect that program to the internet and the program is flexible enough to use a single port.
Zero day exploits won't work on my android 11 s9 with no play services, deny all firewall, and non standard chrome build. In app browser updates don't work until I manually install the binary, most AI features are broken by default even on my win11 laptop.
It's not an easy life. But if you insist that software and hardware do what you wish, your actions should back that. My actions probably more than most. I pass on a decent amount of IT gigs because they require app tracking or that I use their monitoring software, or vpn... but everything I have I KNOW I control now and until it stops working and I buy two more identical and grossly obsolete replacements.
Thank you for explaining! I didn't see this until now. I'm incredibly impressed, that's wild. It does sound like a lot of work. I see why someone like you doesn't need the updates, so thank you for that enlightenment. I think the average consumer still does, of course, but you make a great point that there are ways to protect yourself if you have the expertise and are willing to put in the work.
The internet was a big part of it. Most home users did not have internet access in the System 7 days. When it came out in 1991 no country had more than 1% of its population with internet access. By the time Windows 95 came out around 10% of US users had internet access.
It wasn't until 2001 that the US reached 50% of users having internet access.
Without internet there wasn't really a good way to distribute updates to most users.
As a developer in that era working at a company that made software for PCs and Macs it was great. It meant that the way most users would get our software was buying it on floppy disk (or later CD) from a retail software store like CompUSA or Egghead.
We'd only make more money from someone who bought our software if that software made a good enough impression that they bought more of our software. We'd lose money if any software went out with enough bugs or a confusing enough interface or a poorly enough written manual that a lot of people made a lot of calls to our toll free tech support.
This was great because it largely aligned what developers wanted to do (write a feature complete program with a great UI and no bugs) and what management wanted (happy users who do not call tech support).
With internet giving us the ability to push updates at almost zero cost and as often as we want people who release incomplete programs early and add the missing parts in updates are going to outcompete people who don't release until the program is complete and nearly bug free.
Once you get there it is not much of a leap to decide that what you are really selling is not software to do X but rather the service of providing software to do X. Customers subscribe to that service and you continuously improve its ability to do X.
> It meant that the way most users would get our software was buying it on floppy disk (or later CD) from a retail software store like CompUSA or Egghead.
On the topic of Windows, this is why Microsoft's commitment to backwards compatibility was and is such a huge deal.
It wasn't so easy to just update your software if Windows ever made breaking changes, and your users would, rightly, be pretty ticked off if suddenly what they bought no longer works because they upgraded from Win 95 to 98, or 98 to XP.
You had confidence that you could buy a program once, and it'll just happily continue to run for the foreseeable future.
This also made businesses happy. If you liked a particular version of a software product, you bought it, ran it on Windows, and could rest easy knowing it'll just continue to work through version upgrades of the OS.
I stopped using Windows over 15 years ago and moved to Ubuntu that was running all the servers. Unfortunately Ubuntu decided to do the same garbage trying to shove their pro crap down my throat, made it impossible to remove (by making a desktop requirement) and resorted to the game of trying to re-enable it during updates
I finally moved everything to just Debian itself that never nags me and just works with everything I need, including games (thanks to steam)
Only time I boot a Win10 VM is to compile apps for for windows, otherwise it has zero use or need anymore
I supported Ubuntu when they started but gave up on them after they sent people's local file searches to third parties so they could push amazon ads. They're totally corrupted as far as I can tell.
I too remember the days when every unpatched Windows PC was a member of a botnet. Perhaps less fondly than you.
And thankfully this was before a time when everyone’s computers and phones had access to their bank accounts, credit cards, and before email was the gateway to virtually your entire life.
Most of your account's comments in the 13 days since it was registered have been flamebait, fulminating or trolllish, and are being flagged by other community members. Please stop this style of commenting or we'll have to ban the account. HN is only a place where people want to participate because others make the effort to raise the standards. For accounts that are dragging the standards down, sooner or later we have to do what most of the community expects of us, which is to uphold the guidelines and ban accounts that continue to post in this style.
Please spare us the defiant strutting. This is a discussion about operating system upgrades. Whether I "like" what you're posting or not hasn't even crossed my mind. The guidelines apply regardless of your position. This place has happily existed for nearly two decades as a place where anything can be disagreed about, precisely because we have guidelines that keep people focused on curious conversation rather than flamewar and personal attack. You're welcome to go elsewhere if our ways are not to your taste.
I remember installing plain Windows XP at a time when Service Pack 3 had already been released. Since I had only recently gotten cable internet, it didn’t cross my mind to disconnect the network cable, and my PC got owned almost immediately. IIRC, some dialog just popped up as an artifact of a successful penetration, right after the network connection was established - before I even managed to insert the SP3 CD. So it was pretty bad for a while.
> According to the researchers, an unpatched Windows PC connected to the Internet will last for only about 20 minutes before it's compromised by malware, on average. That figure is down from around 40 minutes, the group's estimate in 2003.
This was from two decades ago, and cursory searching suggests the average lifetime of an unpatched system is even lower now.
The real problem was pre-Windows XP. Anyway, just because you failed your assignment doesn’t mean it wasn’t a real problem. You should probably trust actual IT administrators over your experience as a college student.
I miss when I felt that personal computers were a new wave of democratized capital, a kind of affordable factory for individual owners to use pursuing their own autonomy and power... and not just for programmers.
I underestimated the economic forces trying to turn them into devices for enforcing the interests of a large company onto the owner and turning the owner into a renter.
Windows XP sold for $200 in 2001. In 2025, that's $364[1]. If we can find enough people willing to pay $364 for an OS that values privacy and doesn't push needless upgrades, that'll be a start. But XP itself was probably priced based on the belief that people would be upgrading in a few years to Windows Vista. So we might need more than that.
[1] - According to minneapolisfed.org, which uses the official economist-approved inflation rates. Not that I'm implying that there's anything wrong with that. I have all of the orthodox beliefs about inflation that a good citizen should have.
Windows 11 Pro is still $200 [1]. Of course most people don't pay anything directly as it's bundled with their PC and they won't think to question why that is.
> Windows XP sold for $200 in 2001. In 2025, that's $364
I assume you used the overall CPI rate rather than the software rate. but using the Software CPI its more like $58. and that seems like an easier sell (for the user, maybe not the developer).
Hardware support isn't all that bad anymore. Certainly better than it was when I started using Linux.
It isn't perfect. You'll probably have a better experience with AMD than Nvidia GPUs, most fingerprint readers probably won't work, and newly released hardware might not have drivers for a few months, but most stuff just works.
> I remember the days of Macintosh System 7 and Windows 95. No upselling services. No automatic updates. No nagging. You turned your computer on, executed programs, and that was it.
I 'member the days of Win 98, Win ME and Win XP... made good money cleaning up malware - browser toolbars, dialers, god knows what - from computers. Some came from the hellholes that were Java, ActiveX or Flash, some came from browser drive-by exploits served from advertising networks, but others just came from computers that were attached directly to the Internet from their modems.
And I also 'member Windows being prone to crashes, particularly graphics drivers, until Windows 7 revamped the entire driver model.
Oh, and (unrelated) I also 'member websites you could use to root a fair amount of Android and Apple phones.
All of that is gone now, it has gotten so, so much better thanks to a variety of protection mechanisms.
Security and upselling are orthogonal; I can make a secure operating system that doesn’t notify the user of OneDrive, iCloud, and other services.
Things get more nuanced when we talk about other types of notifications and about whether updates should be automatic or always require a user’s explicit consent. I personally believe that a key tenet of personal computing is that the owner of the computer, not the hardware or software vendor, should have full control over the hardware and software on the computer. This control is undermined when systems are designed in ways to give users less control. There may be legitimate security benefits to mandatory automatic updates, for example, but there are risks, such as buggy updates leading to broken installations or even lost data, and there’s also having to deal with unwanted UI/UX changes.
As a power user, developer, and researcher, I want control over my computing environment. Unfortunately Windows and macOS have been trending toward more paternalism, more nagging, and more upselling. Thankfully Linux exists, but at the cost of needing to switch away from convenient proprietary software tools like Microsoft Office. I can do without Word or Excel, but PowerPoint is what keeps me on Office (I’ve tried LibreOffice and the Beamer LaTeX template). I’m also concerned about hardware getting increasingly locked down, which will hurt Linux.
I had the same reading, it sounded like Windows is worse now than Windows 95, which would be a hot take indeed. But it seems the intent was purely on these nagging aspects which have definitely gotten worse.
It might be easier to swallow the message focusing on Windows 8+ when it really jumped the shark. Windows 7 was a pretty good OS holistically I think even if there are aspects lost compared to the pure simplicity of those really old ones.
I know it goes against the grain here; but so what. It's the users prerogative to do with their device, what the wish. Nag for security updates, sure. But automatic updates of anything is user hostile and should be abolished. Especially when those automatic updates remove features or introduce a shit ton of new bugs.
Problem is the history o people failing to patch causing widespread Internet outages, such as via SQL Slammer; a SQL Server patch had been available for six months to protect against the vulnerability. Microsoft learned the lesson that users, even the “professional” ones that should know better, fail to patch, which brings us to the current automated patch situation.
> It's the users prerogative to do with their device, what the wish.
The problem is, users are still part of the Internet. And historically, users haven't taken care about update nags, that's how we ended up with giant ass botnets.
The size of the botnets and raw bandwidth they have access to now is staggering. (DDoS, "Residential Proxies", ”Anti-Censorship VPNs”, etc. All just compromised residential devices.
Can you provide some details on the reasons for needing MS Offfice? I'm genuinely curious. What does LibreOffice do differently that makes it a problem for you to use? Personally my only complaint is the performance of LO, which could be better.
I'm not GP but I do know it's rare to open an existing .docx in LibreOffice and have it look right; who knows what it looks like in Word after I've edited and saved it. It's fine creating new documents, and Excel/Calc is better than Word (inherent in being more structured I suppose), but it's not a drop-in replacement. I've used web Office365 when necessary though, not Windows.
> I do know it's rare to open an existing .docx in LibreOffice and have it look right; who knows what it looks like in Word after I've edited and saved it.
This is not true except as hyperbole. Most docx open and let themselves edit quite well in LibreOffice Writer, and they look right.
However, you still have a point. There are always some cases when the compatibility is not good, and the only way to use said docx files would be in MS Word.
I submit most people had better luck than you and stayed silent. The fringe cases are the most vocal ones in their complaints. Of course, this is not implying that your experience is not true or less valid.
Excel I'd argue is the primary reason for most in the business world.
What LibreOffice misses, and sheets to a lesser extent, is that Excel isn't just a spreadsheet app. It's a general-purpose programming environment for non-devs (although, at a certain point, you could argue they are effectively programming even if they don't see it that way).
Yeah, there are better solutions. At a certain level of complexity, you probably shouldn't be using Excel and should switch to Python+some SQL database, but there's something to be said about the visual environment Excel provides.
> It's a general-purpose programming environment for non-devs
Google sheets's programmability is way better (than the last time I used) Excel, with direct support for python, which Gemini can write just fine. It's a bit fiddly in places, I'll admit, but Google sheets is definitely a programming environment.
Do not connect it to the internet. Problem solved.
Basically anything in a social network needs to learn to defend itself against threats. Make computer a hermit, and it can go without updates for a long time.
(Oh, but you don't like that? Well, Microsoft doesn't like getting in the news for some worldwide botnet of all Windows 10 machines. I bet they'll figure this out sooner or later.)
Microsoft Office somewhat works in the browser. Certainly good enough for me, although 99% of my actions is upload document to onedirve, open it in web MS Office version, export to pdf and then read with standard tools.
> Microsoft Office Online works fine on Linux. In fact, it’s superior to native MS Office in terms of stability.
It may work for your case - good. Many companies have custom VBA macros that runs on their Excel sheets to get data or validate it. Try to use a document like this on your online Office and you will understand why most Office users can't easily migrate.
What kills me is there seems to be no option for accounting that is acceptable to CPAs besides being held captive paying whatever QuickBooks cloud demands. It's not like dual entry accounting has changed much in 500 years. There are bank integrations and service contracts (notably Apple Card wasn't willing to pay licensing fees for the quickbooks file format, so you simply couldn't syncronize your accounts with your spending, instead falling back to manual import), but they would not make investors happy by merely offering bank connection services
(God forbid banks be required by law to offer a web connector that allows you to request your own data. A workaround I've tried is to have my bank send me an email alert on every transaction over a penny, so at least I have a record, but never got around to setting up an auto import from my inbox)
I've heard that many times, but the 3 accounting firms I've worked with for my business didn't care what accounting software I used. They were all happy to work with Gnucash so long as I could provide the needed reports, all of which were pre-configured in Gnucash. Two were small firms, but one was part of a major national accounting firm/franchise.
> I miss the days when personal computers were simply tools, akin to pencils and handheld calculators.
> System 7 and Windows 95
If Windows 95 was the complexity level of a pencil to you, Win 10/11 is merely a color pencil. You should be fine getting rid of the nagging and adapting it to your needs, it hasn't become 10x or 100x more complex, merely incrementally more.
> Microsoft [...] not exploiting their platforms.
That's a phrase I didn't expect. What part of Microsoft do you feel was leaving money on the table, as they were sued by basically the whole globefor their business practices ?
Is it dead because people don’t want the desktop, or is it dead because Big Tech won’t invest in the desktop beyond what’s necessary for their business?
Whether intentional or not, it seems like the trend is increasingly locked-down devices running locked-down software, and I’m also disturbed by the prospect of Big Tech gobbling up hardware (see the RAM shortage, for example), making it unaffordable for regular people, and then renting this hardware back to us in the form of cloud services.
Desktop is all about collaboration and interaction with other apps. The ideal of every contemporary SaaS is that you can never download your "files" so you stay locked in.
Digital sovereignty is woefully undertaught. Things like FOSS software, cryptography and its many uses, digital rights management, ownership rights, right to repair, etc. We are turning computers into monkey-friendly appliance devices, when we should be molding tiny humans into digitally sovereign supermonkeys on advanced universal computation devices. How does anyone graduate high school not having heard of the Diffie-Hellman Key Exchange? In my ideal, that would not happen. Students taught properly in digital sovereignty should be extremely difficult to surveil or control digitally with almost any kind of local, network, or service lock. This is a good thing, we want digital barbarian warriors and not digital slaves.
But outside of that I doubt there will be many users actually doing stuff (as opposed to just ingesting content) that will abandon desktop, and other ones like Mac UI isn't getting worse
I enjoyed this talk, and I want to learn more about the concept of “learning loops” for interface design.
Personally, I wish there were a champion of desktop usability like how Apple was in the 1980s and 1990s. I feel that Microsoft, Apple, and Google lost the plot in the 2010s due to two factors: (1) the rise of mobile and Web computing, and (2) the realization that software platforms are excellent platforms for milking users for cash via pushing ads and services upon a captive audience. To elaborate on the first point, UI elements from mobile and Web computing have been applied to desktops even when they are not effective, probably to save development costs, and probably since mobile and Web UI elements are seen as “modern” compared to an “old-fashioned” desktop. The result is a degraded desktop experience in 2025 compared to 2009 when Windows 7 and Snow Leopard were released. It’s hamburger windows, title bars becoming toolbars (making it harder to identify areas to drag windows), hidden scroll bars, and memory-hungry Electron apps galore, plus pushy notifications, nag screens, and ads for services.
I don’t foresee any innovation from Microsoft, Apple, or Google in desktop computing that doesn’t have strings attached for monetization purposes.
The open-source world is better positioned to make productive desktops, but without coordinated efforts, it seems like herding cats, and it seems that one must cobble together a system instead of having a system that works as coherently as the Mac or Windows.
With that said, I won’t be too negative. KDE and GNOME are consistent when sticking to Qt/GTK applications, respectively, and there are good desktop Linux distributions out there.
It's because companies are no longer run by engineers. The MBAs and accountants are in charge and they could care less about making good products.
At Microsoft, Satya Nadella has an engineering background, but it seems like he didn't spend much time as an engineer before getting an MBA and playing the management advancement game.
Our industry isn't what it used to be and I'm not sure it ever could.
I feel a major shift happened in the 2010s. The tech industry became less about making the world a better place through technology, and more about how to best leverage power to make as much money as possible, making a world a better place be damned.
This also came at a time when tech went from being considered a nerdy obsession to tech being a prestigious career choice much like how law and medicine are viewed.
Tech went from being a sideshow to the main show. The problem is once tech became the main show, this attracts the money- and career-driven rather than the ones passionate about technology. It’s bad enough working with mercenary coworkers, but when mercenaries become managers and executives, they are now the boss, and if the passionate don’t meet their bosses’ expectations, they are fired.
I left the industry and I am now a tenure-track community college professor, though I do research during my winter and summer breaks. I think there are still niches where a deep love for computing without being overly concerned about “stock line go up” metrics can still lead to good products and sustainable, if small, businesses.
In the 80s and 90s there was much more idealism than now. There were also more low hanging fruit to develop software that makes people’s lives better. There was also less investor money floating around so it was more important to appeal to end users. To me it seems tech has devolved into a big money making scheme with only the minimum necessary actual technology and innovation.
This is not true for the vast majority of people making these things. At some point, most businesses go from “make money or die” to financial security: “make line go up forever for no reason”.
I bet the vast majority of people making things also want cutting edge healthcare for themselves and loved ones, for their whole life, which is equivalent to make money or die.
I would agree that it was different, but I also think this may be history viewed through rose-tinted glasses somewhat.
> There were also more low hanging fruit to develop software that makes people’s lives better.
In principle, maybe. In practice, you had to pay for everything. Open source or free software was not widely available. So, the profit motive was there. The conditions didn’t exist yet for the profit model we have today to really take off, or for the appreciation of it to exist. Still, if there’s a lot of low-hanging fruit, that means the maturity of software was generally lower, so it’s a bit like pining for the days when people lived on the farm.
> There was also less investor money floating around so it was more important to appeal to end users.
I’m not so sure this appeal was so important (and investors do care about appeal!). If you had market dominance like Microsoft did, you could rest on your laurels quite a bit (and that they did). The software ecosystem you needed to use also determined your choices for you.
> To me it seems tech has devolved into a big money making scheme with only the minimum necessary actual technology and innovation.
As I said earlier, the profit motive was always there. It was just expressed differently. But I will grant you that the image is different. In a way, the mask has been dropped. When facebook was new, no one thought of it as a vulgar engine for monetizing people either (I even recall offending a Facebook employee years ago when I mentioned this, what should frankly have been obvious), but it was just that. It was all just that, because the basic blueprint of the revenue model was there from day one.
As a private individual, you didn't actually have to pay for anything once you got an Internet connection. Most countries never even tried enforcing copyright laws against small fish. DRM was barely a thing and was easily broken within days by l33t teenagers.
Things like hypertext, search, email and early social networks (chat networks connecting disparate people) and also the paperless office (finally). Images and video corrupted everything as they now became that which addicted eyeballs.
I think you may be looking at history through rose-tinted glasses. Sure, social media today is not the same, so the comparison isn’t quite sensible, but IRC was an unpleasant place full of petty egos and nasty people.
A trope in the first season of HBO’s Silicon Valley is literally every company other than the main characters professing their mission statement to be “Making the world a better place through (technobabble)”
The subtle running joke was that while the main characters technobabble was fake, every other background SV startup was “Making the world a better place through Paxos-based distributed consensus” and other real world serious tech.
I have heard a big factor is that a lot of the newer devs don’t really use desktop OS outside of work. So for them developing a desktop OS is more of an abstract project like for me developing software for medical devices which I never use myself.
People who got into software development not because they enjoy working with computers, but rather because it pays well. Outside of work, they're the same as any other casual who's got a phone as their primary computing device.
Also people who now have other commitments, such as family, or became tired of computers over their career and don't want to fiddle with them outside of work anymore. I feel like an outlier in my office, even the nerdiest of my developer colleagues sold his PC in favor of Steam Deck and phones.
> any other casual who's got a phone as their primary computing device.
I tried to use my phone as a "computing device", but i mostly can use it as a toy. Working with text and files on a phone is... how to say nicely ... interesting.
I feel the same way, but if you're a person who doesn't deal with files outside of work (photos are in the photo app, notes in the notes app), and don't deal with text beyond messaging and short notes, having those things be easier to work with is a bit like selling fridges northern Canada.
Somehow I just thought about this on VS Code. Native windows has the top command bar. One of Linux VMs do not... It saves entire line of text... As if I would care about that much vertical space on modern screen.
On other hand. Vivaldi I am trying on phone has this stupid thick bar at bottom on my Android. With essentially bookmarks, back, home, forward and tabs buttons... Significantly more taking visual space...
Some job positions are so competitive to get that a candidate with good data structures and algorithms skills but who hasn’t seen a specific LeetCode problem before and needs to solve it on the spot may lose out to a candidate who “grinded LeetCode.” It’s kind of like how a good student still needs to prep for standardized tests.
I wonder if part of the problem is the lack of color in these examples? I remember Microsoft Office 97 and 2000, which had icons in their menus (albeit only for a few actions, not for every action). However, those icons were colored and appeared visually distinct from each other.
Yesterday I booted my 350MHz Power Mac G4 for the first time in 13 years. I booted into Mac OS 9.2.2. I remember the Apple menu having icons for every item. Once again, though, every icon was in color.
And the loss of skeuoumorphism. As much as designers chide it, skeuoumorphic interfaces are, when done well, a massive improvement in usability compared to flat/monochrome ones, both for new and experienced users.
It's not really visual "clutter", the shadows / pseudo-3d elements help the brain distinguish between different types of elements, providing contextual information.
Yesyesyes this here. Icons need colors, the smaller the more. Otherwise, they might as well be gray blobs. Peripheral vision works with colors, but it doesn’t do finer details.
rant:
But in the end, user interfaces are mostly “dead” anyway. No more structure, no more colors, no more icons. Everything is a flat sea of labels and boxes (or sometimes even just lines) floating(!) around. And no two user interfaces use the same style, even from the same vendor.
I believe Microsoft Office 97 for Windows was the first time I saw icons next to menu items. Office 97 had highly customizable menus and toolbars. Each menu item and toolbar item could be thought of as an action with an icon and a label, and that action could be placed in either a menu or a toolbar. Not every menu item had an icon associated with it. Additionally, each icon was colored and was clearly distinct.
Office 97 went pretty overboard on customization. It could be awesome if you know what you're doing, but I saw countless examples of where somebody had accidentally changed something and got stuck. Deleted the file menu? tough luck!
This is definitely where I would this pattern - MS Office 97’s customizable toolbars necessitated this model where every single thing you could do in the application had an icon.
It then got copied into Visual Studio, where making all of the thousands of things you could do and put into custom toolbars or menus have visually meaningful icons was clearly an impossible task, but it didn’t stop Microsoft trying.
I assume Adobe, with their toolbar-centric application suite, participated in the same UI cycle.
By the time of Office 2007 Microsoft were backing off the completely customizable toolbar model with their new ‘Ribbon’ model, which was icon-heavy, but much more deliberately so.
I still regard Office '97 as the best UI it ever had. I spent a lot of time inside it, including a couple of years at a bank reconciling corporate actions before I got my first programming job. The ribbon version was awful in comparison.
Also, the Excel Labs formula editor. But it needs a way to tell it "I know I have too many cells! Just let me trace over the 100 nearest rows."
The old scripting language can still be handy if you can keep people from opening the online version of Excel. Especially if you have a certain debugger addin[1]. Excel's JavaScript features are of limited use, if you're offline.
I keep wishing for a spreadsheet to implement all its scripting and formulas in something like Forth behind the scenes, so that every time a competitor announces n-more functions, we can just be like "Oh, really?" and add it.
[1] Related to waterfowl of the plasticised yellow variety. I'm not sure I can mention the name in a post anymore, since ages ago when I tried multiple times to post a properly-referenced (overly-hyperlinked?) message while my connection was very flaky. Note to self: should probably mail dang about this, some day.
An interesting thought experiment is what Stallman would’ve done in that alternate timeline in the late 1980s and early 1990s when Lisp machines were killed off by advances in commodity hardware and compiler technology, the end of the Cold War (the US military was a large customer of Lisp machines), and the AI winter (Lisp used to be synonymous with AI).
reply