Mozilla begs courts to allow Google search deal for Firefox to continue

The moment a lot of us has been fearing may be soon upon us. Among the various remedies proposed by the United States Department of Justice to address Google’s monopoly abuse, there’s also banning Google from spending money to become the default search engine on other devices, platforms, or applications. “We strongly urge the Court to consider remedies that improve search competition without harming independent browsers and browser engines,” a Mozilla spokesperson tells PCMag. Mozilla points to a key but less eye-catching proposal from the DOJ to regulate Google’s search business, which a judge ruled as a monopoly in August. In their recommendations, federal prosecutors urged the court to ban Google from offering “something of value” to third-party companies to make Google the default search engine over their software or devices. ↫ Michael Kan at PC Mag Obviously Mozilla is urging the courts to reconsider this remedy, because it would instantly cut more than 80% of Mozilla’s revenue. As I’ve been saying for years now, the reason Firefox seems to be getting worse is because of Mozilla is desperately trying to find other sources of revenue, and they seem to think advertising is their best bet – even going so far as working together with Facebook. Imagine how much more invasive and user-hostile these attempts are going to get if Mozilla suddenly loses 80% of its revenue? For so, so many years now I’ve been warning everyone about just how fragile the future of Firefox was, and every one of my worries and predictions have become reality. If Mozilla now loses 80% of its funding, which platform Firefox officially supports do you think will feel the sting of inevitable budget cuts, scope reductions, and even more layoffs first? The future of especially Firefox on Linux is hanging by a thread, and with everyone lulled into a false sense of complacency by Chrome and its many shady skins, nobody in the Linux community seems to have done anything to prepare for this near inevitability. With no proper, fully-featured replacements in the works, Linux distributions, especially ones with strict open source requirements, will most likely be forced to ship with de-Googled Chromium variants by default once Firefox becomes incompatible with such requirements. And no matter how much you take Google out of Chromium, it’s still effectively a Google product, leaving most Linux users entirely at the whim of big tech for the most important application they have. We’re about to enter a very, very messy time for browsing on Linux.

Leaving big tech behind: Murena’s /e/OS on the Fairphone 5

There are so many ecological, environmental, and climate problems and disasters taking place all over the world that it’s sometimes hard to see the burning forests through the charred tree stumps. As at best middle-income individuals living in this corporate line-must-go-up hellscape, there’s only so much we can do turn the rising tides of fascism and leave at least a semblance of a livable world for our children and grandchildren. Of course, the most elementary thing we can do is not vote for science-denying death cults who believe everything is some non-existent entity’s grand plan, but other than that, what’s really our impact if we drive a little less or use paper straws, when some wealthy robber baron flying his private jet to Florida to kiss the gaudy gold ring to signal his obedience does more damage to our world in one flight than we do in a year of driving to our underpaid, expendable job? Income, financial, health, and other circumstances allowing, all we can do are the little things to make ourselves feel better, usually in areas in which we are knowledgeable. In technology, it might seem like there’s not a whole lot we can do, but actually there’s quite a few steps we can take. One of the biggest things you, as an individual knowledgeable about and interested in tech, can do to give the elite and ruling class the finger is to move away from big tech, their products, and their services – no more Apple, Amazon, Microsoft, Google, or Amazon. This is often a long, tedious, and difficult process, as most of us will discover that we rely on a lot more big tech products than we initially thought. It’s like an onion that looks shiny and tasty on the outside, but is rotting from the inside – the more layers you peel away, the dirtier and nastier it gets. Also you start crying. I’ve been in the process of eradicating as much of big tech out of my life for a long time now. Since four or five years ago, all my desktop and laptop PCs run Linux, from my dual-Xeon workstation to my high-end gaming PC (ignore that spare parts PC that runs Windows just for League of Legends. That stupid game is my guilty pleasure and I will not give it up), from my XPS 13 laptop to my little Home Assistant thin client. I’ve never ordered a single thing from Amazon and have no Prime subscription or whatever it is, so that one was a freebie. Apple I banished from my life long ago, so that’s another freebie. Sadly, that other device most of us carry with us remained solidly in the big tech camp, as I’ve been using an Android phone for a long time, filled to the brim with Google products, applications, and services. There really isn’t a viable alternative to the Android and iOS duopoly. Or is there? Well, in a roundabout way, there is an alternative to iOS and Google’s Android. You can’t do much to take the Apple out of an iPhone, but there’s a lot you can do to take the Google out of an Android phone. Unless or until an independent third platform ever manages to take serious hold – godspeed, our saviour – de-Googled Android, as it’s called, is your best bet at having a fully functional, modern smartphone that’s as free from big tech as you want it to be, without leaving you with a barely usable, barebones experience. While you can install a de-Googled ROM yourself, as there’s countless to choose from, this is not an option for everyone, since not everyone has the skills, time, and/or supported devices to do so. Murena, Fairphone, and sustainable mining This is where Murena comes in. Murena is a French company – founded by Gaël Duval, of Mandrake Linux fame – that develops /e/OS, a de-Googled Android using microG (which Murena also supports financially), which it makes available for anyone to install on supported devices, while also selling various devices with /e/OS preinstalled. Murena goes one step further, however, by also offering something called Murena Workspace – a branded Nextcloud offering that works seamlessly with /e/OS. In other words, if you buy an /e/OS smartphone from Murena, you get the complete package of smartphone, mobile operating system, and cloud services that’s very similar to buying a regular Android phone or an iPhone. To help me test this complete package of smartphone, de-Googled Android, and cloud services, Murena loaned me a Fairphone 5 with /e/OS preinstalled, and while this article mostly focuses on the /e/OS experience, we should first talk a little bit about the relationship between Murena and Fairphone. Murena and Fairphone are partners, and Murena has been selling /e/OS Fairphones for a while now. Most of us will be familiar with Fairphone – it’s a Dutch company focused on designing and selling smartphones and related accessories that are are user-repairable and long-lasting, while also trying everything within their power to give full insight into their supply chain. This is important, because every smartphone contains quite a few materials that are unsustainably mined. Many mines are destructive to the environment, have horrible working conditions, or even sink as low as employing children. Even companies priding themselves on being environmentally responsible and sustainable, like Apple, are guilty of partaking in and propping up such mining endeavours. As consumers, there isn’t much we can do – the network of supply chains involved in making a smartphone is incredibly complex and opaque, and there’s basically nothing normal people can do to really fully know on whose underpaid or even underage shoulders their smartphone is built. This holiday season, Murena and Fairphone are collaborating on exactly this issue of the conditions in mines used to acquire the metals and minerals in our phones. Instead of offering big discounts (that barely eat into margins and often follow sharp price increases right before the holidays), Murena and Fairphone will donate

Managing third-party packages in 9front

Every now and then, news from the club I’m too cool to join, the plan9/9front community, pierces the veil of coolness and enters our normal world. This time, someone accidentally made a package manager for 9front. I’ve been growing tired of manually handling random software, so I decided to find a simple way to automate the process and ended up making a sort of “package manager” for 9front¹. It’s really just a set of shell scripts that act as a frontend for git and keep a simple database of package names and URLs. Running the pkginit script will ask for a location to store the source files for installed packages (/sys/pkg by default) which will then be created if non-existent. And that’s it! No, really. Now you can provide a URL for a git repository to pkg/add. ↫ Kelly “bubstance” Glenn As I somehow expected from 9front, it’s quite a simple and elegant system. I’m not sure how well it would handle more complex package operations, but I doubt many 9front systems are complex to begin with, so this may just be enough to take some of the tedium out of managing software on 9front, as the author originally intended. One day I will be cool enough to use 9front. I just have to stay cool.

Microsoft Word is using you to train “AI”

The author of this article, Dr. Casey Lawrence, mentions the opt-out checkbox is hard to find, and they aren’t kidding. On Windows, here’s the full snaking path you have to take through Word’s settings to get to the checkbox: File > Options > Trust Center > Trust Center Settings > Privacy Options > Privacy Settings > Optional Connected Experiences > Uncheck box: “Turn on optional connected experiences”. That is absolutely bananas. No normal person is ever going to find this checkbox. Anyway, remember how the “AI” believers kept saying “hey, it’s on the internet so scraping your stuff and violating your copyright is totally legal you guys!”? Well, what about when you’re using Word, installed on your own PC, to write private documents, containing, say, sensitive health information? Or detailed plans about your company’s competitor to Azure or Microsoft Office? Or correspondence with lawyers about an antirust lawsuit against Microsoft? Or a report on Microsoft’s illegal activity you’re trying to report as a whistleblower? Is that stuff fair game for the gobbledygook generators too? This “AI” nonsense has to stop. How is any of this even remotely legal?

Using (only) a Linux terminal for my personal computing in 2024

A month and a bit ago, I wondered if I could cope with a terminal-only computer. The only way to really find out was to give it a go. My goal was to see what it was like to use a terminal-only computer for my personal computing for two weeks, and more if I fancied it. ↫ Neil’s blog I tried to do this too, once. Once. Doing everything from the terminal just isn’t viable for me, mostly because I didn’t grow up with it. Our family’s first computer ran MS-DOS (with a Windows 3.1 installation we never used), and I’m pretty sure the experience of using MS-DOS as my first CLI ruined me for life. My mental model for computing didn’t start forming properly until Windows 95 came out, and as such, computing is inherently graphical for me, and no matter how many amazing CLI and TUI applications are out there – and there are many, many amazing ones – my brain just isn’t compatible with it. There are a few tasks I prefer doing with the command line, like updating my computers or editing system files using Nano, but for everything else I’m just faster and more comfortable with a graphical user interface. This comes down to not knowing most commands by heart, and often not even knowing the options and flags for the most basic of commands, meaning even very basic operations that people comfortable using the command line do without even thinking, take me ages. I’m glad any modern Linux distribution – I use Fedora KDE on all my computers – offers both paths for almost anything you could do on your computer, and unless I specifically opt to do so, I literally – literally literally – never have to touch the command line.

MaXX Interactive Desktop springs back to life with new release and updated roadmap

I had to dive into our archive all the way back to 2017 to find the last reference to the MaXX Interactive Desktop, and it seems this wasn’t exactly my fault – the project has been on hiatus since 2020, and is only now coming back to life, as MaXXdesktop v2.2.0 (nickname Octane) Alpha-1 has been released, alongside a promising and ambitious roadmap for the future of the project. For the uninitiated – MaXX is a Linux reimplementation of the IRIX Interactive Desktop with some modernisations and other niceties to make it work properly on modern Linux (and FreeBSD) machines. MaXX has a unique history in that its creator and lead developer, Eric Masson, managed to secure a special license agreement with SGI way back in 2005, under which he was allowed to recreate, from scratch, the IRIX Interactive Desktop on Linux, including the use of SGI’s trademarks and IRIX’ unique look and feel. It’s important to note that he did not get access to any code – he was only allowed to reverse-engineer and recreate it, and because some of the code falls under this license agreement and some doesn’t, MaXX is not entirely open source; parts of it are, but not all of it. Any new code written that doesn’t fall under the license agreement is released as open source though, and the goal is to, over time, make everything open source. And as you can tell from this v2.2.0 screenshot, MaXX looks stunning even at 4K. This new alpha version contains the first changes to adopt the freedesktop.org application specifications, a new Exposé-like window overview, tweaks to the modernised version of the IRIX look and feel (the classic one is also included as an option), desktop notifications, performance improvements, various modernisations to the window manager, and so, so much more. For the final release of 2.2.0 and later releases, more changes are planned, like brand new configuration and system management panels, a quick search tool, a new file manager, and a ton more. MaXX runs on RHEL/Rocky and Ubuntu, and probably more Linux distributions, and FreeBSD, and is entirely free.

The rare POWER Indigo 2

This is a Silicon Graphics workstation from 1995. Specifically, it is an ‘Teal’ Indigo 2 (as opposed to a ‘Purple’ Indigo 2, which came later). Ordinarily that’s rare enough – these things were about £30,000 brand new. A close look at the case badge though, marks this out as a ‘Teal’ POWER Indigo 2 – where instead of the usual MIPS R4600 or R4400SC CPU modules, we have the rare, unusual, expensive and short-lived MIPS R8000 module. ↫ Jonathan Pallant It’s rare these days to find an article about exotic hardware that has this many detailed photographs – most people just default to making videos now. Even if the actual contents of the article aren’t interesting, this is some real good hardware pornography, and I salute the author for taking the time to both take and publish these photos in a traditional way. That being said, what makes this particular SGI Indigo 2 so special? The R8000 is not a CPU in the traditional sense. It is a processor, but that processor is comprised of many individual chips, some of which you can see and some of which are hidden under the heatsink. The MIPS R8000 was apparently an attempt to wrestle back the Floating-Point crown from rivals. Some accounts report that at 75 MHz, it has around ten times the double-precision floating point throughput of an equivalent Pentium. However, code had to be specially optimised to take best advantage of it and most code wasn’t. It lasted on the market for around 18 months, before bring replaced by the MIPS R10K in the ‘Purple’ Indigo 2. ↫ Jonathan Pallant And here we see the first little bits of writing on the wall for the future of all the architectures trying to combat the rising tide of x86. SGI’s MIPS, Sun’s SPARC, HP’s PA-RISC, and other processors would stumble along for a few more years after this R8000 module came on the market, but looking back, all of these companies knew which way the wind was blowing, and many of them would sign onto Intel’s Itanium effort. Itanium would fail spectacularly, but the cat was out of the bag, and SGI, Sun, and HP would all be making standard Xeon and Opteron workstations within a a few years. Absolutely amazing to see this rare of a machine and module lovingly looked after.

Introduction to Bismuth VM

This is the first post in what will hopefully become a series of posts about a virtual machine I’m developing as a hobby project called Bismuth. This post will touch on some of the design fundamentals and goals, with future posts going into more detail on each. But to explain how I got here I first have to tell you about Bismuth, the kernel. ↫ Eniko Fox It’s not every day the a developer of an awesome video game details a project they’re working on that also happens to be excellent material for OSNews. Eniko Fox, one of the developers of the recently released Kitsune Tails, has also been working on an operating system and virtual machine in her spare time, and has recently been detailing the experience in, well, more detail. This one here is the first article in the series, and a few days ago she published the second part about memory safety in the VM. The first article goes into the origins of the project, as well as the design goals for the virtual machine. It started out as an operating systems development side project, but once it was time to develop things like the MMU and virtual memory mapping, Fox started wondering if programs couldn’t simply run inside a virtual machine atop the kernel instead. This is how the actual Bismuth virtual machine was conceived. Fox wants the virtual machine to care about memory safety, and that’s what the second article goes into. Since the VM is written in C, which is anything but memory-safe, she’s opting for implementing a form of sandboxing – which also happens to be the point in the development story where my limited knowledge starts to fail me and things get a little too complicated for me. I can’t even internalise how links work in Markdown, after all (square or regular brackets first? Also Markdown sucks as a writing tool but that’s a story for another time). For those of you more capable than me – so basically most of you – Fox’ series is a great series to follow along as she further develops the Bismuth VM.

What’s in a Steam Deck kernel anyway?

Valve, entirely going against the popular definition of Vendor, is still actively working on improving and maintaining the kernel for their Steam Deck hardware. Let’s see what they’re up to in this 6.8 cycle. ↫ Samuel Dionne-Riel Just a quick look at what, exactly, Valve does with the Steam Deck Linux kernel – nothing more, nothing less. It’s nice to have simple, straightforward posts sometimes.

Linux to lose support for Apple and IBM’s failed PowerPC Common Hardware Reference Platform

Ah, the Common Hardware Reference Platform, IBM’s and Apple’s ill-fated attempt at taking on the PC market with a reference PowerPC platform anybody could build and expand upon while remaining (mostly) compatible with one another. Sadly, like so many other things Apple was trying to do before Steve Jobs returned, it never took off, and even Apple itself never implemented CHRP in any meaningful way. Only a few random IBM and Motorola computers ever fully implemented it, and Apple didn’t get any further than basic CHRP support in Mac OS 8, and some PowerPC Macs were based on CHRP, without actually being compatible with it. We’re roughly three decades down the line now, and pretty much everyone except weird nerds like us have forgotten CHRP was ever even a thing, but Linux has continued to support CHRP all this time. This support, too, though, is coming to an end, as Michael Ellerman has informed the Linux kernel community that they’re thinking of getting rid of it. Only a very small number of machines are supported by CHRP in Linux: the IBM B50, bplan/Genesi’s Pegasos/Pegasos2 boards, the Total Impact briQ, and maybe some Motorola machines, and that’s it. Ellerman notes that these machines seem to have zero active users, and anyone wanting to bring CHRP support back can always go back in the git history. CHRP is one of the many, many footnotes in computing history, and with so few machines out there that supported it, and so few machines Linux’ CHRP support could even be used for, it makes perfect sense to remove this from the kernel, while obviously keeping it in git’s history in case anyone wants to work with it on their hardware in the future. Still, it’s always fun to see references to such old, obscure hardware and platforms in 2024, even if it’s technically sad news.

Microsoft pushes full-screen ads for Copilot+ PCs on Windows 10 users

Windows 10’s free, guaranteed security updates stop in October 2025, less than a year from now. Windows 10 users with supported PCs have been offered the Windows 11 upgrade plenty of times before. But now Microsoft is apparently making a fresh push to get users to upgrade, sending them full-screen reminders recommending they buy new computers. ↫ Andrew Cunningham at Ars Technica That deadline sure feels like it’s breathing down Microsoft’s neck. Most Windows users are still using Windows 10, and all of those hundreds of millions (billions?) of computers will become unsupported less than a year from now, which is going to be a major headache for Microsoft once the unaddressed security issues start piling up. CrowdStrike is fresh in Microsoft’s minds, and the company made a ton of promises about changing its security culture and implementing new features and best practices to stop it from ever happening again. That’s going to be some very tough promises to keep when the majority of Windows users are no longer getting any support. The obvious solution here is to accept the fact that if people haven’t upgraded to Windows 11 by now, they’re not going to until forced to do so because their computer breaks or becomes too slow and Windows 11 comes preinstalled on their new computer. No amount of annoying fullscreen ads interrupting people’s work or pleasure are going to get people to buy a new PC just for some halfbaked “AI” nonsense or whatever – in fact, it might just put even more people off from upgrading in the first place. Microsoft needs to face the music and simply extend the end-of-support deadline for Windows 10. Not doing so is massively irresponsible to a level rarely seen from big tech, and if they refuse to do so I strongly believe authorities should get involved and force the company to extend the deadline. You simply cannot leave this many users with insecure, non-maintained operating systems that they rely on every day to get their work done.

OpenVMS V9.2-3 released

VMS Software, the company migrating OpenVMS to x86 (well, virtualised x86, at least) has announced the release of OpenVMS 9.2-3, which brings with a number of new features and changes. It won’t surprise you to hear that many of the changes are about virtualisation and enterprise networking stuff, like adding passthrough support for fibre channel when running OpenVMS in VMware, a new VGA/keyboard-based guest console, automatic configuration of TCP/IP and OpenSSH during installation, improved performance for virtualised network interfaces on VMware and KVM, and much more. Gaining access to OpenVMS requires requesting a community license, after which OpenVMs will be delivered in the form of a preinstalled virtual disk image, complete with a number of development tools.

How Analyzing Turnover Rates Contributes to Success in Real Estate Farming

Targeted area marketing is a vital strategy for agents looking to build long-term success in a specific area. But what makes a farm area ideal? While various factors contribute to farming success, analyzing turnover rates stands as one of the most critical aspects. Turnover rates represent the percentage of homes that are sold within a set time frame, usually annually. Understanding these rates can provide invaluable insights into market activity. This helps real estate agents identify potential opportunities, assess the competition, and craft effective marketing real estate farming strategies. This allows agents to focus their efforts where they will yield the best results, maximizing their return on investment. Understanding Turnover Rates In real estate or targeted area farming, these rates help agents gauge how often homes are bought and sold in a particular area. Why is this important? It reflects the potential for new business. A higher turnover means there are more opportunities to sell homes, making the area a prime spot for farming. If an area has a low turnover, it could be a sign of low market activity, indicating fewer potential clients and, therefore, fewer opportunities for success. By tracking these rates, agents can determine whether a neighborhood is ripe for engagement or if another area may provide more fruitful prospects. This insight allows agents to allocate resources more wisely and avoid wasting time in areas with stagnant markets. Targeting the Right Market Segment with Turnover Insights Another benefit of analyzing these rates is the ability to identify market segments that align with an agent’s strengths. Real estate or targeted area farming is about much more than just selling homes. It’s about cultivating relationships and establishing a reputation. Areas with higher rates may offer more clients to serve, but they also bring more competition. By analyzing specific segments within the turnover data, such as luxury homes or starter homes, agents can determine which niches they are best suited to serve. This targeted approach not only helps you stand out among competitors but also ensures a higher rate of success by focusing on promoting your client base. It also enables agents to create specialized marketing strategies tailored to the unique needs of each segment. Using Postcards for Targeted Marketing Postcards remain one of the most effective tools for real estate marketing, especially when paired with insights from turnover rates. Why are postcards so powerful? They provide a tangible, personal touch that other forms of marketing may lack. By analyzing turnover rates, agents can identify homes that are likely to be listed soon and send postcards to these potential sellers. These postcards can include market updates, personalized messages, or offers for a free home evaluation. By targeting properties in areas with high turnover, agents increase the chances of reaching the right people at the right time, maximizing the impact of each marketing effort. How Turnover Rates Guide Marketing and Sales Strategies Turnover rates also play a significant role in shaping marketing and sales strategies. How can agents leverage this data for optimal results? When turnover rates are high, agents can implement aggressive marketing campaigns to reach potential sellers. These strategies might include direct mailers, door hangers, and community events that increase the agent’s visibility. Conversely, in areas with lower turnover rates, agents may focus on building relationships with residents and networking with other professionals. In these cases, patience and persistence are essential, as cultivating long-term trust can eventually lead to a higher likelihood of listing opportunities. By adapting marketing strategies based on turnover trends, agents are better equipped to capture opportunities as they arise. Turnover Rates as a Predictor of Future Market Conditions In Real estate, market conditions can change rapidly. Analyzing turnover rates not only helps agents assess the current state of the market but also serves as a valuable predictor of future trends. By tracking previous turnover data over time, agents can identify patterns, such as seasonal fluctuations or long-term market shifts. This foresight is invaluable when planning marketing and sales strategies. For instance, if turnover rates rise in the winter months, agents may want to adjust their campaigns accordingly, capitalizing on the unexpected surge in market activity. By understanding and anticipating these trends, agents can proactively position themselves to take advantage of shifting market conditions. Using Turnover Data to Build Stronger Relationships with Clients Beyond sales and marketing strategies, turnover rates can also help real estate agents build stronger relationships with clients. By staying in tune with an area’s turnover rates, agents can anticipate when homeowners are more likely to be thinking about selling or buying. This insight allows agents to reach out proactively, offering valuable market advice and personalized guidance. Establishing this rapport not only leads to more immediate business opportunities but also positions agents as trusted advisors. It fosters loyalty and long-term relationships, which are critical in real estate, where repeat business and referrals can often be the key to success. Through consistent follow-up and personalized communication, agents can nurture these relationships, ensuring they remain top-of-mind when clients are ready to move. Building Long-Term Success with Data-Driven Insights Ultimately, real estate or targeted area farming is about more than just immediate sales; it’s about building a sustainable business over time. By consistently analyzing turnover rates and incorporating these insights into their business strategy, agents can position themselves as trusted experts in the areas. It’s about creating a balance between marketing efforts and relationship building. A high rate can provide an influx of clients, but it’s the agent’s ability to nurture these relationships and provide value that will ensure long-term success. With a clear understanding of turnover trends, agents can remain ahead of the curve and continue to thrive in competitive markets. By aligning their strategies with data-driven insights, agents lay a solid foundation for future growth. Analyzing turnover rates is crucial for real estate agents seeking success in real estate farming. It offers insights into the potential of an area, guides marketing strategies, and helps agents target the right clients. By understanding these rates, agents

“Why I stopped using OpenBSD”

I’ve linked to quite a few posts by OpenBSD developer Solène Rapenne on OSNews, mostly about her work for and knowledge of OpenBSD. However, she recently posted about her decision to leave the OpenBSD team, and it mostly comes down to the fact she hasn’t been using OpenBSD for a while now due to a myriad of problems she’s encountering. Posts like these are generally not that fun to link to, and I’ve been debating about this for a few days now, but I think highlighting such problems, especially when detailed by a now-former OpenBSD developer, is an important thing to do. Hardware compatibility is an issue because OpenBSD has no Bluetooth support, its gamepad support is fractured and limited, and most of all, battery life and heat are a major issue, as Solène notes that “OpenBSD draws more power than alternatives, by a good margin”. For her devops work, she also needs to run a lot of software in virtual machines, and this seems to be a big problem on OpenBSD, as performance in this area seems limited. Lastly, OpenBSD seems to be having stability issues and crashes a lot for her, and while this in an of itself is a big problem already, it’s compounded by the fact that OpenBSD’s file system is quite outdated, and most crashes will lead to corrupted or lost files, since the file system doesn’t have any features to mitigate this. I went through a similar, but obviously much shorter and far less well-informed experience with OpenBSD myself. It’s such a neat, understandable, and well-thought out operating system, but its limitations are obvious, and they will start to bother you sooner or later if you’re trying to use it as a general purpose operating system. While it’s entirely understandable because OpenBSD’s main goal is not the desktop, it still sucks because everything else about the operating system is so damn nice and welcoming. Solène found her alternative in Linux and Qubes OS: I moved from OpenBSD to Qubes OS for almost everything (except playing video games) on which I run Fedora virtual machines (approximately 20 VM simultaneously in average). This provides me better security than OpenBSD could provide me as I am able to separate every context into different spaces, this is absolutely hardcore for most users, but I just can’t go back to a traditional system after this. ↫ Solène Rapenne She lists quite a few Linux features she particularly likes and why, such as cgroups, systemd, modern file systems like Btrfs and ZFS, SELinux, and more. It’s quite rare to see someone of her calibre so openly list the shortcomings of the system she clearly otherwise loves and put a lot of effort in, and move to what is generally looked at with some disdain within the community she came from. It also highlights that issues with running OpenBSD as a general purpose operating system are not confined to less experienced users such as myself, but extend towards extremely experienced and knowledgeable people like actual OpenBSD developers. I’m definitely not advocating for OpenBSD to change course or make a hard pivot to becoming a desktop operating system, but I do think that even within the confines of a server operating system there’s room for at least things like a much improved and faster file system that provides the modern features server users expect, too.

Windows 365 Link: a thin client from Microsoft

One of my favourite devices that never took on in the home is the thin client. Whenever I look at a fully functional Sun Microsystems thin client setup, with Sun Rays, a Solaris server, and the smartcards instantly loading up your desktop the moment you slide it in the Ray’s slot, my mind wonders about the future we could’ve had in our homes – a powerful, expandable, capable server in the basement, running every family member’s software, and thin clients all throughout the house where family members can plug their smartcard into to load up their stuff. This is the future they took from us. Well, not entirely. They took this future, made it infinitely worse by replacing that big server in our basement with massive datacentres far away from us in the “cloud”, and threw it back in our faces as a shittier inevitability we all have to deal with. The fact this model relies on subscriptions is, of course, entirely coincidental and not all the main driving force behind taking our software away from us and hiding it stronghold datacentres. So anyway Microsoft is launching a thin client that connects to a Windows VM running in the cloud. They took the perfection Sun gave us, shoved it down their throats, regurgitated it like a cow, and are now presenting it to us as the new shiny. It’s called the Windows 365 Link, and it connects to, as the name implies, Windows 365. Here’s part of the enterprise marketing speak: Today, as users take advantage of virtualization offerings delivered on an array of devices, they can face complex sign-in processes, peripheral incompatibility, and latency issues. Windows 365 Link helps address these issues, particularly in shared workspace scenarios. It’s compact, lightweight, and designed to maximize productivity with its highly responsive performance. It takes seconds to boot and instantly wakes from sleep, allowing users to quickly get started or pick up where they left off on their Cloud PC. With dual 4K monitor support, four USB ports, an Ethernet port, Wi-Fi 6E, and Bluetooth 5.3, Windows 365 Link offers seamless connectivity with both wired and wireless peripherals. ↫ Anthony Smith at the Windows IT Pro Blog This is just a thin client, but worse, since it seemingly can only connect to Microsoft’s “cloud”, without the ability to connect to a server on-premises, which is a very common use case. In fact, you can’t even use another vendor’s tooling, so if you want to switch from Windows 365 to some other provider later down the line, you seemingly can’t – unless there’s some BIOS switches or whatever you can flip. At the very least, Microsoft intends for other vendors to also make Link devices, so perhaps competition will bring the price down to a more manageble level than $349. Unless an enterprise environment is already so deep into the Microsoft ecosystem that they don’t even rely on things like Citrix or any of the other countless providers of similar services, why would you buy thousands of these for your employees, only to lock your entire company into Windows 365? I’m no IT manager, obviously, so perhaps I’m way off base here, but this thing seems like a hard sell when there are so, so many alternative services, and so many thin client devices to choose from that can use any of those services.

FLTK 1.4.0 brings Wayland support

FLTK 1.4.0 has been released. This new version of the Fast Light Toolkit contains some major improvements, such as Wayland support on both Linux and FreeBSD. X11 and Wayland are both supported by default, and applications using FLTK will launch using Wayland if available, and otherwise fall back to starting with X11. This new release also brings HiDPI support on Linux and Windows, and improves said support on macOS. Those are the headline features, but there’s more changes here, of course, as well as the usual round of bugfixes. Right after the release of 1.4.0, a quick bugfix release, version 1.4.0-1, was released to address an issue in 1.4.0 – a build error on a single test program on Windows, when using Visual Studio. Not exactly a major bug, but great to see the team fix it so rapidly.

Why did Windows 95 setup use three operating systems?

Way back in April of this year, I linked to a question and answer about why some parts of the Windows 98 installer looked older than the other parts. It turns out that in between the MS-DOS (the blue part) and Windows 98 parts of the installation process, the installer boots into a small version of Windows 3.1. Raymond Chen posted an article detailing this process for Windows 95, and why, exactly, Microsoft had to resort to splitting the installer between MS-DOS, Windows 3.1, and Windows 95. The answer is, as always, backwards compatibility. Since Windows 95 could be installed from MS-DOS, Windows 3.1, and Windows 95 (to fix an existing installation), the installer needed to be able to work on all three. The easiest solution would be to write the installer as an MS-DOS program, since that works on all three of these starting points, but that would mean an ugly installer, even though Windows 95 was supposed to be most people’s first experience with a graphical user interface. This is why Microsoft ended up with the tiered installation process – to support all possible starting points in the most graphical way possible. Chen also mentions another fun fact that is somewhat related to this: the first version of Excel for Windows was shipped with a version of the Windows 2.1 runtime, so that even people without Windows could still run Excel. Even back then, Microsoft took backwards compatibility seriously, and made sure people who hadn’t upgraded from MS-DOS to Windows 2.x yet – meaning, everyone – could still enjoy the spreadsheet lifestyle. I say we pass some EU law forcing Microsoft to bring this back. The next version of Excel should contain whatever is needed to run it on MS-DOS. Make it happen, Brussels.

DOJ will push Google to sell Chrome to break search monopoly

Speaking of Google, the United States Department of Justice is pushing for Google to sell off Chrome. Top Justice Department antitrust officials have decided to ask a judge to force Alphabet Inc.’s Google to sell off its Chrome browser in what would be a historic crackdown on one of the biggest tech companies in the world. The department will ask the judge, who ruled in August that Google illegally monopolized the search market, to require measures related to artificial intelligence and its Android smartphone operating system, according to people familiar with the plans. ↫ Leah Nylen and Josh Sisco Let’s take a look at the history and current state of independent browsers, shall we? Netscape is obviously dead, Firefox is hanging on by a thread (which is inconspicuously shaped like a giant sack of money from Google), Opera is dead (its shady Chrome skin doesn’t count), Brave is cryptotrash run by a homophobe, and Vivaldi, while an actually good and capable Chrome skin with a ton of fun features, still isn’t profitable, so who knows how long they’ll last. As an independent company, Chrome wouldn’t survive. It seems the DoJ understands this, too, because they’re clearly using the words “sell off”, which would indicate selling Chrome to someone else instead of just spinning it off into a separate company. But who has both the cash and the interest in buying Chrome, without also being a terrible tech company with terrible business incentives that might make Chrome even more terrible than it already is? Through Chrome, Google has sucked all the air out of whatever was left of the browser market back when they first announced the browser. An independent Chrome won’t survive, and Chrome in anyone else’s hands might have the potential to be even worse. A final option out of left field would be turning Chrome and Chromium into a truly independent foundation or something, without a profit motive, focused solely on developing the Chromium engine, but that, too, would be easily abused by financial interests. I think the most likely outcome is one none of us want: absolutely nothing will happen. There’s a new administration coming to Washington, and if the recent proposed picks for government positions are anything to go by, America will be incredibly lucky if they get someone smarter than a disemboweled frog on a stick to run the DoJ. More likely than not, Google’s lawyers will walk all over whatever’s left of the DoJ after 20 January, or Pichai will simply kiss some more gaudy gold rings to make the case go away.

Google is reportedly killing Chrome OS in favour of Android

Mishaal Rahman, who has a history of being right about Google and Android-related matters, is reporting that Google is intending to standardise its consumer operating system efforts onto a single platform: Android. To better compete with the iPad as well as manage engineering resources more effectively, Google wants to unify its operating system efforts. Instead of merging Android and Chrome OS into a new operating system like rumors suggested in the past, however, a source told me that Google is instead working on fully migrating Chrome OS over to Android. While we don’t know what this means for the Chrome OS or Chromebook brands, we did hear that Google wants future “Chromebooks” to ship with the Android OS in the future. That’s why I believe that Google’s rumored new Pixel Laptop will run a new version of desktop Android as opposed to the Chrome OS that you’re likely familiar with. ↫ Mishaal Rahman at Android Authority The fact both Chrome OS and Android exist, and are competing with each other in some segments – most notably tablets – hasn’t done either operating system any favours. I doubt many people even know Chrome OS tablets are a thing, and I doubt many people would say Android tablets are an objectively better choice than an iPad. I personally definitely prefer Android on tablets over iOS on tablets, but I fully recognise that for 95% of tablet buyers, the iPad is the better, and often also more affordable, choice. Google has been struggling with Android on tablets for about as long as they’ve existed, and now it seems that the company is going to focus all of its efforts on just Android, leaving Chrome OS to slowly be consumed and replaced by it. In June, Google already announced it was going to replace both the kernel and several subsystems in Chrome OS with their Android counterparts, and now they’re also building a new version of Chrome for Android with extensions supports – to match Chrome on Chrome OS – as well as a terminal application for Android that gives access to a local Linux virtual machine, much like is available on Chrome OS. As mentioned, laptops running Android will also be making an entrance, including a Pixel laptop straight from Google. The next big update for Android 15 contains a ton of new proper windowing features, and there’s more coming: improved keyboard and mouse support, as well as external monitors, virtual desktops, and a lot more. As anyone who has ever attempted to run Android on a desktop or laptop knows, there’s definitely a ton of work Google needs to do to make Android palatable to consumers on that front. Of course, this being Google, any of these rumours or plans could change at any time without any sense of logic behind it, as managers fulfill their quotas, get promoted, or leave the company.

iOS 18.1 will reboot iPhones to a locked state after 72 hours of inactivity

In recent weeks, law enforcement in the United States discovered, to their dismay, that iPhones were automatically rebooting themselves after a few days of inactivity, thereby denying them access to the contents of these phones. After a lot of speculation online, Jiska Classen dove into this story to find out what was going on, and through reverse-engineering they discovered that this was a new security feature built by Apple as part of iOS 18.1, to further make stolen iPhones useless for both thieves as well as law enforcement officers. It’s a rather clever feature. The Secure Enclave Processor inside the iPhone keeps track of when the phone was last unlocked, and if that period exceeds 72 hours, the SEP will inform a kernel module. This kernel module will then, in turn, tell the phone to gracefully reboot, meaning no data is lost in this process. If the phone for whatever reason does not reboot and remains powered on, the module will assume the phone’s been tampered with somehow and kernel-panic. Interestingly, if the reboot takes place properly, an analytics report stating how long the phone was not unlocked will be sent to Apple. The reason this is such a powerful feature is that a locked iPhone is entirely useless to anyone who doesn’t have the right code or biometrics to unlock it. Everything on the device is encrypted, and only properly unlocking it will decrypt the phone’s contents; in fact, a locked phone can’t even join a Wi-Fi network, because the stored passwords are encrypted (and I’m assuming that a locked phone does not provide access to any methods of joining an open network either). When you have a SIM card without any pincode, the iPhone will connect to the cellular network, but any notifications or calls coming in will effectively be empty, since incoming phone numbers can’t be linked to any of the still-encrypted contacts, and while the phone can tell it’s received notifications, it can’t show you any of their contents. A thief who’s now holding this phone can’t do much with it if it locks itself like this after a few days, and law enforcement won’t be able to access the phone either. This is a big deal in places where arrests based purely on skin colour or ethnicity or whatever are common, like in the United States (and in Europe too, just to a far lesser degree), or in places where people have to fear the authorities for other reasons, like in totalitarian dictatorships like Russia, China or Iran, where any hint of dissent can end you in harsh prisons. Apple is always at the forefront with features such as these, with Google and Android drunkenly stumbling into the open door a year later with copies that take ages to propagate through the Android user base. I’m legitimately thankful for Apple raising awareness of the need of features such as these – even if they’re too cowardly to enable them in places like China – as it’s quite clear a lot more people need to start caring about these things, with recent developments and all.