They tried to keep it from prying eyes, but several people did notice it: Google made a pretty significant policy change regarding the use of fingerprinting by advertisers. While Google did not allow advertisers to use digital fingerprinting, the company has now changed its mind on this one. Google really tried to hide this change. The main support article talking about the reasoning behind the change is intentionally obtuse and nebulous, and doesn’t even link to the actual policy changes being implemented – which are found in a separate document. Google doesn’t highlight its changes there, so you have to compare the two versions of the policy yourself. Google claims this change has to be implemented because of “advances in privacy-enhancing technologies (PETs) such as on-device processing, trusted execution environments, and secure multi-party computation” and “the rise of new ad-supported devices and platforms”. What I think this word salad means is that users are regaining a modicum of privacy with some specific privacy-preserving features in certain operating systems and on certain devices, and that the use of dedicated, siloed streaming services is increasing, which is harder for Google and advertisers to track. In other words, Google is relaxing its rules on fingerprinting because we’re all getting more conscious about privacy. In any event, the advice remains the same: use ad-blockers, preferably at your network level. Install adblocking software and extensions, set up a Pi-Hole, or turn on any adblocking features in your router (my Ubiquiti router has it built-in, and it works like a charm). Remember: your device, your rules. If you don’t want to see ads, you don’t have to.
In recent months, we’ve talked twice about FM Towns, Fujitsu’s PC platform aimed solely at the Japanese market. It was almost entirely only available in Japanese, so it’s difficult to explore for those of us who don’t speak Japanese. There’s an effort underway to recreate it as open source, which will most likely take a while, but in the meantime, a part of the FM TOWNS Technical Databook, written by Noriaki Chiba, has been translated from Japanese into English by Joe Groff. From the book’s preface: That is why the author wrote this book, to serve as an essential manual for enthusiasts, systems developers, and software developers. Typical computer manuals do not adequately cover technical specifications, so users tend to have a hard time understanding the hardware. We have an opportunity to thoroughly break through this barrier, and with this new hardware architecture being a milestone in the FM series, it feels like the perfect time to try. Hardware manuals up to this point have typically only explained the consequences of the hardware design without explaining its fundamentals. By contrast, this book describes the hardware design of the TOWNS from its foundations. Since even expert systems developers can feel like amateurs when working with devices outside of their repertoire, this book focuses on explaining those foundations. This is especially necessary for the FM TOWNS, since it features so many new devices, including a 80386 CPU and a CD-ROM drive. ↫ Noriaki Chiba This handbook goes into great detail about the inner workings of the hardware, and chapter II, which hasn’t been translated yet, also dives deep into the BIOS of the hardware, from its first revisions to all the additional features added on top as time progressed. This book, as well as its translation, will be invaluable to people trying to use Towns OS today, to developers working on emulators for the platform, and anyone who fits somewhere in between. It seems this translation was done entirely in Groff’s free time as a side project, which is commendable. We’re looking at about 65000 words in the target language, of a highly technical nature, all translated for free because someone decided it was worth it. Sending this over to a translation agency would most likely cost well over €10000. Of course, that would include additional editing and proofreading by parties other than the initial translator(s), but that’s definitely not needed for a passion project like this. Excellent, valuable work.
When someone says you’re biased against them because you object to their stated goal of removing you from society, they’re not actually asking for fairness — they’re demanding complicity. It’s the political equivalent of asking why the gazelle seems to have such a negative view of lions. Think about the underlying logic here: I’m biased because I don’t give equal weight to both sides of a debate about my fundamental rights. I’m biased because I notice patterns in political movements that explicitly state their intentions regarding people like me. I’m biased because I take them at their word when they tell me what they plan to do. ↫ Joan Westenberg OSNews and I will always stand for the right of transgender people to exist, and to enjoy the exact same rights and privileges as any other member of society. This is non-negotiable.
Enlightenment 0.27.0 has been released, and we’ve got some highly informative release notes. This is the latest release of Enlightenment. This has a lot of fixes mostly with some new features. ↫ Carsten Haitzler That’s it. That’s the release notes. Digging into the commit history between 0.26.0 and 0.27.0 gives some more information, and here we can see that a lot of work has been done on the CPU frequency applet (including hopefully making it work properly on FreeBSD), a lot of updated translations, some RandR work, and a ton of other small changes. Does any one of us use Enlightenment on a daily basis? I’m actually intrigued by this release, as it’s the first one in two years, and aside from historical usage decades ago – like many of us, I assume – I haven’t really spent any time with the current incarnation.
Two years ago, Twitch streamer albrot discovered a bug in the code for crossing rivers. One of the options is to “wait to see if conditions improve”; waiting a day will consume food but not recalculate any health conditions, granting your party immortality. From this conceit the Oregon Trail Time Machine was born; a multiday livestream of the game as the party waits for conditions to improve at the final Snake River crossing until the year 10000, to see if the withered travellers can make it to the ruins of ancient Oregon. The first attempt ended in tragedy; no matter what albrot tried, the party would succumb to disease and die almost immediately. A couple of days before New Years Eve 2025, albrot reached out and asked if I knew anything about Apple II hacking. ↫ Scott Percival It may have required some reverse engineering and hackery, but yes, you can reach the ruins of Oregon in the year 16120.
Mastodon, the only remaining social network that isn’t a fascist hellhole like Twitter or Facebook, is changing its legal and operational foundation to a proper European non-profit. Simply, we are going to transfer ownership of key Mastodon ecosystem and platform components (including name and copyrights, among other assets) to a new non-profit organization, affirming the intent that Mastodon should not be owned or controlled by a single individual. It also means a different role for Eugen, Mastodon’s current CEO. Handing off the overall Mastodon management will free him up to focus on product strategy where his original passion lies and he gains the most satisfaction. ↫ Official Mastodon blog Eugen Rochko has always been clear and steadfast about Mastodon not being for sale and not accepting any outside investments despite countless offers, and after eight years of both creating and running Mastodon, it makes perfect sense to move the network and its assets to a proper European non-profit. Mastodon’s actual control over the entire federated ActivityPub network – the Fediverse – is actually limited, so it’s not like the network is dependent on Mastodon, but there’s no denying it’s the most well-known part of the Fediverse. The Fediverse is the only social network on which OSNews is actively present (and myself, too, for that matter). By “actively present” I only mean I’m keeping an eye on any possible replies; the feed itself consists exclusively of links to our stories as soon as they’re published, and that’s it. Everything else you might encounter on social media is either legacy cruft we haven’t deleted yet, or something a third-party set up that we don’t control. RSS means it’s easy for people to set up third-party, unaffiliated accounts on any social medium posting links to our stories, and that’s entirely fine, of course. However, corporate social media controlled by the irrational whims of delusional billionaires with totalitarian tendencies is not something we want to be a part of, so aside from visiting OSNews.com and using our RSS feeds, the only other official way to follow OSNews is on Mastodon.
It’s hard to see how to move forward from here. I think the best bet would be for people to rally around a new community-driven infrastructure. This would likely require a fork of WordPress, though, and that’s going to be a messy. The current open source version of WordPress relies on the sites and services Mullenweg controls. Joost de Valk, the original creator of an extremely popular SEO plugin, wrote a blog post with some thoughts on the matter. I’m hoping that more prominent people in the community step up like this, and that some way forward can be found. Update: Moments after posting this, I was pointed to a story on TechCrunch about Mullenweg deactivating the WordPress.org accounts of users planning a “fork”. This after he previously promoted (though in a slightly mocking way) the idea of forking open source software. In both cases, the people he mentioned weren’t actually planning forks, but musing about future ways forward for WordPress. Mullenweg framed the account deactivations as giving people the push they need to get started. Remember that WordPress.org accounts are required to submit themes, plugins, or core code to the WordPress project. These recent events really make it seem like you’re no longer welcome to contribute to WordPress if you question Matt Mullenweg. ↫ Gavin Anderegg I haven’t wasted a single word on the ongoing WordPress drama yet, but the longer Matt Mullenweg, Automattic’s CEO and thus owner of WordPress, keeps losing his mind, I can’t really ignore the matter any more. OSNews runs, after all, on WordPress – self-hosted, at least, so not on Mullenweg’s WordPress.com – and if things keep going the way they are, I simply don’t know if WordPress remains a viable, safe, and future-proof CMS for OSNews. I haven’t discussed this particular situation with OSNews owner, David Adams, yet, mostly since he’s quite hands-off in the day-to-day operations and has more than enough other matters to take care of, but I think the time has come to start planning for a potential worst-case scenario in which Mullenweg takes even more of whatever he’s taking and WordPress implodes entirely. Remember – even if you self-host WordPress outside of Automattic, several core infrastructure parts of WordPress still run through Automattic, so we’re still dependent on what Mullenweg does or doesn’t do. I have no answers, announcements, or even plans at this point, but if you or your company depend on WordPress, you might want to start thinking about where to go from here.
One of the innovations that the V7 Bourne shell introduced was built in shell wildcard globbing, which is to say expanding things like *, ?, and so on. Of course Unix had shell wildcards well before V7, but in V6 and earlier, the shell didn’t implement globbing itself; instead this was delegated to an external program, /etc/glob (this affects things like looking into the history of Unix shell wildcards, because you have to know to look at the glob source, not the shell). ↫ Chris Siebenmann I never knew expanding wildcars in UNIX shells was once done by a separate program, but if you stop and think about the original UNIX philosophy, it kind of makes sense. On a slightly related note, I’m currently very deep into setting up, playing with, and actively using HP-UX 11i v1 on the HP c8000 I was able to buy thanks to countless donations from you all, OSNews readers, and one of the things I want to get working is email in dtmail, the CDE email program. However, dtmail is old, and wants you to do email the UNIX way: instead of dtmail retrieving and sending email itself, it expects other programs to those tasks for you. In other words, to setup and use dtmail (instead of relying on a 2010 port of Thunderbird), I’ll have to learn how to set up things like sendmail, fetchmail, or alternatives to those tools. Those programs will in turn dump the emails in the maildir format for dtmail to work with. Configuring these tools could very well be above my paygrade, but I’ll do my best to try and get it working – I think it’s more authentic to use something like dtmail than a random Thunderbird port. In any event, this, too, feels very UNIX-y, much like delegating wildcard expansion to a separate program. What this also shows is that the “UNIX philosophy” was subject to erosion from the very beginning, and really isn’t a modern phenomenon like many people seem to imply. I doubt many of the people complaining about the demise of the UNIX philosophy today even knew wildcard expansion used to be done by a separate program.
Many moons ago, around the time when Andreas formally resigned from being Serenity’s BDFL, I decided that I want to get involved in the project more seriously. Looking at it from a perspective of “what do I not like about this (codebase)”, the first thing that came to mind was that it runs HERE points at QEMU and not THERE points at real hardware. Obvious oversight, let’s fix it. ↫ sdomi There’s no way for me to summarise this cursed saga, so just follow the lovely link and read it. It’s a meandering story of complexity, but eventually, a corrupted graphical session appeared. Now the real work starts.
Don’t you just love it when companies get together under the thin guise of open source to promote their own interests? Today Google is pleased to announce our partnership with The Linux Foundation and the launch of the Supporters of Chromium-based Browsers. The goal of this initiative is to foster a sustainable environment of open-source contributions towards the health of the Chromium ecosystem and financially support a community of developers who want to contribute to the project, encouraging widespread support and continued technological progress for Chromium embedders. The Supporters of Chromium-based Browsers fund will be managed by the Linux Foundation, following their long established practices for open governance, prioritizing transparency, inclusivity, and community-driven development. We’re thrilled to have Meta, Microsoft, and Opera on-board as the initial members to pledge their support. ↫ Shruthi Sreekanta on the Chromium blog First, there’s absolutely no way around the fact that this entire effort is designed to counter some of the antitrust actions against Google, including a possible forced divestment of Chrome. By setting up an additional fund atop the Chromium organisation, placed under the management of the Linux Foundation, Google creates the veneer of more independence for Chromium than their really is. In reality, however, Chromium is very much a Google-led project, with 94% of code contributions coming from Google, and with the Linux Foundation being very much a corporate affair, of which Google itself is a member, one has to wonder just how much it means that the Linux Foundation is managing this new fund. Second, the initial members of this fund don’t exactly instill confidence in the fund’s morals and values. We’ve got Google, the largest online advertising company in the world. Then there’s Facebook, another major online advertising company, followed by Microsoft, which, among other business ventures, is also a major online advertising company. Lastly we have Opera, an NFT and cryptoscammer making money through predatory loans in poor countries. It’s a veritable who’s who of some of the companies you least want near anything related to your browsing experience. I highly doubt a transparent effort like this is going to dissuade any judge or antritrust regulator from backing down. It’s clear this fund is entirely self-serving and designed almost exclusively for optics, with an obvious bias towards online advertising companies who want to make the internet worse than towards companies and people trying to make the internet better.
VLC media player, the popular open-source software developed by nonprofit VideoLAN, has topped 6 billion downloads worldwide and teased an AI-powered subtitle system. The new feature automatically generates real-time subtitles — which can then also be translated in many languages — for any video using open-source AI models that run locally on users’ devices, eliminating the need for internet connectivity or cloud services, VideoLAN demoed at CES. ↫ Manish Singh at TechCrunch VLC is choosing to throw users who rely on subtitles for accessibility or translation reasons under the bus. Using speech-to-text and even “AI” as a starting point for a proper accessibility expert of translator is fine, and can greatly reduce the workload. However, as anyone who works with STT and “AI” translation software knows, their output is highly variable and wildly unreliable, especially once English isn’t involved. Dumping the raw output of these tools onto people who rely on closed captions and subtitles to even be able to view videos is not only lazy, it’s deeply irresponsible and demonstrates a complete lack of respect and understanding. I was a translator for almost 15 years, with two university degrees on the subject to show for it. This is obviously a subject close to my heart, and the complete and utter lack of respect and understanding from Silicon Valley and the wider technology world for proper localisation and translation has been a thorn in my side for decades. We all know about bad translations, but it goes much deeper than that – with Silicon Valley’s utter disregard for multilingual people drawing most of my ire. Despite about 60 million people in the US alone using both English and Spanish daily, software still almost universally assumes you speak only one language at all times, often forcing fresh installs for something as simple as changing a single application’s language, or not even allowing autocorrect on a touch keyboard to work with multiple languages simultaneously. I can’t even imagine how bad things are for people who, for instance, require closed-captions for accessibility reasons. Imagine just how bad the “AI”-translated Croatian closed-captions on an Italian video are going to be – that’s two levels of “AI” brainrot between the source and the ears of the Croatian user. It seems subtitles and closed captions are going to be the next area where technology companies are going to slash costs, without realising – or, more likely, without giving a shit – that this will hurt users who require accessibility or translations more than anything. Seeing even an open source project like VLC jump onto this bandwagon is disheartening, but not entirely unexpected – the hype bubble is inescapable, and a lot more respected projects are going to throw their users under the bus before this bubble pops. …wait a second. Why is VLC at CES in the first place?
On Monday at CES 2025, Nvidia unveiled a desktop computer called Project DIGITS. The machine uses Nvidia’s latest “Blackwell” AI chip and will cost $3,000. It contains a new central processor, or CPU, which Nvidia and MediaTek worked to create. Responding to an analyst’s question during an investor presentation, Huang said Nvidia tapped MediaTek to co-design an energy-efficient CPU that could be sold more widely. “Now they could provide that to us, and they could keep that for themselves and serve the market. And so it was a great win-win,” Huang said. Previously, Reuters reported that Nvidia was working on a CPU for personal computers to challenge the consumer and business computer market dominance of Intel, Advanced Micro Devices and Qualcomm. ↫ Stephen Nellis at Reuters I’ve long wondered why NVIDIA wasn’t entering the general purpose processor market in a more substantial way than it did a few years ago with the Tegra, especially now that ARM has cemented itself as an architecture choice for more than just mobile devices. Much like Intel, AMD, and now Qualcomm, NVIDIA could easily deliver the whole package to laptop, tablet, and desktop makers: processor, chipset, GPU, of course glued together with special NVIDIA magic the other companies opting to use NVIDIA GPUs won’t get. There’s a lot of money to be made there, and it’s the move that could help NVIDIA survive the inevitable crash of the “AI” wave it’s currently riding, which has pushed the company to become one of the most valuable companies in the world. I’m also sure OEMs would love nothing more than to have more than just Qualcomm to choose from for ARM laptops and desktops, if only to aid in bringing costs down through competition, and to potentially offer ARM devices with the same kind of powerful GPUs currently mostly reserved for x86 machines. I’m personally always for more competition, but this time with the asterisk that NVIDIA really doesn’t need to get any bigger than it already is. The company has a long history of screwing over consumers, and I doubt that would change if they also conquered a chunky slice of the general purpose processor market.
So we all know about twisted-pair ethernet, huh? I get a little frustrated with a lot of histories of the topic, like the recent neil breen^w^wserial port video, because they often fail to address some obvious questions about the origin of twisted-pair network cabling. Well, I will fail to answer these as well, because the reality is that these answers have proven very difficult to track down. ↫ J. B. Crawford The problems with nailing down an accurate history of the development of the various standards, ideas, concepts, and implementations of Ethernet and other, by now dead, network standards are their age, as well as the fact that their history is entangled with the even longer history of telephone wiring. The reasoning behind some of the choices made by engineers over the past more than 100 years of telephone technology aren’t always clear, and very difficult to retrace. Crawford dives into some seriously old and fun history here, trying to piece together the origins of twisted pair the best he can. It’s a great read, as all of his writings are.
Hey there! In this book, we’re going to build a small operating system from scratch, step by step. You might get intimidated when you hear OS or kernel development, the basic functions of an OS (especially the kernel) are surprisingly simple. Even Linux, which is often cited as a huge open-source software, was only 8,413 lines in version 0.01. Today’s Linux kernel is overwhelmingly large, but it started with a tiny codebase, just like your hobby project. We’ll implement basic context switching, paging, user mode, a command-line shell, a disk device driver, and file read/write operations in C. Sounds like a lot, however, it’s only 1,000 lines of code! ↫ Seiya Nuta It’s exactly what it says on the tin.
We’ve all had a good seven years to figure out why our interconnected devices refused to work properly with the HDMI 2.1 specification. The HDMI Forum announced at CES today that it’s time to start considering new headaches. HDMI 2.2 will require new cables for full compatibility, but it has the same physical connectors. Tiny QR codes are suggested to help with that, however. The new specification is named HDMI 2.2, but compatible cables will carry an “Ultra96” marker to indicate that they can carry 96GBps, double the 48 of HDMI 2.1b. The Forum anticipates this will result in higher resolutions and refresh rates and a “next-gen HDMI Fixed Rate Link.” The Forum cited “AR/VR/MR, spatial reality, and light field displays” as benefiting from increased bandwidth, along with medical imaging and machine vision. ↫ Kevin Purdey at Ars Technica I’m sure this will not pose any problems whatsoever, and that no shady no-name manufacturers will abuse this situation at all. DisplayPort is the better standard and connector anyway. No, I will not be taking questions.
NESFab is a new programming language for creating NES games. Designed with 8-bit limitations in mind, the language is more ergonomic to use than C, while also producing faster assembly code. It’s easy to get started with, and has a useful set of libraries for making your first — or hundredth — NES game. ↫ NESFab website NESFab has some smart features developers of NES games will certainly appreciate, most notably automatic bank switching. Instead of doing this manually, but NESFab will automatically carve your code and data up into banks to be switched in and out of memory when needed. There’s also an optional map editor, which makes it very easy to create additional levels for your game. All in all, a very cool project I hadn’t heard of, which also claims to perform better than other compilers. If you’ve ever considered making an NES game, NESFab might be a tool to consider.
An OPO (compiled OPL) interpreter written in Lua and Swift, based on the Psion Series 5 era format (ie ER5, prior to the Quartz 6.x changes). It lets you run Psion 5 programs written in OPL on any iOS device, subject to the limitations described below. ↫ OpoLua GitHub page If you’re pining for that Psion Series 5, but don’t want to deal with the hassle of owning and maintaining a real one – here’s a solution if you’re an iOS users. Incredibly neat, but with one limitation: only pure OPL programs work. Any program that also has native ARM code will not work.
Dell has announced it’s rebranding literally its entire product line, so mainstays like XPS, Latitude, and Inspiron are going away. They’re replacing all of these old brands with Dell, Dell Pro, and Dell Pro Max, and within each of these, there will be three tiers: Base, Plus, and Premium. Of course, the reason is “AI”. The AI PC market is quickly evolving. Silicon innovation is at its strongest and everyone from IT decision makers to professionals and everyday users are looking at on-device AI to help drive productivity and creativity. To make finding the right AI PC easy for customers, we’ve introduced three simple product categories to focus on core customer needs – Dell (designed for play, school and work), Dell Pro (designed for professional-grade productivity) and Dell Pro Max (designed for maximum performance). We’ve also made it easy to distinguish products within each of the new product categories. We have a consistent approach to tiering that lets customers pinpoint the exact device for their specific needs. Above and beyond the starting point (Base), there’s a Plus tier that offers the most scalable performance and a Premium tier that delivers the ultimate in mobility and design. ↫ Kevin Terwilliger on Dell’s blog Setting aside the nonsensical reasoning behind the rebrand, I do actually kind of dig the simplicity here. This is a simple, straightforward set of brand names and tiers that pretty much anyone can understand. That being said, the issue with Dell in particular is that once you go to their website to actually buy one of their machines, the clarity abruptly ends and it gets confusing fast. I hope these new brand names and tiers will untangle some of that mess to make it easier to find what you need, but I’m skeptical. My XPS 13 from 2017 is really starting to show its age, and considering how happy I’ve been with it over the years its current Dell equivalent would be a top contender (assuming I had the finances to do so). I wonder if the Linux support on current Dell laptops has improved since my XPS 13 was new?
Over 60% of Windows users are still using Windows 10, with only about 35% or so – and falling! – of them opting to use Windows 11. As we’ve talked about many times before, this is a major issue going into 2025, since Windows 10’s support will end in October of this year, meaning hundreds of millions of people all over the world will suddenly be running an operating system that will no longer receive security updates. Most of those people don’t want to, or cannot, upgrade to Windows 11, meaning Microsoft is leaving 60% of its Windows customer base out to dry. I’m sure this will go down just fine with regulators and governments the world over. Microsoft has tried everything, and it’s clear desperation is setting in, because the company just declared 2025 “The year of the Windows 11 PC refresh”, stating that Windows 11 is the best way to get all the “AI” stuff people are clearly clamoring for. All of the innovation arriving on new Windows 11 PCs is coming at an important time. We recently confirmed that after providing 10 years of updates and support, Windows 10 will reach the end of its lifecycle on Oct. 14, 2025. After this date, Windows 10 PCs will no longer receive security or feature updates, and our focus is on helping customers stay protected by moving to modern new PCs running Windows 11. Whether the current PC needs a refresh, or it has security vulnerabilities that require the latest hardware-backed protection, now is the time to move forward with a new Windows 11 PC. ↫ Some overpaid executive at Microsoft What makes this so incredibly aggravating and deeply tone-deaf is that for most of the people affected by this, “upgrading” to Windows 11 simply isn’t a realistic option. Their current PC is most likely performing and working just fine, but the steep and strict hardware requirements prohibit them from installing Windows 11. Buying an entirely new PC is often not only not needed from a performance perspective, but for many, many people also simply unaffordable. In case you haven’t noticed, it’s not exactly going great, financially, for a lot of people out there, and even in the US alone, 70-80% of people live paycheck-to-paycheck, and they’re certainly not going to be able to just “move forward with a new Windows 11 PC” for nebulous and often regressive “benefits” like “AI”. The fact that Microsoft seems to think all of those hundreds of millions of people not only want to buy a new PC to get “AI” features, but that they also can afford it like it’s no big deal, shows some real lack of connective tissue between the halls of Microsoft’s headquarters and the wider world. Microsoft’s utter lack of a grasp on the financial realities of so many individuals and families today is shocking, at best, and downright offensive, at worst. I guess if you live in a world where you can casually bribe a president-elect for one million dollars, buying a new computer feels like buying a bag of potatoes.
The more than two decades since Half-Life 2‘s release have been filled with plenty of rumors and hints about Half-Life 3, ranging from the official–ish to the thin to the downright misleading. As we head into 2025, though, we’re approaching something close to a critical mass of rumors and leaks suggesting that Half-Life 3 is really in the works this time, and could be officially announced in the coming months. ↫ Kyle Orland at Ars Technica We should all be skeptical of anything related to Half-Life 3, but there’s no denying something’s buzzing. The one reason why I personally think a Half-Life 3 might be happening is the imminent launch of SteamOS for generic PCs, possibly accompanied by prebuilt SteamOS PCs and consoles and third-party Steam Decks. It makes perfect sense for Valve to have such a launch accompanied by the release of Half-Life 3, similar to how Half-Life 2 was accompanied by the launch of Steam. We’ll have to wait and see. It will be hard to fulfill all the crazy expectations, though.