It’s been nearly a year since a faulty CrowdStrike update took down 8.5 million Windows-based machines around the world, and Microsoft wants to ensure such a problem never happens again. After holding a summit with security vendors last year, Microsoft is poised to release a private preview of Windows changes that will move antivirus (AV) and endpoint detection and response (EDR) apps out of the Windows kernel.
↫ Tom Warren at The Verge
After the CrowdStrike incident, one of the first things Microsoft hinted as was moving antivirus and EDR applications out of the kernel, building an entirely new framework for these applications instead. The company has been working together with several large security vendors on these new frameworks and APIs, and it’s now finally ready to show off this new work to the outside world. Instead of designing the new frameworks and APIs in-house and just dumping them on the security vendors, Microsoft requested the security vendors send them detailed documentation on how they want the new frameworks and APIs to work.
This first preview of the new implementation will be private, and will allow security vendors to request changes and additional features. Microsoft states it will take a few iterations before it’s ready for general availability, and on top of that, security software is only the first focus of this new effort. It turns out Microsoft wants to move more stuff out of the kernel, with anti-cheat software – more accurately described as rootkits, like Riot’s Vanguard – being an obvious next target.
Perhaps this effort could have some beneficial side effects for gaming on Linux, which you should be doing anyway if you want better performance, because Windows games seem to perform better on Linux than they do on Windows.
Games do perform better on Linux in general – on my machine anyway. They do require more system RAM though.
There are a few pain points, but it’s been getting better FAST.
– Compatibility isn’t actually a pain point much any more. Many games work better on Linux than they do on modern Windows version, especially older games on newer Windows versions. On Linux it often just works, or takes very little effort to get working, while it can be impossible to get it working properly on Windows 11.
– An exception to this is rootkit, I mean anti-cheat/anti-tamper – but that is mostly company policy driven at this point. It’s not a technical problem.
– The other except to this is MODs. They are slightly more challenging to set up, because finding the paths to the games is harder to find in wine. That’s mostly just a matter of learning how wine does things though.
– HDR is a pain point – this has never been better – but it’s all very new, and system wide configuration tools don’t yet exist. There are also various bugs with switching between gamescope mode (game mode – steam os style) and desktop mode, and you really have to be in game mode to get the best performance/experience.
– The AMD/HDMI situation really sucks. There’s no technical reason we can’t have 120Hz HDR output at 4k with VRR – except someone at HDMI is on a power trip, and won’t let AMD (or Intel on Battlemage, from what I understand) enable the damned feature because a subsystem – content protection – that no one wants or asked for.
– nvidia on Linux is still a giant pain in the ass.
That is an incredibly short list. Gaming on Linux has never been better!
There’s even Heroic Launcher to make Epic, GOG and even Amazon games easy to load, and add to Steam.
If curious, my system:
– AMD Ryzen 5700X
– 32GB DDR4 RAM
– MSI b450 Gaming Plus mobo
– AMD Radeon RX 6800 (16GB RAM)
CaptainN-,
Yes, Linux has an advantage, since basically all games run in a well designed API emulation layer. This makes sure older games like Fallout which require tens of patches on modern Windows “just works”
On the other hand, there is still a lot of work needed to get them run stable, and not everything is perfect. There are a plethora of translation layers like OpenGL on Vulkan, DXVK, VKD3D not to mention the analogues on Mac.
If it works, it might even work better than Windows. Not only the stability issues, but Proton, for example has very good shader caching, and will fix issues like stuttering that plagues many Windows games (e.g.: Elden Ring)
It can even go crazy like Raspberry PI running classic Source Engine titles:
https://www.reddit.com/r/raspberry_pi/comments/1fxonk4/gaming_via_x86_steam_with_box86_box64_and_proton/
Yet, if a game does not work, it just does not work. Some even being sabotaged by publishers. (A lot of EA games retroactively received that dreaded EA launcher. That was a terrible move, and broke things on Steam OS).
(Side note: I had much less luck with Heroic Launcher myself. That was before I gave up on my Steam Deck, and just moved to Rog Ally + Windows, yes, had to give up).
sukru,
Yeah, the kids have been buying many windows games that work fine on linux. Although I still hit many titles that don’t work. We talked about Stray not working before, but it works now. Some games I’ve tried are still a no-go. A few years ago I tried to show my kids a minigolf game I played as a kid on win98. I have the original CD but couldn’t get it to work on linux (or even windows for that matter).
https://archive.org/details/3DUltraMinigolfDeluxe_201809
Although I wonder…reactos? Haha.
I gave up trying to use raspberry pi for a gaming system. The performance is low but playable for native games, but I found the ARM barrier was just too painful to run steam/windows games. It stopped being a fun project for the kids and was turning into time sucking work. I thought to myself if nothing else I’d use the RPI as a remote steam client from an x86 box in a different room, but that software turned out that’s extremely buggy with joysticks not working correctly in many titles.
I ended up just buying a 30″ HDMI cable for the TV so they could game on it that way, which at least works. But samsung’s smart TV features get in the way and you can’t simply select HDMI, it has to be configured first, but since the computer doesn’t have CEC it has to time out before finishing. And samsung made another stupid decision to automatically delete the HDMI link when the computer is unplugged, which means the whole process has to be repeated.. The dumb TV just worked. Argh why does new technology have to be this frustrating!??!
If you’re just looking for a living room gaming machine, we use Atari VCS 800s on all three of our TVs. They run an AMD SoC that’s roughly like what you’d find in a Steam Deck, so you can run pretty much anything as long as you can live with 720p. My wife’s playing Witcher 3 right now at a solid 60 FPS.
You can find a VCS for about $100 — about the same as a Pi plus its add-ons — and upgrade the RAM, SSD, etc. too.
We boot ours of external SSDs ’cause Atari’s own OS/Games are actually pretty good and we didn’t want to wipe them off the internal storage.
Brainworm,
Interesting product. New it seems to go for $200 including accessories and $160 without. Even used prices are up there. I’ll need to research it, but the atari VCS’s PC mode gets a lot of bonus points from me. Thank you for suggesting it.
My problem with so many console appliances and smart TVs in general are the restrictive ecosystems. Not only do I hate this on account of FOSS values, but I am at my wits end dealing with proprietary manufacturers telling us what we can and cannot do, like a smart TV that where the manufacturer picks what services you can have. “I can play netflix, but I can’t play my children’s recitals” F**k the tech industry.
I did build a kodi system before but it wasn’t a home run like I wanted and my wife ended up getting a roku instead because it was easier. Unfortunately that thing is a prison and neither it nor the TV can play local content nor stream PC games. Then there was the RPI that I talked about earlier. I’ve been such a passionate advocate for independence and openness, yet I’ve failed in my own home. Oh the shame.
About the HDMI situation, the only solution I can think of is AMD releasing an optional proprietary software module that’s just enough to satisfy the HDMI people (Nvidia’s proprietary Linux drivers can do HDMI 2.1). Or doing what Intel did and output DP and convert it to HDMI internally in hardware.
As I say over at Slashdot: FINALLY! I hate it how third-party AVs casually help themselves into the kernel (via a kernel driver that patches the kernel) as if we are still running XP 32-bit and PatchGuard isn’t a thing. Microsoft leaving holes in PatchGuard for AV vendors to use is and has always been downright evil.
In my opinion, no third-party code should be in the kernel (ring 0), if third-parties want some kernel-level functionality, it should be offered by the kernel itself.
Seconding this. Vendors pushing shit into the kernel space is just asking for trouble. Good on Microsoft for fixing this.
Of course that doesn’t solve the problem, exactly — next thing you know you’ll have vendors requiring users to disable Patchguard (or whatever other security systems) in order to run their software.
If I remember right, this happened relatively recently with some set of Mac-specific eGPUs; the company behind them couldn’t be arsed to produce a signed driver and so just required that users disable basically every system related to driver security.
If a third-party AV vendor does this, nobody will buy their product, especially considering Windows Defender exists (and has gotten really good lately, ignore the memes). And Microsoft won’t let OEMs ship laptops with PatchGuard disabled. Just because some vendor of a niche video-editing accelerator card does it, it doesn’t mean third-party AV vendors can.
Also, Apple Silicon Macs don’t support eGPUs or discrete GPUs in general, what was this vendor of Mac-specific eGPUs?
If I remember right this was back in 2018 and Sonnet initially made a stink about a MacOS update “removing support” for their products when those products actually depended on users running an unsigned driver. It was covered in OSNews and in several other places at the time.
The point is that third party vendors will happily compromise their users’ security to save a nickel, and then point the finger at OS manufacturers when their own bad practices cause problems for users. I’d fully expect a company like CrowdStrike to play chicken with MS on this instead of spending actual money to fix their product.
Given the date, I’m pretty sure it was more of an “Apple will never sign GPU drivers for 3rd parties ever again” issue. If you wanted to add a real GPU, you would have to do some shady stuff to get it working. Nvidia probably could have sued Apple to force them to sign their GPU drivers, but the writing was on the wall for using Macs for anything 3D already, so no money to be had there.
Sonnet eGPUs enclosures for Macs are niche products, users will do anything to get the product up and running and be thankful there is at least one vendor out there that fills their niche. Third-party consumer AVs for Windows are a crowded market that’s shrinking as more and more people realize that Windows Defender is much better than the memes imply, so any third-party consumer AV vendor that forces users to jump through such hoops is doomed to commercial irrelevance.
Crowdstrike will be interesting to watch, since they are a near monopoly in the enterprise sector, but even then, telling customers to disable OS security can’t be a good look.
kurkosdr,
Disabling security across the board adds unnecessary risk, but very often this is the operating system’s fault. When it comes to securing the kernel and authorizing drivers, it’s a bad design if the operating forced a binary on/off choice across the board. Owners should be allowed to selectively approve specific vendors. It’s neither the owner nor 3rd party vendor’s fault if security is weakened on account of the OS failing to provide fine grained controls.