There’s more coming to Windows 11 at some point during this year, and three of them are of particular interest to the type of people who read OSNews. First, Windows is finally getting support for more archive file formats.
Microsoft has finally added native support for more archive formats, allowing you to open tar, 7-zip, rar, gz, and other files. In addition, Windows 11 users will benefit from improved compression performance when zipping files.
You’ll soon also be able to force quit applications straight from the taskbar, instead of having to open Task Manager, and as we noted not too long ago, ungrouped taskbar buttons are also making a comeback – among other things.
Content adaptive brightness control also sounds intriguing. I’d appreciate an hour more on my battery.
That’s nice, some actual good changes in Windows for once.
I’ll still probably go with 7z at work in any case. The interface for compressing and uncompressing using the MS interface in Explorer isn’t very good, especially in Windows 11. And that’s being kind.
Needs LHA! Granted, due to the Amiga’s file system capabilities, extracting some of the whdload stuff on a windows machine will cause filenames to be truncated…
leech,
I don’t use these any more, but I’ve kept them as a backup in case I ever needed them again.
There were tons of weird archive formats in the shareware & bbs days, haha. I developed my own archive format for a game I was writing ages ago. Fun times 🙂
We have all these articles for new Windows features, but the most important one is missing:
App Isolation
https://github.com/microsoft/win32-app-isolation
It seems like they are offering containerized (flatpack style) application support for Windows. What they are going to do seems to be enabling this for end users in default Windows installs, instead of being an obscure developer package.
(Some details: https://www.tomshardware.com/news/windows-11-moment-3-update-isolated-x32-apps-no-rar-support-yet)
sukru,
There was a tool that did something like this on windows a decade or so ago, but I’m drawing blanks what it was called. It ran applications in their own isolated environment and it was quite easy to use.
Isolation is frequently overlooked in the design of desktop operating systems, which have tended to focus on user application but not application isolation. Giving users the ability to isolate applications helps improve security. However these mandatory feature updates always give me pause. It’s a thin line between having an OS that empowers owners by giving us the tools to secure our systems versus having an OS that uses the tools to unilaterally force policies without owner permission and no owner override. Mandatory feature updates are user hostile.
As long as these isolation features remain under owner control in the future, I think it’s useful functionality. But hypothetically if they were to make application sandboxing mandatory in the future without user permission, then it starts to look a whole lot more like IOS.
Alfman,
Yes, a forced “upgrade” would not be ideal.
Mac land is already moving into that direction. Not only iOS, but even regular Mac OS desktop will give you prompts like “The Terminal wants to access Desktop” when you do a simple “ls”.
Sandboxie?
darkhog,
Ah yes, do you use it?
https://sandboxie-plus.com/Sandboxie/
I really haven’t used it in ages after migrating to linux.
I wonder how isolation will work with the registry. As a former windows app developer that abused the registry heavily ( as did everyone else, and my stupid manager made me), debugging any with the registry might be even more challenging. Are you in the default registry or the application’s segmented view of the registry?
Bill Shooter of Bul,
Sounds familiar! I was writing software at an internship. One application I wrote needed a small database to keep track of customer products & labels. Anyway my manager instructed me to save this database in the registry instead of in a file that users could copy/move to different computers. I didn’t question his instructions, bug ugh that felt dirty. Registry settings are one thing, but who saves actual data in the registry? It’s a pain to work with that way.
I don’t have windows 11 to see how well the isolation works, so I cannot really pass judgement on it for the moment.
But the segmentation you describe does sounds kind of reminiscent of the dichotomy microsoft created with the system32 and syswow64 resource forks in both the file system and registry. The incompatible views created much confusion for users “Ahh why does the program say file not found?!?! I’m looking right now and the file is right there!!”
I kind of rebelled and made a variable called squigglybub in the registry as a protest. I thought no customer would ever see it or have to modify it, but then some thing broke what ever it controlled and the only way to do it was to tell the customer to turn squigglybub off. I actually got reprimanded for not being professional in my naming choice by my boss at that time. Which I thought was kind of absurd. The variable was a very very dumb thing he wanted me to do which wasn’t in any way secure, but insisted it be done. Something like marking the license had been validated. You know the license for the application which we never ever charged any money for. I think he understood the point of making a dumb name for an absurd feature was my way of revenge.
They use something called “registry virtualization”:
https://youtu.be/w6VwHGPz12w?t=1156
And this is not new. We had this for application back compat for a long while. (How else it would run older apps that want to write to program files, “abuse” registry, etc, cleanly in the modern OS)?
It’s one thing if you know what’s going on and expect it. But there are times when under the hood remapping is extremely confusing and unexpected “why are none of my changes having any effect?!?” If you couldn’t tell, I’ve gotten hit by this on more than one occasion 🙂
I legitimately think the whole 32bit 64bit dichotomy was an an unplanned mistake. I suspect what happened is extremely simple: rather than developers adding 64bit to a working 32bit system. They started with a pure 64bit port of windows for their proof of concept. Only then did they consider the requirements for adding 32bit support back in, but by this point all of the 64bit resources had literally taken over the full paths of 32bit resources (c:\windows\system32). This obviously is extremely dumb as it would mean that 32bit software would see the wrong dlls since c:\windows\system32 got hijacked with 64bit dlls. Microsoft’s solution was the ugly hack that we know today: remapping resources back to where they were supposed to be c:\windows\syswow64 to c:\windows\system32 and now applications end up with different views of the system depending on their architecture. It’s a big mess that would not have been technically justified if only they had planed for 32bit and 64bit to coexist in the first place.
To my knowledge no other operating system does that. Even the previous architectural transitions for windows itself didn’t do this and it was no problem at all for 16 bit applications and 32bit applications to coexist and see & share the same namespaces. Nobody complained about it, it just worked.
TLDR; The virtual remapping is a solution to a problem that microsoft created themselves.
We’ve analyzed this on osnews before, the 32/64 layout isn’t even used consistently. There are many 32bit binaries in the 64bit directories and visa versa. Now this confusing mess has to be kept around for legacy compatibility. This is just one of my gripes, oh well!
Alfman,
Linux had lib32 and lib64, and the dynamic loader (ld.so) picked the correct ones at runtime.
Windows also has a library split. But they tried to use the same names and same paths for their new DLLs. Hence all that confusion ensured:
https://www.ibm.com/support/pages/why-do-64-bit-dlls-go-system32-and-32-bit-dlls-syswow64-64-bit-windows
Windows 3.1 had System for 16 bit. Windows 95/NT had System32 for 32 bit. Windows XP 64 bit had SysWOW64 for 32 bit and System32 for 64 bit.
The tradeoff would be drop in source compatibility. But people had to recompile for 64 bit anyways, so having them fix hard coded paths in the process would have been a definite better choice.
sukru,
Exactly, that’s what windows should do too. No architecture FS remapping needed.
Well, yes but there’s no reason for them to share the same DLL paths. furthermore It’s totally unnecessary and confusing to segregate program files. And in terms of segregating the registry, why? IMHO upgrading an application from 32 to 64bit and continuing to use the same app settings is a feature. Not one of those cases justifies the need for new confusing namespaces.
Yes, that was a good approach and I wish microsoft had continued that and used “system64” for 64bit resources. This is what linux does and it works fine. It would be crystal clear to everyone where things are since windows isn’t lying to applications about the true path to programs. And you don’t get the “c:\windows\system32\xyz.dll doesn’t exist” anomalies.
Yes, I agree the onus should have been on 64 software because the software needs to be ported & rebuilt for 64bit anyways. However even then I don’ believe the compatibility argument holds water because it is the loader that determines which path to use (using environment variables), not the application. Developers don’t specify the DLL path. Unless you’re loading DLLs dynamically, paths are generally not in the executable.
I really believe this was a byproduct of adding 32bit to 64bit port after the fact rather than a plan because the plan is quite a bit worse than all the alternatives.
I will keep using WinRAR and 7-zip anyway. I had issues with corrupted files after unpacking with other tools than the “official” one, both in case of RAR and 7z formats. Whereas, after unpacking with the official one it always worked fine (with the same archive file, so it wasn’t a case of corrupted archive).
Not to mention the way Windows handles archives, compressed folders, is stupid to say the least.
//edit: Also your “remember me” checkbox on the login screen is broken, have to log in again every other week. And no, didn’t clear cookies in the meantime.
darkhog,
Do you have an example?
As I recall, the official unrar source code has been open source since the dos days and is the same code used by official tools, so there really shouldn’t be corruption in 3rd party utilities using the same code.
https://github.com/pmachapman/unrar
I don’t follow the intricacies of rar, but just a guess is maybe an older version of unrar can fail to decompress an archive generated by a newer version of rar? Maybe we could try it and see what happens.
Yeah, but the RAR format changed a couple of times since then, adding stuff like better compression and better encryption, stuff that third-party tools usually don’t do correctly when unraring. As for example, I’d give you one, but I’m pleading the fifth on this ;). Hope you understand and can figure out where the rar in question came from.
As for 7z, one of my friends made a game in RPG Maker and compressed it with 7z. Unfortunately after unpacking it with WinRAR, some of the files were corrupted and the game didn’t work correctly (crashed when entering a certain map). But then I’ve been asked to download 7zip and unpack it with it (which I did) and this time (without changing anything about the archive, such as redownloading it) it worked correctly.
As an archive format RAR is still the superior choice.
It has solid archives with recovery record support.
7-zip, too has solid archives, but without recovery records they are just waiting to become useless with a random flip (re: bitrot, and yes they do happen quite often to be a nuisance).
Personal preference, but RAR also has a better SFX stub. The default user interface for self extracting archives is very well designed.
As for compression? That is secondary. Both offer really solid performance, and can utilize large amounts of RAM and parallel processing. RAR has some specialized algorithms for text, executable, and multimedia files, but I am not sure they are still relevant today.
The only downside is that RAR is not free.
WinRAR has been a 3MB download and 5 seconds install. Lastly, people have largely stopped using archivers. The young don’t even know what WinRAR is.
I don’t understand why so many websites are talking about that.