“The Singularity project (an OS written in managed code used for research purposes) has provided several very useful research results and opened new avenues for exploration in operating system design. Recently, MSR released a paper covering an operating system research project that takes a new approach to building an OS stack with verifiable and type safe managed code. This project employs a novel use of Typed Assembly Language, which is what you think it is: Assembly with types (implemented as annotations and verified statically using the verification technology Boogie and the theorem prover Z3 (Boogie generates verification conditions that are then statically proven by Z3. Boogie is also a language used to build program verifiers for other languages)). As with Singularity, the C# Bartok compiler is used, but this time it generates TAL. The entire OS stack is verifiably type safe (the Nucleus is essentially the Verve HAL) and all objects are garbage collected. It does not employ the SIP model of process isolation (like Singularity). In this case, again, the entire operating system is type safe and statically proven as such using world-class theorem provers.” Channel9 has an interview on video with one of the developers behind this MSR project. Source code to Verve is available.
A lot of people complain about osnews being too broad, too much a tech blog and no longer an “os site”. However, projects like verve and singularity really look like the future of operating systems; so, if you care as much about OS’s as you say you do, you ought to at least look at this stuff.
Moving to an os like verve would be a great step forwards in terms of security (pretty much all root-elevation kernel hacks would be gone), stability (the kernel isn’t very likely to crash if its formally verified), and, to a small extent, performance (No context-switch overhead, IPC is cheaper). Of course, Verve is not a mature system, and surely integrating real 3d graphics drivers, for example, might be quite a project – but the idea and the future potential are there.
The kernel may not crash, but it also may not do what is expected.
Because of course the software runs into the problem of Who Verifies The Verification Rules?
Also I just thought of many ways a fully verified kernel can still crash: all sorts of hardware interaction.
Say a driver orders a device to DMA network packets into a RAM block but somehow frees that RAM block before shutting down the DMA engine, then that block gets reused for a loadable driver code, then WHAM kernel crash when another network packet gets received.
like first, i’m not bashing this project. i’m bashing pipe dream about perfect world they want to show.
since when did verification mean anything? especially in commercial waters? you have ms verified drivers and they usually work worse then non verified. you have all sorts of verifications elsewhere, and to what account? for pr bogus that helps them sell more
unfortunately, coders don’t really decide on their product. management and pr does. which often leads to going out with completely bogus half assed project.
i think i’m going to live to have meaningful conversation with gingerbread people long before i see something like completely secure and trustworthy.
and like the other poster said… who polices the police?
I think you’ve confused ‘verified’ with ‘certified’.
You are mixing your terms. Microsoft “certifies” drivers, meaning nothing more than “trust us, it works – cause we looked at it… and were Microsoft”. “Verified”, as it is used by the developers on this project, means it has passed a specific set of rigid mathematically proven tests for type safety that guarantees the code in question does not do certain explicitly defined “bad” things. It does not mean the code is perfect and does not contain bugs, but it does mean it does not perform certain unsafe operations. Verified != bug free.
The point is not to “police” anything. You are taking what amounts to a very interesting but currently unrealistic experiment, and trying to project it’s affect on your world view of whats wrong with software development… that ain’t the point.
Verification at this level is simply a tool. It is very useful because it acts as insurance against human error (assuming the fundamentals of the type checking are sound). It is also highly specialized and not applicable to any “normal” software development practice currently in use – they had to essentially build their own little software development ecosystem in order to test it…
The point is if you can build something that can type check code at the assembly level, and you can build a runtime/compiler/jit that can reliably annotate the assembly code it generates with type annotations for such a tool to work with, you can then build verification tools that can determine with a great deal of accuracy if said code is doing something wonky you don’t want it doing. Thats it. Its useful, but it is no panacea.
On a side note… I often get the general impression that a lot of programmers get all riled up about all this “new age” stuff from Channel9 and lash out against it simply because it is Microsoft backing it. Watch the video. If you are a programmer and can’t relate to the guy being interviewed please turn in your geek card – the guy is obviously intelligent, motivated, and cares about what he is doing. I couldn’t care less what company he is working for, and I don’t think he cares much either. He is in it for the work – this is what research is supposed to be and I applaud Microsoft for funding it.
Anyone who bashes MS for supposedly poor coding quality and the like leads me to automatically distrust their opinion. You can legitimately bitch and moan about MS marketing decisions, unethical business practices (sometimes) and the way they treat users, but it is just plain wrong to speak as though underpaid undergrads do all the programming at MS. They’ve made some amazing stuff and they’ve had to balance a great many constraints in their software, including user and hardware-related ones. And given those constraints, they’ve done a great job. Apple, for example, just says screw the users and developers and will release new APIs are CPU architectures every decade or less. And Linux? Don’t make me laugh. If OSS is truly supposed to be more stable, more robust and more featureful, aside from some flagship projects, it has failed.
lol, now help me with this one. beside the fact my first sentence says i’m not bashing project, i’m bashing OPs utopic view.
where did i bash ms coding quality? read what i wrote again.
I don’t think anyone would say that Microsoft doesn’t have some really talented people and I agree they have produced some amazing software over the years. I think the main problem with Microsoft is their attidute to release software early and fix it later. Take exchange 2007, I think it’s a great email server system with some incredible features, however when it first arrived there was a bug on the RTM disc which prevented it from installing correctly, I had to go online and download a hot fix for the install routine!
1. certification, verification is the same. looked at solution based on certain circumstances and parameters set by persons/mechanisms which control quality
2. tests can be incomplete. in fact in real world, they are always incomplete. there is simply no way to predict every possibility
Point taken. i went overboard there maybe, but i said i was not disputing project. i was disputing utopic view of parent poster.
as i said , i disputed utopic view of original poster, which would only be possible if exactly problems i named would be solved
and this is exactly one realistically set goal with real world view on the problem i can respect and agree with. not saying this will heal cancer, solve all problems and hey, your computer will run bugs free.
??? hey, i’m avid linux developer/maintainer. but all my coding is done in either vala or Mono/.Net. and beside few serious downfalls (against the rest of the quality) in some .Net parts, hats of to Ms for bringing it.
as for my job goes, i don’t feel slightest remorse to suggest Ms solution when it suits better. although you wont find any Ms solution in my house… ever. I simply lost my will to go nuts with their lack of quality control (again… do not take me to the word. I’ve been using Ms products in their worst times, 2000/xp were the last versions of their OS. i don’t know about now, nor do i want to know if it went for better, i’m not interested, linux does job for me now and it also did that job in 90s).
as long as solution takes, i’m all in for new. although i do have to admit that i have my doubts about having garbage collected kernel, gc never performed well. then again, using it as sandbox/virtualization solution to sandbox various operations… hell yea.
p.s. if you plan on biting how could i know about certified driver or os state… i still have to maintain customers machines, where there are 50%win/50%linux and problems only show on windows side. funny, maintenance interventions on win machines are needed on regular basis, while linux ones simply run unless there is a hw problem
Edited 2010-12-09 07:20 UTC
Sorry, that really was not directed at you – although it admittedly was in my reply to you.. so sorry. That was a general flip off to usual “this can’t be worth a damn, Microsoft is behind it” type of comments this article with invariably receive. Call it premature conflagration…
I have my nits to pick with Microsoft as well. I certainly don’t think their perfect, far from it. I just think it is stupid to ignore/deride research they funded just because it is them who funded it.
Its not the same, not how it is being used by this project. Mathematical verification means you have a proof backing your logic at every level from the bottom up. You are building a construct, starting from nothing, where every component you add is proven mathematically sound – you have to prove out the design. That of course doesn’t mean that the implementation is perfect and does not contain flaws, but it does mean the model is logically perfect – there are no unknown corner cases – all possible permutations have been factored in.
Hence why the scope of what exactly is being verified is so limited. There IS simply no way to predict every possibility – but with a thorough understanding of how a compiler/runtime/jit generates code you can with absolutely certainty identify code which performs certain unsafe operations. That is verification.
Certification is a form of insurance based on reputability, verification is a form of insurance based on mathematical certainties.
not to be nitpicking;)
one part of your answer excludes other.
I don’t really know what you mean by that… Maybe Im not being clear. The verification software is mathematically proven. Again, that doesn’t mean it has no bugs, but it is logically consistent and all permutations of input have been factored into its design. It is not possible to give it input that produces an unknown output.
The software it is verifying on the other hand is not. The verification software can be given a block of code, and based on the assumption that the code is annotated correctly – it can with absolutely certainty make certain claims about it, one such claim for example being “this block of code does not cast an int to a pointer”.
That doesn’t mean that the code being tested is bug free, but it DOES mean that it does not cast an int to a pointer. There is absolutely no way it could do that, because the verification software has been mathematically proven to catch any input that would represent a cast of an int to a pointer.
Sure, the verifier could have a bug in it, you still have to do due diligence. But the point is very few software products are proven out this way – this is a way to apply the benefits of proven code to other code – and it doesn’t require a runtime to do so (like for example the way .NET does it).
Edited 2010-12-09 10:32 UTC
is kinda excluded and impossible with
if you read my comments… i only bash “high horse” attitude which will solve world hunger. i do not dispute quality or originality of the project.
p.s. you seem to have a lot of love for it, kinda like being involved in its development
I was just trying to explain myself. I don’t take your comments as bashing anything – if I sound like I trying to rebut your posts that is not my intention. I still don’t see how my two statements are mutually exclusive, but Ill agree to disagree at this point.
Not at all, I just appreciate the effort involved. I work in the medical industry actually, primarily doing web related development – but I dabble in a lot of different technologies and have a keen interest in language design.
Oberon
Modula-3 Spin
Inferno
House
Are just three examples of operating systems where the systems language used was typesafe with garbage collector.
This is true, however, only for certain unusually large values of “three”.