About one month ago, I’ve posted an OSnews Asks item asking for details on how interrupts work on various architectures. Since then, I’ve been reading the manuals and comments, and have extracted what I found to be a summary of the specifics of each architecture. I’ve then written my first attempt at a portable interrupt handling model based on this data for my pet OS. Now I contribute this back to OSnews, so that these resources get more exposure for those who are interested.
Nice too see real OS topics back to OSnews!
A paper I wrote in 1975 for my Degree. I compared the Interupt mechanisms of a PDP-11 and a Varian Mini Computer (It ran Vortex as the OS)
I later spent 20 years working at Dec. I wrote a number of device drivers for PDP-11/RSX-11, VMS & Ultrix.
Can you do OS Schedulers next? Please include the many abortive attempts than MS have made in creating a fully pre-emptive scheduler?
dbolgheroni,
“Nice too see real OS topics back to OSnews!”
I concur! Instead of striving to be just another news outlet, strive to be a destination with unique content for the technically minded.
It’s the article series on OS dev that first caught my eye, but frankly I’m thinking that many articles here are not worthy of the moto “OSNews is Exploring the Future of Computing”. Too much news aggregation IMHO.
“I later spent 20 years working at Dec. I wrote a number of device drivers for PDP-11/RSX-11, VMS & Ultrix.”
I’m jealous of those who’ve been able to make a career in hardcore computer science. I studied very hard to understand the low-level ins and outs of the x86 platform. I’m proficient and enjoy this stuff a lot, but career-wise all my skills have been wasted on web development (html/javascript). The market for low level developers has vanished in the US; companies prefer to outsource than to hire residents.
I feel I would be much better off if I could have started 15 years earlier.
Heh I’ve hesitated for some time between physics and CS when entering university, physics has finally won because I don’t feel like the areas of CS which I like would get me a job…
And since my brother chose CS, I could steal his books, like I did with Modern Operating System a bit more than a year ago…
I had the similar question. Mine was worse — I was choosing between Phy, CS and Maths. I was so general and so much in doubt that I asked around. And no one around me in my extended acquaintance network knew anything about those 3 that were able to give a good answer. Finally, I just went with my phy teacher’s view: That if you look around the people doing phy and doing maths, the people doing phy resemble human beings more…
Anyway, is it possible for you to disclose your uni / course / year of study here? I’m currently studying Theo Phy at UCL right now, and I really feel that the choice is well made — concurrency? That is nothing after you deal with Quantum and GR inconsistencies!
Haha. Now, after that kind of racist comment as the above, I will need to add that CS recursion ideas are plenty useful in phys — anything discrete maths related has direct use in QM, and even if it were not, our problems with modelling in CS is hugely limiting our theoretical progress in all areas of phys. We need CS guys, a lot more CS guys, in phys to help us with the ridiculous programs we make. The general populace does not understand this, but programming knowledge should be part of high school curriculum…
So, something more related to your work here — Keep up your work. I obviously have no time to read through your stuff now, but I am very much interested. It is just that I have books piling up to the roof as of right now, and that is on top of exam prep. (Stop nagging at me, SR, QM, Classical Therm, Stat Therm, Math Methods, Solid State, Geometric Algebra, Feynman…) ((GR is missing, if you do not already realise.))
To think of this, the interrupting and scheduling business (with memory management just as closely linked) is almost the most important and the most difficult part of OS design. (the almost is there just in case.) Please do remember that implementation is most important — portability tends to be an afterthought here because performance really matters. Although, of course, the people who manage to find simultaneous eigenstates tend to do better than those who directly for performance. Nobody would care about your interrupt handling mechanism / scheduler if it is really slow, even if it is massively portable. I am sure you are already aware of this, so I am more talking to myself than to you.
(Note to self: Need to make lots of money and coordinate and fund massive project on OS design. Of course, on top of massive bookS…)
EDIT: Super weird formatting…
Edited 2011-03-28 02:32 UTC
xiaokj,
“our problems with modelling in CS is hugely limiting our theoretical progress in all areas of phys. We need CS guys, a lot more CS guys, in phys to help us with the ridiculous programs we make.”
I am curious, could you elaborate?
I’m guessing your talking about super-positioning?
Believe it or not I’ve always fantasized of creating a simulator for quantum physics. If only to help me to understand phenomenon like quantum tunneling.
I’m reminded of a particle simulator I made many years ago to model the flow of “air” particles around an airframe. I wanted to demonstrate the Bernoulli principal at the particle level, it was slow, but very cool.
Today, I’d try to optimize it with SSE vector support and make it scale on clustered servers. I’m guessing that’s been done though.
It is very funny how you replied to my comment while I am busy making a huge reply to your other comment.
Anyway, I was not talking about super-positioning or any fancy quantum stuff. I am talking about the poor state of logic composition of the people on my course. I am still an undergraduate, so it is maybe unfair, but the Mathematica course I had is modelling superbly simple stuff, particles interacting with Morse potentials and so on.
The problem I saw was many-fold. Firstly, there are people on my course who, when given the plot of the quantum wavefunction in one dimension, is incapable of telling me what the general features mean. In quantum mechanics, the wavefunction has a lot of information in it, but there are stuff immediately recognisable. If it oscillates, it means that the particle has enough energy to move around in that region, for example, and clearly they are not there yet. I can understand if they were normal physicists, but we theoretical ones need to be able to talk about such stuff — it is in our job scope. It is not easy stuff, but the teachers are really doing a bad job not even hinting to the students of such stuff.
Secondly, they were unable to really make code at all. In this sense, my lecturer is awesome, but the code beginners make can really make me cringe. The code made would have cruft lying around doing vestigial stuff, almost as if the code is an organic mass that evolved its way there with no plan whatsoever, and these parts are clearly rotting in our face but are “irremovable” because “they don’t harm us”; a statement of which I can very well prove wrong when they ask me for help. Some friends have the logic, but are not able to express them, and that is at least still fine. They can reason about it, but the implementation is not so stellar, which is to be expected for beginners.
However, I had the horror of witnessing, when helping around, the “code” of a certain project. It had no code whatsoever — there is just axioms and calculator values. There is no logic whatsoever anywhere, be it within the computer or between the keyboard and chair. The project was very specific, and it is clear within 2 minutes of googling and wikipedia that the project is going in a certain direction, while the student is completely unaware, and claims to be able to get 100% of the marks, when a substantial percentage of the work is directly from the professor and my own. And this is the future generation of theoretical physicists that we want to depend upon.
I feel so old ranting about this.
Anyway, on the more troublesome ones that may interest you, fluids is still pertinent. It just won’t behave. Our computers cannot yet deal with 6.02 10^23 times 3 dimensions times the timespan. Stochastic methods like Monte Carlo go a long way, but are still fraught with difficulties. Also, in the big bang formation of nuclear elements, the physical instabilities are the entirety of the interesting phenomena, and we have just begun being able to model it in 3D that finally allows the stars to explode. Assuming spherical symmetry just doesn’t cut it. And then, on top of that, we need to distinguish between numerical instabilities, physical instabilities and statistical randomness. Not very funny when you are working on this.
Compared with these real problems, a simulator for quantum physics sound really simple. We have lots of them, actually, just that they are all too slow and memory intensive. Of course, they help you understand quantum a bit, of which I am sure you should be encouraged to try some. However, do note that theoretical methods are very much more respectable than just numerically forcing a solution, because of the fact that you can fit in random functions and get very different answers. There is also a field of study on this problem, in fact, and the basic premise is just as easy to state. Numerically, if you try to taylor expand solutions, there should be and usually are solutions to any problem of physical interest, even if they are working for very small regions at a time only. However, they may converge to nonsense — even in quantum worlds you just don’t care about particles at infinitely far away! Theoretical methods can improve on them by looking for stationary solutions that have niceties built in, like falling off like a gaussian, which is essential. Then, building numerical solutions on top of that is economical, and for some cases, even important exact solutions happen. Of course, the exponential itself is only as exact as we can evaluate it, and will never be exactly evaluable. Nonetheless, there are definite use for theoretical methods even in the context of using numerical approximations, and people just don’t get that the theoretical part of maths is a requirement, not an afterthought. Of which, another of huge use is variational methods, under the umbrella of calculus of variations. The amazing method of differentiating with respect to dy/dx allows us to find solutions that are extremely close to the required solutions — because they try to make deviations worth nothing, small errors automatically disappear, hence it is very efficient to make calculations of that form. However, using different test functions to forcefully fit an answer can give completely different, but all useful, answers. It is just appalling but helpful in a bizarre way.
Given as little as we really know about systems in general, modelling air particles one by one in the slow manner will be fruitful, except that they are so slow, we might as well construct the Earth and calculate why it is 42… In fact, there are established papers that deal with this, and the images of individual particles interacting, forming droplets of water and so on are very amazing, and clearly deserve their place in journals.
In fact, there is no reason why you would restrict yourself to gases. If you study statistical thermodynamics, you would be kinda shocked to find that the quantum particle is really incredible because there is an identity crisis that is really non-trivial: your solids and liquids, when basic quantum statistics are applied to it, are nothing but gases! Things like surface tension are understood in terms of even stuff like classical thermodynamics, which is a field that is clearly neglected in modern times. People just don’t realise how incredible is the framework of thermodynamics — it is the mother theory of physics and it all!
Okay, I really should stop. The character count is dwindling and this is OSNews, not Physics News. We can take this discussion elsewhere if you would like this to continue. (Or if other forum posters like this too, we can start a new trend in OSNews itself… haha.)
PS: @Neolander: if you are willing to talk, but is hesitant to divulge online, you can PM or email me. Forgot about this.
EDIT: A bit of logical flow to make it easier to read, small typos, and the forums are really behaving weirdly today…
Edited 2011-03-28 04:03 UTC
Try those working on (super)strings or loop quantum gravity Judging by our subatomic physics teachers working there, it also seems that prolonged exposure to the LHC may also cause weird effects on the human mind, although they are nothing this impressive.
Université Paris Diderot – P7 (France) / Fundamental Physics / Master 1 (4th year of university).
As the course is mainly taught in French, it’s unlikely that you’ll find an English description, so here are the contents, starting from the L3 (3rd) year where things begin to be interesting
–L3, semester 1–
* Quantum Mechanics (the basics: bras and kets, wavefunctions, the Schroedinger equation, quantum wells and tunelling, spins basics and Rabi oscillations, a bit of entanglement…)
* Experimental physics (The McGyver course: you make experiments and write a small report+powerpoint oral presentation around a well-known but weird physical phenomenon. The catch: your tools are not worth much, so you’ve got to learn about improvisation)
* Mathematics 1 (A bit of Hilbert spaces, but the most interesting part is that about Laurent series and how to integrate functions which expose poles and similar annoyances)
* Physical optics (’nuff said. An interesting introductory course on lasers, though)
* Lagrangians and Special Relativity
* Optional course which I’ve chosen: Introduction to lab research (spend some time with a researcher who tells us about his field, in my case it was nonlinear acoustics, quite interesting ).
–L3, semester 2–
* Statistical physics
* Waves and vibrations (many, many calculations based on the lagrangians introduced earlier)
* Mathematics 2 (don’t remember much, but it was mostly about distributions)
* Hydrodynamics
* Numerical physics (teaching people who don’t know yet how to program in C, for more experienced guys like me it’s about writing a physics simulation)
* Optional course: Introduction to nanosciences (an interesting panorama of nano-physics, and an introductory course on the modern theories of conduction and magnetism).
–M1, semester 1–
* Advanced quantum mechanics (angular momenta and the hydrogen atom, time-independent and time-dependent perturbation theory, quantum scattering, an introduction to the density operator, and an electromagnetic part which I didn’t understand well about dipolar approximation)
* Condensed matter physics
* Subatomic physics (atomic physics and particle physics. Nothing about Gordon Freeman, though. Meh.)
* Thermodynamics of matter states (mostly about getting used to thermodynamic reasoning again and phase transitions. A very interesting part which I sadly didn’t understand at all about superconductivity)
* Optional courses: Signal processing, Project in CM physics (playing with the interaction of light with phonons in matter)
–M1, semester 2–
* Transport phenomena in matter (Out of equilibrium thermodynamics. Enough said.)
* Optional course: Semiconductor devices (p-n junction theory, diodes, transistors, laser diodes, photodiodes and photovoltaic panels… What I think I want to spend my life working on )
* Optional course: Introduction to photonics (take these silly scalar optics and make them interesting by introducing multiple weird phenomena which depend on the vector nature of the electromagnetic field)
* Optional course: Nanomaterials (a course that’s mostly about the theory and practice of modern microscopy (AFM, STM, SEM, TEM…), and a tiny bit of material growth)
* Optional course: Colloids and Interfaces (an extended and comprehensive study of surface phenomena, from the Van der Waals interaction to wetting and the stability of colloidal suspensions)
Hooray for programming courses in primary school or high school ! On the other hand, I’d say that the limits of computing power only restrain progress in some areas of theoretical physics.
You don’t need a computer to play with questions such as who or what is a quantum observation or if all physics be unified behind a single basic theory playing a role akin to that of Newton laws… Only a bunch of philosophers, physicists and mathematicians, a big table, lots of spaghetti, and a missing fork just to turn those who are fond of CS theory mad
Well, I’m playing with some ideas to create a very clean but reasonably fast low-level design. As my project targets desktops, laptops, and the future evolutions of large-screen general-purpose personal computing, I concentrate on perceived performance rather than “actual” performance (anytime I see the UI of my laptop or phone slow down just because there’s a heavy background task around, I want to kill some scheduler designers If this computer is fast enough to display a fast UI when idle, it should be fast enough to display a fast UI under load), but I know that raw number crunching power will also be needed at some point.
Good luck with that Myself, I sometimes fear that once I leave studies for “actual” work, I’ll miss some of this spare time which I currently have.
Wow, that is a huge list you have just plunked on me!
Nonetheless, from an undergrad’s POV, I won’t understand most of that as of now, except those I have already touched upon.
For QM, there is no reason whatsoever why it takes so long for lecturers to go to Bras and Kets, when it is so much clearer than whatever nonsense they write on the board, and the bras and kets explain overlap integrals almost by definition. Wavefunctions and Schroedinger and Quantum Wells and so on should be so easy — I am learning them here in 2nd year, and I am also reading on Hisenberg picture on top of Schroedinger picture, which is also more natural to the classical physicist. The rest, I still have yet to touch.
Laurent series, at least the restricted part I’ve yet learnt, is so easy, it should not be in a Masters’ course. It clearly needs to be in undergrad. Lagrangians too, but I know that Lagrangians in SR is pretty much undefined for almost the whole of last century. I don’t know if the picture had improved since, but we know we are in quite troublesome waters.
Angular momenta and the hydrogen atom is in the advanced QM course? Woe be to you! That stuff is really fundamental, and really easy to grasp! You must be pretty bored in there. Can’t help you with dipolar though, but looking at it from the classical viewpoint might help.
Superconductivity and Phase Transitions should be talking about how Superconductivity is really very special. It behaves as if there is a phase transition, and you expect abrupt changes, with a latent heat to boot. However, Superconductivity has no latent heat, so it is not the usual first-order phase transition of melting and boiling that we are familiar with. The lack of the latent heat means that it will just instantly switch from being superconducting to not, instead of having a reaction time for you to appreciate. To get it from a theoretical viewpoint is also next to impossible — our theories will say it is a 3rd order transition, when it is clearly 2nd order from the experimental viewpoint, simply because of impurities. How it is experimentally discovered was also very incredible in itself.
Irreversible Thermodynamics? That is wonderful stuff! Wonderfully challenging!
Your options sound very interesting indeed. Good choices. I see you like your experiments too, which as a theoretical, I am a rare one that also like experiments. I don’t understand how they managed to squeeze experiments out of theoretical anyway. It is very difficult to learn about numerical methods if there is no idea of what the physical system is doing, and the subject itself is very interesting.
I agree with your assessment of how there are stuff that can be done without computers at all. It is not plausible for a real result to chaos, for example, to drop out of computer simulations. Such stuff need to be done by hand. Theory of dynamical systems, the Lagrangian and friends of many body systems, are a field where no other stuff would work except brute force maths.
In fact, to the spaghetti problem, there are many useful solutions of real genius. In the tradition of unix and daemons, I strongly prefer to simply have a daemon to overlook it, although it may be more than overkill. The trick I like about the daemon approach is to realise that some parallelism problems are just non-existent if there is a server governing the resource. The method of using compare and exchange operations to really efficiently achieve the solution from there makes it really powerful, general and efficient, on top of minimalism if it is consistently used (which means it would then become a reusable programming pattern).
I prefer UI responsiveness too. I don’t know why we care so much about server responsiveness — the programmers themselves will become more productive if we had UI centrism for power users — I don’t see why compiling cannot be done behind the scenes of coding, for example. Why must the computer throw everything down and stop responding when compiling huge packages?
And I didn’t mention the two internships yet
More seriously, sorry about the big pack of text, I just thought that if you asked about the course, you were likely more interested in the contents than in the name. The contents were not available in English as far as I can tell, so… I made them available my way.
Just a question… Which year of university does “graduate” designate in the UK&US ? I think it’s 4th year, but am not sure about that…
Around here, we began with bras and kets and only then introduce wavefunctions as u(x) = {u|x|u}/{u|u}. I must agree that I find this approach more intuitive to play with, although wavefunctions are better for calculations…
The problem which we have with QM in L3 is that students come from various places with different physics programs, due to the interestingly convoluted way the French education system works. Some of my friends have had a QM course in L2, myself I haven’t. We need to make sure that the course is accessible to everyone, even though it means doing a bit less than in other countries…
If only I hadn’t neglected electromagnetism so much in the first two years ! ^^ I’m paying the price of this every single day…
What made this course almost totally incomprehensible was that the teacher did not take a qualitative, clean approach like your comment above, but rather chose to do a lot of calculations and made us struggle with dozens of blackboards full of equations with only few comments to explain what exactly he was doing…
I agree that superconductivity is a fascinating problem, though. Can you imagine if we managed to understand what we’re doing with high-Tc superconductors enough to find a general theory of high-temperature superconductivity and use it to make superconductors which work in a regular freezer, or even at room temperature ?
Indeed. A *very boring* beginning imo, though, but recently, as we reach the last courses of this semester, we’ve had an introduction to Brownian motion and microscopic causes of diffusion phenomena… Pretty yummy stuff !
Yeah, I have a tendency to think applied, even when I play with rather theoretical/fundamental things like that bistable and self-pulsing laser in my L3 internship… I don’t like things which I can’t see a use for, like theories about what happened in the first seconds of our universe or the LHC’s experiments, although I know enough about the history of physics to understand that such obscure research is vital for long-term evolution.
You are pretty lucky to be at ease with both theory and experiments Around here, most people are either at ease with one or the other.
Indeed, phasing out experiments altogether sounds very weird in a natural science like physics, I don’t understand why they did that. Around here, my course is one of the most theoretical ones, and as you can see there’s still room for experiments even though you can do more or less depending on what you like…
Yeah, I’d hardly imagine solving complex problems in hydrodynamics and acoustics analytically, but who knows… When there are lots of bodies around, we can get some interesting results from statistical physics… Just look at the theories of gases: we just need the interaction potential between two particle to extract a satisfactory equation of state. So who knows… Maybe tomorrow, some breakthrough in mathematics and theoretical physics will make us able to solve Navier-Stokes analytically without thousands of simplifications ! ^^
Edited 2011-03-29 19:32 UTC
UK undergrad courses are usually 3 years, some have a 4th year, of which usually that is the Masters’ course. US universities make the course longer, but because of that, they can have much more flexibility. Physics is one course whereby there is so many things to learn, 3 years seem ridiculous.
Having the wavefunction to calculate with is not very useful because we are actually more like guessing the thing. The problem is that, if you learn the integral way, it is kind of hard to immediately see that what you have is a 3N dimensional integral (of N particles) of which you are not interested in the integral itself, only the stuff you get out of the integral. Once you understand what the left bra and the right ket are about, you can very easily see the big picture and understand what is going on.
Also, you might want to learn what it means to have {x|A|u} and {x|u}, which, though small and subtle, makes everything else make sense. It will only take less than 5 minutes. Go on.
Try reading a few other books then. In English, there is the holy trio of Pippard, Kittel and Callen for those who are good with the maths. Reading them creates a strongly intuitive and yet also mathematical view into the subject.
We have trouble with the “high temperature” superconductivity of ceramics at slightly higher temperature than the understood metallic cooper pairing superconductivity. Room temperature or something small enough for household use is out of the picture for now, although it would be well worth Nobel prize. Heck, even explaining the ceramic one should deserve a Nobel prize.
On the base structure though, we know that it is a very simple case of bosonic Bose Einstein statistics at work for any superconductivity. That part is extremely easy to understand from statistical thermodynamics and quantum mechanics. If you have access to Feynman lectures, read the first few parts of the 3rd volume to see how quantum statistics work, and it just makes so much sense. All it has to do with is the Swap operator, which is similar to the Parity operator. Who would have thought that the indistinguishability of elementary particles would have such huge implications everywhere!
What? After so long, you just touched Brownian motion? Sigh.
I think it has a lot to do with pioneers like Hisenberg. He had an oral exam under Wien, who was so appalled with his answers to experimental work that he voted to fail him. There is a big following of people who only care about the maths and want to do physics.
The problem I see is that, fine if you want to do physics by maths, but at least you need to be able to tell me why you believe in what you believe in. Basically, the whole of logic depends upon the strength of the axioms, and the inclusion of any one falsehood can have catastrophic consequences. There are huge amounts of physical data from which to extract laws, but not all laws are of the same weightage. The way to distinguish which is which can only come from the experimental faith.
The funny thing about education these days is exactly what is feared: The widespread teaching of facts may as well be preaching faith in the scientific establishment, of which the entire point of the scientific method is to disregard authority. Faced with questions about scientific truths, people tend to utilise the same kind of reasoning reminiscent of religious arguments (sorry for the incitement, it is solely my viewpoint, and I mean it. Hehe). And then the society does not understand why it is that we have trouble converting more sheeple to the scientific faith. Clearly, nature is at work again — it does not matter if you do not yet know your flaws; nature will punish you sliently as you practise them.
Faith is best induced by repeatedly diminishing the object of faith, and witnessing its relentless triumphant recovery. That we do not already resign to this method of teaching clearly shows how humans are: History shows that humans do not learn from History.
Hence, why I do not accept teaching without experiments, and I do not accept teaching without proofs, at all. Especially at the high school level, where there is considerable noise of wanting to teach for “usefulness in the workplace” nonsense when it is clear that you cannot be useful in the workplace when you cannot even form basic logical thought processes. Either you know why you can place your life on the line believing your beliefs, or just go jump off the cliff.
I don’t understand why there are people in the rich world not liking experiments. Where I come from, the experiments are only done to the exam standards, and it is always the same few experiments, and that was so boring! I managed to survive that, and still like experiments, and you guys can design and play with your experiments here…
EDIT: Forgot a small part.
Edited 2011-03-29 20:46 UTC
Are we talking about those “cuprate” superconductors which work at the temperature of liquid nitrogen, or something at lower temperature but still not within the realm of BCS?
Well, we’ve been playing with some kinds of (semi-)random motion like Drude’s model of conductivity for some years, but only this year have I seen the theory behind Brownian motion being cleanly and relatively deeply taken care of
I think that when two theories explain experiments equally well, theoretical cleanness has a role to play, too. We could, if we wanted, introduce thousands of additions to Newton’s and Maxwell’s laws until they explain the same things as quantum mechanics, but QM wins by Occam’s principle: although extremely weird from a philosophical point of view, this theory is coherent with experiments and only needs a small number of hypotheses to work…
Yes.
I see I see. That is fine then.
Now, I don’t have much clue as to how the 2 topics linked up, but assuming we want to complex-ify Classical Mechanics to explain Quantum phenomena, you might be surprised that it is actually not possible from the start. There are countless mathematical proofs that our notion of continuum in the physical sense is actually not the same continuum as is described mathematically, because mathematical oddities like Barnach-Tarski does not happen in real life, which would then lead to the same results as Quantum Mechanics.
Not to mention Planck’s Law, which is the best fitted law of the universe (plot the CMB on the wall, and the size of the pen drawing the fit of Planck’s Law will be bigger than the experimental errors). No matter what, Planck’s Law dictates that, even if QM is wrong, the universe can only be more bizarre, rather than less bizarre, than QM.
ie, We have ridiculously strong proofs that we are correct, or else we would have thrown away the awful business of QM long ago.
Now, back on the topic as described here, I was lamenting the people who want to do Physics via the Mathematical route without recourse to the absurdity of the assumptions they make. When we assume Newton’s Laws to hold true, we are making an assumption that is not only well tested, it is plausible and understandable. However, just blinding taking the assumption of the existence of Entropy or Lagrangian is not the same — we have no way to justify it except for its immense agreement with experimental data. This leaves the believability part unfulfilled. A much better route would be to show that it is inevitable that such an assumption is acceptable. For the example of Entropy, we can work out from 1st law and other empirically “obvious” laws, + ideal gas law, that Entropy must exist, and then use Carnot’s argument to show that it is inevitable that Entropy be well defined and even well behaved. Then, it is more plausible to assume that Entropy exists and we can then prove 1st Law and all the other “obvious” laws using the assumed properties of Entropy. The same argument is better talked about in Lagrangian form, because the Feynman path integral formulation of QM is even more manifestly relativistic than us trying to fit relativity into SE — because of the path integral, it deals with local variations, which is manifestly relativistic. (Feynman talks about the 3 different views of physics, one is forces and fields, one is potential fields and one is Lagrangians, of which the three are the same but have very different characteristics and we cannot throw any away. Except for forces, which do not have a physical meaning in QM except as statistical averages (Ehrenfest theorem).)
On the other hand, it is clear that being mathematically proficient is a crucial prerequisite to helping out in Physics. It is exceedingly implausible that we have another Faraday any time soon (and it is getting less and less plausible by the day). Not that it is bad — we want more, not less, people to be proficient in maths. It is just that Physics is a big island formed by understandable theories linking up empirical laws expressed in mathematics, and trying to emphasise on the mathematics without the logic holding it up is just abstruse.
Neolander,
“…I concentrate on perceived performance rather than ‘actual’ performance (anytime I see the UI of my laptop or phone slow down just because there’s a heavy background task around…”
The problem is that all too often devs combine user interface threads with blocking network (or cpu bound) calls. This is a poor design from the start and I see all too many apps which do this. It’s a widespread practice for *nix apps where blocking calls are the norm.
“I was considering something like pop-up threads, in order to maximize driver performance, with mutexes and semaphores…This model has the advantage that it’s theoretically beautiful. I mean… One interrupt, one thread, isn’t that clean and scalable?”
In comments on one of the prior articles, I mentioned the benefits of using an asynchronous model instead of a threaded one. Message queues allow batches of operations to be handled together with no context switching overhead at all. This is how mainframes achieve such high performance. Microthreading defies all that. Even when threads are light, the real penalty is that threads each need their own stack which easily overflows the cache, and by their nature tend to invalidate the cache frequently between cores.
“I can think about working on less rewarding things which require more work, like NIC support or accelerated graphics…GPU drivers, especially the open-source ones, are one of the greatest sources of computer crashes ever created.”
I wouldn’t touch GPU drivers at all unless I was paid to. However, I would have thought some form of network support is required, no? If not, what is your OS going to do?
Even support for just the NE2K or RTL8139 chipsets would allow it to work under a virtual machine.
Which performance are we talking about ? The batch scheduling algorithms of mainframes achieve high number-crunching power at the cost of terrible reactivity. What I’m fearing is that by leaving the event dispatching job to third-party processes, we’d favor poor designs where every single task is handled within the event dispatcher, like those which you mentioned earlier… Unix devs are everywhere in the computing world
Neolander,
“The batch scheduling algorithms of mainframes achieve high number-crunching power at the cost of terrible reactivity.”
That’s a fair point, however you could theoretically have a dispatcher with soft/hard realtime constraints.
“What I’m fearing is that by leaving the event dispatching job to third-party processes, we’d favor poor designs where every single task is handled within the event dispatcher, like those which you mentioned earlier.”
I’m not sure if you’re agreeing or disagreeing with me?
“Yes, but which virtual machine ? Bochs ? Qemu ? VirtualBox ? One of the multiple VMwares ? VirtualPC ? Xen ?”
Bochs/Qemu/Kvm/Xen all stem from a shared codebase.
The qemu/kvm documentation lists support for i82551, i82557b, i82559er, ne2k_pci, ne2k_isa, pcnet, rtl8139, e1000, smc91c111, lance and mcf_fec. At a minimum all these support the rtl driver.
Virtual PC uses an esoteric Intel 21140.
VMWare supports “AMD 79C970 PCnet32” and “e1000/Intel 82545EM”
So, I *think* e1000 would get you everything except for virtualpc.
I understand that networking isn’t the first think one needs to work on, but if your not headed in that direction, then it’s difficult to see what this OS is going to offer?
I think most of the innovation to come in kernels is going to revolve around networking. Take for instance supporting clusters natively within the OS. Or automatically migrating processes between servers. Or distributed file systems. And so on.
“As sample applications, I think about things which better showcase the OS underneath than web browsers, which re-implement everything their own way for the sake of cross-platform consistency.”
I agree, while running a web browser is cool, it would not be innovative.
The question remains, what would you want to showcase?
I agree that devs do way too much things within event dispatcher threads. This is why, in my opinion, they should not have to write the content of those threads by hand, standard OS mechanisms should do the dispatching job for them.
VBox included? It’s one of the things which I’ve completely broken on my Fedora install, so I can’t check which network cards it does support…
Anyway, this sounds indeed like the best choice once I get into networking and look for a first hardware to implement, especially if the specs of this card are freely available on the net… Something like SoundBlaster compatibles, HD audio, and AC97 for sound
I can’t pretend to offer a desktop OS usable on an everyday basis without networking or sound, so I’ll have to support them at some point, they are just not my top priority now. GPU rendering would also be a nice (although optional) trick to have in some long-term future.
But on desktops, for demo purposes, there’s some amount of software which don’t need any of that:
-The desktop environment
-Programming tools
-Raster 2D graphics (and some light vector graphics like diagram drawing, electronic circuitry design…)
-Raytracers (because POV FTW and could already render wonders on my Pentium II)
-Anything whose main purpose is to handle fonts, simple gradients, and some images (Word processors, LaTeX, DTP software, more generally office suites…)
-Educational software
-Physics simulations, data plotting and analysis (àla Origin/Qtiplot), more generally software which does maths (including the generalist ones: Octave and Scilab on the numerical side, Maxima on the analytical side)
-Basic games, some specific gaming niches with simple graphics and no sound (e.g roguelikes).
Yup, there’s a lot of potential for networking-friendly kernels in the server area. However, I target the desktop, so these are not my main concerns atm.
Edited 2011-03-30 20:31 UTC
“I can’t pretend to offer a desktop OS usable on an everyday basis without networking or sound…”
“Yup, there’s a lot of potential for networking-friendly kernels in the server area. However, I target the desktop, so these are not my main concerns atm.”
I personally use servers everyday with no sound/display/keyboard at all and only network access. I understand now that you want your OS to be more “user facing”.
However, I still think if your goal is to work on user facing technologies, then writing a kernel seems to be an awfully tedious way to go about it, at least to me.
You’re not alone. However, I do think that many problems with nowadays’ desktop OSs stem from low-level issues. Poor reactivity is a mix of synchronous architectures, poorly written homemade event dispatchers, and scheduling algorithms that are built for computational speed rather than reactivity. Poor security is the consequence of a security model that’s unsuitable for desktop use from the start, giving way too much power to software instead of using a sandboxing/capability approach. I have yet to investigate what causes very long boot times, but I bet that’s because we try to do way too much before displaying a login prompt, like rediscovering hardware which we already know about. Lots of research shows that current driver architectures are the core cause of countless crashes. And so on…
Edited 2011-03-31 06:07 UTC
Prove that some crazy ideas are doable in practice. For the first release at least, which would be a tech demo more than an OS that’s usable on a daily basis. Like I said, I could work on the proprietary mess later. As sample applications, I think about things which better showcase the OS underneath than web browsers, which re-implement everything their own way for the sake of cross-platform consistency.
Yes, but which virtual machine ? Bochs ? Qemu ? VirtualBox ? One of the multiple VMwares ? VirtualPC ? Xen ?
My problem with claiming support for nonstandard hardware like NICs is that I can’t guarantee to users that my software will run everywhere, it’s a trial and error game. So better work first on things which will.
“Heh I’ve hesitated for some time between physics and CS when entering university, physics has finally won because I don’t feel like the areas of CS which I like would get me a job…”
I liked physics too, the calculus way of thinking answered many of my questions about all sorts of problems. With it we can derive formulas from scratch, no one else around me appreciates the intellectual power of it, but as a physicist I am sure that you do.
“And since my brother chose CS, I could steal his books, like I did with Modern Operating System a bit more than a year ago…”
As much as I enjoyed the topics, I kind of regret the CS degree for two reasons:
1. The job market totally collapsed (at least relative to what it was when I entered university).
2. It wasn’t really challenging enough and I got bored with lame/useless classwork. I would have learned much more in a field where I had less personal experience going in (such as physics or engineering).
If you are unable to impress people with calculus, then it may simply just be that you are missing some crucial tools to do so. I have always been able to wow people by using a very little bit of History to talk about it. Most of the time, I would wait until the people I talk to are frustrated by the lack of certainty in their respective fields. Then, I would tell them that, to the contrary of their experience, the maths that we do, we are completely and extremely sure about. Then, they will be intrigued and some even dispute that. That is where I would bring in History, where we knew that Calculus was born as an outcast, doomed to be hand-wavy and impure. However, Cauchy, who studied the error terms and limits, realised that within the formal structure he created, he was able to always push the error to zero (barring un-physical generalised functions (which, ironically, is a physical thing…)). This means that calculus is actually rigorously correct all along and we had been mistreating it. Its absolute truth and beauty means that all the physicist’s methods, which are ridiculously crude, are justified and proven to always work.
On the other topic, I can kinda understand (1), but (2) should be a slight nag that is more than remediable. I am sure that concurrency is not properly dealt with. Numerical algorithms and their stability and consistency is a field all on its own (which even Java is designed wrong). OS design is still springing surprises at every corner, and so is language and compiler design. None of these topics are trivial nor well integrated in any computer science course, I would presume. In fact, reading Joel on Software and Paul Graham and Eric Raymond, Lisp and the like are not taught at all, and they are fundamental topics. There are extremely many great books, from Feynman even. MIT’s old computer science series on scheme is widely praised. I am sure that, if you read outside what you are given, you will be rewarded. Network theory is also another topic of endless repercussions. Or maybe concussions. User interface stuff, including the pros and critics of the art of unix programming both have important insights of subtle nature that are difficult to grasp.
Otherwise, you can look into some other stuff. Quaternion methods to computer graphics seem nice, but I am reading on better methods than that. Applying methods from statistical thermodynamics and statistical fluid dynamics and finite element methods and other stochastic methods of computation is still not being exhausted, and in there, huge cabinets of literature abound, so you are sure to find gems.
xiaokj,
Just to be clear, I really didn’t mean imply the CS field was boring. However, the CS classes leading to my degree were because they were too elementary and didn’t go in depth enough. We covered things at a very abstract level, with lots of hand waving, and never really got our hands into working internals.
I learned more writing a tiny OS from scratch on my own than I did in operating systems classes, which is why I question now in retrospect the decision to take CS classes.
I got your point from the start — it is much more than just too elementary! I had a very funny conversation with a professor from US, and he was complaining about how the level of teaching in UK universities is really so low and so exam oriented and the course is only 3 years, leading to almost no content being properly taught. It is very funny because, all the while, we had been thinking that US is of low standards, whereas he was complaining that UK < US. Bizarre but true.
There is no substitute to self-motivated learning from books, and there should never be. In this view, there is no reason to doubt why you chose CS — getting forced to learn stuff when it may seem stupid is useful because you would never know when you really need it until you need it. For me, I might not have wanted to learn statistical thermodynamics until I have completed classical thermodynamics, but I now realise I can do both together and get to grips with both at once. I am sure that, in the job market, being certified to be good at something, with a certain minimum stuff known is good. At MIT, Lewin would guarantee you that any Physics / Engineering student must have seen the Tacoma Narrows bridge at least once. That is partially the worth of paying to get to University these days. Sad but true.
Also, sometimes, just knowing about something is enough. I tend to attend Maths lectures, not to absorb anything of immediate use, but to just understand that there are methods and angles of attack that I don’t know. There is a lot of stuff that had been solved, just that the solutions are not communicated. Maxwell and Einstein and Hilbert had been doing a lot of their famous stuff by simply applying other people’s work in different ways. From Cantor’s arguments, Godel and Turing can modify and talk about non-trivial stuff to the most amazing way!
So, I stand uncorrected — I knew your point all along, and I was just telling you about how to induce more interest into yourself about your course. Just go and read more. Remember, also, the root cause of all these dumbing down — it is circular! Assuming that people need knowledge dumbed down to them furthers that need to dumb down, and it clearly is insulting everyone’s intelligence. Same thing with political correctness and so on. Things are getting really, downright, dangerous.
Not true! Just do a search on dice for linux +kernel and you’ll get a tonne of job postings in the US for companies looking for low level linux type work. With the surge in Linux OSs (Android, WebOS etc.) there’s plenty of opportunity.
mahoney,
“Not true! Just do a search on dice for linux +kernel and you’ll get a tonne of job postings in the US for companies looking for low level linux type work. With the surge in Linux OSs (Android, WebOS etc.) there’s plenty of opportunity.”
Haha, you clearly haven’t spent a minute in the Dice discussion boards.
I’ve tried Dice, but there are no opportunities in my area what-so-ever, much less ones which I am a match for.
Can’t guarantee anything at the moment, my to-do list is too large atm to go into starting something else…
-Hobby OS-deving 4 (getting started with x86)
-Bada review 3 (bundled software, maybe part 1 if it becomes too large)
-Analyzing all the feedback on those two articles (wow, a dozen of new comments to review on my blog in one day or so !), trying to expand with more architectures as many have pointed out that there are some major players missing (Sparc, PPC…).
And that’s without taking into accout my offline life
Edited 2011-03-27 20:18 UTC
Thanks for possibly adding it to the list.
It is certainly interesting how my the thread diverged like it has.
I am not a CS grad. Not a Computer grad at all. I’m a Control Engineer by profession & qualification. Interrupt handling & latency are important in RTOS environments especially where sub millisecond responses are required. I got my grounding in this area working on Flight Simulators (The real things used by Airlines & Military) in the later 1960’s/early 1970’s.
No Sparc? I have about 5 32bit sparcstations and 2 Ultras just chomping at the bit to run hobby operating systems 🙂
That aside Sparc would make a neat hobby computer project using cheap FPGAs …like the nexsys2 board or something similar (don’t forget they have a high gate count version)… maybe even the new ATLYS board it looks really nice.
If you want to maximize portability, then the answer is to keep the interrupt handler code very brief, possibly doing nothing more than dispatching a message to the driver using OS standard message queues (hopefully prioritized).
This serves multiple purposes:
1. The driver interface to interrupt callbacks via message queues which are standardized across architectures, so the drivers themselves are more portable.
2. The logic for acknowledging and/or combining and/or sharing interrupts is handled at the OS level for each architecture, and need not be repeated within each driver.
3. The OS can change the priority of IO independently from hardware interrupts.
4. In the future, it may be easier to virtualize the operating system into a user process if “interrupts” are handled in message queues rather than actual hardware interrupts.
5. The message queue abstraction would theoretically enable us to run and test drivers in user space (this may or may not be desirable for a production system, however this is desirable for debugging a driver).
6. Depending on overall OS design & architecture, the message queues solve many concurrency issues. Instead of requiring drivers to handle interrupts at any moment, they can simply read the message queue when they can (in series or in parallel as needed).
7. Avoids the need for each driver to implement it’s own queue internally when it is unable to handle the interrupts immediately (blocked semaphores etc.)
If performance is absolutely crucial, the drivers can receive interrupts directly, however then the drivers have to handle concurrency issues, which makes them much more complicated.
…
To be truly viable though, your OS is going to need drivers which you cannot personally develop (due to resources). I would try to see if there is a way that you could compile existing linux drivers into your OS.
This is particularly important for network cards. Less important but desirable are sound cards, video cards and USB devices.
Of course at that point, I’m not sure if your OS can be that much “better” than linux while being source compatible with its drivers?
It’s probably overkill for a hobbyist OS, but it just sucks that your OS will not work for most of us due to drivers.
Interesting notes about the message queue concept. I was considering something like pop-up threads, in order to maximize driver performance, with mutexes and semaphores here and there as bottlenecks when mutual exclusion is truly needed (with good deadlock prevention policies, obviously).
This model has the advantage that it’s theoretically beautiful. I mean… One interrupt, one thread, isn’t that clean and scalable? But maybe it could indeed prove to make drivers hard to write in some circumstances. Got to think about this.
Yup. My codebase is not POSIX, is based on objects, and is a microkernel, so it’s likely to be pretty hard to do this, but implementing things like ACPI from scratch is a whole life’s task and I think I have more interesting things to work on
I plan to start with a “proof of concept” release which is fully based on widely documented standards for hardware (PS/2 mouse and keyboard interface, VESA…). At this point, progress will be made for everyone. After that, if this OS surprises me by gaining lots of traction, I can think about working on less rewarding things which require more work, like NIC support or accelerated graphics.
GPU support, in particular, would be an interesting crash test for the “bulletproof” side of this project, considering that nowadays’ GPU drivers, especially the open-source ones, are one of the greatest sources of computer crashes ever created. HW manufacturers wanted to look smart by going to proprietary convoluted specs and unneeded effort duplication, and now we’re all paying the price…
Cool discussion guys. Mixes it up a bit.
I missed your question first time around; if you’re not already familiar with Chris Lattner’s ( of the LLVM project ) OS Resource Center, it’s a wealth of info and he seems committed to keeping it up and running.
http://www.nondot.org/sabre/os/articles