“Luca Barbieri made a rather significant commit today that adds a state tracker dubbed ‘d3d1x’, which implements the Direct3D 10/11 COM API in Gallium3D. Luca says this is just the initial version, but it’s already working and can run a few DirectX 10/11 texturing demos on Linux at the moment. This is not a matter of simply translating the Direct3D calls and converting them to OpenGL like how Wine currently handles it, but is natively implemented within Gallium3D and TGSI to speak directly to the underlying graphics driver and hardware. Thanks to Gallium3D’s architecture, this Direct3D support essentially becomes ‘free’ to all Linux drivers with little to no work required.”
Now this, this is sweet.
Sounds like it’ll ensure that fewer and fewer games will be made in OpenGL if this gets good…
If D3D is kept current on Linux, is that a problem? OpenGL doesn’t lead in anything anymore, and only plays catch-up to Direct3d.
Both nVidia and AMD design their graphics chips around the next Direct3D spec, and after the chips are finalized, the OpenGL spec is updated.
I would argue that OpenGL and DirectX cater for two entirely different markets. When there was going to be a massive overhaul to OpenGL the people who kicked up the stink about it? the CAD industry wanted to keep the status quo and the gaming industry wanted to have a whole heap of new features and ways of doing things. The result is an API designed by a committee trying to cater for a variety of different needs versus DirectX which is controlled via a benevolent dictator who makes decisions and the market follows. The benefit of the later over the former is uniformity and a set direction that isn’t swayed by protests from those sitting on the cheap seats.
With that being said OpenGL has made a huge leap forward over the last couple of years – it hasn’t been give the massive changes that some have been hoping for it but it certainly hasn’t been standing still when you consider what has appears in OpenGL 4.x.
Microsoft… Benevolent? Are you serious?
As for OpenGL playing catchup..? On what planet? It is other way around.
On this planet. New OpenGL versions regularly add features already in Direct3D.
I would suggest you to read the following blogposts. You may reconsider your point of view afterwards…
http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-D…
http://blog.wolfire.com/2010/08/OpenGL-update
Also take note that WebGL has the potential to make Flash history. There will be no need to install a buggy Flash plugin on Linux when websites use jQuerry, when video sites use the HTML5 video tag, and when online games use WebGL. For instance, look at this:
http://playwebgl.com/games/quake-2-webgl/
Very interesting read Who would have known that game devs know so much about graphic apis, instead of just choosing some framework and taking the graphic api which comes with it ?
A couple issues with the linked article.
First, Half the article is about FUD, and not technology. 4 year old FUD, no less. He talks about when Microsoft was originally planning to disable direct hardware access to graphics without Direct3D. This was a long time ago.
The PS3 uses it’s own API for 3D, and while Sony does provide OpenGL ES 2.0, it is poorly supported and offers low performance. I don’t believe the Nintendo supports OpenGL on the Wii, though their native API is OpenGL-like.
While OpenGL is cross-platform, he he misrepresents OpenGL ES as part of it. OpenGL ES has been a separate API, not just a subset as commonly described. Its inclusion in OpenGL 4.1 is recent.
His stats on XP use being dominant are outdated, as the graph he links to shows 56% of Steam users have a DirectX 10-class GPU running Windows 7/Vista. This is not his fault, though. However, he overstates the importance of XP’s user base, and ignores the fact that it is shrinking.
His initial claims about OpenGL being a significantly faster API are later refuted by himself when he says “Microsoft has worked hard on DirectX 10 and 11, and they’re now about as fast as OpenGL…”
Microsoft has indeed done a ton of work on bringing Direct3D execution up to speed.
While tessellation may have been available as an extension to OpenGL for three years, that isn’t the same as being part of the spec. Microsoft wisely removed the ability for hardware manufacturers to add custom extensions. Software developers no longer have to target D3D + vendor extensions for full capability, as they do with OpenGL. Having to support various extensions from different hardware vendors kinda defeats the purpose of a universal API.
Tessellation has also been quite possible through the standard Direct3D 10 pipeline. Only now there is a standard feature.
I must also add that tessellation was available as a direct3D 6/7 extension provided on ATI’s Radeon II graphics chip (Yes, the successor to the original Radeon). Nobody used it, and the dropped the feature in hardware.
They mention Blizzard developing and releasing Mac games simultaneously for Windows and MacOS. This is intellectual dishonesty on the case of the author. I’m sure he knows that Blizzard uses Direct3D for the Windows ports. This is a very dishonest representation.
I wonder if he even read the John Carmack quote he included: “It’s still OpenGL, although we obviously use a D3D-ish API [on the Xbox 360], and CG on the PS3. It’s interesting how little of the technology cares what API you’re using and what generation of the technology you’re on.”
From the quote, the author concludes, “If you can hit every platform using OpenGL, why shoot yourself in the foot by relying on DirectX?” Did he not see that Mr. Carmack said he used the Xbox API for the Xbox, and CG for the PS3? That certainly isn’t hitting every platform with OpenGL.
And, while we’re quoting John Carmack, since it is a popular thing to do:
Don’t get me wrong. I have nothing against OpenGL. I think these OpenGL vs Direct3D wars are stupid. There is enough room on the planet for more than one graphics API. I am just opposed to the if-its-MS-its-bad attitude that much of these debates are based on.
“I am just opposed to the if-its-MS-its-bad attitude that much of these debates are based on.”
What about the if-its-single-platform-its-bad attitude?
Isn’t a valid argument on a website talking about operating systemSSSSS?
Good point. The single-platform-is-bad attitude is valid, and I would tend to agree.
However, the article you had linked to is clearly in the bad-because-its-Microsoft vein. A significant portion of the cross-platform nature of OpenGL is misrepresented.
Prior to OpenGL 4.1, your OpenGL ES code wouldn’t work in OpenGL implementations, and vice versa. He also implies that part of Blizzard’s success as a multi-platform gaming developer is due to OpenGL’s cross-platform nature. However, it has little to do with OpenGL, as they only use OpenGL code for MacOS X. Their Windows games are coded using D3D.
In John Carmack’s comments about OpenGL, his statements about using the native toolkits for the console versions of Rage were completely ignored, and his statement was still used to support the cross platform benefits of OpenGL. Also, I do believe the quote I found from Mr. Carmack may be from the same interview (I don’t have flash, so I can’t watch the video). If so, the author is committing blatant intellectual dishonesty.
Also, the author completely misses the key benefit of OpenGL (and, it’s key weakness, depending on your requirements): It’s stability of API. Old OpenGL code will continue to work as a first-class citizen for a long while, and not just a separate code path just for compatibility’s sake.
OpenGL still leads in being open *and* multiplatform.
Plus I see no Direct3D based games for the smartphones, but more and more OpenGL ES ones…
You forget that non native games for OS X and Linux run terribly slow when compared to the native client, should one exist.
Also, why should anyone have to buy a new, slower OS just for the graphics stack when their current OS is still working fine and is still supported with security updates? Sure 64 bit allocation, but most gamers are not on the upgrade treadmill, only the enthusiast crowd is and they make up less then 1% of the market.
The vast majority are still running IGPs or a low end card that came with their OEM box. There are also many that build a upper mid range box once every 5 years or so. at the time of buying a new machine is when they bother to upgrade their windows.
…because actually implementing a full up-to-date D3D layer which people can use would be nearly impossible, due to…
-Copyright infrigement
-MS’s ability to break the API every year, requiring the Gallium3D people to start over
-Speed (Wine recompiles programs first, but if I understand well this is about real-time translation, aka the main reason why old interpreters deeply suck)
MS can’t easily just change the API. A huge amount of third party software depends on it, and it is tied fairly closely to existing (and future) hardware.
You must realize that graphics hardware is designed around the requirements of whatever the next version of DirectX is, and DirectX is tied fairly well to the hardware. It isn’t developed in a vacuum; hardware manufacturers are deeply involved in the spec, and if Microsoft were to start changing things willy nilly just to irk a handful of Linux users, AMD and nVidia would probably start supporting OpenGL more. Microsoft surely wants that much, much less than no DirectX on Linux.
As for speed, no interpretation is involved.
IIRC it works like this: The API is written on gallium, and gallium is in turn implemented for the specific hardware. Since DirectX is being implemented via Gallium3D, it is a fully-native API.
Edited 2010-09-22 06:12 UTC
That is not true. If that was true all cards should provide a common HW interface and Windows should have one driver “DirectX device” like USB Mouse not several drivers.
Designing hardware for common requirements is not at all equal to designing hardware to a common interface.
Example. Direct3D 11 needs feature X.
ATI implements Feature X by implementation Y.
However, nVidia implements Feature X, but believes Implementation Z is more suited for the rest of their graphics pipeline.
DirectX is agnostic to the actual implementation, as long as the feature returns the same results. This allows hardware manufacturers to tune their hardware different ways.
The software in question is a “thin layer over the Gallium 3D driver layer”. In turn, the Gallium3D driver is a part of the Linux kernel (which includes hardware drivers either directly or as kernel loadable modules). None of this is in any way a copy of any Microsoft software. There is no “copy and paste” here, this is a new implementation of the Direct3D API. Therefore, no copyright infringement can occur.
MS won’t break the API unless they wish to break games.
How so?
It is a software layer interfacing via an API to client programs on the one side and on the other side to different physical hardware cards via a common intermediate shader language. I can’t see where that is a speed-challenged interpreter.
http://www.phoronix.com/forums/showpost.php?p=148860&postcount=21
http://en.wikipedia.org/wiki/Gallium3D
In the sense of it being a layer between the graphics API and the operating system, this Direct3D state tracker for Gallium3D is no different in principle from Direct3D software from Microsoft sitting between the Direct3D API and the operating system (in that case Windows). Indeed, it isn’t really any different from an OpenGL state tracker, apart from presenting a different API to client programs.
This new Gallium3D state tracker implementing a Direct3D API on Linux is equivalent functionality to, but entirely different code from, the Direct3D API driver software for Windows.
http://www.phoronix.com/forums/showpost.php?p=148848&postcount=13
I hope this information helps.
Edited 2010-09-22 07:07 UTC
Can’t microsoft patent some mechanism used in Direct3D, like e.g. tesselation ? Is it really only implementation that can be patented ?
Indeed. Owning the proprietary spec, they can, however, make the API much more complex at each release, meaning that Linux players won’t be able to play latest games for some time. Or they can link it with windows-specific technology like .net as much as possible. Kinda the same problem as reimplementing the Flash spec.
Sorry, I meant the part of the G3D project which works on reimplementation of D3D
Can’t microsoft patent some mechanism used in Direct3D, like e.g. tesselation ? Is it really only implementation that can be patented ?
Tessellation is too general a method to be patented. And it’s not even an algorithm or anything, it’s just an action of doing something. Think of building a house: you can patent a specific way of building a specific kind of a house but you can’t patent the whole idea of building a house. So no, Microsoft can’t patent it.
Indeed. Owning the proprietary spec, they can, however, make the API much more complex at each release, meaning that Linux players won’t be able to play latest games for some time.
Of course that’s true, but D3D is aimed at the whole gaming and 3D graphics industry so if Microsoft were to add unnecessarily complex stuff in it no one would use them and D3D would lose a lot of its popularity. Microsoft can’t just go and make D3D more complex like that. Besides, it’d most likely also slow it down if they were to do such a thing, so that’s yet another no-go.
Besides, even if DX12 was incompatible with DX11 it doesn’t mean Linux folks would make it so: they’d just keep the existing DX11 layer and add a DX12 one in addition to it, not replace it, and thus you’d be able to run both. But again, Microsoft would be insanely stupid to go that route. They’d just annoy billions of their users, and not only gamers but also high-class enterprise users.
Or they can link it with windows-specific technology like .net as much as possible. Kinda the same problem as reimplementing the Flash spec.
Linking it with something like .NET or something would be a better way from Microsoft’s view, but again, what benefit would that bring to DirectX developers or users? Would it be of so much benefit that it would be useful, or would it just slow DirectX down or add hurdles that would turn developers away instead?
Stuff that’s complex to implement is not necessarily complex to use. Take malloc(), as an example : incredibly handy and easy to use, but that certainly does not mean that memory management is a trivial thing to implement.
What could prevent Microsoft from implementing things like this in D3D ? It could take them some time, but they could keep the spec only available to trusted commercial partners during this time and only unveil it when the new release of D3D has reached beta quality…
Except with Vista where the abuse was noticed, Microsoft and Intel constantly made people buy more and more powerful hardware for tasks which they could do on a PIII as time passed, and they never complained. Without the Windows monopoly, computers for office work could cost half less than they do now. Also, do you really think that Crysis being bloated prevented many players from buying it ?
Indeed, they could, that’s what Microsoft did with DX10 on Vista.
Actually, many people love .net and C#. I don’t use it myself so I can’t tell why, but I’m sure that someone who uses it daily like nt_jerkface can tell you why that technology is so great. If microsoft did it right, developers could actually enjoy the thing instead of seeing it as an unnecessary and anticompetive hurdle.
Stuff that’s complex to implement is not necessarily complex to use. Take malloc(), as an example : incredibly handy and easy to use, but that certainly does not mean that memory management is a trivial thing to implement.
Not necessarily, but likely. And I just don’t really know how they could do something that’s really useful yet so god damn hard to implement that even people who have written a whole 3D acceleration system and drivers from scratch wouldn’t be able to implement..
Actually, many people love .net and C#. I don’t use it myself so I can’t tell why, but I’m sure that someone who uses it daily like nt_jerkface can tell you why that technology is so great. If microsoft did it right, developers could actually enjoy the thing instead of seeing it as an unnecessary and anticompetive hurdle.
Indeed, many people do love .NET and C#. But DirectX is supposed to be useable even from plain C/C++ applications, not just .NET applications, so tacking .NET as a requirement to DirectX would not really be of any benefit per ce. DirectX is perfectly useable in .NET applications as-is, so the only way Microsoft could tack .NET as a requirement would be to either do DirectX itself in .NET or allow only .NET applications to use it. And that still wouldn’t change a thing.
Don’t know if it’s even likely, most low-level OS tasks are about putting a simple interface on top of a complex mechanism…
About things nice to use and hard to implement, well, let’s imagine something which automatically shades an object’s texture in order to make this object look like it has more vertexes in it (I think this is pretty much how bump-mapping works). A simple interface would be two give two 3D models, one that is used for display work, and one that is used for lighting. Hardware functionnality is unchanged, all is done using shader languages. This is simple for devs and hw manufacturers, but making the whole thing less power-crunching on the implementation side would be quite hard.
Well, I don’t know enough about .Net to go into technical discussions, but I’m sure it has some handy features that are likeable for game devs. Making all examples of the direct3D doc using these features could be a good start. It would be advertised as “making the DirectX doc cleaner, with code that’s easier to understand”. I can see some benefit for devs here.
Edited 2010-09-22 12:01 UTC
I disappear for 2 days and my name still appears in a thread related to .net. Hmmm.
Anyways the main problem with tying DX to .net is that all the main 3D engines are written in C++. These codebases are massive and MS would risk a major backlash if they tied their next console to .net. There is no way they would risk ceding those developers to Sony or Nintendo just to prevent a few Linux game ports.
But what they can do is tie their own game development environment to .net as they partially do with XNA. So while DX stays neutral the productivity increase of using .net offsets the financial benefit of targeting alternative platforms.
Codebases aside there is no way that they would force people to use .net . A lot of games esp. non-indie/small Developers would never use .net. It just does not provide the control over hardware necessary to optimize code (can you even put inline assembler in .net?).
It certainly is more elegant and easy to program than c++ and thus is good for those than don’t have such strict performance requirements. And it is true that bad c++ code can run slower than .net, but I don’t think that game devs with budgets of millions would hire bad coders.
Wait, I’ve an issue here… Isn’t .Net as tied to C# as Cocoa is tied to ObjectiveC ? I mean, can’t people use libraries from the .Net framework in C++ code ? Why would tying DirectX to .Net result in forcing people to code in C# ? C# is the preferred way of interacting with .Net, but afaik it’s not the sole way.
Edited 2010-09-23 13:20 UTC
Yes but then you lose the main benefits of .net.
Yes but if they tied it to .net they would make it a complete waste of time to bother with anything else.
The two main .net languages are C# and VB.net and people who try to use VB.net past light use scenarios can easily run into documentation limitations. Not so much from MS but the .net world which is built around C#. Note that it isn’t an either/or proposition. MS could tie in benefits for .net developers but leave the option of unmanaged calls.
You are right to be concerned with piggybacking an MS API. Mono is frozen at .net 2.0 with some 3.5 extensions because of WPF and it never attracted the amount of developers they were hoping for. It is more used for in-house apps that have to be ported to Linux or OSX.
Improving the sound stack, Wine integration and building some type of app store would do more to attract game developers. But what Linux and OSX really need is for web frameworks to improve so the OS is no longer an issue. Adobe supposedly has some major 3D additions planned for Flash in the works so it might not be long before light 3D games such as The Sims are delivered in the browser. The heavy 3D is headed for consoles anyways, the under 25 crowd is pretty much ditching the desktop.
First, thank you for these explanations about .Net. It’s sad that this technology is tied to one specific OS. The more I read about it, the more it looks like the spiritual child of that Delphi wonder which I discovered and enjoyed much when I was a child (and was also single-platform, that’s why I had to leave it).
Hmmm, I’d rather say streamlining it. Currently, there’s too much APIs handling audio data on the free desktop, but together they form something rather nice and powerful when you forget some errors like Pulse Audio.
Integration with what ? The desktop ? Like the ability to copy-paste data from a wine app, widget look consistency, and things like that ?
If you’re talking about monetizing apps in repositories, I think that Canonical are working on that one.
Well, web frameworks are not the only option. Local interpreted code can also do the trick : if python and Java go bigger, the issue goes thinner. Also, I’m not sure that light 3D games are a real showstopper. To me, the real issue is with pro-oriented tools using non-standard file formats. Office, SolidWorks, Photoshop and Illustrator, things like that.
Say under 20 and we agree, but my personal experience is that 20-25 guys are still heavy desktop gamers. (Because there are still people who think that using a gamepad to play a fps/rts is pure heresy, and things like that)
Edited 2010-09-23 18:12 UTC
http://lin-app.com/
The OS is not an issue at all for Linux and OSX. After all, as an example, apparently out of 2 million new pieces of malware identified this year, 99% of it will be malware for Windows.
http://www.esecurityplanet.com/headlines/article.php/3903376/More-T…
I certainly know which OS is the issue IMO.
Edited 2010-09-24 02:23 UTC
That’s just a list of links, not an app store.
LOL that is not what I was talking about. Linux and OSX need a good 3D web platform to reduce porting costs for developers.
It is a list of links to commercial Apps for Linux, that you can buy, like in a store.
There are even games to be bought, and prices are listed. How about that.
Edited 2010-09-24 11:58 UTC
Well, I can see where this could be lacking…
-Inconsistent interfaces due to billing depending on the website you go to
-Requires the indie dev to make a shiny website for its game and maintain it instead of a simple developer’s page/blog as usual
-Insecure billing for the user as you have to trust many third parties vs one
-Complex billing for those who don’t have a credit card.
Indy developers are in the best position to use .net because they are less likely to need that level of optimization. It’s high-end console games that need to squeeze performance out of the system. Mobile games can use managed code as we have seen with Java.
With .net you can’t do inline ASM directly but you can call unmanaged dlls.
For years I have seen claims of “you won’t be able to do that with managed code” and then a year later someone does.
Doesn’t really say anything about the performance of .net or managed code since it basically only executes the game logic. I could likely write simple ‘indie’ style games (platformers, shoot’em ups etc) in interpreted basic as long as the graphics are being hardware accelerated and have great performance on todays machines. Hell, this javascript/HTML5 game runs great even on my oldest machine (p4, 2.4ghz) using firefox 3.6.10 (which has supposedly slow javascript) http://www.phoboslab.org/biolab/
That cute little game took up half my cpu.
I would hate to see what this XNA game would do to the cpu if it was written in Javascript
http://www.youtube.com/watch?v=uc_bNFG3IQ4&feature=related
Whilst it may seem like microsoft did this just to force people to upgrade, they had valid technical reasons to force vista usage.
First, DX9 vas pretty much an evolution of DX7, DX10 was quite a bit different architecturally and did not have backwards compatibility.
Second, DX10 required recent gpus
Third in order to reduce maintenance they decided to target only vistas new driver model,instead of maintaining to different versions of DX10. I don’t know how the gpu manufacturers felt about this seeing as they would be the ones having to develop pre vista drivers exposing DX10, which did not have any uptake from games or applications.
D3D merely exposes hardware feature in a consistent way.
I know you listed it only as an example, but hardware tessellation speficially existed on the Radeon 2, and was available through ATI’s extensions to DirectX 6 (maybe 7?) and OpenGL 1.2. It was later removed because it was rarely used. The concept of tessellation has been implemented for a long time. Note that the computer generated landscape for the Genesis video in “ST: Wrath of Kahn” utilized a form of iterative tessellation + fractals to generate it, and I’d bet tessellation algorithms have changed little since then.
As for making it more complex at each release, that would be counter productive. It would make it harder to implement for their hardware partners (nVidia, AMD, and Intel), and make it harder for programmers to use, encouraging a switch to OpenGL. If anything, Microsoft would embrace a quality port of DirectX, because it would further strengthen their influence on hardware makers as well as graphics developers.
Instead of leveraging DirectX to kill Linux, they’ll use DirectX on Linux to encourage more Windows development. This avoid bad PR, makes them appear more open, and attracts developers, all while avoiding damaging their own product.
I think you’re just getting hung up on the fact that it is a Microsoft technology, and thus inherently evil.
So hardware vendors fully control D3D features in the end, and Microsoft can’t use its control of the D3D specs in any way ? They can’t block some hardware features or make them easier to implement in a specific way ? That sounds strange, how is the association of SW and HW vendors around D3D and OpenGL a partnership then ? Why don’t SW vendors tell HW vendors to make the spec themselves ?
There isn’t a single kind of complexity, again. As malloc() shows, it’s possible to create features that are complex to implement for OS vendors, but require little to no work from HW manufacturers and programmers. Consider that memory allocation does not even require memory protection or linear address translation to work, those HW features just make it safer or more efficient. Can’t MS introduce *that* kind of complexity in D3D safely ?
Again, how is such an influence useful if D3D is simply a mirror of hardware capability ? What would be the benefit from them ?
Also, could you explain how helping DirectX to be ported on other OSs encourages Windows development ? To me, it seems like Microsoft losing an exclusive advantage on its competitors.
Not exactly. I’m more generally cautious about big companies. Being companies, their first goal is to make money on a market of limited size, no matter how, which is a proven source of morally questionable actions. Being big, they have much power, and such power has a great abuse potential. My (maybe paranoid) position is hence that one should look for every possible way things can go wrong when they are around, before admitting that everything is actually okay.
Then, of course, there are some companies which I *like* more than others, due to positive or negative experience with them, which means that I’m more or less cautious about them. As an example, I’m much more cautious about Microsoft and Apple than I am about Nokia, Samsung, or Adobe, because I’ve seen more crap coming from the formers than the latters.
Edited 2010-09-22 09:37 UTC
If a feature is embedded in hardware, then anyone who legally purchases the hardware has a license to use the features embedded within it.
There is no requirement that a particular brand of software (from another company) must be used to access the features of the card. The purchase contract for the card is a contract between the card manufacturer and the buyer of the card.
For a software vendor third party to try to interfere in that contract between hardware vendor and hardware purchaser would be an act of tortious interference, would it not?
http://en.wikipedia.org/wiki/Tortious_interference
For the hardware vendor to require that the card purchaser must use only the products of a third party would be a case of product bundling, would it not?
http://en.wikipedia.org/wiki/Product_bundling
Surely you wouldn’t suggest that Microsoft would indulge in illegal practices such as these?
OpenGL is an API that is available on every platform. On Windows it has abysmal performance, but everywhere else it is fine.
Until now, Direct3D was a Windows-only API.
If a software developer were to write code to the Direct3D API, then their code used to be chained to Windows.
OTOH, if a software developer wrote code to the OpenGL API, then their code can run on a vast array of platforms with only a re-compile. It perhaps wouldn’t run that well on Windows, but that is a Windows problem, not an OpenGL problem.
Now, however, if Gallium3D becomes a commonly available driver for Linux, then code written to the Direct3D API would suddenly become far easier to port to non-Windows platforms via a relatively painless re-compile.
This expansion of the market into which developers could feasibly sell their software would make it more viable for them to write software to the Direct3D API.
Why did that get modded down?
I’m not kidding here or pulling anyone’s leg.
http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-D…
It is not just me who points out facts like these. I do have backup.
You maintain that the OpenGL performance on Windows is abysmal. The last OpenGL game I played on both Linux and Windows seemed to be fine, please see
http://www.anandtech.com/print/1509
I realize it is dated (2004), but this was the last “big” game that I remember playing on Linux (maybe Quake 4, but it should be the same engine). The link shows that Windows version is consistently faster, sometimes almost 2x, than the Linux one. It would be surprising to find that performance has dropped so far as to be qualified as ‘abysmal’.
Do you have some data to show that performance on Windows is so bad?
Reading up on this matter, it appears that I have misunderstood. Microsoft produced at one time some previews of Vista that had abysmal OpenGL performance.
http://blog.wolfire.com/2010/01/Why-you-should-use-OpenGL-and-not-D…
This was a hot, hot topic at the time.
However, Microsoft apparently backed down:
Anyway, if Microsoft wanted so badly for everyone to believe that OpenGL performance on Vista, and subsequent Windows versions, would be abysmal, why should I not have believed them?
Indeed, the OpenGL performance provided by Microsoft for Vista was abysmal, and only action by hardware providers to create fast installable client drivers (ICDs) that restored native OpenGL support fixed the issue.
Who am I to spread a message contrary to what Microsoft marketing wanted to spread about OpenGL on Windows?
Edited 2010-09-22 13:57 UTC
Uhm, drivers that allow direct support and interface to your specific video-card hardware are faster than some generic (either translated to D3D or entirely in software) Microsoft provided drivers? Well, of course.
Just like the Mesa software render in Linux is slower than the NVidia/AMD binary blob/driver.
Also, the design that they were proposing there never made it to a final build, so only a small fraction of users (the beta-testers, likely) would have seen this effect.
Lastly, it’s telling that you believe things that Microsoft says, if they happen to coincide with the narrative you like most. Would you additionally repeat the Microsoft marketing that they have the fastest/most secure/easiest to use operating system?
Despite your attempt at sarcasm, Microsoft did actually release one or more builds of Vista with abysmal OpenGL performance compared to that available on XP. Why mess with it? Why cripple OpenGL and not DirectX? Only provision from graphics hardware OEMs of installable client drivers (ICDs) that restored native OpenGL support fixed the issue.
Although Microsoft did provide Windows with abysmal OpenGL performance, they never have delivered any version of Windows which was “the fastest/most secure/easiest to use operating system”. Even Microsoft marketing have never had the chutzpah to claim that they had the most secure operating system, but rather they claimed that each new version of Windows was “the most secure Windows ever”.
Now that claim I can almost believe … despite the fact that rates of malware infections of Windows systems have only ever gone up.
http://www.finextra.com/community/fullblog.aspx?id=1385
http://www.windowsitpro.com/article/security/35-percent-of-infected…
http://www.zdnet.com/blog/security/report-48-of-22-million-scanned-…
http://www.crunchgear.com/2010/04/20/symantec-51-percent-of-all-mal…
http://www.gss.co.uk/news/article/7772/Windows_malware_dwarfs_other…
http://www.gss.co.uk/news/article/7789/Is_Stuxnet_the_%27best~*…
25%, 35%, 48%, 51% … Oh dear.
99% of 2 million pieces of new malware in 2010. Oh dear oh dear.
Edited 2010-09-22 15:03 UTC
I never said that they didn’t release any builds with this design, I stated it never made it to the final build, the one that the average end-user encounters. I don’t know how I could have been any more explicit about that.
My final point was one to accentuate your clear bias against Microsoft, which makes your arguments weak.
See how you dug around for those links to support the insecurity of Windows? If you had used that energy to validate your “abysmal performance” claim (based off of a statement made about a pre-release of Vista, which never materialized) then your bias would less obvious.
Direct3D and hardware mirror each other. They are developed in tandem. Direct3D is tailored to the hardware at the same time the hardware is tailored to Direct3D. There is give and take with all parties during development of the next version of the spec. This is how it used to be with OpenGL, before the ARB became so broken and OpenGL fell so far behind that it could only play catch-up with Direct3D.
*facepalms*
Okay, so, hardware vendors don’t control the spec. Hardware and Direct3D are developed in tandem.
Microsoft listens to software developers when they request feature X, and listens to hardware developers when they suggest that feature Y is feasible and useful. While Microsoft has the final say, it means nothing without participation from software developers hand hardware developers.
The reason why hardware vendors don’t develop a spec in a vacuum, without feedback from software developers, is because the hardware vendors want to court software developers to target their hardware, thus increasing sales. Real innovation in graphics is a product of collaboration between all parties involved.
When a hardware vendor doesn’t pay attention to what software vendors really want, you get something like this:
http://en.wikipedia.org/wiki/Voodoo_5
Developers weren’t clamoring for motion-blur or even anti-aliasing yet. They wanted more than just a high fillrate. NVidia listened to what they wanted, and delivered Hardware transform & lighting, plus good 32-bit performance. ATI followed suit.
I won’t attempt to address your comments on added complexity, as I originally took them to be a suggestion of Microsoft adding artificial complexity just to make the lives of Linux D3D developers more difficult. Now, I’ve got no idea what you’re getting at.
And as for how Direct3D on Linux encouraging Windows development, well, OS/2 ran Windows 3.1 programs pretty well. As a result, instead of making OS/2 native apps, developers targeted the Windows API. As the Windows API evolved away from what OS/2 could run, developers moved with it.
Plus, if Linux-only developers start targeting Direct3D, they may start thinking, “well, hell, I can probably get my code to run on Windows pretty easily now.”
Of course, it also applies the opposite way, and everybody benefits.
You did not previously mention any patents.
Your quote: “..because actually implementing a full up-to-date D3D layer which people can use would be nearly impossible, due to… –Copyright infrigement“
Methods are the subject of patents … which is to say that one can be awarded a patent on a method of doing something or other. In order for another party to avoid your patent, they would need to come up with a different method of doing that same thing.
Note that this does not mean that they cannot do that thing, it means only that they may not use the same method to do that thing. E.g. One company might get a patent for a headache tablet using panadine … but that would not prevent another company making a headache tablet where the active ingredient was ibuoprofen.
Neither will the games themselves be written for the latest more complex API release for some time.
Also, the objective of an API is to make it easier for programmers to interface to the hardware, in order to unleash the power of the hardware for people to enjoy.
Surely you aren’t suggesting that Microsoft should write deliberately obscured APIs for the sole purpose of making it harder for competition, rather than making it better for their customers?
I’m shocked, truly shocked!
Scandal! Surely you cannot believe Microsoft would be so evil?
Sorry, I meant the part of the G3D project which works on reimplementation of D3D [/q]
You mean this state tracker. Meh. Direct3D is just an API. It isn’t that hard to pass parameters between a calling program and the graphics hardware. If Microsoft made it wildly complicated, why would developers want to use the next DirectX 12? Why wouldn’t they just stick with DirectX 11?
BTW: Direct3D is only an API. Tessellation would be done in the hardware.
Now when I build my Linux machine, generally I put it together from parts … I buy a blank hard disk, a motherboard, CPU & case, optical drive, keyboard, screen, mouse, RAM and graphics card. I assemble it all together, I put a Linux liveCD into the optical drive, and away I go, ready to run in about 20 minutes after that.
I can generally get the functional equivalent of a Windows 7 machine up and running this way for about a third of the cost of a store-bought Windows 7 machine of similar specifications.
The point is this: note that I purchased the graphics card. I therefore already have a license to use the graphics card.
http://en.wikipedia.org/wiki/Implied_license
Therefore, because I have an implied license to use any Intellectual Property embodied within my legally-purchased graphics card, I may now use any tessellation features embedded within my graphics card.
I may even invoke such features of my graphics card via a Gallium3D state tracker software implementing the Direct3D API, if it so pleases me.
Edited 2010-09-22 10:16 UTC
Wine doesn’t recompile everything. In a nutshell it loads x86 binaries in the PE format into a linux process and provide them with a reimplementation of the windows API.
There are other reasons why things are slow. Direct3D is reimplemented by wine on top of opengl but there are various data format differences between the two APIs that require time-consuming buffer conversions using the CPU. Additionally, shaders have to be recompiled as opengl shaders.
This causes a lot of complexity and overhead that an implementation of direct3d on top of gallium don’t have to deal with.
I don’t think that they can simply do that. At the assembly language level, if say windows use INT1 for something and linux uses it for something else, running Windows programs without some kind of recompilation (I think the proper term is binary translation) would require changing the behavior of INT1, something which non-privileged software like Wine clearly can’t do.
Same for loading PE, I don’t think that Wine can do that on its own if the Linux kernel does not help. It probably has to create an ELF copy of the program at some point.
AFAIK, that’s why the startup performance of Wine software is so poor by the way. Wine does not cache the translated program, it re-does the translation at each time.
I have some difficulties understanding how that can possibly work. How does thing work at the hardware level ? How are D3D and OpenGL normally interfaced with the hardware ?
Same for loading PE, I don’t think that Wine can do that on its own if the Linux kernel does not help. It probably has to create an ELF copy of the program at some point.
Why would WINE need some special privileges to load up a file? Just the fact that it happens to be executable code doesn’t change the fact any way that it’s a file; it can be read to memory. WINE just parses the PE header, sets up the memory accordingly, and reads the contents of the file there. There is nothing special about it.
AFAIK, that’s why the startup performance of Wine software is so poor by the way. Wine does not cache the translated program, it re-does the translation at each time.
Nope. WINE does not translate the code in any way.
Well, I see an issue about application software loading other programs all by themselves, it’s memory protection. How can things like DEP/NX be enforced if software is not loaded by the operating system ?
Well, I see an issue about application software loading other programs all by themselves, it’s memory protection. How can things like DEP/NX be enforced if software is not loaded by the operating system ?
Indeed, WINE does not work unless you give it permission to bypass DEP/NX. By default Linux doesn’t enforce such but f.ex. in Fedora you specifically need to allow WINE to execute code in random memory locations.
I’m pretty sure neither windows or linux programs directly use int to do any system call directly. There’s always a library or dll of some sort sitting between the application and the kernel, which is what wine reimplements.
As for loading a PE executable the only help needed from the kernel is the ability to mess around with the layout of the process address space, mark pages as executable etc. and there’s an api for that.
Yeah, I checked my reasoning and found out where I was wrong. INTs are only required for on-demand library loading, if it’s written in the PE headers that the program requires x library to be loaded, no INT is required in the program itself…
Then I’m puzzled, why in hell is Wine’s startup so slow ? Thought that I finally found out u___u
Okay so user-mode apps are provided with an API to manage memory protection themselves ? I’m not comfortable with such a design decision, but it indeed makes things simpler for Wine.
Edited 2010-09-22 10:16 UTC
When you load an application using Wine, you have to also load Wine itself. All those .dll files (or .so files – whatever) that Wine provides aren’t being used by any Linux app, so they have to be pulled off the disk the first time they’re used. It also needs to start some background processes to handle IPC between Windows programs.
Think of it this way – you have to boot up a copy of Windows before you can run any Windows applications.
Processes have (nearly) complete control over their own address space in Linux. That’s how the dynamic linker (which isn’t part of the kernel – it’s part of glibc) works. You can map the contents of a file to any address you want, as long as it’s not already in use.
It has no effect on other processes, and can not be used to bypass memory protection. The worst a process can do is crash itself.
Thanks, this is a good explanation.
Consider DEP/NX. As a general rule, memory regions should not be writable and executable at the same time by normal user processes, as that possibility is often exploited by malware and almost never used by normal software. However, loading a PE executable without some help from the kernel requires exactly that, as WereCarf pointed out.
Edited 2010-09-22 20:03 UTC
One of the reasons for wine having a slow startup speed is that, on a real windows system many of the core libraries would already be loaded and initialized during bootup because other system components require them, whereas under wine everything needs to be loaded.
And ofcourse userland programs have the ability to manage memory protection to some degree, the program itself has a better idea of its own memory layout than the kernel does.
Patents and copyright are two totally different beasts; how on earth can they copy code when Microsoft has never made the source code open to the world? now for patents you may be correct but many of these patents are so generic OpenGL are also impacted on them as well – so what ever implement on OpenGL will also cross over to the DirectX world in terms of patents. IIRC, Microsoft own several OpenGL patents that they bought off SGI many years ago.
API’s can not be copyrighted or patented, afaik.
MS does not beak DirectX/D3D API’s every year. if they did I wouldnt be able to run old DX7 games and older in DX9. Heck, if they did I couldnt even run yesteryears games this year. This is obviously not the case.
AFAIK, copyright law comes into play when a later work includes major elements of an earlier copyrighted work.
http://en.wikipedia.org/wiki/Derivative_work
So, in order for an implementation of Direct3D on another non-Windows platform to be in trouble with regard to copyright law, the authors of the new non-Windows Direct3D implementation would need to have copied (stolen) a major chunk of Microsoft’s copyrighted source code for Direct3D on Windows and pasted it verbatim into the new work.
So, as to the question of a Direct3D state_tracker for Gallium3D on Linux being a copyright violation in some obscure manner? Not a chance in hell.
Edited 2010-09-22 13:37 UTC
wine does NOT recompile programs. Wine is more like a png viewer for windows executables. It opens them and executes their contents. Nothing more, nothing less. It does not emulate anything. It simply takes windows calls and delivers them to linux. Then back to the program. There is no startup recompile when you run a windows app in wine.
Indeed. However, Wine does provide the equivalent of Windows system dlls. To run a Windows binary under Wine, Wine must first load itself with its dlls (via elf), and then load the Windows binary code into RAM (via Wines .exe loader). Wine then adds the entry for the program’s start point into the Linux kernel’s scheduler, to start execution of the program.
When the Windows program makes a system call, Wine intercepts the call and does one of two things … Wine either directs the call towards one of Wine’s dlls, or Wine translates the paramaters of the call into a Linux system call. In either case, once the system call has completed, Wine translates the results into whatever format is the expected return values (in Windows) for that system call, and then returns back to running the Windows executable binary.
Even though as you say Wine does not recompile any binaries, nevertheless Wine does have extra overhead that is not required for that same binary executable running under Windows on the same hardware. The heaviest extra overhead is during startup of programs (i.e. loading the programs plus Wine itself into RAM).
PS: Note that DirectX emulation on Wine would currently probably be provided via a Wine dll. This new Gallium3D state tracker for DirectX API would probably allow Wine to use a “translated call to a Linux system call” method instead of a Wine dll.
Edited 2010-09-23 03:07 UTC
Called this back when they announced DX10 and MS had recently talked about not allowing new versions of DirectX on XP: http://www.osnews.com/permalink?153301
Interesting all the down-vote hate I got at the time… especially since “disagreeing” with a statement wasn’t supposed to be justification for down-votes on OSNews