Google announced today that it would enable WebGPU support in its Chrome browser by default starting in version 113, currently in beta. In development since 2017, WebGPU is a next-generation graphics API that aims to bring the benefits of low-overhead APIs like Microsoft’s Direct3D 12, Apple’s Metal, and Vulkan to web browsers and other apps.
WebGPU support has been available but off by default in Chrome for a while now, because the API wasn’t finalized and things could break from update to update. Google says that Mozilla and Apple will eventually support WebGPU in Firefox and Safari, and browsers like Microsoft Edge and Opera that rely on the Chromium browser engine can presumably choose to switch it on just as Google has.
Chrome 113 supports WebGPU on Windows, macOS, and ChromeOS to start, with “support for other platforms” like Linux and Android “coming later this year.” This browser version should roll out to all Chrome users sometime in May.
I’ve never really needed any advanced 3D rendering in my day-to-day browsing, but that might just be a case of the chicken and the egg.
It’s not just for 3D. Like Vulkan, WebGPU can be used as a more widely supported alternative to OpenCL or CUDA for GPU compute.
If you have something that uses GPU acceleration for something like Photoshop filters, WebGPU makes it more possible to implement it in the browser.
ssokolow,
Yes, that’s what I’m reading too. In a way it’s neat and I say why not. But this adds yet another gpu API for 3d and compute and a new shader language in an already crowded field. It makes me wonder whether developers will want to embrace this webgpu in large numbers. I can’t comment on the webgpu API itself since I haven’t used it, but in my experience browsers are not the most robust or efficient technology when it comes to processing larger data sets. One crashed tab can brings down the rest of the tabs with it. Browsers also have very limited facilities, no file system, no tcp sockets, no forking processes, no OS/hardware telemetry, limited I/O etc. Of course these limitations are built in for safety, which is good for a browser but it does make the browser a bad target for certain types of applications. I don’t think I would target compute applications in a browser, but maybe I’d use it in node.js.
How long before webgpu gets abused by websites seeking to use GPU resources for themselves like a webgpu accelerated bitcoin miner or password cracker, haha.
There were already JavaScript Bitcoin CPU miners back when that was worth doing, so the answer is already there: Run something like an ad-blocker or JavaScript blocker and name and shame any site that does it.
Didn’t The Pirate Bay get in trouble for running a Bitcoin CPU miner to try to supplement their donations? …or was that another torrent site?
ssokolow,
I know this existed in the past, that’s why I bring it up. GPU based web miners could enable a resurgence. Maybe throw it into one of the many .io games that kids can play for hours. They could get the ball rolling quickly with pay for click ads.
I don’t remember specific sites that used it.
I am not going to link to them, but a search pulled up several companies that let you “earn money with your website” running their js miners. Plugins for wordpress/drupal/etc already exist. These will have a fairly big incentive to include WebGPU especially as it becomes enabled in browsers by default.
I wonder if google have given any thought to giving users a clear way to monitor and block WebGPU? I doubt users will have any idea what’s going on. If abuse is prevalent WebGPU could earn a reputation for being the next flash. Hopefully not, but the risk of abuse seems plausible.
I’d be more concerned about what can be done with service workers. That sort of nonsense and the lack of any kind of per-site permissions UI is why I disable them globally in
about:config
withdom.serviceWorkers.enabled=false
to ensure that, in combination with extensions to control expiry of web storage, when I close a bleeping tab, it stops consuming resources.Could it be used by websites to mine crypto? THAT is the question i want answer to.
If the web is supposed to replace native OS-specific apps, it has to do anything the native apps can do. For example it has to do OpenGL to do OpenGL games:
http://xproger.info/projects/OpenLara/
Same for Vulkan games.
kurkosdr,
That’s sort of what webgl did, For better or for worse. WebGPU goes in the opposite direction: build a new API from the ground up to work better with native javascript. The obvious negative is that a new standard clearly increases the risk of long term fragmentation and more portability work for developers contending with dissimilar APIs that lack 1:1 feature parity.
Honestly my feelings about this are mixed. I can’t say I like OpenGL as an API. It’s been continually extended over the years, but I find it awkward, the many versions of the spec are confusing, and I don’t feel it’s aged well as a modern API. At the same time though it’s played an undeniably crucial role in standardization between diverse hardware and platforms that no other graphics API has. As a developer and user, it makes a lot of sense to support a universal standard and it seems unlikely to me that WebGPU will broadly replace OpenGL for software outside the browser. So I’m predicting the result will be using one API for the browser, and one or more for native software.
In theory, engines may get ported to both APIs, though I’m not sure how they’ll handle shaders. For example, think about a game world where resources including opengl shaders are downloaded on the fly from a server. Things could get more difficult if you have to support both opengl and webgpu clients. I’ll need to read up more on this.
https://www.w3.org/TR/WGSL/
I said “one API for the browser”, but it’s premature to assume WebGPU completely replaces WebGL. They may both have to stick around to not break existing content. Obviously a lot of existing sources won’t get ported to WebGPU.
I don’t have equipment to test these, but I also came across the browser standards for VR.
https://en.wikipedia.org/wiki/WebXR
The API deals with both VR input and output.. The frame buffer portions of the standard specify WebGL specifically. WebGPU hasn’t made it into the standard yet.
https://github.com/immersive-web/WebXR-WebGPU-Binding/blob/main/explainer.md
I wanted to check the status of even older 3d web technologies. To my surprise VRML and X3D resources, which date back to the 90s, work today in firefox, albeit via a script that converts the models to OpenGL.
https://create3000.github.io/x_ite/
With the move to performance-per-watt everywhere, OpenGL is quickly becoming a niche for CAD applications and the like where they absolutely need a high-level API, and low-level graphics APIs like Vulkan are becoming the new norm. The web can either follow or cede that area to native apps.
Eventually, when minimum requirements of modern games have reached a point where the minimum required graphics card supports Vulkan, engines will ditch OpenGL like they ditched Glide. One less thing to maintain.
BTW another benefit of WebGPU is support for GPU compute tasks.
kurkosdr,
Between this and your original post, it’s unclear if you think opengl is important to support going forward or not.
Yes. the primary selling point for browsers has always been ease of deployment, and that theoretically applies here too. But very often compute applications are long running background tasks that need to process job queues with access to a file system to save output. Many of these GPU compute daemons have an HTML browser interface, which is fine, but would you really want the compute daemon itself running in the browser? I don’t fancy that setup. Even if I could host compute applications from a browser, IMHO hosting through node.js would be a better choice.
If intensive GPU background tasks will be allowed in the browser by default it’s a potential pandora’s box as discussed in the thread with ssokolow above.
It’s important for browsers to keep supporting WebGL (and for OSes to keep supporting OpenGL) to maintain backwards compatibility, it’s the same reason Windows still supports DirectX 9.0c. But expect modern game engines to eventually drop support for OpenGL and DirectX 11 (like they dropped support for DirectX 9.0c). The web has to either follow along or cede that area to native apps.
Also, OpenGL will remain a thing in CAD and visualization where performance is not that important.
Keep in mind that background tasks can mean applications like Handbrake using GPU compute to accelerate conversion or a paid content service providing an HEVC or VVC decoder that uses GPU compute to decode the bitstream.
kurkosdr,
I understand what you are saying, but OpenGL still seems to be the best API to use if you want greater portability. It kind of would be nice if everyone: google, microsoft, apple, mozilla, etc would just settle on one API to supersede OpenGL everywhere instead of fragmenting the landscape. This isn’t the way things work though, oh well.
I’d like to see comparative benchmarks between the technologies, I haven’t seen any yet.
Alfman, Vulkan was supposed to be that one API… Apple refused to implement it and Microsoft decided to make a competitor that you have to use if you want to target the XBox.
You also need both Vulkan and WebGPU for the same reason you need both native machine code and WebAssembly. Vulkan needs low-level primitives so it can be used to implement the WebGPU runtime and WebGPU is designed so it fundamentally lacks un-sandboxed primitives to so that only bugs in the runtime or underlying platform can be used to break out.
In effect, Vulkan is designed to support everything a given piece of hardware supports A.S.A.P., while WebGPU is designed to only support what has been audited for security at a given point in time.
ssokolow,
Yeah…sigh.
Ideally the hardware itself would offer foolproof security rather than software. Some enterprise GPUs offer a virtualization capability that could be pretty handy for web browsers too, but I’m not sure whether this is available on consumer grade hardware. As long as isolation is maintained by the hardware (ie one graphics or compute context can’t affect any others), then in principal I’m ok with a software API that hands off security enforcement to hardware. But in practice, what’s the real risk of privilege escalation via GPU? I honestly don’t know.
Something else that hasn’t come up yet, but deserves a mention is that some beta games have accidentally been bricking GPU hardware.
“Another game blowing up NVIDIA GPUS?? Here we go again!”
https://www.youtube.com/watch?v=or7njUlYTLI
Affected titles will obviously update their code to avoid this. But the question remains: is it wise to enable WebGPU GPU execution by default when it opens the opportunity for someone to craft a malicious payload that deliberately creates the conditions causing this catastrophic GPU failure? As the owner of hardware that’s affected by this, I am legitimately concerned.
I think it’s untenable to do anything except adopting a policy of saying that the hardware is defective and compelling the vendor to replace any that gets bricked before an OS-level patch can be deployed.
ssokolow,
I agree it shouldn’t happen, but the fact is it does and none of them have issued recalls. I wish manufacturers were more forthcoming about the nature of the problem. Treating this as a warranty problem leaves consumers vulnerable after it expires. I also wonder if google themselves would have any legal liability over making GPUs so easily remotely accessible to hackers without user concent. It’d sure make an interesting case.
This, again. If an API is involved as an intermediary between the app and your hardware, and commands sent to the API break the hardware, it’s a hardware or driver issue. But we all know the culprit here, it’s hardware being pushed to its physical limits to extract a couple more FPS. We have cards hitting 90°C as if it’s no big deal. Ridiculous.
You can also amuse yourself with Dell’s and Nvidia’s unintentional humor below:
https://www.dell.com/community/Alienware-General-Read-Only/FurMark-GPU-and-Performance-Degradation/td-p/5625816
Blaming a perfectly valid application which is issuing perfectly valid OpenGL commands because your thermals suck, nice work Dell and Nvidia.
kurkosdr,,
There are caveats especially when you start to look at edge cases. For example, the 9th – 11th gen intel CPUs won’t handle prime95 pounding AVX512 instructions. It’s an artificial benchmark designed to put immense stress on the CPU and it’s too intense without underclocking. This benchmark failure affects the majority of those running high end intel CPUs. Practically all other applications work fine though. So should everyone be underclocking or buying slower CPUs on account of prime95? I can’t answer that, I’m just putting out there that tuning for extreme loads like prime95, which is designed to push the limits, will have negative tradeoffs for average loads most people will be running.
I agree, at least ideally it should be able to handle stress tests without long term degradation. But when you fabricate hardware there’s not a sudden jump from loads that are safe to loads that are “dangerous”, it’s a statistical gradient. We might compare it to driving a car, owners expect a car’s mechanics to just work, but if you’ve got a driver who’s particularly aggressive it makes a difference on wear and tear. Now we could argue the manufacturer is guilty and that the car should be built up to higher specifications to handle very aggressive driving, but engineering for rare edge cases rather than the norm involves tradeoffs and might not be a good use of resources.
BTW if you’ll recall, intel “fixed” the prime95 issue in alder lake CPUs by officially disabling AVX512. They decided it was better to remove rarely used high intensity instructions than to compromise performance across the board, Was that the right choice? It depends who you ask…
My point isn’t to defend shoddy product manufacturing, but to highlight the complexity of the problem. Like you, I don’t like hearing a manufacture say that intensive stress testing will wear hardware faster than it would otherwise, but it also seems harsh to criticize them for saying something that’s factually true.
If in the future browsers like Firefox and Chrome will be able to utilize it on iOS. Among improving in a couple of other areas. Then in my opinion developers will target it! Still we must be aware of a fact that technologies such as Flash, among some other things, were killed off by companies like Apple and Google not only due to performance issues and lack of control or standardization. But on top of that to not allow easy access to third party applications outside their respective stores. For Apple hence it’s not in their interest. For people using their devices to visit some web page and to play Fortnite or to use TikTok. Without the need for an application to be installed from their store and without they setting a tax on it. For people being able to use it after.
Here’s hoping those EU regulations force their hand.
Flash was killed because, well yes Apple never allowed it, but also it was a festering pile of vulnerabilities. We cheered as Apple stabbed it repeatedly before setting it on fire. Back in the macromedia days they had a source sharing program that my company investigated, the code was surprisingly compatible with ours. Which was a pretty dang big red flag for anyone using it in a web browser. Anything that compatible with our festering pile of embedded junk running on DOS and based on ansi C, had no business running on a modern operating system interacting with unknown flash files.