Keep OSNews alive by becoming a Patreon, by donating through Ko-Fi, or by buying merch!

Internet Archive

Playing multimedia with Dillo

What if you want to use a web browser like Dillo, which lacks JavaScript support and can’t play audio or video inside the browser? Dillo doesn’t have the capability to play audio or video directly from the browser, however it can easily offload this task to other programs. This page collects some examples of how to do watch videos and listen to audio tracks or podcasts by using an external player program. In particular we will cover mpv with yt-dlp which supports YouTube and Bandcamp among many other sites. ↫ Dillo website The way Dillo handles this feels very UNIX-y, in that it will call an external program – mpv and yt-dlp, for instance – to play a YouTube from an “Open in mpv” option in the right-click menu for a link. It’s nothing earth-shattering or revolutionary, of course, but I very much appreciate that Dillo bakes this functionality right in, allowing you to define any such actions and add them to the context menu.

Google, DuckDuckGo massively expand “AI” search results

Clearly, online search isn’t bad enough yet, so Google is intensifying its efforts to continue speedrunning the downfall of Google Search. They’ve announced they’re going to show even more “AI”-generated answers in Search results, to more people. Today, we’re sharing that we’ve launched Gemini 2.0 for AI Overviews in the U.S. to help with harder questions, starting with coding, advanced math and multimodal queries, with more on the way. With Gemini 2.0’s advanced capabilities, we provide faster and higher quality responses and show AI Overviews more often for these types of queries. Plus, we’re rolling out to more people: teens can now use AI Overviews, and you’ll no longer need to sign in to get access. ↫ Robby Stein On top of this, Google is also testing a new search mode where “AI” takes over the entire search experience. Instead of seeing the usual list of links, the entire page of “results” will be generated by “AI”. This feature, called “AI Mode” is opt-in for now. You can opt-in in Labs, but you do need to be a paying Google One AI Premium subscriber. I guess it’s only a matter of time before this “AI Mode” will be the default on Google Search, because it allows Google to keep its users on Google.com, and this makes it easier to show them ads and block out competitors. We all know where this is going. But, I hear you say, I use DuckDuckGo! I don’t have to deal with any of this! Well, I’ve got some bad news for you, because DuckDuckGo, too, is greatly expanding its use of “AI” in search. DDG will provide free, anonymous access to various “AI” chatbots, deliver more “AI”-generated search results based on more sources (but still English-only), and more – all without needing to have an account. A few of these features were already available in beta, and are now becoming generally available. Props to DuckDuckGo for providing a ton of options to turn all of this stuff off, though. They give users quite a bit of control over how often these “AI”-generated search results appear, and you can even turn them off completely. All the “AI” chatbot stuff is delegated to a separate website, and any link to it from the normal search results can be disabled, too. It’s entirely possible to have DuckDuckGo just show a list of regular search results, exactly as it should be. Let’s hope DDG can keep these values going, because if they, too, start pushing this “AI” nonsense without options to turn it off, I honestly have no idea where else to go.

What would happen if we didn’t use TCP or UDP?

At some point, I wondered—what if I sent a packet using a transport protocol that didn’t exist? Not TCP, not UDP, not even ICMP—something completely made up. Would the OS let it through? Would it get stopped before it even left my machine? Would routers ignore it, or would some middlebox kill it on sight? Could it actually move faster by slipping past common firewall rules? No idea. So I had to try. ↫ Hawzen Okay so the end result is that it’s technically possible to send a packet across the internet that isn’t TCP/UDP/ICMP, but you have to take that literally: one packet.

Chromium Ozone/Wayland: the last mile stretch

Lets start with some context, the project consists of implementing, shipping and maintaining native Wayland support in the Chromium project. Our team at Igalia has been leading the effort since it was first merged upstream back in 2016. For more historical context, there are a few blog posts and this amazing talk, by my colleagues Antonio Gomes and Max Ihlenfeldt, presented at last year’s Web Engines Hackfest. Especially due to the Lacros project, progresses on Linux Desktop has been slower over the last few years. Fortunately, the scenario changed since last year, when a new sponsor came up and made it possible to address most of the outstanding missing features and issues required to move Ozone Wayland to the finish line. ↫ Nick Yamane There still quite a bit of work left to do, but a lot of progress has been made. As usual, Nvidia setups are problematic, which is a recurring theme for pretty much anything Wayland-related. Aside from the usual Nvidia problems, a lot of work has been done on improving and fixing fractional scaling, adding support for the text-input-v3 protocol, reimplementing tab dragging using the proper Wayland protocol, and a lot more. They’re also working on session management, which is very welcome for Chrome/Chromium users as it will allow the browser to remember window positions properly between restarts. Work is also being done to get Chromium’s interactive UI tests infrastructure and code working with Wayland compositors, with a focus on GNOME/Mutter – no word on KDE’s Kwin, though. I hope they get the last wrinkles worked out quick. The most popular browser needs to support Wayland out of the box.

Dillo 3.2.0 released

We’ve got a new Dillo release for you this weekend! We added SVG support for math formulas and other simple SVG images by patching the nanosvg library. This is specially relevant for Wikipedia math articles. We also added optional support for WebP images via libwebp. You can use the new option ignore_image_formats to ignore image formats that you may not trust (libwebp had some CVEs recently). ↫ Dillo website This release also comes with some UI tweaks, like the ability to move the scrollbar to the left, use the scrollbar to go back and forward exactly one page, the ability to define custom link actions in the context menu, and more – including the usual bug fixes, of course. Once the pkgsrc bug on HP-UX I discovered and reported is fixed, Dillo is one of the first slightly more complex packages I intend to try and build on HP-UX 11.11.

Nepenthes: a dangerous tarpit to trap LLM crawlers

If you don’t want OpenAI’s, Apple’s, Google’s, or other companies’ crawlers sucking up the content on your website, there isn’t much you can do. They generally don’t care about the venerable robots.txt, and while people like Aaron Schwartz were legally bullied into suicide for downloading scientific articles using a guest account, corporations are free to take whatever they want, permission or no. If corporations don’t respect us, why should we respect them? There are ways to fight back against these scrapers, and the latest is especially nasty in all the right ways. This is a tarpit intended to catch web crawlers. Specifically, it’s targeting crawlers that scrape data for LLM’s – but really, like the plants it is named after, it’ll eat just about anything that finds its way inside. It works by generating an endless sequences of pages, each of which with dozens of links, that simply go back into the tarpit. Pages are randomly generated, but in a deterministic way, causing them to appear to be flat files that never change. Intentional delay is added to prevent crawlers from bogging down your server, in addition to wasting their time. Lastly, optional Markov-babble can be added to the pages, to give the crawlers something to scrape up and train their LLMs on, hopefully accelerating model collapse. ↫ ZADZMO.org You really have to know what you’re doing when you set up this tool. It is intentionally designed to cause harm to LLM web crawlers, but it makes no distinction between LLM crawlers and, say, search engine crawlers, so it will definitely get you removed from search results. On top of that, because Nepenthes is designed to feed LLM crawlers what they’re looking for, they’re going to love your servers and thus spike your CPU load constantly. I can’t reiterate enough that you should not be using this if you don’t know what you’re doing. Setting it all up is fairly straightforward, but of note is that if you want to use the Markov generation feature, you’ll need to provide your own corpus for it to feed from. None is included to make sure every installation of Nepenthes will be different and unique because users will choose their own corpus to set up. You can use whatever texts you want, like Wikipedia articles, royalty-free books, open research corpuses, and so on. Nepenthes will also provide you with statistics to see what cats you’ve dragged in. You can use Nepenthes defensively to prevent LLM crawlers from reaching your real content, while also collecting the IP ranges of the crawlers so you can start blocking them. If you’ve got enough bandwith and horsepower, you can also opt to use Nepenthes offensively, and you can have some real fun with this. Let’s say you’ve got horsepower and bandwidth to burn, and just want to see these AI models burn. Nepenthes has what you need: Don’t make any attempt to block crawlers with the IP stats. Put the delay times as low as you are comfortable with. Train a big Markov corpus and leave the Markov module enabled, set the maximum babble size to something big. In short, let them suck down as much bullshit as they have diskspace for and choke on it. ↫ ZADZMO.org In a world where we can’t fight back against LLM crawlers in a sensible and respectful way, tools like these are exactly what we need. After all, the imbalance of power between us normal people and corporations is growing so insanely out of any and all proportions, that we don’t have much choice but to attempt to burn it all down with more… Destructive methods. I doubt this will do much to stop LLM crawlers from taking whatever they want without consent – as I’ve repeatedly said, Silicon Valley does not understand consent – but at least it’s joyfully cathartic.

“The people should own the town square”

Mastodon, the only remaining social network that isn’t a fascist hellhole like Twitter or Facebook, is changing its legal and operational foundation to a proper European non-profit. Simply, we are going to transfer ownership of key Mastodon ecosystem and platform components (including name and copyrights, among other assets) to a new non-profit organization, affirming the intent that Mastodon should not be owned or controlled by a single individual. It also means a different role for Eugen, Mastodon’s current CEO. Handing off the overall Mastodon management will free him up to focus on product strategy where his original passion lies and he gains the most satisfaction. ↫ Official Mastodon blog Eugen Rochko has always been clear and steadfast about Mastodon not being for sale and not accepting any outside investments despite countless offers, and after eight years of both creating and running Mastodon, it makes perfect sense to move the network and its assets to a proper European non-profit. Mastodon’s actual control over the entire federated ActivityPub network – the Fediverse – is actually limited, so it’s not like the network is dependent on Mastodon, but there’s no denying it’s the most well-known part of the Fediverse. The Fediverse is the only social network on which OSNews is actively present (and myself, too, for that matter). By “actively present” I only mean I’m keeping an eye on any possible replies; the feed itself consists exclusively of links to our stories as soon as they’re published, and that’s it. Everything else you might encounter on social media is either legacy cruft we haven’t deleted yet, or something a third-party set up that we don’t control. RSS means it’s easy for people to set up third-party, unaffiliated accounts on any social medium posting links to our stories, and that’s entirely fine, of course. However, corporate social media controlled by the irrational whims of delusional billionaires with totalitarian tendencies is not something we want to be a part of, so aside from visiting OSNews.com and using our RSS feeds, the only other official way to follow OSNews is on Mastodon.

Chromium’s influence on Chromium alternatives

I don’t think most people realize how Firefox and Safari depend on Google for more than “just” revenue from default search engine deals and prototyping new web platform features. Off the top of my head, Safari and Firefox use the following Chromium libraries: libwebrtc, libbrotli, libvpx, libwebp, some color management libraries, libjxl (Chromium may eventually contribute a Rust JPEG-XL implementation to Firefox; it’s a hard image format to implement!), much of Safari’s cryptography (from BoringSSL), Firefox’s 2D renderer (Skia)…the list goes on. Much of Firefox’s security overhaul in recent years (process isolation, site isolation, user namespace sandboxes, effort on building with ControlFlowIntegrity) is directly inspired by Chromium’s architecture. ↫ Rohan “Seirdy” Kumar Definitely an interesting angle on the browser debate I hadn’t really stopped to think about before. The argument is that while Chromium’s dominance is not exactly great, the other side of the coin is that non-Chromium browsers also make use of a lot of Chromium code all of us benefit from, and without Google doing that work, Mozilla would have to do it by themselves, and let’s face it, it’s not like they’re in a great position to do so. I’m not saying I buy the argument, but it’s an argument nonetheless. I honestly wouldn’t mind a slower development pace for the web, since I feel a lot of energy and development goes into things making the web worse, not better. Redirecting some of that development into things users of the web would benefit from seems like a win to me, and with the dominant web engine Chromium being run by an advertising company, we all know where their focus lies, and it ain’t on us as users. I’m still firmly on the side of less Chromium, please.

Google’s ad-blocking crackdown underway

Google has gotten a bad reputation as of late for being a bit overzealous when it comes to fighting ad blockers. Most recently, it’s been spotted automatically turning off popular ad blocking extension uBlock Origin for some Google Chrome users. To a degree, that makes sense—Google makes its money off ads. But with malicious ads and data trackers all over the internet these days, users have legitimate reasons to want to block them. The uBlock Origin controversy is just one facet of a debate that goes back years, and it’s not isolated: your favorite ad blocker will likely be affected next. Here are the best ways to keep blocking ads now that Google is cracking down on ad blockers. ↫ Michelle Ehrhardt at LifeHacker Here’s the cold and harsh reality: ad blocking will become ever more difficult as time goes on. Not only is Google obviously fighting it, other browser makers will most likely follow suit. Microsoft is an advertising company, so Edge will follow suit in dropping Manifest v2 support. Apple is an advertising company, and will do whatever they can to make at least their own ads appear. Mozilla is an advertising company, too, now, and will continue to erode their users’ trust in favour of nebulous nonsense like privacy-respecting advertising in cooperation with Facebook. The best way to block ads is to move to blocking at the network level. Get a cheap computer or Raspberry Pi, set up Pi-Hole, and enjoy some of the best adblocking you’re ever going to get. It’s definitely more involved than just installing a browser extension, but it also happens to be much harder for advertising companies to combat. If you’re feeling generous, set up Pi-Holes for your parents, friends, and relatives. It’s worth it to make their browsing experience faster, safer, and more pleasant. And once again I’d like to reiterate that I have zero issues with anyone blocking the ads on OSNews. Your computer, your rules. It’s not like display ads are particularly profitable anyway, so I’d much rather you support us through Patreon or a one-time donation through Ko-Fi, which is a more direct way of ensuring OSNews continues to exist. Also note that the OSNews Matrix room – think IRC, but more modern, and fully end-to-end encrypted – is now up and running and accessible to all OSNews Patreons as well.

Internet Archive hacked and victim of DDoS attacks

Internet Archive’s “The Wayback Machine” has suffered a data breach after a threat actor compromised the website and stole a user authentication database containing 31 million unique records. News of the breach began circulating Wednesday afternoon after visitors to archive.org began seeing a JavaScript alert created by the hacker, stating that the Internet Archive was breached. “Have you ever felt like the Internet Archive runs on sticks and is constantly on the verge of suffering a catastrophic security breach? It just happened. See 31 million of you on HIBP!,” reads a JavaScript alert shown on the compromised archive.org site. ↫ Lawrence Abrams at Bleeping Computer To make matters worse, the Internet Archive was also suffering from waves of distributed denial-of-service attacks, forcing the IA to take down the site while strengthening everything up. It seems the attackers have no real motivation, other than the fact they can, but it’s interesting, shall we say, that the Internet Archive has been under legal assault by big publishers for years now, too. I highly doubt the two are related in any way, but it’s an interesting note nonetheless. I’m still catching up on all the various tech news stories, but this one was hard to miss. A lot of people are rightfully angry and dismayed about this, since attacking the Internet Archive like this kind of feels like throwing Molotov cocktails at a local library – there’s literally not a single reason to do so, and the only people you’re going to hurt are underpaid librarians and chill people who just want to read some books. Whomever is behind this are just assholes, no ifs and buts about it.

A Comprehensive Guide to Choosing the Right DevOps Managed Services

The world of software development is rapidly changing. More and more companies are adopting DevOps practices to improve collaboration, increase deployment frequency, and deliver higher-quality software. However, implementing DevOps can be challenging without the right people, processes, and tools. This is where DevOps managed services providers can help. Choosing the right DevOps partner is crucial to maximizing DevOps’s benefits at your organization. This comprehensive guide covers everything you need to know about selecting the best DevOps managed services provider for your needs. What are DevOps Managed Services? DevOps managed services provide ongoing management, support, and expertise to help organizations implement DevOps practices. A managed services provider (MSP) becomes an extension of your team, handling tasks like: This removes the burden of building in-house DevOps competency. It lets your engineers focus on delivering business value instead of struggling with new tools and processes. Benefits of Using DevOps Managed Services Here are some of the main reasons to leverage an MSP to assist your DevOps transformation: Accelerate Time-to-Market A mature MSP has developed accelerators and blueprints based on years of project experience. This allows them to rapidly stand up CI/CD pipelines, infrastructure, and other solutions. You’ll be able to deploy code faster. Increase Efficiency MSPs scale across clients, allowing them to create reusable frameworks, scripts, and integrations for data warehouse services, for example. By leveraging this pooled knowledge, you avoid “reinventing the wheel,” which gets your team more done. Augment Internal Capabilities Most IT teams struggle to hire DevOps talent. Engaging an MSP gives you instant access to specialized skills like site reliability engineering (SRE), security hardening, and compliance automation. Gain Expertise Most companies are still learning DevOps. An MSP provides advisory services based on what works well across its broad client base, helping you adopt best practices instead of making mistakes. Reduce Cost While the exact savings will vary, research shows DevOps and managed services can reduce costs through fewer defects, improved efficiency, and optimized infrastructure usage. Key Factors to Consider Choosing the right MSP gives you the greatest chance of success. However, evaluating providers can seem overwhelming, given the diversity of services available. Here are the 5 criteria to focus on: 1. DevOps Experience and Maturity Confirm that the provider has real-world expertise, specifically in DevOps engagements. Ask questions such as: They can guide your organization on the DevOps journey if you want confidence. Also, examine their internal DevOps maturity. An MSP that “walks the talk” by using DevOps practices in their operations is better positioned to help instill those disciplines in your teams. 2. People, Process, and Tools A quality MSP considers all three pillars of DevOps success: People – They have strong technical talent in place and provide training to address any skill gaps. Cultural change is considered part of any engagement. Process – They enforce proven frameworks for infrastructure management, CI/CD, metrics gathering, etc. But also customize it to your environment vs. taking a one-size-fits-all approach. Tools – They have preferred platforms and toolchains based on experience. But integrate well with your existing investments vs. demanding wholesale changes. Aligning an MSP across people, processes, and tools ensures a smooth partnership. 3. Delivery Model and Location Understand how the MSP prefers to deliver services: If you have on-site personnel, also consider geographic proximity. An MSP with a delivery center nearby can rotate staff more easily. Most MSPs are flexible to align with what works best for a client. Be clear on communication and availability expectations upfront. 4. Security and Compliance Expertise Today, DevOps and security should go hand-in-hand. Evaluate how much security knowledge the provider brings to the table. Relevant capabilities can include: Not all clients require advanced security skills. However, given increasing regulatory demands, an MSP that offers broader experience can provide long-term value. 5. Cloud vs On-Premises Support Many DevOps initiatives – particularly when starting – focus on the public cloud, given cloud platforms’ automation capabilities. However, most enterprises take a hybrid approach, leveraging both on-premises and public cloud. Be clear if you need an MSP able to support: The required mix of cloud vs. on-prem support should factor into provider selection. Engagement Models for DevOps Managed Services MSPs offer varying ways clients can procure their DevOps expertise: Staff Augmentation Add skilled DevOps consultants to your team for a fixed time period (typically 3-6 months). This works well to fill immediate talent gaps. Project Based Engage an MSP for a specific initiative, such as building a CI/CD pipeline for a business-critical application. Clear the scope and deliverables. Ongoing Managed Services Retain an MSP to provide ongoing DevOps support under a longer-term (1+ year) contract. More strategic partnerships where MSP metrics and incentives align with client goals. Hybrid Approaches Blend staff augmentation, project work, and managed services. Provides flexibility to get quick wins while building long-term capabilities. Evaluate which model (or combination) suits your requirements and budget. Overview of Top Managed Service Providers The market for DevOps-managed services features a wide range of global systems integrators, niche specialists, regional firms, and digital transformation agencies. Here is a sampling of leading options across various categories: Langate Accenture Cognizant Wipro EPAM Advanced Technology Consulting ClearScale This sampling shows the diversity of options and demonstrates key commonalities, such as automation skills, CI/CD expertise, and experience driving cultural change. As you evaluate providers, develop a shortlist of 2-3 options that seem best aligned. Then, further validation will be made through detailed discovery conversations and proposal walkthroughs. A Framework for Comparing Providers With so many aspects to examine, it helps to use a scorecard to track your assessment as you engage potential DevOps MSPs: Criteria Weight Provider 1 Provider 2 Provider 3 Years of Experience 10% Client References/Case Studies 15% Delivery Locations 10% Cultural Change Methodology 15% Security and Compliance Capabilities 10% Public Cloud Skills 15% On-Premises Infrastructure Expertise 15% Budget Fit 10% Total Score 100% Customize categories and weighting based on your priorities. Scoring forces clearer decisions compared to general impressions. Share the framework with stakeholders to build consensus on the

Servo gets tabbed browsing, Windows improvements, and more

If you’re reading this, you did a good job surviving another month, and that means we’ve got another monthly update from the Servo project, the Rust-based browser engine originally started by Mozilla. The major new feature this month is tabbed browsing in the Servo example browser, as well as extensive improvements for Servo on Windows. Servo-the-browser now has a redesigned toolbar and tabbed browsing! This includes a slick new tab page, taking advantage of a new API that lets Servo embedders register custom protocol handlers. ↫ Servo’s blog Servo now runs better on Windows, with keyboard navigation now fixed, --output to PNG also fixed, and fixes for some font- and GPU-related bugs, which were causing misaligned glyphs with incorrect colors on servo.org and duckduckgo.com, and corrupted images on wikipedia.org. Of course, that’s not at all, as there’s also the usual massive list of improved standards support, new APIs, improvements to some of the developer tools (including massive improvements in Windows build times), and a huge number of fixed bugs.

Can you convert a video to pure CSS?

He regularly shares cool examples of fancy css animations. At the time of writing his focus has been on css scroll animations. I guess there are some new properties that allow playing a css animation based on the scroll position. Apple has been using this on their marketing pages or so jhehy says. The property seems pretty powerful. But how powerful? This got me thinking… Could it play a video as pure css? ↫ David Gerrells The answer is yes. This is so cursed, I love it – and he even turned it into an app so anyone can convert a video into CSS.

The journey of an internet packet: exploring networks with traceroute

The internet is a complex network of routers, switches, and computers, and when we try to connect to a server, our packets go through many routers before reaching the destination. If one of these routers is misconfigured or down, the packet can be dropped, and we can’t reach the destination. In this post, we will see how traceroute works, and how it can help us diagnose network problems. ↫ Sebastian Marines I’m sure most of us have used traceroute at some point in our lives, but I never once wondered how,, exactly, it works. The internet – and networking in general – always feels like arcane magic to me, not truly understandable by mere mortals without years of dedicated study and practice. Even something as simple as managing a home router can be a confusing nightmare of abbreviations, terminology, and backwards compatibility hacks, so you can imagine how complex it gets when you leave your home network and start sending packets out into the wider world. This post does a great job of explaining exactly how traceroute works without overloading you with stuff you don’t need to know.

Ethernet history deepdive: why do we have different frame types?

The history of Ethernet is fascinating. The reason why we have three different frame types is that DIX used the Ethernet II frame that is prevalent today, while IEEE intended to use a different frame format that could be used for different MAC layers, such as token bus, token ring, FDDI, and so on. The IEEE were also inspired by HDLC, and modeled their frame header more in alignment with the OSI reference model that had the concept of SAPs. When they discovered that the number of available SAPs weren’t enough, they made an addition to the 802 standard to support SNAP frames. In networks today, Ethernet II is dominant, but some control protocols may use LLC and/or SNAP frames. ↫ Daniel Dib I just smiled and nodded.

Chrome iOS browser on Blink

Earlier this year, under pressure from the European Union, Apple was finally forced to open up iOS and allow alternative browser engines, at least in the EU. Up until then, Apple only allowed its own WebKit engine to run on iOS, meaning that even what seemed like third-party browsers – Chrome, Firefox, and so on – were all just Safari skins, running Apple’s WebKit underneath (with additional restrictions to make them perform worse than Safari). Even with other browser engines now being allowed on iOS in the EU, there’s still hurdles, as Apple requires browser makers to maintain two different browsers, one for the EU, and another one for the rest of the world. It seems the Chromium community is already working on bringing the Chromium Blink browser engine to iOS, but there’s still a lot of work to be done. A blog post by the open source consultancy company Igalia digs into the details, since they are contributing to the effort. While they’ve got the basics covered, it’s far from completed or ready for release. We’ve briefly looked at the current status of the project so far, but many functionalities still need to be supported. For example, regarding UI features, functionalities such as printing preview, download, text selection, request desktop site, zoom text, translate, find in page, and touch events are not yet implemented or are not functioning correctly. Moreover, there are numerous failing or skipped tests in unit tests, browser tests, and web tests. Ensuring that these tests are enabled and passing the test should also be a key focus moving forward. ↫ Gyuyoung Weblog I don’t use iOS, nor do I intend to any time soon, but the coming availability of browser engines that compete with WebKit is going to be great for the web. I’ve heard from so many web developers that Safari on iOS is a bit of a nightmare to support, since without any competition on iOS it often stagnates and lags behind in supporting features other browsers already implemented. With WebKit on iOS facing competition, that might change. Now, there’s a line of thought that all this will do is make Chrome even more dominant, but I don’t think that’s going to be an issue. Safari is still the default for most people, and changing defaults is not something most people will do, especially not the average iOS user. On top of that, this is only available in the EU, so I honestly don’t think we have to worry about this any time soon, but obviously, we do have to remain vigilant.

Verso: a browser using Servo

I regularly report on the progress made by the Servo project, the Rust-based browser engine that was spun out of Mozilla into its own project. Servo has its own reference browser implementation, too, but did you know there’s already other browsers using Servo, too? Sure, it’s clearly a work-in-progress thing, and it’s missing just about every feature we’ve come to expect from a browser, but it’s cool nonetheless. Verso is a web browser built on top of Servo web engine. It’s still under development. We dont’ accept any feature request at the moment. But if you are interested, feel free to help test it. ↫ Verso GitHub page It runs on Linux, Windows, and macOS.

Servo enables parallel table layout

Another month, another chunk of progress for the Servo rendering engine. The biggest addition is enabling table rendering to be spread across CPU cores. Parallel table layout is now enabled, spreading the work for laying out rows and their columns over all available CPU cores. This change is a great example of the strengths of Rayon and the opportunistic parallelism in Servo’s layout engine. ↫ Servo blog On top of this, there’s tons of improvements to the flexbox layout engine, support generic font families like ‘sans-serif’ and ‘monospace’ has been added, and Servo now supports OpenHarmony, the operating system developed by Huawei. This month also saw a lot of work on the development tools.

OpenAI beta tests SearchGPT search engine

Normally I’m not that interested in reporting on news coming from OpenAI, but today is a little different – the company launched SearchGPT, a search engine that’s supposed to rival Google, but at the same time, they’re also kind of not launching a search engine that’s supposed to rival Google. What? We’re testing SearchGPT, a prototype of new search features designed to combine the strength of our AI models with information from the web to give you fast and timely answers with clear and relevant sources. We’re launching to a small group of users and publishers to get feedback. While this prototype is temporary, we plan to integrate the best of these features directly into ChatGPT in the future. If you’re interested in trying the prototype, sign up for the waitlist. ↫ OpenAI website Basically, before adding a more traditional web-search like feature set to ChatGPT, the company is first breaking them out into a separate, temporary product that users can test, before parts of it will be integrated into OpenAI’s main ChatGPT product. It’s an interesting approach, and with just how stupidly popular and hyped ChatGPT is, I’m sure they won’t have any issues assembling a large enough pool of testers. OpenAI claims SearchGPT will be different from, say, Google or AltaVista, by employing a conversation-style interface with real-time results from the web. Sources for search results will be clearly marked – good – and additional sources will be presented in a sidebar. True to the ChatGPT-style user interface, you can keep “talking” after hitting a result to refine your search further. I may perhaps betray my still relatively modest age, but do people really want to “talk” to a machine to search the web? Any time I’ve ever used one of these chatbot-style user interfaces -including ChatGPT – I find them cumbersome and frustrating, like they’re just adding an obtuse layer between me and the computer, and that I’d rather just be instructing the computer directly. Why try and verbally massage a stupid autocomplete into finding a link to an article I remember from a few days ago, instead of just typing in a few quick keywords? I am more than willing to concede I’m just out of touch with what people really want, so maybe this really is the future of search. I hope I can just always disable nonsense like this and just throw keywords at the problem.

“Majority of websites and mobile apps use dark patterns”

A global internet sweep that examined the websites and mobile apps of 642 traders has found that 75,7% of them employed at least one dark pattern, and 66,8% of them employed two or more dark patterns. Dark patterns are defined as practices commonly found in online user interfaces and that steer, deceive, coerce, or manipulate consumers into making choices that often are not in their best interests. ↫ International Consumer Protection and Enforcement Network Dark patterns are everywhere, and it’s virtually impossible to browse the web, use certain types of services, or install mobile applications, without having to dodge and roll just to avoid all kinds of nonsense being thrown at you. It’s often not even ads that make the web unusable – it’s all the dark patterns tricking you into viewing ads, entering into a subscription, enabling notifications, sharing your email address or whatever, that’s the real reason. This is why one of the absolute primary demands I have for the next version of OSNews is zero dark patterns. I don’t want any dialogs begging you to enable ads, no modal windows demanding you sign up for a newsletter, no popups asking you to enable notifications, and so on – none of that stuff. My golden standard is “your computer, your rules”, and that includes your right to use ad blockers or anything else to change the appearance or functioning of our website on your computer. It’d be great if dark patterns became illegal somehow, but it would be incredibly difficult to write any legislation that would properly cover these practices.