Keep OSNews alive by becoming a Patreon, by donating through Ko-Fi, or by buying merch!

General Development Archive

Go 2, here we come!

A major difference between Go 1 and Go 2 is who is going to influence the design and how decisions are made. Go 1 was a small team effort with modest outside influence; Go 2 will be much more community-driven. After almost 10 years of exposure, we have learned a lot about the language and libraries that we didn't know in the beginning, and that was only possible through feedback from the Go community.

The Go team s revealing some things about the future of the programming language.

Fortran is still a thing

In 2017 NASA announced a code optimization competition only to cancel it shortly after. The rules were simple. There is a Navier-Stokes equations solver used to model aerodynamics, and basically, the one who makes it run the fastest on the Pleiades supercomputer wins the first prize.

There were a few caveats though. The applicant had to be a US citizen at least 18 years of age, and the code to optimize had to be in Fortran.

Running Windows software on ARM with Wine

The concept of running Wine on ARM devices isn't new. StarCraft (as well as Diablo 1 & 2) is playable on ARM through Wine thanks to the insanely hard work of the Lithuanian hacker notaz. But while working on my project I couldn't find anything but scattered bits of information and sometimes there was even nothing at all. So this guide will walk you trough the steps required to execute Windows software with Wine on ARM devices running *nix. I specifically focus on a Raspberry Pi 3B+ running Raspbian and here's a screenshot of Notepad++ running there.

Detailed article about running Windows software on ARM Linux using Wine, including how to recompile x86 Windows applications to ARM.

A look at the design of Lua

Lua is a scripting language developed at the Pontifical Catholic University of Rio de Janeiro (PUC-Rio) that has come to be the leading scripting language for video games worldwide. It is also used extensively in embedded devices like set-top boxes and TVs and in other applications like Adobe Photoshop Lightroom and Wikipedia. Its first version was released in 1993. The current version, Lua 5.3, was released in 2015.

Though mainly a procedural language, Lua lends itself to several other paradigms, including object-oriented programming, functional programming, and data-driven programming.5 It also offers good support for data description, in the style of JavaScript and JSON. Data description was indeed one of our main motivations for creating Lua, some years before the appearance of XML and JavaScript.

A minimal C64 Datasette program loader

The Commodore Datasette recording format is heavily optimized for data safety and can compensate for many typical issues of cassette tape, like incorrect speed, inconsistent speed (wow/flutter), and small as well as longer dropouts. This makes the format more complex and way less efficient than, for example, "Turbo Tape" or all other custom formats used by commercial games. Let's explore the format by writing a minimal tape loader for the C64, optimized for size, which can decode correct tapes, but does not support error correction.

I'm no expert, but sometimes I wonder if modern computer classes and schools in general are on the right track by focusing solely on modern systems like Chromebooks and iPads. Wouldn't it be better to teach kids programming in BASIC, with limited resources, on, say, C64 emulators?

What’s the difference between an integer and a pointer?

In an assembly language we typically don't have to worry very much about the distinction between pointers and integers. Some instructions happen to generate addresses whereas others behave arithmetically, but underneath there's a single data type: bitvectors. At the opposite end of the PL spectrum, a high-level language won't offer opportunities for pointer/integer confusion because the abstractions are completely firewalled off from each other. Also, of course, a high-level language may choose not to expose anything that resembles a pointer.

LLVM 7.0.0 released

The release contains the work on trunk up to SVN revision 338536 plus work on the release branch. It is the result of the community's work over the past six months, including: function multiversioning in Clang with the 'target' attribute for ELF-based x86/x86_64 targets, improved PCH support in clang-cl, preliminary DWARF v5 support, basic support for OpenMP 4.5 offloading to NVPTX, OpenCL C++ support, MSan, X-Ray and libFuzzer support for FreeBSD, early UBSan, X-Ray and libFuzzer support for OpenBSD, UBSan checks for implicit conversions, many long-tail compatibility issues fixed in lld which is now production ready for ELF, COFF and MinGW, new tools llvm-exegesis, llvm-mca and diagtool. And as usual, many optimizations, improved diagnostics, and bug fixes.

The release notes have all the details.

On the road to pure Go X11 GUIs

And so I've placed a bet on Go. It is just as conceptually simple as C, sports a friendly BSD-style license, and already has its own parallel ecosystem. No stinky LLVM, in fact no traces of C at all! It's an overlooked revolution! I can follow symbols through packages however deep I want to and I always end up in Go or its assembly. Well, so long as nothing ugly uses Cgo.

Right, now that I've embraced the garbage collector, how do I make an interface that doesn't look like it dates back to the '80s? And can I avoid Cgo?

Learning BASIC like it’s 1983

Now, of course, I tell computers what to do for a living. All the same, I can't help feeling that I missed out on some fundamental insight afforded only to those that grew up programming simpler computers. What would it have been like to encounter computers for the first time in the early 1980s? How would that have been different from the experience of using a computer today?

This post is going to be a little different from the usual Two-Bit History post because I'm going to try to imagine an answer to these questions.

This is a great idea.

x86-64 assembly language programming with Ubuntu

The purpose of this text is to provide a reference for University level assembly language and systems programming courses. Specifically, this text addresses the x86-64 instruction set for the popular x86-64 class of processors using the Ubuntu 64-bit Operating System (OS). While the provided code and various examples should work under any Linux-based 64-bit OS, they have only been tested under Ubuntu 14.04 LTS (64-bit).

Your light reading for the weekend.

Dart 2.0 released

Coming from Dart 1, there's two major developer-facing changes, the largest of which is a stronger type system, including runtime checks to help catch errors that would arise from mismatched or incorrectly labeled types. This type system, originally called "strong mode", has long been the default in Flutter. The other is an interesting quality-of-life change for Flutter developers, which allows creating an instance of a class without the "new" keyword. The goal of this change is to make Flutter code more readable, less clunky, and easier to type, but the principle applies to all Dart code.

The complete list of changes has all the details.

Hello world on z/OS

If you've followed any one of the amazing tutorials on how to set up a mainframe on a conventional personal computer, you've probably noticed they end with the login screen as if everything beyond that point will be intuitive and self-explanatory to newbies. I mean... That was my assumption going into this project. I'll figure it out. How hard could it be? Maybe it would take me a few hours. Maybe I'd have to Google some stuff... Read some documentation...

It took me over a week.

Over a week to figure out enough to compile and run a basic program.

Why Discord is sticking with React Native

Looking back at the past three years, React Native has proven to be extremely successful at Discord and helped drive our iOS user adoption from zero to millions!

More specifically, React Native has allowed us to reap the benefits of quickly leveraging reusable code across platforms, as well as develop a small and mighty team.

Meanwhile, we've learned to adapt to its inevitable pain points without sacrificing overall productivity.

We all complain a lot about these non-native, cross-platform frameworks, but it's only fair to also highlight the other side of the coin - in this case, the view from the developers of an incredibly popular application who need to easily support multiple platforms.

Small computer system supports large-scale multi-user APL

Another article from a very much bygone era - we're talking 1977, and for sure this one's a bit over my head. I like being honest.

APL (A Programming Language) is an interactive language that allows access to the full power of a large computer while maintaining a user interface as friendly as a desktop calculator. APL is based on a notation developed by Dr. Kenneth Iverson of IBM Corporation over a decade ago, and has been growing in popularity in both the business and scientific community. The popularity of APL stems from its powerful primitive operations and data structures, coupled with its ease of programming and debugging.

Most versions of APL to date have been on large and therefore expensive computers. Because of the expense involved in owning a computer large enough to run APL, most of the use of APL outside of IBM has been through commercial timesharing companies. The introduction of APL 3000 marks the first time a large-machine APL has been available on a small computer. APL 3000 is a combination of software for the HP 3000 Series II Computer System2 and a CRT terminal, the HP 2641A, that displays the special symbols used in APL.

Enjoy.

The land before binary

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like "decimal machines" and "2 out of 5 code". It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

Here's a little taste of some of those systems.

I've often wondered why computers are binary to begin with, but it was one of those stray questions that sometimes pops up in your head while you're waiting for your coffee or while driving, only to then rapidly disappear.

I have an answer now, but I don't really understand any of this.

UTC is enough for everyone… Right?

Programming time, dates, timezones, recurring events, leap seconds... Everything is pretty terrible.

The common refrain in the industry is Just use UTC! Just use UTC! And that's correct... Sort of. But if you're stuck building software that deals with time, there's so much more to consider.

It's time... To talk about time.

This is one of the best articles - experiences? - I've ever read. It's funny, well-written, deeply informative, and covers everything from programming with time, to time and UI design, to time and accessibility. This is simply an amazing piece of work.