National Instruments (NI) recently released a new version of its LabView test automation programming environment for the latest Apple Macintosh computers based on the Arm-based Apple M1 CPU/GPU SoC. At the same time, NI let its customers know that this release would be the last one for Apple Macintosh computers, sending a shock through some portion of the company’s customer base.
[…]LabView’s importance to test and measurement cannot be overstated. It was the first graphical programming language designed exclusively for test systems. The language has been continually expanded and improved for nearly 40 years and features more than 7000 software drivers for instruments from many vendors as well as support for custom, FPGA-based instruments. LabView supports many instrument interfaces starting with IEEE-488 and extending to MXI, PXI, USB, Ethernet, and probably a few more interfaces that don’t immediately come to mind.
It’s a shock to many and will likely punish higher education students and a chunk of the scientific and research segment which is still Mac dominant.
According to the article, Macs were the only computer with a 32-bit OS in 1986!? What rubbish.
I think that was meant as part of the next sentence, where it commented on the graphics ability. You also have to keep in mind that Amigas just were not popular in the 80s in the USA where lab link was developed. Macs were much much much more popular. As a computer obsessed kid, I didn’t hear of Amiga until well after it was dead.
Bill Shooter of Bul,
+1
This was my experience as well. As much as I was into computers, it was IBM PC/clones + to a smaller extent macs. In principal we could have ordered other brands from a catalog had we known they existed. It wasn’t so much a choice to go wintel as much as we were oblivious that other brands existed. Just like you, I only discovered alternatives like Amiga later. In hindsight Amiga had merit over the PCs we were using, but this point is mute after the fact when it is already dead.
In France it was an Atari ST vs Amiga war.
When I read that article I assumed they meant full 32-bit bus architecture, which is the critical part for a software product like LabView churning through huge quantity of floating point math. And in that regard I think the Mid to Late 80s iterations of the 68000 series were the first to use full 32 Bus, and as such delivered an order of magnitude better performance crunching numbers. The Macs became ubiquitous in Labs and Universities for that very reason.
No, they could not have been talking about “full 32-bit bus architecture” as the Macs of the time didn’t have that either. The first 68020-based Mac wasn’t released until 1987.
1986 / 1987 really?
While I concede many industries and organisations, including the one I work for, won’t blink at the loss of LabView for MacOS. I’m very cognisant of the fact that the professional engineers that work in industry started off life as students or researchers somewhere. Learning and training in environments where cash is king if you can get it, and Mac and a software suite like LabView was a big investment. Sure many might no longer be bound to those beginnings, but still today some segments of R&D have to maintain standards, and it’s the references to those standards that contain the real value.
For this reason alone as the developer of a tool that is used in such environments for very god reasons, I find NI’s decision bizarre!
It would be like Adobe coming out and saying no more Apple Photoshop.
I can confirm Macs were used with LabView for the benefit of the full 32-bus because I was one of the industrial users, debating the exact year of it happening is rather imperious. I’m pretty sure there were also Mac based add-on cards for high speed data capture, signalling and analysis, that I’m sure took full advantage of 32-bit.
cpcf,
Imperious or not, it’s fun to know… I’m not familiar with most of these,
(sorry about bad links, had to remove to bypass wordpress)
en.wikipedia.org/wiki/Motorola_68020
@Alfman
I must admit, back then I was electronics more than code, the drivers / devices used were coded by others and there were certainly no great standards followed as most solutions were task specific. I haven’t read the link but I’m not surprised you’d find wildly fluctuating results.
Much of what I did involved only 8 or 12 bit signals and very rarely I recall a 14 bit ADC of fast signals, but I think the 14 bit was related to the machine architecture rather than a Nyquist requirement. I think BASF or Dupont used some weird 24-bit system for a real time control. 32-bit was used not because of a need for 32 bit resolution but because of the need to process so much 8 or 12 bit data very quickly, and so the Devs used clever encodings to leverage the full 32-bits. What this faster processing brought was less waste, control systems that could monitor and react faster. Back then it would have all been bespoke data structures, now I suppose you could pick some pre-existing format from an off the shelf protocol to do this all for you.
One of the downfalls of those early days was that many of the custom data capture solutions used proprietary encodings, some with signals being encoded and decoded in the add-on hardware. This meant if the card failed you had no choice but to repair or buy new just to keep using your data, few could afford the storage to re-encode a backup which was often not an option anyway due to the volume. Storage wasn’t cheap back then like it is now so bits were used sparingly, we use to have meeting after meeting discussing the merits of encoding an extra bit of data.
My memory fades, in some ways they were good old days in that you could get into the hardware/software design at component level, but in other ways not so much because a few MB came in it’s own jukebox.