The Linux kernel’s floppy driver dates back to the original days of the kernel back in 1991 and is still being maintained thirty years later with the occasional fix.
Somewhat surprisingly, a patch was sent in to the Linux kernel’s block subsystem ahead of the Linux 5.12 merge window around the floppy code.
Floppies are awesome and I’m sure there’s tons of older machines out there – especially in corporate settings – that are still rocking a floppy drive for backwards compatibility reasons. Might as well keep the code up to snuff.
That’s nothing. People are paying $30+ USD for vinyl albums. We live in strange times.
Vinyl is for aesthetic value. A rack of albums would look fancier than a fingertip sized SD card.
I would not buy one, though. I like my music sounding better (i.e.: digital at CD quality).
Without getting into a heated debate about what ‘sounds better’, there is absolutely more to recorded music than digital’s faithful reproduction. The limitations of vinyl still mean that mastering engineers are required to have a little more dynamic range and can’t just brick wall the sound like is common on digital media.
It’s utterly subjective as to which sounds better. Note : I am not a vinyl snob and haven’t owned one since the 80s.
Yes, there would be “garbage in garbage out” if the CD is not mastered properly, especially with the recent “loudness war” going on.
But if it done properly, there would be no loss at all. Especially for my middle aged ears, there is no further benefit than that standard format.
It was summarized nicely here: https://www.theverge.com/2015/11/6/9680140/chris-montgomery-digital-audio-hi-res-explainer
Recent loudness wars? LOL! By that do you mean the one that has been going on since the early 90’s, or the original that happened about 50 years before that?
Under-phil,
You are referring to the loudness wars, which is a real thing but it’s not an inherent byproduct of the medium itself. It’s a byproduct of changing styles. Subtlety is no longer appreciated and audio levels have to be maxed out all the time. I agree older music seemed better to me, but that’s just style and not due to analog versus digital.
Obviously there have been plenty of low quality digital encodings especially during the early years of MP3, but digital recordings can be made to arbitrary precision and modern studio equipment is well past what humans can detect through proper A/B testing.
Given how common the analog is better myth is, I’m kind of surprised mythbusters never took it on.
Not enough explosions.
Analog vs. digital music as a Mythbusters episode would make no sense at all. The quality of music, regardless of what specifically you’re referring to, is 100% subjective. Neither `analog is better` nor `digital is better` is a myth to be busted or confirmed. Your ears hear what they hear and you perceive that input in a completely personal way. A tv show telling you something contrary to what you experience will not change that.
friedchicken,
To be fair, that’s not the myth.
There’s a lot of vinyl fan who believe it is superior to digital mediums because of better audio reproduction. That’s the myth. For example…
https://aestheticsforbirds.com/2019/11/25/spin-me-round-why-vinyl-is-better-than-digital/
He also covers other aspects for vinyl being better such as artwork & presentation, tactile, etc, which can add subjective value around the experience of records and is not terribly controversial like the audio claims. I’m referring to the myth that records imply superior audio qualities. The truth is digital encoding can reproduce the exact same audio qualities to our senses such that there is no perceptible difference to our ears in A/B testing. Of course there can be differences in quality due to poor hardware and mastering differences (such as different equalization curves, etc), but these differences are not intrinsic to digital media.
Granted, there may be non-auditory reasons for preferring records. But this “what you hear is different” can be tested objectively via A/B testing. We can even measure the placebo effect to account for perceived biases by conducting an experiment in such a way where the participant is tricked about what they believe is playing. For example you could play the same audio every time while giving the appearance of switching between a CD and record. If the participant consistently claims one is better, then you know with certainly that their judgement is not coming from the audio itself, but rather which medium they believe to be playing.
I don’t particularly care if people like records, there’s probably more sentimental value there than with these streaming services, but the myth is about the actual audio quality.
Alfman,
You are right. Digital vs analog is no longer a myth. I was trying to make a similar point. I can get people liking the physical feeling of the vinyl system with retro style amps. But if you are looking for pure audio quality, that is a different thing.
There are actually blind tests for experts. On a semi-related one, for example, “wine tasters” research had really interesting conclusions: https://www.realclearscience.com/blog/2014/08/the_most_infamous_study_on_wine_tasting.html
Overall it is what you “feel”, and there *is* a placebo effect. If someone if okay with $10 headphones, with an ipod good for them. If someone only appreciates music on a $1000 pair with tube headphone amps, good for them too.
(Note: $1000 headset will sound better regardless of the input source).
@Alfman
No need to cite any surveys, opinions, polls, tests, comparisons, or other. Having spent half my life working professionally in music & video production, I’ve heard every argument to be had, seen every technical proof, and spent countless hours one way or another on this subject. It boils down to exactly what I have already stated, that [i]”Your ears hear what they hear and you perceive that input in a completely personal way. A tv show telling you something contrary to what you experience will not change that.”[/i] No metric and no amount of data in favoring one side of the debate or the other will change your personal experience. There simply is no myth, only the illusion of one.
Some people are physically more capable of better perceiving wider frequency ranges. Some people are physically more capable of distinguishing frequencies on smaller scales. I’ve seen the placebo effect in action and I’ve seen people consistently get A/B tests correct where others completely failed. This subject has been endlessly beaten to death by everyone from people with absolutely zero technical knowledge to the most respected ears in the business, to what the math says and in the end it always winds up at the exact same place it started – Everyone’s own personal experience trumping everything else. The buck stops there, period.
Now, there’s a difference between transferring something analog into the digital realm, and emulating or reproducing it there. Algorithms that reproduce analog behavior are not 100% accurate 100% of the time. and they will never be due to the nature of how analog sound is produced.
We exist in an analog world. Does that make analog recordings or reproductions better being native to the human experience, or does the ability to digitally duplicate sound beyond human physical limits make the question moot? You would think clearly the latter but then how do you explain those who manage to tell the difference?
friedchicken,
People may genuinely convince themselves that something sounds different even if it’s provably not the case. I’m not questioning their experience, they may genuinely perceive differences even if they aren’t there, but I am suggesting the placebo effect as the cause rather than the actual sound qualities.
It’s the same reason medical researchers use placebos to provide a control group for their trials. Some patients may get better just because they believe the medicine is helping. Despite the fact that these perceptions are largely personal, we can still account for it in scientifically rigorous studies to establish real versus fake causality between input and output.
I realize you probably disagree with me about the extent to which the placebo effect is happening with audiophiles, but I’d like to ask if you believe in the scientific basis for measuring placebo effects with control groups?
Can you cite the exact test? It really depends on the nature of the test and what it was actually comparing..
For example records have fairly dramatic equalization curves applied to them in order to compress groove/needle motion.
https://en.wikipedia.org/wiki/RIAA_equalization
And while these are intended to be reversed on playback, the analog filter circuits that accomplish this are approximations at best. The result is lossy and may be perceptible versus the original signal. Some people may actually prefer sound that has undergone the record’s lossy equalization curves more than the original unaltered audio, and that’s fine.
– so far I hope we are in agreement on the facts above –
When a digital record does not contain the equalization transformations used by records, the resulting digital recording may be closer to the original signal yet farther from the analog record’s output. And in this sense, yes people can tell the difference between “analog” versus “digital”. However as scientists that’s not really the difference we’re talking about because the data being digitized did not match the record’s audio output in the first place. We’re talking about the ability to digitally represent the exact same output (within our ability to perceive it). A quality digital representation that has been mastered to replicate an analog record will be imperceptible in A/B tests.
The assumption is usually that staying in the analog domain implies no loss, but this isn’t quite right. There’s only a finite range of sound that can be accurately recorded and played back even in the analog domain. The needle size limits how fine the details can be. Also when the frequencies get too high momentum can prevent the needle from being able to follow the groove as quickly as the original signal that created it. Furthermore analog circuits are not that linear. There’s a lot of imprecision in the analog domain (for better or for worse). In the digital domain, the math can be as precise as we want, so the only real limits are the precision/quality of the ADC and DACs.
So while I have to agree that digitization is inherently lossy (as per Nyquist), it can nevertheless be built to a spec which is more accurate than what our analog records make possible.
https://en.wikipedia.org/wiki/Nyquist_frequency
I wouldn’t be surprised if early DAC and ADC circuits had poor frequency response. But our digital signal processing has gotten much better over the years and should easily surpass the limits of human hearing, at least in professionally calibrated equipment.
It would be interesting to see these comparisons on an oscilloscope. I don’t have any of the necessary equipment to do this myself, but I’m sure it’s already been done.
Can I answer that after you provide a link to the exact test you are referring to?
@friedchicken
I’m aware of the multi-domain issues from initial recording through to final output to the ear, and the neuro-psycho-social and perception issues. A minimal treatment would require at least one page of explanation per point and I leave it to others in their field who can explain it better than me.
There are technical and practical issues why digital and analogue can be better than the other in practice. Anyone with a cursory knowledge of the field whether it’s recording, post production, and creative processes knows this. They are all connected. So it’s one of those “it depends” things. A known known.
While the subjective and placebo effect can lead to deluded opinions at the same time they are not to be dismissed or sneered at because they are part of the whole experience. There is a wide variety of application of this in multiple fields to great effect. Yes it is true that an uneducated and coarse person prone to wild-eyed handwaving can talk nonsense at the same time it is true that the highly technical focusing solely on theoretical and narrow domain knowledge can also, with great respect, be talking out their posterior. Again, these are known knowns.
Digital in all its forms is useful but then so is the analogue. I limit my use of digital now to basic office tasks, browsing the web, email, watching video, sometimes listening to music and a couple of other things but that’s it. I’ve “been there, done that” so have limited interest in hardware or software beyond “is it useful?” I’m old enough to remember when the world was mostly anologue, and old enough to have done digital to death. I am introducing more anlogue into my life and have enough bandwidth taken up by this not to care much for pages of go around in circles nerdgasms and hairsplitting and posturing. At the same time the pandemic is forcing lots of people to work remotely and cut down on their interaction with the wilder world. This is leading to mental health issues, social issues, and some degree of detatchment moderated by invisible technical decisions which themselves cause problems. Again, more known knowns.
I have decided the analogue myth is not a myth. No amount of technical posturing by people with knowledge of only one domain and a technical domain only at that will alter this. As a counter-example the “50 Hz” myth is a myth and took decades to undo. Film and digital is different for lots and lots of reasons before you even get to see a moving image. The brain can easily be aware of frames up to approximately 200-250 Hz. Only at that point does it become as solid as direct reality. As for an analogue music player will I benefit? Overall no. I don’t need the faffing about or clutter. otoh, having a real book in my hand and a cup of tea with a classical quartet is playing in the background and my mobile left at home? Not a shred of digtial or connectivity. Oh, my life. What am I going to do? I sure there are many who would have a nervous breakdown but I think it would be rather nice. Digital is a wonderful thing but it has its place and let’s not have the tail wag the dog.
God, we’ll be back to Moscow rules next. lol
All I’m saying is that ‘better’ is subjective. I don’t have any skin in that game.
Under-phil,
I didn’t disagree with your original post, my response was adding nuance to it.
Just as a patient taking a pill might feel better afterwards, a music consumer playing a record might feel it sounds better. In both cases you’d be right that “‘better’ is subjective”, but in both cases there may be biases caused by a placebo effect.
This may not matter to the people involved, they feel better and the ‘why’ isn’t important to them (if so, great). But it does matter in terms of establishing physical causality. There may be hidden variables at play. In scientific studies this ambiguity can often be resolved with double blind testing.
And yet, an archived court case file of 50 years ago is turned into pulp in my country.
I still remember the first time I used a floppy drive in Linux. No fancy automount, no fancy write until you tell it to sync either.
You would put the disk in, have to mount it. Go to copy the data and it make sounds as if it were writing, but would basically just be gauging if there was enough disk space. If you didn’t run sync or the umount command before ejecting the disk, you would have no data on it.
Sadly you haven’t been able to get a motherboard with a floppy drive on it for years. Someone needs to make a PCIe version of the Catweasel.
Why do you feel that’s sad? I can’t think of a single person who would prefer a floppy connector taking up real estate on a mainboard versus a different, modern, and vastly more capable connector. I have fond memories of the days of 3.5″ floppies (and 5.25″ for that matter), but do I miss them at all? Hell no.
Well, if you insist on using internal floppy drive, you could use a floppy to USB adapter. 😉
e.g.
https://www.amazon.com/dp/B07WCRF9H3/
plus
https://www.amazon.com/dp/B06Y5C7DKH/
This is what I did. I write a lot of floppies for older machines, and so used that. I cut off the USB plug and replaced it with a 10-pin USB header so it could connect straight the board and installed a power switch on the 5.25″ blanking plate so it wasn’t always clicking (waiting for a disk). It’s the closest thing to an internal floppy right now.
There are new industrial boards being made with legacy connectors.
If you want a machine that can run modern Windows or Linux and have native floppy support, most Core 2-era motherboards still had a native floppy controller and connector.
Since ditching tower systems and shifting to docked laptops my floppy disc went bye bye. I used to have reasons for needing a floppy disc so bought a USB external floppy drive and it worked well. What I didn’t like was Microsoft removing easy access to the floppy disc via Explorer and, I think, default installation or somesuch of the floppy driver? There is nothing about Explorer which adds or removes the benefit of floppy disc access so to bury floppy disc access in a weird sub-menu somewhere else seems pointless and unnecessarily difficult for the end user. In removing one “complexity” they added another complexity.
Considering the useability and psychological issues I’m wondering if because of Microsoft outsourcing what is a trivial cost onto the end user and marketing led “nudge theory” “behavorial psychology” influences the deprecated and modified functionality is legally actionable for end user additional costs and psychiatric damage i.e. psychological assault and emotional damage.