Our testing, technical analyses and audio latency measurement database of more than 4,238 different Android models/builds shows that Google has been making great progress in order to solve the Android round-trip audio latency problem, however progress seems to be slowing as the current media server internals are not likely to be hacked much further unless fundamental changes should happen. To date, we have seen no improvements with Android N with regards to audio latency.
We receive emails from all around the world, almost on a daily basis, where developers beg us for a solution to Android Audio’s 10 ms Problem. Which is why we’re proud to announce a solution to Android Audio 10ms Problem, which you can install and demo today.
Few regular users will ever care, but for those users that do need low audio latency for music/audio creation applications, this is a godsend.
So it looks like they reduced from 10ms down to 8ms delay? Or am I missing something?
Looking at the numbers, it seems to reduce by about 8ms.
Look at the top right, it says total roundtrip time is 8.1 ms:
http://superpowered.com/images/android-low-latency-audio-solved.gif
Ah, I was looking at the charts, which seem to have a decrease of ~7ms latency.
Seems most were not even at 10ms to start with.
The problem to solve was bringing audio latency down to 10 ms. This is what iOS achieves and what is apparently a requirement for high quality tools. The latency on control devices seems to have been originally at about 15 ms.
Edited 2016-06-17 08:51 UTC
music/audio creation applications don’t give two (censored) about audio latency, especially as low as 10ms. It is video apps that are affected by audio latency. In fact, in most OSes I have tried, audio-sync is never perfect, but always just enough off to be noticeable. You probably won’t notice any lip-sync issues, but when something in the video hits the floor or something, you ‘ll notice the sound is just a bit off, but enough to make the sound unnatural.
Edited 2016-06-17 00:51 UTC
You are completely wrong. Latency in audio production is absolutely a concern. Things get messy very quickly when you’re experiencing latency issues, and no 10ms is not insignificant in a pro-audio or multitrack environment. Sync doesn’t only apply to video, it’s also a main component in recording.
So what about 8.1ms ? because that is what you get with the solution in the article.
Maybe I’m wrong, but I just hope the numbers are stable. I would think that is much more important than how much. Variable latency would be the worst.
Stability is not really an issue. The amount of latency is however, and 10ms is not insignificant. Anyone who knows what they’re talking about, or just has common sense, will tell you the lower the latency the better. You won’t find any professional smiling about 10ms latency but you will in the 2-5ms range.
Latency is important, period. There’s not a circumstance where it isn’t, or is insignificant. What impact it has depends on what & how the equipment is configured, and what you’re doing with it. Anyone who tells you different simply doesn’t know what they’re talking about.
I understand that, the problem is: this is an improvement from 10ms to 8.1ms. I’m just wondering how much of an improvement is that really ?
Depending on the situation, that difference can have little impact or a lot. One general constant however is lower is better.
You are reading wrong. What it says is that the requirement is less than 10 and they achieved 8. The improvement was bigger.
You are right and wrong. Latency is important, but 10 ms is insignificant, which is why the goal was to get it DOWN to 10 ms.
You probably mean something different by audio creation, but speaking as someone who has a serious musical hobby and traffics with Ableton-sporting musicians and DJs: 10 ms is absolutely a concern. If you get into effects and looping on a professional level while you’re playing live, you’re going to want to keep it under 8 ms, and be able to do it without cracks, pops, or gaps. Android has been a no-show in this department, whereas artists like Beardyman use iPads onstage.
Oh really, ilovebeer pointed out that you are wrong.
Imagine yourself strumming an electric guitar. Before you use it, you have to plug the audio cable from your guitar directly to your speaker. Or if you have a **gadget** you plug your cable from your guitar to your **gadget** then a cable from your gadget to your speaker. Strum it and the sounds you hear it immediately.
One example. With an guitar audio application(NOT VIDEO application) installed in your Android gadget with a 15ms or more audio latency, you plug your guitar to that android device, then plug a cable from your android to your speaker. Start strumming your guitar and you will now notice the delay of the sound of your guitar coming from the speaker. This is because audio latency that is equal to 10ms or more is very noticeable in audio applications, you will notice the delay.
Now try yourself playing live with that audio latency problem and its a nightmare.
Audio latency is so critical in some audio applications. Of course it is not much critical if your audio application is just a music player.
Edited 2016-06-18 00:09 UTC
Not something I expect normal users to have on their devices.
Normal users don’t need this thing.
Normal users make use of audio applications that
are only available in iOS/OS X and Windows, which aren’t ported to Android due to this problem.
Edited 2016-06-17 09:19 UTC
No really true. ASIO or WASAPI users for Windows do not qualify as normal users. They are specialists.
Normal users have no idea what is a low latency audio.
Normal users are everyone that isn’t nerd enough to understand what userdebug build or rooted production build, or to destroy their mobiles in the process of rooting them.
I doubt very seriously that most DJs and music producers using sequencing software on iOS, Mac OS X and Windows are also nerds that knows all about rooting their work devices.
You are partially wrong. Normal users includes guitar players, bass players and other sort of musicians. These are not geek users. These users are also not ASIO or WASAPI users. They are normal users who are using gadgets to power their guitars.
Therefore, if there is an audio application available in Android like something was able to do what their dumb guitar gadgets can do, they will find it useful and cool. You do not have to bring with you those bulky gadgets all the time in your gigs.
“Producing” on tablets, phablets, and cellphones is a novelty. If all you’re doing is screwing around or using them as a notepad for ideas then they’re fine.
this is one area where iOS has crushed android since the beginning – audio production apps and device integration.
yes it’s a subset of users, just like gamers and coders.
dare i say this will just lead to more bad EDM made by dorks pressing gadgets and pretending to work hard.
i was a button pushing DJ for many years (98-04) and have now graduated to real instruments. kinda can’t stand how kids these days believe that button pushing DJ’s are the same as musicians, how they elevate them to rock star status. most EDM is like an audio-purge to my ears.
I’m note quite sure what you’re saying – if you were yourself a DJ for a period, you must have a like or respect for some genre or element of dance music.
personally i don’t think the likes of richie hawtin, dubfire, gold panda, steve bug, jeff mills, audion, laurent garnier, josh wink et (enormously more) al. could be considered as anything other than extremely skilled at what they do.
David Guetta et al. however can go wank himself to death with an oversize rusty dildo as far I’m concerned (hope I don’t get banned for that )
But on a point of nomenclature – DJs and electronic music producers generally don’t refer to themselves as either musicians or recording artists anyway! They mostly self identify as what they are, “DJs” or “producers” – some are very much more skilled, some are less skilled. Musicians, agreed, they are not.
I’m old school — I spun mostly funk and soul and hip-hop from the early 90’s-back. I really don’t like the all electronic music, especially one after another. i need beats and breaks and musicians and real singers to get off.
i also produced my own music that mixed fakees with real players called 2MERICA which you can of course google and check out.
when DJ’ing i started on vinyl but moved to traktor dj when it came out for convenience. i was ahead of the curve and hated on by vinyl dj’s for a bit, especially since i could do rapid-fire loops and other filtering trickery that you couldn’t pull of all-analog.
that was nearly 2 decades ago though. the mp3 era makes music that sounds so horrible and compressed, and the way producers have to mix now with all kinds of trickery like parallel automated side-chain compressors — i know how to do all of that and i rejected it as a real art form.
i can’t be at a club playing this music i hate, and i’m not trying to exist as the old school guy on weeknights. i have some friends that still spin but i produce and listen now.
aphex twin is all i need in that department. and brian eno. and newcleus – the most underrated EDM band ever.
you can find me most nights getting a vinyl album on and/or listening to my wife play her piano.
anyway thanks for reading and allowing myself to explain. nowadays i’m all about 24bit hi-res, good DAP’s, and restoring sound quality as something maybe the next generation will care about again.
Edited 2016-06-17 17:33 UTC
I don’t agree with your assessment of digital audio, but I certainly do appreciate your confidence and music selection
Anytime I hear anything about audio latency it reminds me of BeOS and it’s 3ms latency. Amazing for it’s time…
I wonder if it would run under desktop Linux.
Probably, until the next tiny kernel update bricks it and takes your entire audio stack down.
Given it plugs into ALSA, I doubt that would be the problem that breaks it.
And sadly, yeah, I’ve had to tell guitarists to go with Foxconn slave made iPhones because there is no option in Android.
I record ALL. THE. TIME. I have got my ardour rig down to ~5ms and it’s liveable, but still not ideal.
You want less latency. Whatever the cost, whatever the sacrifice, less latency is the goal.
And I think people wanting to record guitar tracks are “normal users”. That is a really dumb argument. Damned near half the planet plays guitar for christ’s sake.