A few years ago, backstage at a conference, I spotted a blind woman using her phone. The phone was speaking everything her finger touched on the screen, allowing her to tear through her apps. My jaw hit the floor. After years of practice, she had cranked the voice’s speed so high, I couldn’t understand a word it was saying.
And here’s the kicker: She could do all of this with the screen turned off. Her phone’s battery lasted forever.
Ever since that day, I’ve been like a kid at a magic show. I’ve wanted to know how it’s done. I’ve wanted an inside look at how the blind couldnavigate a phone that’s basically a slab of featureless glass.
This week, I got my chance. Joseph Danowsky offered to spend a morning with me, showing me the ropes.
There’s a ton to dislike about iOS, but its assistive technologies for people with disabilities are absolutely spectacular. Nothing even comes close to it.
Indeed the assistive technologies on the iOS are impressive and cover many languages.
Some time ago when I was testing an ipad for this exact reason in order to buy one for a relative of mine with eyesight issues the only “issue” I noticed is that you can have both the voiceover feature enabled and the zoom feature enabled.
I’m not sure if this has changed now but if anybody has any experience I’d love to hear it.
Actually the UWP developer tools have support for developing applications for the blind, including turning the screen off, so that normal developers can have the same experience.
“Accessibility on Windows 10”
https://channel9.msdn.com/Events/Build/2016/P541
Edited 2017-03-11 07:43 UTC
Windows has horrible accessibility they’ve been sued and lost, worst in the industry
By whom and when? Where are those facts?
The accessibility situation, particularly in Windows 10, is getting worse. Microsoft is changing their implementation so that the third-party accessibility tools are falling further behind. On top of that, their own accessibility solution (Narrator) is worth far less even than Google Talkback.
There was a time when Windows was one of the most accessible platforms, even though you had to resort to third-party products (some of which cost more than a computer) to make it happen. The situation now is, to be honest, depressing. I don’t much care for Windows, however I’m all for competition even in accessibility features.
That being said however, I’m not aware of any legal action taken against Microsoft. Technically, Microsoft’s pitiful efforts qualify as meeting the Section 508 requirements of the Rehabilitation Act, so I doubt they’d ever get more than a slap on the wrist even if they were taken to court over it. Further, on top of this, they may even get hit with anti-trust complaints if they did try to drastically improve their accessibility tools, as they’d compete with established third-party solutions serving the market and the company behind most of them (a virtual monopoly in and of itself now) is known to be sue-happy and has a proven track record of it.
Microsoft Narrator is so far below the competition that you must absolutely install third party software like NVDA or Jaws to get a usable computer for a blind person.
Meanwhile Mac OS X’s accessibility is 100% built-in and so well integrated that you can reinstall the OS from internet recovery mode while using Voice Over. A blind person doesn’t need a sighted person to reinstall their operating system on a Mac if something has gone wrong and they end up needing to do it.
Same for Ubuntu. Although Linux accessibility is still inferior to Mac OS X, it’s more integrated than Windows. Once the liveusb boot up chime can be heard, a press of a keyboard shortcut combo can launch Orca, the screen reader. A blind person can install once again their operating system by themselves without outside assistance.
As for the main topic, which was iOS vs other mobile OSes, and not Windows 10 which is completely dead in the mobile market, there is just no comparison possible. Even if Windows 10 wasn’t dead, there would be the issue of the app ecosystem and developers not caring enough for it to support accessibility APIs in their apps :
https://coolblindtech.com/latest-windows-phone-10-previews-are-acces…
> Unfortunately, at the time, very few Windows Phone apps had accessibility built in. In fact, if you’re a developer interested in or already making apps for the Windows platform, click this link to learn more.
This was a problem because some of Nokia’s own setting screens were completely unusable. Cortana later introduced the ability for apps to “launch†various actions, so we were able to use Netflix, as an example.
> The game does not change with Windows Phone 10. (Or if you prefer, Windows 10 Mobile.) You still have a large number of inaccessible apps — from my own personal list, Pandora/ Spotify are still inaccessible, while Swarm works with workarounds, and Netflix is passable as “OK.â€
Those apps are accessible in their iOS incarnations. Macs developers care more often to develop well integrated software than their Windows using counterparts. The experience of using a 100% Cocoa MacOS/iOS environment or 100% gtk Gnome environment on Linux is just light years ahead.
I doubt you even read the article before you commented too because :
> Joe showed me how he takes photos. As he holds up the iPhone, VoiceOver tells him what he’s seeing: “One face. Centered. Focus lock,†and so on. Later, as he’s reviewing his photos in the Camera Roll, VoiceOver once again tells him what he’s looking at: “One face; slightly blurry.â€
> “If a cab or an Uber lets me off somewhere, and I’m not sure which way is uptown, I open the Compass app. Since NYC is a nice grid, it lets me know which way I’m walking.â€
> “Or I might just say to Siri, ‘Where am I?’ She tells me exactly where I am.â€
>Joe uses a lot of text macros. He’s set one up that says, for example, “Where are you?†when he types.
He knows the positions of all his apps’ icons—but often, he’ll just say to Siri, “Open Calendar†(or whatever).
This level of integration with the entire system and built-in apps basically doesn’t exist anywhere else.
Edited 2017-03-14 09:47 UTC
https://support.google.com/accessibility/android/answer/6007100
Seems like a common feature everyone has, not something special or unique to Apple.
The article did address TalkBack and pointed out that it wasn’t as good. They even linked to this article https://icodelikeagirl.com/2016/03/27/unity-accessibility-plugin-upd….
Unfortunately, it’s not as simple as having a “common feature.” Speaking as someone who must use these features every day, I have to say that Google Talkback is so far behind iOS’ VoiceOver feature that it’s not even a joke. To explain everything, and why Apple is so far ahead in this game at the moment, would be an article in and of itself. I’ve offered to do a comparison article here before on the accessibility features of Android/iOS/Mac/Windows, however there has been little interest expressed.
Well, it seems like Thom is now interested. I’ve love to see a comparison as I don’t use the features, but would love to understand where each choice stands currently.
I am also professionally curious, even though i do not use it myself or am, at least not yet, involved in any development work that need to be accessible.
I have previously tried searching for developer resources on the topic, but it seems a lot of content ends up in the very simple side or something that is waaay too complicated for semi causual reading for a developer who would just like to know some general best practices of how to at least make his work not suck.
My spouse does use screen magnification, and here MacOS seriously beats the windows magnifier.
Because of this, i checked screen zoom in both iOS and Android as well, and this might be the only space where it easier to work with in Android, as i just could never get comfortable with the 3 finger gestures in iOS. She does not use it though, except sometimes taking a screenshot and zooming that 🙂
this, honestly, is part of the ‘apple premium’. i have known for decades that they were driving the market for assistive technologies by sinking in R&D that didn’t make the main bullet points, and that made me feel good about buying their stuff.
i knew we all would need that at some point. i’m heading to the eye doctor soon. my lifelong 20/20 is not so strong anymore… 🙁
I’m lucky – I’m near-sighted. As people get older, you naturally become more and more FAR-sighted, which means my eye-sight gets better every year. I use the lowest prescription I’ve ever had since I started wearing glasses at 12-ish. My latest pair are considerably thinner than my first pair, and they’re not even the new super-thin lenses they now have available.
Cool, except that this is not how it works. Yes, you will be able to read things you should use a glass to focus before, but what happens is, the muscles around your globe become less powerful and lose its capacity to deform your globe and at same time your lenses become more rigid and more opaque. Sorry guy, I wish things were like you dreamed.
Edited 2017-03-14 00:08 UTC
Well, I’m not an ophthalmologist, but that’s what the last one I saw (about three years ago) told me. That and my last prescription was only 1/4th the power of my first clearly means that I (if nobody else in the world) am getting more far-sighted as I age.
I am in the same boat except I now really cannot do without reading glasses if I have my contacts in or I have to take off my prescription glasses to look at something at close range. Focusing takes fractionally longer and is less stable than it was even a couple of years ago and this is from someone who breezed through the bottom line on the optician’s chart with corrected vision, something I was informed that most struggle with even with 20:20.
You should have your eyes tested annually. Stuff like glaucoma or macular degeneration is more likely the older you get. Leaving it for three years is less than smart.
Yeah, you’re right about the testing. My YOUNGER brother just got a cataract in one eye, so we’re all old enough to need that yearly testing. Too often, we just wait until we start to show symptoms, and then it’s more of a problem. Catch it early and you’re better off.
Though I don’t need assistive tech at this time, I found it to be a worthwile read…
“There’s a ton to dislike about iOS…”
We feel that way about Andoid too.
Indeed. Truthfully, I dislike aspects of both. I simply dislike iOS a little less especially from a stability and battery life standpoint in addition to the accessibility.
I run BeOS on my pocket computer.
I dream in infrared. I rebooted myself three days ago.
—
One other reason iOS’ accessibility features are superior to Androids is their ease of using multiple languages when necessary. If there is a correct language mark-up, it will switch to an appropriate voice automatically. If not, it’s ease itself to switch between any languages you need VoiceOver to use. This is independent of your device language so, for example, my iPad is in English but I want to read an article in Swedish. No problem: I can switch VoiceOver to Swedish in a second, read what I want, and just as fast switch to German or English or whatever I need next. It’s ironic, but VoiceOver actually handles multiple languages better than Siri does.
With Android, you have to go through the laborious process of changing your default TTS voice every time you want to read in several languages. This not only takes thirty seconds every single time, but also… have you ever tried to read your device’s English UI while your tts is in Swedish? The result is, while humorous, quite difficult to comprehend. Forget about reading mixed language text on Android. It’s an ordeal not worth even attempting.