Ever since rumors started swirling that Apple was working on a wearable device, I’ve often thought about what such a device would mean for people with disabilities. My curiosity is so high, in fact, that I’ve even written about the possibilities. Make no mistake, for users with disabilities such as myself, a wearable like the Apple Watch brings with it usage and design paradigms that, I think, are of even greater impact than what the iPhone in one’s pocket has to offer.
Suffice it to say, I’m very excited for Apple Watch’s debut sometime next year.
Accessibility is definitely a strong point for Apple – at least compared to the competition – and I don’t think the Apple Watch will be any different.
Can’t say I’m excited about Apple’s or anyone else’s Smart Watch. Er, nor can I see why very many would. Must be out of the loop as they seem to do nothing very much that isn’t already done better on a phone and till they start sprouting holographic displays and beaming me up, when I ask, I doubt I will.
From my perspective, though I don’t see the point of them, I’ve been wondering how the heck you’d make these devices accessible to the blind. Short of duplicating what we have for touch screen phones, where one touches something and it speaks, I can’t think of any other way. Problem with that is noise polution, both when trying to hear it in a noisy environment and disturbing others. Maybe I’m old fashioned but I still try my best not to disturb anyone with my noise, so I use a Bluetooth earpiece with my phone. At that point however, what purpose would a smartwatch serve?
The taptic engine sounds promising, but the interface for the watch itself is also much more complex than android wear.
The primary navigation for the android wear is kind of expected to be audio. So great for the blind. Maybe not so great for the deaf.
Depends. For that to work the watch would have to use tts to talk back to a blind person. Speaking to it won’t do us any good when we can’t see the result. If it’s anything like Google Now, we’d need a way to review the screen too. On phones we have this, though no idea about watches yet.
motion input is already in use in console video games but it’s been for gaming and basic OS navigation only. imagine being able to type (select) a message, reply, accept/dismiss an alert, or otherwise give the watch input through wrist and arm motion only. imagine the wii controller built into your wrist with full iOS programability.
haptic feedback is also in use in console video games but only for vibrating cues, and apple has been experimenting with it using custom vibrations on iphones. imagine being alerted by a style of vibration on your wrist, and factor in variations in intensity, length, and pattern as well as stereo (LR) to give more alert types.
also remember they have built haptic input into the touch screen, so it will know the difference between a soft touch and a hard jab, along with all of the advanced swipe gestures that you can fit on the little screen.
finally there’s voice input and output to go along with the dial input and screen output. that’s a lot of UI IO and a lot of new concepts.
killer features could be things like:
–buzz my wrist 1 way when an email/message comes in from one of my VIP’s.
— tap my pattern to reply with 2 wrist turns out to reply ‘OK’, 2 wrist turns in to construct custom message
— buzz my wrist another way when an alarm triggers, 1 slow wrist turn in to dismiss
— buzz my wrist a 3rd way when my wife contacts me
— buzz my wrist a 4th way when it’s time to take a break
— buzz my wrist left-ward or right-ward for navigation
— add in voice prompts, dial spins or clicks, soft taps, hard taps, swipes, and you have a device that can perhaps operate without a visual display at all.
the thing about “accessibility” is we all need it, no matter how good our eyes and ears are. it’s not just for the handicapped anymore. the screens are so small and the sensor array-based software is so advanced that we all could use these features in a variety of circumstances.
i think “looking at your phone 500x a day” is apple’s next target for expiration. i’m happy to go on that journey with them. i’ve been staring at a smart phone, dropping it in and out of my pocket for 15+ years now, and just look around, that’s all people are doing anymore. staring at screens. boo.
there are several new input and output styles that will hit the mainstream with the iWatch, and as usual apple is doing the unsexy work of sorting out the basics of this new computing form, and will try to profit from it. i think they are finally going beyond star trek with the iWatch, and the android watches out now are really missing the point but will soon follow apple’s lead into this new human-machine interaction.
Problems I foresee:
– body positioning – laying in bed, laying on your side, running – all present complicated wrist positions that might confuse the watch
– how many separate vibration styles can our wrist determine and our brain process w/o having to look to clarify?
– how many times can our wrist vibrate in a day/week/year before we get ghost vibrations?
– false inputs. a watch gets banged around a lot as it’s on your wrist. we lay on them and have to swing our arms around for a variety of reasons. all of these can be false positives. if you have “log in” each time you want to send it a command, that could be very tiresome. pocket calling/texting will be replaced by ‘i played with my son the other day and it accidentally messaged my boss a drug deal!”
– too much input and output mixing, making for a complicated ordeal that’s simpler on a phone
– nerds making all of this so uncool that it’s mocked mercilessly, iWatch fails, then someone else comes along 4 years later with a simpler version and it’s a success (like newton).
If you don’t want to look at your screen, you are “blind”.
If you don’t want to hear your device make noise or read text, you are “deaf”.
If you don’t want to shout commands at your device, you are “sane”.
If you have a tremor in your hand, you have motor-skill deficiencies.
If you can’t read small text, you have vision deficiencies.
What we call “accessibility” now is the future of human-machine interaction. Apple is way ahead in some of these areas, and the watchkit API along with ingenious programmers should end the screen addict years.
Yes, Apple is ahead in these areas. They’ve done ever so well with Siri, after all.
snark snark. apple has done fine with siri.
plus accessibility is a lot more than voice control.
if you read my post again, you’ll see that voice control is one of the least helpful ways to control a computer.