Talkback and VoiceOver!

AppleVis looks at Talkback and Voiceover

A Fair Look at Talkback and VoiceOver

Original Article from applevis.com by mehgcap – Applevis Edtiorial team

Hello there, reader. Did you come here because you’re the world’s biggest Apple fan, and are excited to join in some Android bashing? Are you in love with Android, with visions of finally hearing someone put those Apple idiots in their place? Well, my goal is to do neither. You see, I’ve used iOS for years and recently spent some time learning Talkback. I found it an interesting experience. I want to compare VoiceOver and Talkback because each has strengths and shortcomings, and each could learn some major lessons from the other. Don’t worry, though: there is a winner.

What This Article Is and Is Not

I want to be very clear about this: my goal with this article is not to provide details on how to use Talkback or Android. It’s not to offer a handy list of Android resources. It’s not to explain the ins and outs of VoiceOver. I assume you have at least a basic familiarity with VoiceOver on iOS, and an understanding of the idea of touch screen gestures and other mobile screen reader concepts. Finally, I don’t want the comments to turn into a free-for-all. Keep things respectful and helpful. And again, don’t ask me for step-by-step details on Talkback, because that’s not why we’re here. I will sometimes give descriptions of how features work in the below text, but that’s only so readers can understand how the feature in question works. I’ll try not to go beyond the basics needed to understand what I’m referencing.

The Hardware and Software

I’ve used an iPhone 7 since the model came out in 2016. My Android device for this experiment was a refurbished Pixel 1, a phone released that same year. I got the blue one. The Pixel seems to be in perfect working order, with a good battery, functional parts, and no cosmetic problems I or sighted people who have seen the device have found.

For this experiment, I was using Android 9 with the latest Talkback and other Google accessibility updates installed. My iPhone was running iOS 13.

A Quick Pixel Review

Skip this section if you’re here for Talkback. You’ll miss nothing important. If you’re curious about the Pixel, though, stick around.

My Pixel is slightly larger, in all three dimensions, than my iPhone 7. However, it isn’t uncomfortable or poorly designed. It’s not as sleek and nice-feeling as my iPhone, but it’s not bad at all.

The Pixel has an aluminum body with a glass-feeling material on the upper half of the back, with the fully glass front of any modern smartphone. It has a large chin, which feels odd as there is no home button or other control there. That doesn’t impact performance, though, so it doesn’t bother me. The fingerprint reader is on the rear, and seems to do a good job. In fact, adding my fingerprint was faster and smoother than on iOS.

The speaker grill is on the right side of the bottom edge, facing you if you hold the phone flat in your hand with the screen facing up. On the left side of this face is an identical grill, where I presume a microphone is hidden. Between the two grills is the USB-C port. A headphone jack is on the upper edge, about where it is on most iPads. The right edge holds the lock button, with the volume buttons below it. One nice touch is the texture: the lock button is ridged, making it rough under your finger, while the volume rocker is smooth. This makes it easy to feel which button you’re about to press. Opposite these buttons is the nano SIM slot.

The Pixel’s performance felt somewhat slower than my iPhone, but Apps open fast enough to avoid frustrating me, and Google Assistant starts listening about as fast as Siri does. There’s a delay between performing a gesture and having Talkback respond, which I imagine is partly the phone and partly the software. Benchmark comparisons show that the Pixel lags behind the iPhone 7, but as a test phone, the Pixel is more than sufficient. It wouldn’t kill me to use it daily with its current speeds, but it could definitely be faster. Remember, though, that this is still a 2016 phone.

My main complaint is the speaker, which sounds tinny and weak next to the iPhone’s dual speaker setup and better bass response. If I had to come up with an esthetic annoyance, it would be the sticker on the lower part of the phone’s back. It has some codes, numbers, and so on. It’s a thick, obvious rectangle that ruins the look (I asked a sighted person) and feel of the smooth aluminum shell.

Basically, the Pixel is fine. Not great, or sexy, but… it’s fine. It has a few things my iPhone doesn’t, but it’s slower and bigger. Still, if you want a cheap Android phone with no third party modifications to the software, even something as old as the Pixel 1 is a great choice for the money. There’s nothing here that makes me hesitate to recommend it, especially since it can be had for under $150 from places like eBay, or even cheaper if you go for a refurb.

A Brief Talkback Talk

I won’t give you a full Talkback primer here. There is one aspect of the screen reader, however, that is essential to understand: it only intercepts some gestures.

VoiceOver on iOS reads every touch you make on the screen, then reacts. This is why some apps have special places where VoiceOver doesn’t interfere with gestures, while other apps ask you to turn VO off entirely. In contrast, Talkback only involves itself in touch gestures when those gestures include one finger. Two or more fingers are simply ignored, with Android itself reacting instead. Drag two fingers down from the top of the screen, and Talkback isn’t opening your notifications shade, Android is. TB has no idea what just happened, it only knows it has new content to read.

This explains why Talkback lacks support for customizing gestures that include two or more fingers, and why it uses back-and-forth and angle gestures so much. The developers needed a way to pack a lot of commands into one finger, because they couldn’t use more.

It also explains a quirk of Talkback: if you touch the screen to explore it, you must pause for a small amount of time. If you place a finger and begin moving it right away, Talkback reads the movement as an attempt at a gesture instead of telling you what’s under your finger. This is rarely an issue, as the delay needed is brief, but I’ve run into it a few times.

Knowing that Talkback can’t read multi-finger gestures explains a lot. It won’t, however, stop me from holding this shortcoming against the screen reader. It’s a problem, no matter why it exists.

What I Like About Talkback

Let’s begin with the positives, as I usually do. There are aspects of Talkback I really, really like, and that VoiceOver would do well to consider borrowing. Note that I’m not saying VoiceOver can’t already do some of the below. This section is a mix of things VO lacks, and things VO can do that Talkback handles in a simpler way.

Super-powered Fingerprint Sensor

Talkback supports “fingerprint gestures”, which are commands you issue by swiping a finger over the fingerprint reader. Not all devices have this, but my Pixel is one that does. I can swipe one finger up, down, left, or right over the sensor on the back of my phone, and Talkback will respond. I can either use this as a menu for speech rate, volume, and other settings, or assign each of the four movements to any of Talkback’s commands.

I don’t use this often, mostly because of my grip. I hold phones in such a way that none of my fingers are near the reader, so I have to shift my grip anytime I want to use this option. This often requires two hands, so by the time I’m in position to issue a fingerprint gesture, I may as well have used a normal one. But I just need to work on holding my phone differently, and I can see this becoming a very helpful tool. I was hoping to also be able to assign multiple taps on the sensor to commands, but Google hasn’t gotten there yet.

Ironic Menus

At any time, you can open one of two menus with Talkback. One is contextual, offering options specific to what you’re doing. If you are editing text, you’ll have editing commands; if you’re on a notification, one of the options will be to view actions for that notification. You get it.

The other menu is also full of options, but these are global ones. They include things like opening Talkback’s settings, spelling the most recent utterance, copying the most recent utterance to the clipboard, and the like.

I love this idea. In VoiceOver, you have to remember that the “copy last spoken phrase to clipboard” gesture is tapping three fingers four times, or you’re out of luck. What if you have a hard time memorizing all the gestures VO uses? Or you’ve assigned something you use more often to that gesture? A menu of possible actions makes perfect sense. Just bring it up, choose the function you want, and you’re done.

I promised you “ironic menus”, and here it is: Apple already did this! On macOS, you can press vo-h, then h again, and you’ll be in a searchable menu of every possible VoiceOver command you could ever want. Mouse movement, navigation, speech, and plenty more. For some reason, though, the feature never made it to the mobile world… Except it did. Google implemented it. Now, the only one missing out is the iOS version of VoiceOver.

Circle Menus

Talkback lets you use circular menus if you want to. The idea is that, instead of a list of items you can swipe through, you place a finger on the screen and move it in a circle. As you move, Talkback will speak various menu options, as though each were on the rim of the circle you are drawing. To select an option, just lift your finger when you hear what you want.

While you can turn this functionality off and use a regular list of menu items instead, I’ve come to like the circles. They are faster to use in most cases, even more so once you know where specific items are. For instance, I can dismiss a notification by drawing the angle gesture to open the actions menu, touching the screen, sliding my finger down a bit, and lifting up. I don’t have to open the action menu, swipe or touch to find the option, and double tap. Yes, it only saves me one extra gesture, but it feels faster and just as intuitive as the other method. And hey, that’s one more gesture I don’t have to worry about.

Speaking Passwords

The question of whether a screen reader should allow you to hear what you type as you enter a password is a long-standing one. One side argues that since a blind user can’t see the keys they touch, having the confirmation that they typed what they meant is useful. If they don’t want it in some situations, they can turn it off. Most of the time, though, they are somewhere where no one will overhear the characters of their password. The response to this is that apps and operating systems sometimes offer users the ability to view their passwords as they type. If this is available, the user can use it. If not, the screen reader shouldn’t override the system’s choice and speak the password anyway.

On iOS, VoiceOver won’t echo the characters you type unless the field offers a way to view the password. Talkback has a clever idea here, though. It follows VoiceOver’s model, but lets you choose whether to speak the characters you type if you have headphones connected. Presumably, headphones mean only you will hear the audio anyway, so the security of not speaking what you type is unnecessary. While I can still see both sides of this debate, I tend to favor this implementation over Apple’s more firm stance.

The Double-Edged Sword of Navigation

VoiceOver lets you navigate by swiping up and down to move by character, word, heading, link, and the like. You change what these swipes move by by changing the rotor. This makes sense once you grasp it, but the gesture can be hard for people to master, and it requires two fingers plus an odd wrist movement.

Talkback solves this by letting you swipe a finger up or down to change what left/right swipes move by. It’s like a simplified rotor. This system makes it much easier to, say, go from jumping around by heading to moving by character or link. It’s all done with one finger and no rotor motion.

Say It How You Want

It’s no secret that Android is more open than iOS. This means developers can release Android apps that would never survive Apple’s review process. These include speech synthesizers, and even replacement screen readers. Talkback can speak using Google TTS, Eloquence, Acapela, Nuance, eSpeak, and others. You need only purchase the app you want, then change a setting. Or, you can install a different screen reader and tell Android to use it instead of Talkback. For my testing, I stuck with Talkback as my screen reader, and a mix of Google’s own voices and eSpeak for my speech.

Clever Gestures

I’ve touched on some of the gestures unique to Talkback, and the reason they exist (one finger maximum, remember). I want to highlight them, though, because they’re really cool.

You can use angle gestures by moving your finger in a straight line, then in another, perpendicular straight line. You basically draw a right angle. This gives you eight gestures to play with : up then right, up then left, right then down, down then left, and so on. By default, you have one-finger access to the local and global menus, notifications, the overview (sort of like the iOS app switcher), home, and back. Some people hate these, either because they can be finicky to get right at first, or because they can interfere with exploring the screen by touch. But I don’t mind them.

The other set of gestures are what Google calls back-and-forth ones. To perform one of these, you move your finger up, down, left, or right. Once you’ve drawn a short line, you reverse the direction, going back along the line you just drew. To jump to the top of the screen, for instance, you move one finger up, then back down. To scroll a list, you move right then left, or left then right.

Both of the above are very clever ways to do more with one finger, adding twelve more possible gestures. I’d love to see VoiceOver implement some of these, particularly the back-and-forth ones. If Apple offered those with multiple fingers, users could do a lot.

Getting Volume Keys in on the Action

Android makes much more use of physical keys than iOS does. You can press power and volume up to mute, for instance, or power twice to bring up the camera from anywhere. Talkback supports the volume keys as well. You can use them to change the value of a slider control, such as the speech rate. You also toggle Talkback on and off by pressing and holding both volume keys at the same time. While less useful than other features I’ve talked about, this is still a handy way to do things. I didn’t like it at first, but I’m coming around. Android being Android, I imagine I could find apps to let me change tracks, activate buttons, and perform other actions with these buttons.

Impressively Specific Volumes

Android has several volume levels, all of which can be changed independently. There’s the speech volume, media volume, alarm volume, and ringer volume. There may be more I’m missing, too. Talkback even tells you which volume has been changed; if you press a volume key while the TTS is speaking, you hear that speech volume has changed, whereas pressing a button while music is playing tells you that the media volume has changed. This lets you mix the volumes of speech and media together, to get them balanced just how you want.

Also, you can set your ringer and alarms to different levels than your music and speech. You might want alarms to always sound at 100%, while music plays at 45% and your ringer is at 70%. This is all easily done.

I know that iOS lets you change the ringer volume independently of other volumes, but it can’t let you mix media and speech how Android does. It also won’t report the new level, or which volume was changed. These announcements can be irritating at times, but the idea behind them is still great.

What I Don’t Like

Sorry Google, but now it’s time for the negatives. I found plenty to like about Talkback, but I also found plenty of irritations and missing features.

Missing Gestures

Talkback fails to support a shocking amount of gestures. When I first went to the setting that would let me customize its touch gestures, I was very surprised to find that a lot of obvious ones aren’t present. There are no triple or quadruple taps at all, and nothing with two or more fingers. No three-finger swipes, no double tap with two fingers, nothing. Just a bunch of one-finger gestures that don’t even include triple or quadruple taps.

We’ve already discussed Talkback’s unique gestures, and how I think they’re a good idea. Yet I can come up with twenty-one more gestures just off the top of my head which TB doesn’t have: triple and quadruple tap one, two, three, or four fingers; one or two taps with three or four fingers; and swiping three or four fingers up, down, left, or right. The list gets much larger if you could use the reverse gestures Talkback already supports with more than one finger. And this still doesn’t touch the position-based commands, such as the two-finger swipe from the top of the screen that shows notifications. Why not allow us to customize those, and others like them?

Not a single command in the previous paragraph is available to be customized. Most aren’t present at all. I know why this is (remember that Talkback only intercepts one-finger gestures, not all touch input). As I said when we talked about that model earlier, though, I’m still counting this against Google.

The other major missed opportunity here is the magic tap. This is a VoiceOver feature that lets you double tap two fingers to do a surprisingly wide array of tasks without needing to find a specific control on the screen. Normally, these tasks relate to audio, so you don’t need to hear speech mixed with other sounds as you try to find a pause button or answer a call. Instead of a simple gesture to accept an incoming call, Talkback’s official manual says that you must place one finger “about three quarters of the way down the screen”, then swipe right, left, or up. VoiceOver’s two-finger double tap, which can be performed anywhere on the screen, certainly seems simpler.

The caveat here is that I was unable to test phone calls during my time with Android. I use iMessage a lot, and I didn’t want Apple to find my phone number no longer associated with an iPhone. I can still use iMessage with my email addresses, but I wasn’t sure what it would take to re-associate my number. The potential hassle didn’t seem worth it. Still, I’m going off the official documentation, so I hope I have accurate information.

If nothing else, a universal “stop the music” gesture is great to have. Audio ducking isn’t at all good on my Pixel, with music quieting well after speech begins and coming back to full volume almost before speech ends. This not only gives the ducked audio a choppy sound, it makes it harder to hear speech. I didn’t realize how nice the magic tap was until it wasn’t there.

A Lack of Actions and Options

Let’s assume for now that Talkback did have the same set of gestures as iOS, plus its right angle and reverse ones. What would you assign to all those commands? Would you use four of them to always have heading and form element navigation available? Maybe set some to spelling the last utterance or copying it to the clipboard? Too bad.

You see, Talkback has a surprisingly small amount of actions. You can assign a gesture to move to the next or previous navigation option (similar to turning the rotor in VoiceOver), but you can’t set a gesture to actually move by one of these options. In other words, you can customize how you get to moving by headings, but not set up a way to simply move by headings anytime you want to. If you’re on a webpage and move by heading, then you want to keep going to see what is under that heading, you have to first move back through the navigation options to “default”. If you don’t, swiping right on the heading will simply jump to the next one. I found this slowed me down a lot. After all, VoiceOver always moves by what Talkback calls “default”, regardless of the rotor setting. Land on a heading, and you need only swipe right to have VO read what follows that heading. Not so with Talkback. This is odd, because you can choose which keyboard shortcuts will move you by link, heading, and other options. I have no idea why you can’t do the same for touch gestures.

Speaking of actions, VoiceOver long ago introduced the custom actions rotor. On emails, message threads, notifications, app icons, files, links, some buttons, and countless other places across iOS, you can simply swipe up or down to find actions. Share a file, delete an email, clear or view a notification, and on and on. Simply swipe one or three times, double tap, and you’re done.

Android has actions as well, but Talkback hides them inside the actions menu. To open the menu, you have to perform a gesture. I realize this is only one extra step, but trust me, you feel it. Instead of swiping up or down, you must do the gesture that opens the actions menu, wait for the menu to appear, find the action, and double tap it. It doesn’t add more than a second, but those seconds add up. Oh, and the best part: the gesture to open the actions menu is unassigned by default! You read that right: unless I’ve missed something, you have to open the actions menu by first opening the local context menu and finding the right item. Now we’re up to three gestures at minimum just to get to something VoiceOver offers automatically.

This translates to more than just an annoyance when handling notifications, though. It means there are more buttons all over the place, and they are sometimes not convenient to find. Each notification, for instance, has a button to expand or collapse it. That’s an extra swipe per notification as you move through the list. On iOS, actions are just that–actions, tucked away in the rotor. To be fair, this can hurt discoverability on iOS, so for new users, Talkback’s approach may be better. But once you know how the rotor works, you don’t need extra buttons cluttering up your navigation.

Let’s get back to the other kind of annoying button. In the Gmail app, you must swipe twice per email: once for the message itself, and once for the control that selects or de-selects the message. Once you select a message, you have to go to the top of the screen and find the action button (delete, archive, move, etc) you want. Once done, you have to get back to the list of emails and find your place again. If you’ve used VoiceOver, you know that deleting a message is as simple as swiping up and double tapping. No selecting by using an extraneous control, no leaving the list of messages then going back to it. Again, VoiceOver’s rotor is only good if you know how to use it, so having on-screen buttons isn’t a bad thing in itself. But having those buttons be the only way to manage emails makes Android a less efficient way for me to get things done.

No Help Mode

Every screen reader, braille notetaker, talking book player, and other blindness-specific technology I can think of includes a special help mode. When active, the press of a key or activation of a gesture will announce what should happen, but not do anything. This way, you can try something to confirm that it will do what you expect, or work through a set of new commands you’ve just learned, without actually affecting the device you’re using. I consider myself a power user on macOS, iOS, and Windows with NVDA, but I still use this help mode.

Talkback doesn’t have it, as far as I can tell. You can go into the settings and review the assigned gestures, but that’s as far as it goes. You can’t tell your phone to speak what a gesture would do. You can only try the gesture and see what happens, or try to find the gesture in Talkback’s settings. This is especially disappointing given the unique gestures Talkback offers. Instead of getting frustrated while trying to learn angles or back-and-forth motions, but only getting random focus movements, it would be great to have a place to practice things, where the only feedback is silence or the pronouncement that what you just did matches the gesture you were going for. I know I’m not alone in having used VoiceOver’s help mode to learn the rotor, how quickly to double tap, how far to move when swiping, and so on. I can’t imagine why Talkback doesn’t offer a similar mode.

The Double-Edged Sword of Navigation

I said earlier that Talkback’s method of swiping up and down to change what left and right swipes do is far simpler than VoiceOver’s rotor. It is, but I’ve also told you how much of a problem that can be. When you move through a website by heading, then want to read the text past a heading, your only option is to swipe up or down to set your navigation back to default. You can’t rely on swiping left and right to always move by element, regardless of which element it is. Not having this always-present navigation is another way Talkback hurts efficiency.

Imagine using the screen reader on a Windows or Mac computer to move around a website. You’d likely press h to move by heading, or another single key to move by link or landmark. Once you got where you wanted to be, you’d probably use your down arrow to read, right? With Talkback, it’s as though the arrow keys are all you have. Press up and down to cycle through the different elements by which left and right can move. Up arrow to headings, press right, and try to read. You can’t, without pressing up or down several times to let left and right review the text of the page again.

The analogy isn’t great, but hopefully it gives you an idea. If you’re still confused, just trust me that Talkback’s method is objectively slower than that of any other screen reader I can think of, including Voiceover on iOS.

Bye Bye Braille

In iOS 8, Apple introduced system-wide braille input, available right on the touch screen. Suddenly, I could type text with ease, instead of poking at the on-screen keyboard. I don’t exaggerate when I say this feature was life-changing, either. With it, I’ve moved to doing far more on iOS than I ever expected. Social media, emails, texts, writing beta testing feedback, writing reviews, and more are all just a rotor twist away. My phone has become my primary computing device, and braille screen input is largely why.

Talkback has no such option, and that’s the worst part of the experience for me. I’ll give Google this: their automatic suggestions are better than those on iOS. Still, they’re no substitute for the almost effortless input of braille screen input. BSI has profoundly impacted my mobile computing experience for the better, and Android can’t offer it. For a heavy user of this feature, its absence is difficult to ignore. I realize I’m in a minority, but if you’re a braille user like me, this is going to be one of the largest factors to consider when you look at Android.

Other Android Thoughts

While this post is about Talkback and VoiceOver, I wanted to take a moment to acknowledge the other aspects of Android I appreciate. Some are specific to accessibility, others are not.

Moving Apps

I’m using the Pixel Launcher, since it came with my phone and the only other one I tried, Nova, didn’t let me move apps around. The Pixel Launcher has a pretty neat way of moving apps, a way that’s easier than on iOS.

First, you touch the app to be moved. Next, you open its actions and choose “move app”. Third, you touch where you want it to go. When you touch the screen in this mode, Talkback says “move to row 4, column 3” (for example) if you touch an empty app slot. If you touch another app, you instead get, “create folder with [other app]”. It’s simple, intuitive, and does everything you need. Again, other launchers will vary.

Widgets

I didn’t appreciate widgets until I tried them. My favorites are the weather, which lets you have the temperature right on your home screen, and the Apple Music widget, which allows me to skip tracks or play/pause without opening the app itself. I haven’t used other widgets yet, but what I’ve seen so far makes me curious to explore more. It also makes me wish Apple hadn’t relegated its version of widgets to a screen that takes extra gestures to get to.

Vibrations Everywhere

One thing I’ve never understood about Apple, from the first time I booted my first iPhone, is the company’s eversion to using vibration to indicate, well, anything. On my Pixel, there’s vibration feedback when the phone restarts, when I should lift my finger while saving a fingerprint, and more. I like this, both because it feels more slick than using speech for simple notifications like those, and because it’s more accessible.

Split Screen

I don’t use an iPad often, but when I do, I usually place my two top apps side by side so I can access them both without switching between them. It’s very cool to have this same ability on a phone. Sure, it might seem visually cramped, but I don’t care. I can touch one app or the other to put my focus there, and that’s all Talkback needs. Managing them isn’t as easy as it is on iOS, but it’s at least an option. By Apple’s decree, I can’t do this at all on my iPhone.

Who Wins?

You won’t be surprised by my conclusion: VoiceOver is better in almost every way I care about, so it wins, hands down. It could certainly borrow third-party speech synthesizers, new gestures, and menus from Talkback. But what it lacks in those areas is more than made up for by what it offers that Talkback doesn’t. Talkback has no global braille input, no help mode, far less commands that can be assigned, less gestures to which to assign the commands it does have, a less efficient navigation system, and no quick way to pause media playing in the background. Even answering phone calls requires you to put your finger in exactly the right place.

That said, I can absolutely see why people prefer Android. Note that I said Android, not Talkback. Widgets are awesome, using split screen on a phone is great, assigning my own apps for my browser, mail client, and other functions is quite nice, and placing apps anywhere I want is helpful. Also, I love having the ability to install any speech synthesizer I care to.

On the flip side, I’m missing a lot of apps I use all the time. Seeing AI, Overcast, Twitterrific, various games, and even first-party apps like Apple’s Mail app are all ones for which I’ve not found Android counterparts that come close to being as good. Android’s use of buttons instead of relying on swipe actions is painful at times, such as having to swipe four times just to move from one tweet to the next, or two times to move between emails in Gmail.

I hope you found this useful and informative. Please don’t decide based on my experience alone, though. Try it for yourself, or, at the very least, seek out other peoples’ experiences. I’ve read posts that echo my own, and I’ve read posts that talk about how much better Talkback is. Each person has their own preferences, so your milage will vary.

Original Article from applevis.com by mehgcap – Applevis Edtiorial team

 

Facebooktwitterredditpinterestlinkedinmail
Back to top