Podcast: Play in new window | Download
Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA——————— Transcript Starts Here —————————-Ben Jacobs:
Hi, this is Ben Jacobs, and I’m the owner of RebelTech Consulting, and this is your Assistive Technology Update.Josh Anderson:
Welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson, with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 487 of Assistive Technology Update. It is scheduled to be released on September 25th, 2020.
Josh Anderson:
On today’s show we’re super excited to have a conversation with Ben Jacobs from RebelTech, and we’re going to spend a little bit of time talking about the new iOS 14, some of the cool features that it has as well as some of the accessibility features that are available now, and we’ll also talk about some new Android accessibility features and how it can help individuals with disabilities. We have a story about Waymo, which is making self-driving taxis and what they’re doing about accessibility in ensuring that individuals that are blind and visually impaired can access their vehicles.
Josh Anderson:
Don’t forget that we love to hear from you so if you have someone who may be a good guest, something you’d like to learn about, a question about assistive technology, or perhaps just some comments or suggestions for the show, go ahead and reach out to us. You can reach us on our listener line at (317) 721-7124. You can drop us a line on Twitter @indataproject, or send us an email at tech@eastersealscrossroads.org. We thank you so much for listening and we can’t wait to hear from you. Now let’s go ahead and get on with the show.
Josh Anderson:
As we head to our conversation with Ben here in just a little bit, we’re going to talk about accessibility in smartphones and just think over the course of time how much that has really changed things for individuals with disabilities and really and truly for all of us, I mean, we all use these things every single day. Well, something else that most of us have to use in some way, shape, or form is transportation. And I know at least here in Indiana and probably many other places, transportation for individuals with disabilities can be a challenge.
Josh Anderson:
Now, Uber, Lyft, those things have definitely made a big difference really more in the more populated areas. I live a little bit outside of Indianapolis, quite aways and I mean, you can’t get an Uber where I live. Of course, no food delivers there either, maybe it’s just a little bit too much in the country. But our first story today comes to us from TechCrunch and it’s titled Hailing a Self-Driving Taxi When Blind: learn How Waymo Answers That Challenge at Sight Tech Global. It’s by Ned Desmond and it talks a little bit about Sight Tech Global which is having a big event in December, which is all 100% free and 100% online and there are links in here that if you would like to attend that you can. But it talks a little bit about Waymo.
Josh Anderson:
Now, Waymo is in the development of self-driving taxis, and some of these are already in service in the Phoenix, Arizona area. It talks about how they’re actually working with a couple of organizations out there that help blind and low-vision individuals to make sure that these rides are completely accessible. Such as it kind of starts off with talking about a blind individual who is really excited to try this self-driving taxi. The little safety driver in there says, “Hey, get in buckle up and hit the start button,” but it doesn’t say where that start button is. You can imagine how that could be a little bit of a challenge.
Josh Anderson:
But it does say that Waymo is working very closely with the Foundation for Blind Children in Phoenix, and also consulting with a LightHouse for the Blind in San Francisco. They are getting this input just to make sure that these are accessible to everyone because really think, if you have self-driving cars, self-driving taxis, how much easier that can make transportation for individuals with all kinds of disabilities, not just the blind and visually impaired, especially if you could make these taxis actually accessible for wheelchairs and for folks with other mobility challenges as well.
Josh Anderson:
It really digs into some of the challenges with making these self-driving vehicles accessible. If you do hail a taxi, an Uber or something like that, there’s someone driving that car and they have a lot of duties outside of just driving the car. They may roll down the window and let the rider know who they are, help them find the car, say, “Hey, the handles are in front of you. Hey, let me take your bags.” Trying to put that kind of stuff into a self-driving vehicle can be a little bit challenging. How much information do you need to give? How much is too much? Can the person understand that information? So there’s so many different things that can really and truly make these vehicles inaccessible when really and truly they could be just an amazing accommodation for folks.
Josh Anderson:
I think it’s something that I think we’ve talked about a lot and I think will even come out a little bit in the interview with Ben here in a little bit, that a lot of these accessibility features, a lot of these things that are built in actually have uses for individuals without disabilities. So they’re really trying to get a broader focus on inclusive design in these vehicles and really taking that into account just to make sure that they’re not just serving the disability community, but also making things more accessible for individuals without disabilities. We’ll go ahead and put a link to this over in our show notes so then you checkout, learn a little bit about Waymo and also learn about that Sight Tech Global event if that’s something that might interest you.
Josh Anderson:
iOS 14 officially released last week, and it came along with some exciting new features and accessibility settings. Not to be out done, Android’s also adding to its accessibility features and allowing for even more users to access the world around them. Today, we have Ben Jacobs from RebelTech on the show to discuss these new features and how they can help individuals with whole range of different abilities. Ben, welcome back to the show.
Ben Jacobs:
Hey, thanks for having me, Josh.
Josh Anderson:
Yeah, it’s really great to have you back on and get to talk about all this exciting new stuff. But, can you start off by telling our audience a little bit about yourself and about RebelTech?
Ben Jacobs:
Sure. Originally, well, not really originally, but I worked in the Air Force in information technology and I really focused on integrating various technologies together to be able to accomplish a mission greater than any one of those technologies could achieve. Once I retired from the Air Force I moved to Georgia and got a job at Georgia Tech’s Assistive Technology Program. And when I was working there I established the Access Lab where I was able to demonstrate many new and different emerging technologies and how they could work together and be able to benefit anyone, making independent living possible for so many different people.
Ben Jacobs:
People would always be really excited about what they saw, especially the smart home solutions and they would always ask, “Who’s going to set all of this up for me?” I couldn’t really think of a good affordable service with the necessary expertise in both the technology and the accessibility sides, so I decided to address that service gap by leaving the Assistive Technology Program and starting my own business, RebelTech. At RebelTech Consulting, I offer individuals consulting, recommendations, installation, setup, customization, and training and maintenance for all of these different types of solutions, including smart home integration. I also offer trainings and presentations and access to expertise for organizations that serve people with disabilities as well.
Josh Anderson:
Excellent. And we’re going to put some of that expertise to the test today. Maybe not a test, but at least we’re going to hear some of it. iOS 14 came out last week. What are some of the new features that it introduced?
Ben Jacobs:
There are a lot of great new features. I read an article recently where they listed off 45 features that people may not know about. But some of the big ones that are really going to help a lot of people out, most people out, include widgets. Widgets are these little information blocks that you can add to your home screen. They’ll present information in a very easy-to-read way from any app on your phone. You could put an analog clock on your home screen if you wanted, or you could put a little block that has news updates and weather information on your home screen as well. This is something that’s actually been on Android since almost the beginning, but it’s great to see that iOS is adding this feature that can just make information so much easier to access for everyone.
Ben Jacobs:
They’re also adding in a feature called the App Library. Again, this is something that Android has had for quite a while but iOS has added it, now it allows people to find apps a lot easier. It just adds all of your apps on one screen. I think if you swipe all the way to the left on iOS, it’ll bring up the App Library where you can see an alphabetical or a category listing of all of the apps that are installed on your phone. It makes it a lot easier to find an app that you’re looking for rather than swiping through each of your different home screens trying to find it.
Ben Jacobs:
And then another one I wanted to mention real quick that I really think is amazing is Siri now shares the screen on iOS. Originally with Siri, as soon as you talk to her your whole screen would be blacked out, she’d take over the whole thing. If you were maybe navigating on your Apple Maps or if you were looking at something on your web browser, as soon as you ask Siri a question she’d take over the whole screen you wouldn’t be able to see any of that other information anymore. But now with iOS 14 they’ve made it so that she just pops up at the bottom of your screen and doesn’t block all that other information out or whatever app you’re using at that time. You’ll still be able to access all of that information while still talking to Siri.
Josh Anderson:
Well, and I love that the phone feature has done the same thing, just because I know it used to be any time here you’re trying to do something on the phone and the phone rings and maybe you can’t answer it right then, but you maybe don’t want to hit decline to get it off the screen but now it just pops up right up there at the top and does not take up the whole screen. It’s amazing how simple of a thing can make a huge difference.
Ben Jacobs:
Yeah, absolutely. They’re always looking for ways that they can make the user experience better for everyone and that’s definitely worked here, I think.
Josh Anderson:
It definitely has. And talking about that user experience, what are some of the new accessibility features available in iOS 14?
Ben Jacobs:
Right. There’s really a whole bunch of great accessibility features in iOS 14. One of them is sound recognition. This could be really beneficial for people that are deaf or hard of hearing. Sound recognition has an accessibility saying where you can make it so that your phone or your device is aware of various sound-based things that are happening around you. When your device picks up a particular type of sound or alert, it’ll send a notification to your device, whether that’s an Apple Watch, an iPhone or an iPad, but the sounds the system can detect are alarms like sirens or smoke alarms at home or building fire alarms.
Ben Jacobs:
It will also detect household noises like doorbell chimes or car horns, or if your appliance is beeping, it’ll even detect running water and let you know that maybe you left the faucet on, or maybe there’s a leak somewhere. It’ll even tell you if it detects people nearby you yelling. If someone’s trying to alert you to something that’s going on nearby and you’re not normally able to hear them it’ll detect that and let you know on your Apple Watch or your iPhone. And it’ll detect animals as well. Right now it’s set up so that you can detect a cat meowing or a dog barking nearby as well.
Josh Anderson:
Oh, that’s great.
Ben Jacobs:
Yeah. Back Tap is a really cool one that’s a brand new. This is one accessibility feature that many people can end up using really. The feature lets iPhone users do a variety of quick actions by double or triple-tapping on the back of an iPhone. Users can turn on specific accessibility features or take a screenshot. They can also scroll or open the control center, you can go to the home screen or even open the app switcher, all by either double or triple-tapping on the back of your iPhone.
Ben Jacobs:
One thing that Back Tap doesn’t easily do is launch the camera or take a photo, but users can configure those actions by first making a Siri shortcut. People that aren’t familiar with shortcuts, the shortcut app was introduced two years ago and it’s actually built-in now it’s not a separate app, but it automates common and routine tasks. With shortcuts people have the ability to create customized commands.
Ben Jacobs:
For Siri, usually, is how it’s used but they can set up a request that brings together a bunch of different things. For example, say, you’re a surfer and you want to put together a surf report and the current weather, travel time to the beach, maybe a sunscreen reminder, you can set up a shortcut that does all of that just by saying, “Hey Siri, surf time.” And any of those shortcuts that you create can actually be mapped to the back tap settings. As I said before, with the camera, being able to open up the camera, you can actually create a shortcut that opens the camera app or takes a photo and once you have that shortcut you can assign it to that Back Tap and then it’s just simply a matter of tapping the back of the phone a couple of times.
Josh Anderson:
And what are some of the other ones, Ben? I know the Back Tap is definitely one of my favorites and I know it’s my wife’s favorite because she just moved to an iPhone that didn’t have the home button and I thought she was going to get rid of it. I was like, “She just missed that.” And I understand, you’re so used to it. She missed that home button then when Back Tap came out last week she was like, “Oh thank goodness. Thank goodness I can finally get home without having to try to think through it again.” Just real quick, what are a few of the other ones before we get into talking about Android?
Ben Jacobs:
Yeah. There’s also an update to Voiceover. Voiceover is Apple’s screen reader built into their devices and now Voiceover actually taps into Apple’s on-device machine learning and Neural Engine. It’s able to recognize and audibly describe more of what’s happening on the screen even when third party developers haven’t enabled that ability on their apps. An iPhone or iPad will now automatically provide better optical recognition on more objects, images, text, or controls that are displayed on the screen and Voiceover actually gives more natural and contextual feedback now. When it comes to images or photos Voiceover can now read complete sentence descriptions to detail what’s on the screen and it automatically detects user interface controls like buttons, labels, toggles, sliders, and other indicators.
Josh Anderson:
That’s excellent. Because it makes those things accessible that the developers didn’t think to make accessible on their own and just circumvents that and that’s excellent, it opens up a whole new world of accessibility to folks. We don’t want to spend all our time on Apple although I know we probably could just talking about those but I feel like I have to give some time to Android as well. What are some of the cool new accessibility features that Android is putting on their devices?
Ben Jacobs:
Unfortunately, this past year we missed out on Google I/O which is their big developer conference where they tend to show off a lot of the new upcoming things and accessibility is always a huge thing there. But from last year’s Google I/O they had demonstrated a few different things and they’re actually coming now on to phones so it’s really great timing to be able to talk about these. Live captioning is one really huge one that Google introduced, like I said, they showed it back in 2019 but it’s now available on any Android 10 or newer device.
Ben Jacobs:
Live captioning what it does is it’ll take anything, any sound on the phone and caption it live for you. You can be watching a video on YouTube, sure, you can get captions for that on your phone. But you can also record a video on your own device and it could have someone talking and it would automatically caption that for you as well. Or even podcasts, if you wanted to listen to a podcast you can use the Live Caption accessibility feature built into Android to be able to have it automatically generate captions for that podcast.
Ben Jacobs:
It’s really amazing just with a touch of a button anything that’s on your phone can have captions and make it accessible for someone that’s deaf or hard of hearing. Or even myself, I don’t know if you’ve ever watched Game of Thrones, Josh, but me and my wife did and we had a hard time following it at first but then when we put on the captions it made it a lot easier for us to be able to follow all of the names and locations and stuff like that.
Josh Anderson:
That’s so funny, I had the issue. The names were just too much alike and I couldn’t keep the places. Great, yeah, you’re right after a few seasons you figure it out, but no, that’s a great thing to be able to help out.
Ben Jacobs:
Yeah. Captions are great for anyone really, but, yeah. Another great feature that they showed off at Google I/O back in 2019 and again it’s now available is Live Relay. This kind of takes captioning and puts it on to your phone conversations. I know that there are a lot of privacy concerns when it comes to what’s being uploaded to the cloud, what’s being recorded and whatnot. This is one of those things that’s all done on device so nothing goes up to the cloud and nothing is recorded.
Ben Jacobs:
But how Live Relay works is say, someone calls someone that is deaf or hard of hearing on their Android phone. If the person the that is deaf or hard of hearing has enabled Live Relay on their phone, what happens is as soon as they pick up the phone the Google Assistant tells the person that’s calling, “Hey, this is the Google Assistant, I’m helping Fred out with phone conversations. You go ahead and say whatever you want to say to Fred, I’ll type it out for him and then when he types his responses I’ll say those out loud to you.”
Ben Jacobs:
It’s pretty much a relay where the color, whatever they say comes up real time on the user’s screen on text. It looks just like a text message, really, a text message conversation going back and forth where whatever the caller says comes up there real time and then the user can type their response and that response is read out loud to the caller. It makes it a really seamless conversation for anyone that’s using Android.
Ben Jacobs:
Another great feature built into Android is Lookout. This is one that’s designed for people that are blind or low-vision, where they need help to identify different objects around them. They can pull up Lookout, they can be at the grocery store and it’ll help them find where a barcode is on a product and scan that barcode and then let that person know exactly what that product is. And you can also have it detect faces. If you first meet someone you can program them into Lookout and then the next time you see them they’ll identify that person for you.
Ben Jacobs:
You can also have it describe a whole scene. If you point your camera at a certain area of your home, it will actually describe the whole scene for you. It will say, “Center, there’s a chair. To the left on the wall there’s a clock,” and actually describe out the whole scene for you. It can be really beneficial for people that are blind or low-vision to be able to navigate the world around them.
Josh Anderson:
Oh, most definitely. I mean, just think of 10 years ago, even, I guess a little bit farther back when the smartphones first came out to even imagine that they could do so many things would be just almost unheard of.
Ben Jacobs:
Yeah. It’s really amazing the leaps and bounds that technology has made. I mean, this all would have been science fiction 10 years ago so it’s really cool to see. One other feature I wanted to mention real quick with Android is a project that they’re working on called Project Euphonia. And this is one that goes hand-in-hand with the captioning and actually the Google Assistant and all of that, where they’re actually teaching the AI models that recognize our speech to be able to recognize people that have slurred or hard-to-understand speech. They’re actually just putting all that data of those people talking that have slurred or hard-to-understand speech and putting those into the AI models. And they’re actually making it so that it’s better able to recognize anyone regardless of how well they’re able to enunciate or if they have cerebral palsy or whatever other kind of speech disabilities that they might have. It’s really amazing.
Josh Anderson:
It is really amazing. And it’s great that they’re taking that into consideration. I mean, because then more folks can access their devices, which I suppose from a business standpoint is great in the end just because more folks need to buy your devices. But to really think about that and be able to make it accessible for anyone, it’s a great goal and it’s great to see these big giant companies actually taking that into a major consideration.
Ben Jacobs:
Yeah, absolutely. It’s really great to see all of these companies really thinking about how to make things more accessible for everyone.
Josh Anderson:
Well, talking about that, Ben, what would you love to see out of the newest version of Android or iOS 15, whenever these come out? What’s an accessibility feature that you’ve been waiting on that you would just love to see them have?
Ben Jacobs:
I mean, it’s difficult because there’s so much great things that are being done right now with voice access. You can pretty much control your whole phone just using your voice if there’s even switch access, so if the voice doesn’t work for you there’s that. There’s a lot of automation being brought in, it’s getting to the point where it’s difficult to really think of where we could go next though, all the low hanging fruit has definitely been gobbled up at this point.
Josh Anderson:
And it really is, and really that was almost a trick question because even after I asked I was sitting here thinking, “Well, it’d be cool if he does that. Now a big no, no.. he does that.” I mean, even if you’re using Google Maps now you can see the accessible way to get to places now, or how accessible they are once you get there. With an iPhone you can use your ear pods almost as assistive listening devices now. It’s so much that whenever I do think, “Oh, I’d be really great.” With voiceover I’d be great if we could describe these things that aren’t accessible to, “Oh, it can.”
Ben Jacobs:
I would say the one thing that I am keeping an eye on and it seems very sci-fi right now is… I’m not sure if you had seen that Elon Musk actually recently did a demonstration of his Neuralink. This is a device that’s surgically implanted into a person’s brain and they’re able to access devices through that Neuralink. This could be something that’s really beneficial for people that have no mobility, no communication, they’d still be able to access their devices and communicate through that.
Ben Jacobs:
I know that’s something that’s definitely way off in the future. I don’t think it’s coming to Android 12 or iOS 15, but it’s definitely something to look for in the future. I think that it’s going to make devices and communication and just everything accessible. One thing that I always say is if you’re able to access your computer you’re able to access the world and really smartphones and tablets and all of these devices that we put in our pockets, our computers, and any way that anyone can access those really opens up a whole world of independence for that person.
Josh Anderson:
It definitely does, it definitely does. Well, Ben, before we let you go here, could you tell our listeners how they might be able to find out more about RebelTech if they’d like to?
Ben Jacobs:
Sure they can find my website at rebel-tech.org. That’s R-E-B-E-L-T-E-C-H.org, or they can email me directly at ben.jacobs@rebeltech.org. And that’s B-E-N.J-A-C-O-B-S@R-E-B-E-L-tech, T-E-C-H.org.
Josh Anderson:
All right, awesome. Well, Ben, thank you so much for coming on and talking about… I know we didn’t get to all the accessibility features or all the new things but, I mean, we could probably fill a whole month with that. But I really do appreciate you coming and sharing your insights and your excitement as well with all these new accessibility features on smartphones.
Ben Jacobs:
Of course, thanks so much for having me.
Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If you do, call our listener line at (317) 721-7124. Shoot us a note on Twitter @INDATAproject, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com. Assistive Technology Update is a proud member of the Accessibility Channel. For more shows like this plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the INDATA Project. This has been your Assistive Technology Update, I’m Josh Anderson, with the INDATA Project at Easterseals Crossroads in Indianapolis, Indiana. Thank you so much for listening and we’ll see you next time.