Podcast: Play in new window | Download
Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Show Notes: Greg Stilson – Vice President of Product for Aira
Youtube: aira visual assistance
Alexa Show Story: http://bit.ly/2ndnLtX
Sensory Bags Story: http://bit.ly/2n4SF7t
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: https://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA
————–Transcript Starts Here————————————
Greg Stilson:
Hi, this is Greg Stilson. I’m the vice president of product here at Aira, and this is your assistive technology update.
Josh Anderson:
Hello, and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology, designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson, with the INDATA project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 435 of Assistive Technology Update. It’s scheduled to be released on September 27th, 2019. On today’s show, we’re very excited to have Greg Stilson, the vice president of product for Aira on to talk about Aira and all the new things that they’re working on. We also have a quick story about new sensory bags available at sporting events in the theater to help individuals with autism and sensory disabilities, as well as a new feature available for your Amazon Echo Show devices that can actually tell you what it is that you’re holding.
Josh Anderson:
Don’t forget you can always reach us on our listener line at (317) 721-7124. Send us an email at tech@eastersealscrossroads.org, or drop us a line on Twitter, @indataproject. Now let’s go ahead and get on with the show. A fun and interesting story over at Fox 2 now out of St. Louis. It’s written by Joe Millitzer, and it’s titled, “The Enterprise Center and Stifel Theatre now have sensory bags for guests with autism.” So we think about going to a sporting event, especially a hockey game, which is where it was played there at the enterprise center, apparently, or going to the theater where you have just a lot of noises, a lot of things going on. You have not just cheering crowds, but you also have… If we think of the theater people on onstage moving, dancing, yelling, singing, all these other things.
Josh Anderson:
Then if you think of a hockey game, we think of people smashing up against the wall and cheering and jumping up and down and all that, for folks with any kind of sensory disability or challenges, that can be very overwhelming and probably make people just not go. So it looks like these two places actually offer sensory bags in order to help individuals with dementia, autism, PTSD, other similar conditions where they just have a sensitivity to sound or just overstimulation. It says in these bags, there are noise canceling headphones, fidget tools, verbal cue cards, and weighted lap pads. So a lot of different things to really help folks out who maybe have these sensitivities and aren’t usually able to get into these kinds of events because of maybe fear of the overwhelming noise and stimulation that then maybe would really cause issues in the past.
Josh Anderson:
So very nice that this place is thinking about opening up their doors to individuals with, again, autism, dementia, PTSD, other similar conditions where they might have kind of an overload of experience just from all the noise and everything. But they’re really opening their doors to individuals, and making it to where they can enjoy these events as well. So we’ll put a link to that over in our show notes. Found an interesting story that kind of fits in with today’s interview, even. It’s over on Grocery Dive. It’s written by Jessica Dumont, and it’s titled, “Amazon adds tool to aid visually impaired customers in the kitchen.” So a lot of us maybe have an Amazon Echo device and we won’t actually say her name so maybe we don’t turn on your devices, but it talks about that now, it can identify household pantry items.
Josh Anderson:
So if you have one of the Echo Shows, so the one that actually has the camera that can see you, has the screen on it, you can actually hold something up in front of it and say, “The A word, what am I holding?” And it will actually identify this item, uses its advanced computer vision and machine learning in order to get this. And it actually said it’s now available to US customers who have a first or second generation Echo Show device. So very cool that they kind of thought about this. You can just hold it up and say, “What is this? Is this a box of macaroni and cheese? A thing of Rice-A-Roni?” It says that Amazon actually collaborated with the Vista Center for the Blind and Visually Impaired in Santa Cruz, California, and also worked with blind Amazon employees for the research development and testing of the feature.
Josh Anderson:
So they partnered with an agency that’s going to know a lot about blind and visually impaired needs, as well as partnering with some of their own employees, just to make sure that this was actually going to be really useful for them. As we’re going down the story, it actually mentions Aira in this story, so I won’t even talk about that part. We’ll be talking about Aira here just a little bit later with Greg, but it also talks to that… The technology behind this is very similar to Microsoft Seeing AI, which we’ve talked about on this before. It also talks about how voice ordering has really kind of helped out individuals who are blind and visually impaired, and how that’s been moved into a lot of different systems. So we’ll put a link to this over in the show notes, but it’s very cool to think how Amazon and some of these larger companies are thinking about individuals with disabilities, and opening up the accessibility to their stuff for individuals who are blind and visually impaired.
Josh Anderson:
We’ve talked about Aira and actually had them on the show, well, before it was actually my time as host. I can remember first hearing about Aira at ATIA few years back, and seeing what it could do. Well since then, Aira’s continued to innovate and offer more services and assistance, so we thought that it was time to have them back on the show to tell us all about it. Our guest today is Greg Stilson, vice president of product for Aira. Greg, welcome to the show.
Greg Stilson:
Thanks Josh. So happy to be here.
Josh Anderson:
Yeah, I’m really excited to talk about all the new things and kind of get to know you a little bit more. But before we get started, for folks who don’t know what Aira is, can you start off by just telling our listeners a little bit about it?
Greg Stilson:
Absolutely, yep. So Aira is a service that we created that connects a blind or low vision person to a live trained person. So we call our people on that side agents. So they’re sighted agents who go through a significant training process to basically function as a pair of eyes for whoever needs it, but they do so much more than just function as a pair of eyes. It’s doing anything from visually identifying things, to describing, to providing personal assistant type of solutions, and things like that. Really, those agents are some of the most adaptable people you’ll ever meet. And you connect to them through an app on your smartphone, whether you’re using an iOS device or an Android device, or you can connect using our Horizon Smart Glasses as well. But Aira is a service that basically connects a blind or low vision person to alive train sighted agent whenever they need it on their terms.
Josh Anderson:
We’ll hear more about everything you mentioned there here in a bit. First though, how did you get to start working for Aira? And tell us a little bit about your backstory.
Greg Stilson:
Yeah, so I’ve worked in the assistive tech industry for about 14 years, I would say. I spent 11 years with HumanWare actually, doing… Started there just in their tech support team, and changed jobs several times, [inaudible 00:07:30] roles. I worked in their sales team. I worked as a product specialist, always working alongside customers, and really identifying what customer’s pain points are and things like that. And I did that sort of as a product specialist, when I would do a lot of training workshops with HumanWare. And that kind of gave me this love for the UX side, user experience side, and identifying where gaps are for the user experience and really creating sort of the best user experience that we can possibly do.
Greg Stilson:
So I kind of tried my hand later on. I was the product manager for HumanWare’s blindness product, so I created from one of the biggest products that I oversaw from concept to reality, was the BrailleNote Touch. And I worked on Victor Reader Stream projects, and Trekker Breeze projects and things like that, and really found that I absolutely loved product management. Being able to see a project or a see product from concept to reality is so surreal. I’ll never forget CSUN of 2016, when we launched the BrailleNote Touch after developing that thing for almost four years. So to be able to see that happen and to be able to really… Today, especially as I look, as I travel the country, and I’m still semi in the same space. You know, I still see a lot of my former coworkers and things like that at various conferences, but I see a lot of the teachers that are working with students in the classroom and talking about how their students are using the BrailleNote, and it’s still really awesome to just see that side of it.
Greg Stilson:
So I’ve worked, like I said, with Humanware remotely. I still am remote. I live in Madison, Wisconsin. A little bit about myself, I have a three year old daughter and both my wife and I are blind. My daughter is not. So we’ll see what challenges that creates and what it doesn’t, but I’m looking forward to all of that, all those chapters and things like that. But I came to change careers, and joined Aira back in… I think it was December of 2017. And it was really personally and professionally time for a change for me, and I really was enamored with what Aira was doing. It was so innovative and so unique, and something that… They took a very simple idea.
Greg Stilson:
Connecting a visual person through a phone camera, or through a camera and providing visual information. But they were doing it in such a way that created this level of independence that I personally, as a blind person, had never experienced. And I’ll clearly remember for a long, long time… You mentioned ATIA. In ATIA 2017, I remember I met Suman Kanuganti at that point. He was our CEO at that time. And he asked me, because he was kind of following what I was doing at HumanWare… He asked me, he’s like, “Greg, why aren’t you using Aira?” And I said, “I don’t know that much about it, but I’ll tell you what. If this thing can really get me through an airport independently where I don’t have to rely on anybody else, and I can get off the plane and go to my gates or go catch an Uber or whatever, independently just like any other sighted person and not have to wait for one of those meeting assist people or deal with the whole wheelchair and drama and all that other stuff…”
Greg Stilson:
I said, “You may have me hook to that point.” And so he said, “All right, give it a try.” So I remember ATIA was the first time that I decided to try out Aira, and I used it… Back then we were using Google glasses, and I used it for the first time in the airport. I said, “All right, well, I’ll give this a shot.” So we were using Google glasses connected to a MiFi. There was pretty much no app involved, or very minimal app that was involved at that point. You had to use the glasses. But I remember I got off the plane in Orlando Airport, and I didn’t wait for a meet and assist person. I called an agent. The agent immediately had my GPS location. They had a map of that location of where I was in the airport, and we successfully navigated for the first time.
Greg Stilson:
I’ve traveled 100,000 miles for several years in the past for HumanWare when I was doing my work internationally and things like that. And 2017 was the first year that I had ever really navigated an airport by myself without having somebody walking with me, or needing to rely on somebody like that. And it was probably one of the most emotionally freeing moments that I had, and at that point, I was enamored with what Aira was doing. And when the time came to make a career change, Suman was a very persuasive individual, and I was already enamored with the technology. I joined the product team here, and the rest is history.
Josh Anderson:
Very nice. Now, I know you said back then they were using Google Glass. Now, you’re not using Google Glass anymore? You guys have something called the Horizon glasses? Can you tell us about those?
Greg Stilson:
Yeah, absolutely. So we’ve been through so many different glasses. We publicly released two prior versions to users themselves. So the Google Glass was our really early adopters, and some people really liked them. Other people, they just didn’t work well. And the biggest challenge that we faced with both the Google Glass and the next one, which was called [Austria 00:00:13:23], was that we had to rely on a MiFi in the middle. And so because they were wireless glasses, there was really two major challenges that we faced. The first one was that the battery really only lasted 60 to 90 minutes on those glasses, and especially for navigating an airport, some of those calls can take a half hour or 45 minutes. And at that point, you’re dead for whatever the next airport is that you want to get to.
Greg Stilson:
So there’s a lot of needing to keep power packs and charging and things like that. But I would say the largest challenge that we tried to overcome was because we were using a wireless hop that was going either from your smartphone to the glasses or from one of our MiFis that we would send in the kit to the glasses, what would happen is it would really amplify a poor connection. What I mean by that is not amplify for a positive, but it would take a poor connection that would only be maybe slightly poor, and because there was a wireless hop that the connection or the data would have to jump from the MiFi to the glasses, during that jump, we would lose connectivity much, much faster than what you would see maybe if you were making a call on your smartphone and things like that.
Greg Stilson:
So we had frequent disconnections and things like that. Our early adopters, I think they all saw the potential in what we were doing, but they were all very patient people as we kind of honed the experience. But we did find out that the wearable… It doesn’t have to be glasses, but the wearable device that you’re able to allow you to be hands free was really the ticket. And so we said, “All right, we need to solve it.” And I remember exploring with our chief product officer at that time, Austin, we were looking through so many different pairs of glasses trying to figure out what… We looked at Vues next. We looked at so many glasses that were in prototype phases, and they all were not built for our use case. And our use case was we needed anywhere from three to five hours of battery life, and we needed stable connectivity.
Greg Stilson:
And still, to this day, our requirements that we, I believe, achieved. It’s a solution that works. I will say, it’s still not the optimal solution that we’re looking for, and we’re still in search of what we call the glasses Holy Grail. But what it is today is it’s a pair of pretty stylish glasses. The camera is mounted right above your nose, so it’s centered, which is great. The other glasses had the camera off to the right. So getting somebody to look at something could be a challenge at that time, because you were kind of looking to the left of what you were always wanting to gauge and to see. This camera is a wide angle, ultra wide angle lens, so it’s 120 degree field of view, which gives the agent peripheral vision. But it’s not perfect.
Greg Stilson:
So when you do have that really wide field of view, fine details like text can be a little bit more challenging to read. So that’s why with this solution, we have really a two piece solution. We have the glasses, and the glasses are hardwired into what we call the Horizon Phone, or the controller device. Really, this is the brains of the Horizon system. So there’s really very little processing that goes on in the glasses themselves. There’s no battery in there. There’s really just a camera and a motherboard that’s inside of the glasses, but really all the heavy lifting and processing happens on that Horizon Phone. That’s really the brain and the battery. So doing this, that Horizon Phone provides us up to seven hour battery life that you can use. It also provides a stable data connection that, because we’re hardwired into it, you’re not having that loss of fidelity when you’re doing a wireless hop anymore.
Greg Stilson:
And it also gives us a second camera. So if you’re not getting the clarity out of the Horizon Glasses… Let’s say you’re needing to sift through mail, and the text on a piece of mail can be really small and really fine grain. And so you are actually able to… The agent can switch right over to your Horizon Phone camera, which is a much more narrow field of view, can handle more of those fine details, and use that. You can hold it over the piece of mail and read that type of information. You can do that on the fly if you need to. So it kind of gave us the best of both worlds. And as I say, it’s not a perfect solution, but we do have thousands of users today using the Horizon Glasses successfully. And especially when I attended the summer shows over the summer NFB and ACB, it was amazing to have the amount of people who were using Horizon Glasses, or using their phones, held up and working with agents to navigate them, to help navigate where they needed to go on their terms.
Josh Anderson:
Well, excellent. It sounds like you guys really listen to feedback and those kinds of issues that were there originally, and really took those to heart to try to find something that would work a lot better for folks. So that’s excellent.
Greg Stilson:
Absolutely, yep.
Josh Anderson:
So you’ve talked about them a lot and kind of mentioned them, but tell me a little bit more about the agents, the eyes I guess if you will, the people that are helping out.
Greg Stilson:
Sure. So the agents really like what makes Aira, Aira. It’s not the technology, it’s the people. I always joke that I work on the product team and I work to build the best product that we can build, but it’s really… When we look at what we as a product team build, it’s how do we reduce the friction to get you to an agent as efficiently and quickly as possible? So we have this metric that we look at, which is time to agent, and our goal as a product team is how do we get you to an agent as fast as possible with as few hurdles in the way? And so our agents are extremely special people. There’s a running joke that we hire less than 1% of the applicants who apply to become an agent. So the running joke is it’s harder to become an Aira agent than it is to get into Harvard, but there is a skill set that is very challenging.
Greg Stilson:
You need to find some special people who have… These people, number one, have to be tremendously customer service oriented. They have to be patient, because blind people typically are not calling for an extra pair of eyes because they’re super happy about it. They’re trying to solve a task, and there may be frustration involved. Sometimes you may call just to get something described, but most of the time amongst all of my alternative techniques that I have, if I’m calling Aira it’s because one of those other techniques didn’t work. It’s part of that technology toolbox, and so our Aira agents are extremely customer service oriented, very patient. They’re empathetic.
Greg Stilson:
In our training, one of the things that we say is empathize, don’t sympathize. So you’ll never hear an Aira agent say, “Oh, you’re so amazing,” or, “I feel so bad that you’re blind,” or any of that kind of stuff. Like our agents are empathetic, and they’re trained to think like a pair of eyes and not a brain. So the user, we call them explorers. The explorers who call in are the decision makers at all times. So there is… You can ask for an agent’s opinion, and if they feel comfortable, they will give it to you, but you need to prompt them for their opinion. For example, if we have people call in to say, “Hey, I didn’t label this shirt or these pants. What color are these? What’s the pattern look like?” And the agent will describe that.
Greg Stilson:
And some people, like myself who has absolutely no fashion sense, we’ll say, “Hey, what’s your opinion, do these match?” And at that point, if the agent is a more fashion-oriented person, they may say, “Yeah, absolutely. It looks great,” or “No, I wouldn’t wear that shirt with those pants,” or something like that. But you ask them for their opinion. They don’t just give it to you. So everything’s really objective at that point. They also train in… They’re not O&M specialists by any means, but they know how to speak the lingo. So one of the things that you will experience when you’re walking with an agent if you’re… I’m a cane user myself, but we train both on cane use and guide dog use, because they’ll use phrases like, “Okay, off to your right is a wall, if you want to shoreline,” or, “If you tap off to your left, you’ll feel that there’s a curve, let’s say one foot to your left,” or something like that. And they’ll use the same style of terminology for dog guide users as well.
Greg Stilson:
And I think one of the really cool things… And we’ve got so many videos on YouTube, but one of my favorite videos that just really shows the experience in action is if you just type in “Zac navigates the Denver Airport, Aira” into YouTube, you’ll actually see a real experience of somebody navigating an airport. But what I really like is in that video, his cane taps poles, it taps those stanchions, it taps all of those things, and you never once hear the agent say, “Oh, watch out,” or any of that kind of stuff. They say, “Oh, you’ll feel this on your left,” or “You’ll feel that on your right.” But they know that you as a blind person know how to navigate with your own tools, and so it’s really an extra sort of available tool for you as you’re using your primary tool, which is your cane or guide dog.
Josh Anderson:
And that’s great. I know some of the folks I’ve talked to who use Aira, they have said that. Because like you said, they’re empathetic. They’re a helper. They’re not telling you what to do or telling you where to go. They’re just there to answer your questions and help you along as kind of just an extra guide, I guess, without completely leading the way, kind of beside you, not in front of you.
Greg Stilson:
Exactly. I’ve heard so many folks who maybe haven’t used it or are skeptical of using it, and there’s a lot of comments out there like, “Oh, well, your O&M skills are going to erode or decay because you’re using Aira.” And quite honestly, I think you hit it right on the head, Josh, which is, there’s really no difference in you walking next to a sighted person, or you having Aira. You still have to use those skills, and the difference is that maybe that sighted person who’s walking next to you would grab you and pull you out of the way if you were going to run into something, whereas Aira doesn’t do that. They will give you that extra information, and so you still have to be using your O&M skills. I would say there is some tremendous stories out there of folks who maybe aren’t super confident. They became… Let’s say they came to blindness later, or are starting to lose their vision, and there’s a lack of confidence out there that… Or they have a lack of confidence initially because they don’t have that significant training.
Greg Stilson:
What we’ve heard is that Aira helps to give them that confidence to go try new things that maybe they wouldn’t do. And one example I have of that is myself as a blind person. I hated farmer’s markets, because my wife is blind, I’m blind, and to have to walk up to each and every stand is incredibly annoying to find out what they have. And so I tried Aira. I normally wouldn’t go to a farmer’s market, but I said, “This might be different.” And so I tried Aira, had the glasses on, and my wife was with a friend of hers, and my daughter, they were just pushing the stroller around. And she’s like, “Well, why don’t you go grab this, this and this?” And I was able to just independently walk to these various stands and pick up the items that I wanted to, and not have to go to every single stand. And they’ve actually made that an incredibly good experience and fun experience, whereas I wouldn’t have that same experience if I didn’t have that level of detail.
Josh Anderson:
Listeners, unfortunately, that’s all the time that we have for today, but there’s still a whole lot more to be said. So make sure to tune in next week to hear the rest of Greg’s interview and hear all of the new things that Aira’s doing and the great new features that are available. Do you have a question about assistive technology? do you have a suggestion for someone we should interview on Assistive Technology Update? If you do, call our listener line at (317) 721-7124. Shoot us a note on Twitter @indataproject, or check us out on Facebook. Are you looking for a transcript or show notes? Head on over to our website at www.eastersealstech.com. Assistive Technology Update is a member of the Accessibility Channel. For more shows like this, plus so much more, head over to accessibilitychannel.com. The views expressed by our guests are not necessarily that of this host or the INDATA project. This has been your Assistive Technology Update. I’m Josh Anderson with the INDATA project at Easterseals Crossroads in Indianapolis, Indiana. Thank you so much for listening, and we’ll see you next time.