AT Update Logo

ATU697 – Irisbond with Hannah Erickson

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.

 

Special Guest:
Hannah Erickson – Head of Training and Support – Irisbond
Irisbond website: https://www.irisbond.com/en/
More about Bridging Apps: www.bridgingapps.org
iOS 18 Accessibility Stories
Motion Sickness Control: https://bit.ly/3N5Sx0i
Other Accessibility: https://bit.ly/47NDpOG
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA 
—– Transcript Starts Here —–

Hannah Erickson:

Hi, this is Hannah Erickson and I’m the Head of Training and Support at IRISBOND, and this is your Assistive Technology Update.

Josh Anderson:

Hello, and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host, Josh Anderson, with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 697 of Assistive Technology Update. It is scheduled to be released on October 4th, 2024.

On today’s show, we are super excited to welcome Hannah Erickson, Head of Training and support for IRISBOND on, and she’s here to tell us all about their eye tracking system and really, just the importance of eye tracking and all the great things that it can do. We got a story about some of the new accessibility features in iOS 18, as well as we are joined once again by Amy Barry from BridgingApps, with an app worth mentioning.

Don’t forget listeners, we always look forward to hearing from you. Give us a call on our listener line at (317) 721-7124, or shoot us an email at tech@ Eastersealscrossroads.org. So please don’t hesitate to reach out if you got a question, a comment, a guest you’d like to hear on the show. Pretty much anything, we always do love hearing from our listeners and getting your input just to make this show even more enjoyable for you. Now, with that in mind, let’s go ahead and get on with the show.

I know a lot of our listeners out there are iPad, iPhone and just iOS users. So iOS 18 came out here just a week or so ago, somewhere in there, and it definitely came with some brand new and really cool accessibility features. One of those accessibility features is something that’s called vehicle motion cues, and this is meant to help reduce motion sickness when using the iPhone or iPad in a moving vehicles, so a car, a plane, anything like that. I found a story over 9to5Mac that kind of explains how this works a little bit better. So if you turn it on and use it, what will happen is there’s a bunch of animated dots on the edges of the screen and they will kind of move as you’re riding in the car. So it’s really kind of hard to explain, I suppose, on a podcast, but basically these little dots show up on the screen, so if you’re trying to read something, these dots would be over that. And as you move in the car, they will actually kind of move as well along with the movement of the vehicle.

So, why does this work? How does this help? How is this important? Well, I got to admit, this article really helped me out a lot because I didn’t understand motion sickness. I suppose I’ve had it before, don’t really get it in the car much, a little bit in a plane maybe with turbulence and things. But I have the thing where you’ve been out on a boat for a while, you get on the dock and the dock’s just really shaking even though it’s sitting completely still.

So, just to kind of quote from the story here, it says, “That you get motion sickness when there’s conflicts among your senses.” So let’s say it gives the example of you’re on a ride at the fair and it’s spinning you around and upside down. Your eyes, see one thing, your muscles feel another, and your inner ears sense something else. All of these mixed signals going into the brain, that’s what causes you to feel dizzy or sick. So the idea behind this is, with these little dots going in the same motion as the vehicle, that helps your brain kind of stay in tune, as opposed to reading static font or watching kind of a video on a static device. It gives you this movement that goes along with what your body’s experiencing, what the world around you is doing, and that actually reduces motion sickness. So a very, very cool feature and one that maybe you wouldn’t have thought of kind of right off the bat, or something that’s a little bit different, but a very cool accessibility feature there available in iOS 18 on iPhones and iPads.

We also got a story over at Cult of Mac that digs a little bit deeper into some of the other accessibility features, and one of those is eye tracking. Now of course, we’re talking about eye tracking with a device here as we get into our interview about IRISBOND, but this is kind of a built-in eye tracking on the iPhone and the iPad. Now at the time of recording, I have not been able to get this on my iPad yet, but I have tried it on my iPhone and it works. I am able to use my eyes for control of most of my device. I’m sure this is something that will get better as time goes on, but with that being said, it does have some limitations. If you’re really in need of full-on eye tracking, I would definitely say still stick with the device like we’re talking about here later, or full-on kind of eye control device just because it’s not fully getting there.

At the same time. I mean, it does work. It’s not super user-intuitive, but I’m sure it will get better as time goes on. Also, I wasn’t doing it in the completely correct way. It does say use a stand, keep it kind of close. But anyway, it is neat that at least they’re building it in there and if nothing else, it helps build awareness as people see this in there, they know that it’s available and it’s something else that they can really think of.

Another thing they talk about is a music haptics. I played with this a little bit too, it is pretty neat, actually. I mean, it is, just gives a whole other dimension to the audio, it vibrates. So music haptics is kind of like rhythmic vibrations, different buzzing patterns. It’s timed to certain Apple Music songs. So you can actually feel it, it kind of vibrates. So for individuals with hearing impairments, individuals who might be deaf, it allows you to feel the music and really know what’s going on with it. It’s pretty neat. It doesn’t work on every song, it’s just on some things on Apple Music, but still, it is pretty neat that they’re including that in there.

Another one that’s really great is something called vocal shortcuts. Now, for the longest time, I know that Google, and I’m sure Apple as well, have been trying to kind of take individuals who have kind of non-standard speech and train their AI engines to understand what it is that the individual is saying. So this kind of takes it the other way, where you can teach the iPhone to recognize custom phrases, and then have that perform another task. So here in the story in Cult of Mac, it talks about things like, “Hey Siri,” which usually turns on your phone and gosh, I’m sorry if I turned on anybody’s phone, but kind of having different words, different wake words, different controls that the user sets up itself. Something that’s easier for them to say, easier for them to use, and allows them access to their device even if they do have a different kind of speech pattern.

So then another one that’s available, kind of along those same lines is listen for a typical speech. So this will allow Siri to listen for a typical speech to really allow its speech recognition and accuracy to be a little bit more accessible, I suppose. It can really allow individuals that maybe have just what you don’t consider is the standard speech, still be able to access their devices. So, very cool. We always love when new accessibility features come out, and I’m sure these will get even better as more updates come, but just wanted to kind of highlight a few of them from a few different stories there so that everybody can go check them out.

Do you have a favorite new accessibility feature in iOS 18? Something we didn’t mention, something that maybe was updated, fixed, or works a whole lot better for you? Please, shoot us an email at tech@eastersealscrossroads.org, and let us know all about it.

Next up on the show, please join me in welcoming back Amy Barry from BridgingApps, with an app worth mentioning.

Amy Barry:

This is Amy Barry with BridgingApps, and this is an app worth mentioning. This week’s featured app is called Right Hear: Blind Assist. Right Hear is a free navigation tool for users with visual impairments. It helps users easily orient themselves in all environments by providing information about their environment when indoors and outdoors, where Bluetooth beacons are installed.

Right Hear has a very simple interface. After downloading the app, the users just have to open it to hear their current location. If they are not in a Right Hear enabled location, then the app will start naming what is near them. If users are within range of a Right Hear enabled location, they will hear their current location and what is around them indoors when they turn their phone in different directions. If available, users can also call a local representative for the location that they are in through the app. Open the business’s web page, use the lens feature to access third-party object recognition apps, such as Be My Eyes, Seeing AI, Cash Reader, and also Envision AI, as well as know the direction that they’re walking towards.

Right Hear was trialed by a BridgingApp’s staff member and her blind client in the Easterseals building. The client liked that the app was easy to use and did not require much instruction. She liked that she did not need to use her camera to navigate indoors, and that all she had to do was point her phone in different directions to receive information about her environment. As of fall 2024, the Right Hear website states that there are 2,381 Right Hear enabled locations worldwide. The app is currently available for both Android and iOS devices, and it’s free to download. For more information on this app and others like it, visit Bridgingapps.org.

Josh Anderson:

Listeners, today we are kicking off AAC Awareness Month with Hannah Erickson from IRISBOND, and she is going to tell us all about the great alternative input device that can assist AAC users and other individuals with disabilities access their devices, communication, and in turn, the world around them. Hannah, welcome to the show.

Hannah Erickson:

Hi, thanks for having me.

Josh Anderson:

Yeah, I am really excited to get into talking about IRISBOND and the technology. But before we do that, could you tell our listeners a little bit about yourself?

Hannah Erickson:

Yeah, of course. Well, I grew up in Minnesota, but I came here to Spain where IRISBOND is based, almost a decade ago as a language teacher. And although I have a background in psychology and in education as that language teacher, I didn’t have such a large awareness of eye tracking until I began working at IRISBOND about four years ago. But since then I’ve been opened up to the huge world of AAC and assistive technology, and have really dived into it since then.

Josh Anderson:

Awesome, awesome. And I always love the way that that kind of happens. I had almost no experience in assistive technology when I kind of started in this program about a decade ago, and now I don’t know, I get to talk about it with folks like you all the time, so it’s really great. Well, Hannah, let’s kind of start with maybe the big picture, the overarching picture. What is IRISBOND?

Hannah Erickson:

Well, IRISBOND has been an AAC provider and eye tracking manufacturer here in Spain for about 10 years now, so we just celebrated our 10-year anniversary last year. But we work with lots of partners around the world to bring AAC solutions that include our eye tracking device in them, in lots of different areas around the world, like I said. But as manufacturers of the eye tracker and developers of some technology that goes with the eye tracker, we’re just always working to make sure that we’re improving and working really closely with our end users to make sure that we’re providing some really good technology for them.

Josh Anderson:

Awesome, awesome. And I guess to take a little bit of a step back, maybe for our listeners who don’t know, what is eye tracking?

Hannah Erickson:

Yeah, that’s a good question, it’s a good place to start. Eye tracking is a form of technology that it just detects and follows eye movements of the person that’s using it.

So eye tracking can really be used in two main ways. The first way is to collect information about how we look at things, in what order we look at them, where our eye gaze is drawn to. And sometimes it’s used for, to look at how much time we’ve spent looking at something, understand our interests and our attention and things like that. But the second use of the technology is really more about interacting with devices. So in this context, eye tracking can be used for assistive technology, and this allows people to communicate and access their world via their eyes. So that’s really what we focus on a lot at IRISBOND.

But most of the time, eye tracking and eye gaze users can use their eyes to navigate through a computer, a tablet, an iPad, all of that, just by looking at different areas of the screen. And this is, they’re kind of completing what another person might do with a traditional mouse with a computer. So it’s really incredible, all the things that you can end up doing with just eye gaze.

Josh Anderson:

Awesome, awesome. And as we kind of talk about the device, tell us a little bit about the eye tracking device from IRISBOND.

Hannah Erickson:

Yeah, well, our eye tracker is compatible with lots of different options. So it’s compatible, I guess the special thing about it is that it’s compatible with both Windows systems, so your PC tablets on the Windows side, but it’s also compatible with Apple devices, so iPads. So a lot of eye trackers might only be able to work with one. So our eye tracker is really designed to be very flexible and easy to use with both types of devices, and you can combine it with those things to have two objectives, I guess. One is for communication, being able to communicate just using your eyes by connecting our eye tracker to the device. And then the other one would just be access, like what I said before, that you can access your computer and be able to work on myriad of different things just with your eyes.

Josh Anderson:

Awesome, awesome. And not to dig in, I guess too, too deep, but kind of just talking about the other things that you can access. Can you just give us an example of what some of those are?

Hannah Erickson:

Yeah. Well, some of our users have used Hiru, which is the name of our multiplatform eye tracker, Hiru. Some of our users have used Hiru to do everything from continue working after they’ve had a diagnosis of a degenerative disease. So accessing their work platform, sending emails, searching the internet, all of those things. Anything from working to, I can think of another user who he uses the Hiru exclusively for access for his university degree. So he’s studying math and science and using the eye tracker to access all of his university platforms and write his papers and calculations and all those things.

Josh Anderson:

Nice, very cool, very cool. Yeah, I know it used to be, it was just kind of, I don’t want to say a one trick pony, that’s always a terrible thing, but eye tracking was kind of, could be used on one thing or on just kind of a native app or maybe kind of just one program. But I love that it’s able to control the device and actually give folks access to so many different things and so many different ways of being able to do what it is that they’re looking to do.

Hannah Erickson:

Okay. So in addition to using the Hiru and eye gaze to access your whole computer, a lot of people are really using eye tracking for communication. So normally what this looks like is using an AAC or communication software that has everything inside of it adapted for use and easy access with eye gaze. So that makes things a lot easier to use when you’re working with eye gaze. So usually this includes things like a communicator, whether that’s with symbols or text communication, as well as apps that are adapted to use with eye gaze. So you wouldn’t have to always be navigating on your regular browser in order to use your computer with eye gaze. You could use it through these adapted apps, which can help a lot with that accessibility.

Josh Anderson:

Hannah, I guess, can you dig in just a little bit more on maybe how or where the Hiru works with Windows and iPad?

Hannah Erickson:

For Windows, we have a proprietary computer access software that’s called EasyClick. This is what is going to allow users of eye gaze to imitate a traditional mouse click or movement in order to access all parts of the device. But as far as other software for communication, we have integrations with Jabbla’s, Mind Express five software, Smartbox’s AAC software like Grid Three and Look Lab, as well as some free access software that’s available online as well.

And then for iPad, things work a little bit differently. So even though it’s the same camera used for both, you have kind of a different environment that you’re working in. So an iPad, things work differently because you’ll be able to use or access any app via iPad and Apple’s own assistive touch tool. So these are tools that are already built into the iPad and we’re just using the eye gaze to access them. So because Hiru has the MFI or made for iPad certification, you’re able to access all of those tools that Apple provides in order to just make your iPad work for eye gaze.

Josh Anderson:

Nice, very nice. Hannah, I got to ask you, how does training and user profiles really work with the Hiru and with the systems that it controls?

Hannah Erickson:

That’s a great question. So as I said before, I am working in the training and support area of our organization, so I really do focus a lot on the training that we have. So, since we are working a lot with implementing AAC for users both in Windows and in iPad, and we’ve been doing that for a long time in Spain already for years now, we wanted to extend our training options for AAC. For AAC providers, therapists, eye gaze users, and families, to the US as well, and to other countries.

So one of the ways that we do that is by providing some complete guides that are really in detailed guides, based on different user characteristics or user goals that can help someone get their device, let’s say a regular old iPad, get their device set up for success with communication or the access that we talked about before. So you can use these setup guides that we have to easily transform your iPad into an actual access or a powerful communication device that’s really personalized for you.

So we have some of those things available in what we call our Knowledge Base, which is an area of our website that has lots of information for getting started with eye gaze, but we also do that through personalized training sessions. So those can, like I said, be completely personalized and fit the needs of whoever’s requesting the training. But we, yeah, provide those personalized training courses about AAC implementation, eye gaze, and how to use the eye gaze in both operating systems.

Josh Anderson:

Awesome, and I know that you’re also built in to some devices over here in the states, and I’m sure kind of around the world, but what are some devices that folks might find the Hiru in, or maybe some kind of partners that you have that are using that device?

Hannah Erickson:

Yeah, yeah. We actually have a lot of partners. Some of them are around the world, but I’ll just go into talking about the ones in the US, because exactly like you said, people, our partners often take our eye tracker and integrate it into their communication solutions. And we have partner devices for both Windows and iPad in the US.

For example, you will find us with PRC-Saltillo with the VersaEye for iPad. Control Bionics also has a Duo and Trilogy is the name of the devices for Windows and iPad. That includes both our Hiru and their NeuroNode switch, which is something created by Control Bionics. Forbes also has the Hiru included in their ProSlate communication device for the iPad, as well as their WinSlate for Windows. And to finish off the list, you can also find Hiru inside some SGDs like Talk To Me Technologies’ SGD line, and RM Speech, as well as ImproveAbility. So we have a pretty long list, but the great thing about Hiru is just that, that you’re able to combine with lots of different types of technologies in different forms so that you can find the one that works best for you.

Josh Anderson:

Yeah, it’s always great. I’m sure there’s probably some listeners out there that are like, probably didn’t know what maybe IRISBOND or the Hiru was when they first tuned in and they’re like, oh, now I know exactly what it is, because I’ve seen that, used that, or maybe even use it on a daily basis. I think one of the favorite things that I really found in researching for today and looking at everything, was your feedback community. Can you tell our listeners a little bit about that?

Hannah Erickson:

Absolutely. Well, our feedback community is really just a program that is showing how important we take the opinions of our users. So I said at the beginning that we at IRISBOND are always trying to improve our technology, but really the only way that we can do that is by getting feedback. The good, the bad, and the ugly from our end users. So we’ve really set up our feedback community to be able to work with end users, families, and AAC professionals to make our eye tracking even better.

So what we do there is we, people that are interested in joining the feedback community are able to get samples of our technology via a loan or pre-release of different options, and just exchange opinions and feedback about it. This usually means a loan period with the device or some just feedback sessions with people on our team and those end users.

Josh Anderson:

Very cool, very cool. And it seems like in doing this for a while, the products that really last and really seem to help folks are the ones that do take that feedback from the community and really kind of take it to heart as they improve and just make their devices even better and even more accessible. So Hannah, kind of on that same note, can you share a story about how IRISBOND or maybe the Hiru made a significant difference in someone’s life?

Hannah Erickson:

Yeah, definitely. I mean, we have so many great user stories that have obviously come from the feedback community, but also just the experience over the past 10 years.

But one that stands out, I would say, is probably even the very first person at IRISBOND who had cerebral palsy to ever use the IRISBOND technology for communication, whose name is Gema Canales. She was a young child who started communicating with IRISBOND’s eye tracker from the time she was about three years old and has been very successful throughout the process. And we learned a ton from working with Gema throughout the years and her family, as well as her SLP, Solis, who’s here in Spain. We’re able to kind of evolve and grow together.

But throughout these years, she’s already 16 now, so she’s a social butterfly and already finishing up high school. And she’s actually in the process of opening her own foundation to promote inclusive education systems and awareness of AAC, and giving everyone a voice here in Spain, which is actually called the Fundación Gema Canales. So that’s just been definitely a user that stands out in my mind, was someone that we’ve been able to be very close to throughout the eye gaze journey that she’s had.

Josh Anderson:

That is absolutely, absolutely great. Hannah, if our listeners want to find out more, what’s a great way for them to do that?

Hannah Erickson:

Well, you can go to our website, just irisbond.com. There you’ll find some information about how we work in the different countries around the world, also about our feedback community and our user stories, we have a bunch on there as well. But you can also go to our YouTube channel, our YouTube channel I think is really important for people that have never seen eye gaze in action, just because it’s one of those things that you kind of have to see to believe. So yeah, our website, irisbond.com, our knowledge base, and our YouTube channel are the best places to get more information, and you can always reach out to us directly. You should find our contact information on our website.

Josh Anderson:

Awesome. We will put all that information down in the show notes so that folks can easily access it. Well, Hannah Erickson, thank you so much for coming on today, for talking about just eye tracking and the great things there at IRISBOND, the Hiru device, and all the different places that it shows up and things it’s built into. Just really appreciate you taking time out of your day and thanks for being on the show.

Hannah Erickson:

Of course, thanks so much for having me.

Josh Anderson:

Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If so, call our listener line at (317) 721-7124. Send us an email at Tech@eastersealscrossroads.org, or shoot us a note on Twitter at INDATA Project.

Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation, or INTRAC. You can find out more about INTRAC at RelayIndiana.com. A special thanks to Nikol Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project, Easterseals Crossroads, our supporting partners, or this host. This was your Assistive Technology Update, and I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-Bye.

Leave a Reply

Your email address will not be published. Required fields are marked *