AT Update Logo

ATU691 – Envision Updates with Karthik Kannan

Play

AT Update Logo

Your weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist people with disabilities and special needs.
Special Guest:
Karthik Kannan – Co-Founder and CTO? – Envision
——————————
If you have an AT question, leave us a voice mail at: 317-721-7124 or email tech@eastersealscrossroads.org
Check out our web site: http://www.eastersealstech.com
Follow us on Twitter: @INDATAproject
Like us on Facebook: www.Facebook.com/INDATA 
—– Transcript Starts Here —–

Karthik Kannan:

Hi, my name is Karthik Kannan. I’m the co-founder and CTO of Envision. You’re listening to the Assistive Technology Update.

Josh Anderson:

Hello and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs. I’m your host Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 691 of Assistive Technology Update. It is scheduled to be released on August 23, 2024.

On today’s show, we are super excited to welcome back Karthik Kannan, co-founder and CTO of Envision, and he is here to tell us about some really great new updates to the whole Envision Glasses and their whole platform, including some really cool AI tools that can really help folks with visual impairments access the world around them.

We’d love to send a huge thank you to everyone who attended our online training on innovative assistive technology yesterday. If you weren’t able to join us, a video of that will be uploaded to our full-day training archives here very soon. If you ever do want to go to our full-day training archives, please visit eastersealstech.com/fulldayarchives.

Today also marks a very special occasion. Tomorrow is August 24, 2024, and that marks six years since I took over hosting duties of this show from a previous host, a mentor, friend, Wade Wingler, who started the show and then did the first 300 and some odd episodes. I don’t know, I’ll have to kind of figure out exactly how many. But tomorrow marks six years, which I had to look about three times to really figure out if that was really true. Have I really been doing this six years sitting behind this microphone, getting to interview these amazing folks? And it actually has been that long.

So whether you’re a new listener, whether you’ve been here since well before my time, thank you so much for listening. Thank you for giving me the opportunity to come on here every Friday to get a talk to the amazing folks that make this assistive technology, that create these accommodations for individuals with disabilities. And I must admit that I enjoy every bit of it. I enjoy getting to learn, getting to find out just all the ways that folks are using new technology, older technologies, and just using things in whole new ways in order to increase accessibility for everyone.

I must admit, sometimes it’s very hard to kind of work this into the schedule and other duties of the job, but at the same time though, hopefully it is a good resource for you listener. I know that it definitely is for me. I share the information I learned on this podcast with my team, with others as much as I possibly can, and just really try to get that word out about assistive technology any chance that I get.

So again, to our new listeners, to folks who have been around since the beginning of the show or even for the last six years since I’ve been here hosting it, I just want to say thank you. Thank you for letting me be a part of this. Thanks for letting me just talk to these amazingly great guests. Thanks to all of our guests for coming on here for really and truly just explaining and telling me about all the great things they do. I do know, and I’ve known for a long time that my view of the world and everyone in it has been skewed quite a bit by getting to do the work I get to do and be around such amazing people all of the time.

So listeners, as always, thank you so much for listening. Again, whether it’s been for the whole, I don’t know, 11, 12 years this show’s been on, for the six that I’ve been hosting, or if it’s your very first time, thank you so much for listening. And now let’s go ahead and get on with the show.

Folks, we cannot thank you enough for giving us a listen here at Assistive Technology Update. Without you, we would not have been around for coming up on, getting pretty darn close that 700 episode mark. But did you know this is not the only podcast that we have? You can also check out our sister show, Assistive Technology Frequently Asked Questions. This show comes out once a month and it features panelists, Belva Smith, Brian Norton, and myself as we try to answer the questions that are plaguing your mind about assistive technology. We gather up all the questions we get during the month from emails, phone calls and many other means, and then we do our best to answer them. But I got to tell you folks, believe it or not, we do not know everything. So we rely on our listeners a lot to reach out to us and give us some of those answers or maybe just talk about their personal experiences and things that have happened to them.

So if you like Assistive Technology Update, you may very well love Assistive Technology Frequently Asked Questions. Again, that’s Assistive Technology Frequently Asked Questions where you can get your questions about assistive technology answered, or if you happen to have the answers to some of the questions asked on that show, please, please, please do reach out and let us know so that we can help the community with the answers that they so desperately seek. Much like Assistive Technology Update, you can find Assistive Technology Frequently Asked Questions wherever you prefer to get your podcast. And as always listeners, thank you for listening.

Listeners, we are super excited to welcome Karthik Kannan from Envision back to the show to give us an Assistive Technology Update update on all the cool things that they’ve been working on and some updates to Envision that we’re looking super forward to hearing all about. Karthik, welcome back to the show.

Karthik Kannan:

Thank you so much for having me here, Josh. And my name is Karthik Kannan and I am the co-founder and the CTO of Envision.

Josh Anderson:

Yeah, and we are just so excited that you’re going to come back on and tell us about all the really exciting and cool updates to Envision. Before we do that, for listeners who maybe haven’t heard you on here before, could you tell us a little bit about yourself and your background?

Karthik Kannan:

Sure. So my name is Karthik, and me and my co-founder we both started Envision about seven years ago. And the main idea was to basically build a product that would help people who are blind or low vision to live more independently, to be able to go out independently, to be able to sit at a restaurant and order food for themselves by themselves, or being able to walk around the city, navigate around independently the bus stops and signs and get descriptions of what’s happening around them.

So those are all things that we wanted to solve from the very beginning about seven years ago. And we initially built Envision as a smartphone app. And people really liked it. We won the Google Play Award and a whole bunch of things happened. But then there was always this nudge from the customers that, “Hey, it’s great to have everything you’re doing on a phone. I mean, you can hold a phone and then it’s great to be able to maybe use ChatGPT or other apps, take pictures, and then get descriptions, but it’s kind of hard to do this when you’re outside or when you have a cane in your hand or a dog or any of those things.”

And so that’s when the idea for Envision Glasses was born, where the main focus is to give you everything the Envision app does, which is being able to read text or being able to scan documents. You can even have a two-way conversation with the Envision app and all it serves as an assistant. And to do all of that, but to do it on the glasses where you can be completely hands-free where you can just go about doing your thing and not have the app come in between you and stuff.

So that’s the idea with Envision. And we’ve been on this journey for the last four years with the Envision Glasses and we’ve got thousands of people across the world today using the Envision Glasses.

Josh Anderson:

Awesome, awesome. And yeah, I know I talked to some folks that really love them, love the app and use it for a lot of different things and other folks that really use the glasses and really find them super beneficial. We’ve got you on here today because you guys have some really new stuff and exciting updates come out here very soon and lately. So Karthik, what is new with Envision?

Karthik Kannan:

Yeah, I think with Envision, the big, big thing is basically giving people the ability to have a back-and-forth conversation with their glasses after they take a picture. And it’s truly a back-and-forth conversation because when you speak to me, I basically respond to you in less than half a second. It doesn’t feel like you’re talking to a computer hopefully, you feel like you’re talking to a human because of the way we react to each other. And that is something that we focus a lot on at Envision, trying to make sure that the speed at which you get your responses back they feel instantaneous. And that’s the big update that we have on the glasses where today people can take a picture of what’s happening around them, get a detailed description, get a very detailed description, including the description of faces or the text in the image, and then have that come back to you in the blink of an eye. That is the big update that we launched on the Envision Glasses. It’s called Instant Results.

And people who own a pair of the Envision Glasses can basically update to the most eligible version and then they can just go ahead and try some of these new features out that we’ve put out. And of course, apart from that, we have got the ability to scan documents of all kinds. People, for example, use the Envision Glasses at work, or if they get a letter from the utility provider, for example, they can just go ahead and scan the document automatically with the Envision Glasses and then ask how much should they have to pay for this?

Or if you’re at a restaurant, the glasses knows, or you can tell the glasses that, “Hey, listen, I’m someone who’s vegetarian, I’m looking to lose a bit of weight or be healthy.” So when you read a menu, it would not only answer your question about the menu, but also you can ask it for suggestions on what you could eat.

Josh Anderson:

Nice.

Karthik Kannan:

And the glasses would basically understand that you’re a vegetarian, you’re looking to be a little bit more healthy, maybe it suggests greens, it suggests low-fat options, and so on. So that’s, again, very helpful, feels almost magical for someone who’s going to be sitting at a restaurant and then just ordering by themselves.

Josh Anderson:

No, and it’s amazing. I know when I started doing this, and I mean it’s been a while, artificial intelligence was maybe there, but definitely in its infancy. But yeah, if you used any kind of scan-and- read device of any kind, you had to just wait for the information you were looking for. You had to wait for it to read every single thing back to you. And for some things…

Karthik Kannan:

Exactly.

Josh Anderson:

… menus, other stuff, I mean, that’s a lot of information to get through, especially if it’s giving you ingredients or descriptions and everything else. So I love that you can just ask for that important information and it goes ahead and just extracts that right straight for you.

Karthik Kannan:

Yeah, yeah, no, that’s basically the main idea is that we know that you’re not interested in reading all the text. Whenever someone sits down to read a menu, for example, they’re not interested in reading every single dish. They just quickly scan through or they have some idea on what they want to eat, and then they come look at the menu and then they make a decision based on that. So that’s exactly what we’re trying to do with the describe scene feature on the Envision Glasses.

And again, it’s available for people to try out. It’s been doing really, really well amongst people, especially the older members of the Envision Glasses community, because they just find it such an easy way to interact with the world instead of having to learn how to press a button or do this tap or do that tap or say this magic incantation and stuff. They just can start talking to it without trying to sound half like Harry Potter.

Josh Anderson:

Sure. And that’s a huge thing too because yeah, for folks just sometimes just that learning curve is so big that they just abandon the technology and decide it’s a little bit too much work. So I love that you’re making it easy. And like you said, just the plain language, have a conversation and be able to talk to it.

Karthik Kannan:

Exactly. I think the thing is, a lot of times even I get confused when I use different voice assistants, how do I actually trigger them? For some, it’s like, “Hey, Alexa,” or it’s like, “Hey, Siri.” Or if I say something like that during the conversation right now, my Siri just wakes up. And so there’s a lot that we are trying to simplify and I think the Envision Glasses today is the simplest possible way for anybody to interact with AI. I think it’s the simplest, most effective way because yeah, it’s just designed to get out of your way and to be less complex.

Josh Anderson:

Oh, for sure. And Karthik, I know we kind of talked about the text and finding and be able to parse that out. I know it can kind of describe a scene in a setting and everything to me. Can I ask it questions about the scene in the setting and get more information as well?

Karthik Kannan:

Yeah. So for example, a lot of people use it when they’re dressing up. They take a picture and then they’re like, “Hey, so what does this shirt look like?” And then the glasses gives you a description. And then you could ask it a question like, “Hey, is this particular skirt I’m going to wear, is that matching with my shirt,” and so on. And it is a very, very advanced AI that you can have a conversation back and forth with.

And you can ask it any type of question. You can ask it questions about the text in the image. You can ask it questions about the people. You can get the detailed descriptions of people’s faces and what they’re wearing, their hair color, and all those kind of attributes. You can get very detailed descriptions of your phone or your laptop screen. A lot of people, again, use it to debug issues or when their screen reader crashes, for example.

This is a very useful case because the other alternative is for you to call a human being on Eyedar or Be My Eyes and you have to get them to help you with pointing at the screen and telling you what’s exactly going on. So a lot of times people just take a picture with the Envision Glasses and then get a description of what’s going on in their screen. And they can even ask more specific information. For example, as a fairly older user, he was able to get the Envision AI to go ahead and help him out with something related to JAWS when his JAWS was broken on his PC.

So those are things that you can do with the Envision Glasses, describe scene. Very versatile, kind of works in all these environments. And also specifically trained or taught to be a very simple assistant that you could just ask questions to.

Josh Anderson:

Nice. And I love how it can just elaborate and give you more. And yes, for as long as I’ve done this, when certain things pop up on your computer and you’re trying to use JAWS or another screen reader and it stops it from working, that is one of the most frustrating things in the world because no matter how skilled you are, if you can’t see that darn box that just popped up, then it’s definitely not helpful at all.

Karthik, I know you got a lot of folks out there using this and everything. Has anyone used it in a way that surprised you or maybe even just a really cool way where you were like, “Oh, that’s even a use that we didn’t think of,” or anything else like that?

Karthik Kannan:

Yeah, I think people are, for example, using this, like I said, to troubleshoot their computers. I think for me, I didn’t even know that our AI was advanced enough to help someone out with JAWS. I mean, of course we did train the AI or the AI is basically trained on the entire internet. That’s basically the vastness of how much knowledge this AI has. But you don’t consciously think of JAWS technical manuals as part of the whole training set, but it was. And it helped that person out. And now that person is like, “Hey, previously, I might have to wait for about five to 15 minutes, depending on when I get to speak to a human being through Eyedar or Be My Eyes or any one of the many apps that people used to call and then point that phone at what’s going on, get an idea of it, and so on.”

But now with the glasses, they just take a picture and then just start having this back-and-forth conversation. And they can keep adding more pictures to the conversation. They can just keep adding more and more pictures. So in a given conversation, it’s able to refer to images in the past as well as the present. That’s something that the glasses can do that’s really surprised us.

I think another aspect the glasses has really surprised us is with how much work or how much help is it when it comes to reading things like textbooks for students. I think we, again, we’re seeing AI help students interpret graphs on their textbooks, or at least give them a basic description of what the graph is about. Again, graphs, the Envision Glasses can also decipher tables, so it can break down tables for you and you can ask questions of it. You can have it do basic arithmetic on tables as well. So you want to know what the average of a particular column is or what the total of a particular column is. Those are things that people have been using the glasses for. Again, very surprising use cases.

And I think lastly, when they just are out and about and when they want to feel safe. Sometimes in the outside you don’t know exactly what’s going on and you don’t know where you are and stuff. And so people just take a quick picture of what’s going on and they get a description in the blink of an eye and they can just keep clicking pictures as they go along and then see if something’s up or not. So yeah, it’s a very useful tool in terms of just being outside and navigating outside and understanding what’s around you and stuff.

Josh Anderson:

Awesome. Awesome. Karthik, for folks who, again, maybe don’t kind of know about Envision or the Envision Glasses and everything, can you kind of describe them to us? Just because I know we’ve probably got listeners out there that are imagining something as big as a VR headset or some other stuff like that. So just to make sure that they understand, can you describe the glasses physically to us?

Karthik Kannan:

Sure. No, I describe the glasses physically to you. I think the glasses look and weigh like any other pair of sunglasses. So what I mean by that is there are about 50 to 60 grams in total. Let me check how much that is in ounces. Yeah, it’s about 1.7 [inaudible 00:19:05] 2 ounces you can say roughly. That’s basically the weight of the glasses themselves. The 50 grams is what it is when you put on all the accessories and deck the glasses up like it’s a Ferrari or something, yes, you have about 50 grams.

And it’s got a camera on the right side of the frames. Sorry, yeah, and the camera is angled towards the center, so you don’t have to hold documents to the right or you don’t have to hold things in a right hand for the glasses to pick up. The glasses are a wide-angle camera, so it’s like the .5 camera on your smartphone. That is angled towards the center. So anything you hold in front of your nose, roughly about an inch away from your nose can be picked up by the glasses easily. So it has a very wide field of view.

It’s also got microphones and a speaker directly on the glasses. They’re not bone conduction speakers, they’re not stereo speakers. They basically are on the right-hand side of the glasses themselves. But the volume is just loud enough for you to hear it. Other people around you won’t hear it. And most likely when you’re outside in a noisy environment, you can bump up the audio volume on the glasses, but you can also connect them to AfterShoks headphones if you have them or your AirPods or whatever. It has Bluetooth in it, so you can connect it with Bluetooth.

It relies on Wi-Fi. There are a lot of functions on the glasses that work entirely on device, but then functions like the describe scene where it’s an assistant that describes everything around you and where you can ask it back-and-forth questions. That stuff requires the internet. And you can either connect to the Wi-Fi in your house, or you can connect to the hotspot on your phone if you have a hotspot plan.

And the good thing about the glasses is that they remember your hotspot connections, and then you can just go ahead and automatically connect them to your hotspot every time you’re outside. So when you’re indoors, it connects to your indoor Wi-Fi, and then when you’re outdoors, it will look for your hotspot and connect your hotspot if it’s turned on. So there’s no need for you to keep switching between different networks. And you can set up a Wi-Fi very easily on the Envision app. It’s like two screens and you just have to type in a bunch of information and that’s it.

Josh Anderson:

Awesome.

Karthik Kannan:

So that’s how the glasses on the front side there. And then you have on the back of the glasses, or rather towards the back of your right ear is where you have the battery and the charging port. And the glasses themselves, they come in different frames. Some of the frames are thin, some of the frames are thick, like safety glasses. And then you can also replace the lenses on the glasses. So you can have dark lenses if you want, and you can have prescription lenses. You just take them to an optometrist and then he will replace it out to you for free in like five minutes.

Josh Anderson:

Perfect.

Karthik Kannan:

So that’s a possibility. And lastly, I think we also offer, for example we offer a case with the glasses where you can put your glasses in and carry them with you. It comes with a USBC charging cable, and it’s compatible with any USBC fast charger that is in use today. So that’s something as well.

Josh Anderson:

Nice. That’s awesome. And let’s say I’ve got them and I’m going to be wearing them around for the day. About how long does a full charge last using them?

Karthik Kannan:

Yeah. So a full charge is around… I would say the full charge would be around about five to seven hours of battery life on this. Yeah, yeah. So it’s about five to seven hours of battery life. I’d say pretty solid for everyday use, office use, outside use, and so on. Yeah.

Josh Anderson:

Very, very cool. Well, Karthik, you may or may not be able to tell us this, but what have you got coming down the pipeline? Not that you need any more things kind of going on right now, but we’re always excited to hear about the new updates. So is there anything you’re working on or working towards?

Karthik Kannan:

Yeah, we’re working on a new version of the document scanning feature on the Envision Glasses. We have a completely new product that we are going to be launching very soon for the blind and low-visions market, which will also work on the Envision Glasses, which will be made available to the Envision Glasses users. It’s a completely new product I think the blind and low-vision space has not seen yet. I think there has been some attempts and stuff, but I think at a time when other companies are moving out of the blind and low-vision space, very big companies are moving out of the space, I think we really believe in doubling down here with accessibility specifically in assistive tech, and we’re going to be launching a new product very soon. I think that really would help a lot of people just use AI more effectively in their lives and to make it more simple.

And we already have this on private beta. And so far the response has been pretty phenomenal and we are just trying to go ahead and figure out how to scale this to more people when we launch it publicly. So we’re just in the process of refining the stability of it, the reliability of it so that when we have all of the Envision app users, for example, using this, it should work totally fine. But yeah, we’re working on that part now.

Josh Anderson:

Awesome. Well, we can’t wait to find out more about it and kind of hear about it when that comes out. Well, Karthik, thank you so much for coming on. If our listeners want to find out more about Envision, the Envision Glasses, the app, everything else, what’s a great way for them to do that?

Karthik Kannan:

Yeah, so you can go ahead and check out our website. That’s letsenvision.com, letsenvision.com, and you can request for a free demo of the Envision Glasses. So we either give you a free in-person demo if you’re in the United States and if you’re in some of the territories that we have distributors in. If not, we do a free online demo of the Envision Glasses. Anyone can purchase a pair of the Envision Glasses on our website. And when you purchase a pair of Envision Glasses, you get a 30 days no questions asked return policy that allows you to go ahead and return these glasses if you choose to. So you can go ahead, buy a pair of glasses, try them on. We also offer an onboarding as well when you buy a pair of the Envision Glasses.

So yes, that’s basically quick summary of where you can find out about us. You can go to our website, letsenvision.com/glasses, and then you can request about a free demo.

Josh Anderson:

Awesome. We will put that down in the show notes. Well, Karthik, thank you so much for coming back on the show today, telling us about all the great updates and all the really, really, really cool things that Envision can do to just really help folks with accessibility and getting access to just so many different things. So thank you so much.

Karthik Kannan:

Thank you. Thank you so much for having me on board, Josh.

Josh Anderson:

Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If so, call our listener line at 317-721-7124, send us an email at tech@eastersealscrossroads.org, or shoot us a note on Twitter @INDATAProject. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation or InTRAC. You can find out more about InTRAC at relayindiana.com.

A special thanks to Nikol Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project, Easterseals Crossroads, our supporting partners, or this host. This was your Assistive Technology Update. I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-Bye.

Leave a Reply

Your email address will not be published. Required fields are marked *