Podcast: Play in new window | Download
Hi. This is Cornel, and I’m the CEO and founder of .lumen, and this is your Assistive Technology Update.
Josh Anderson:
Hello, and welcome to your Assistive Technology Update, a weekly dose of information that keeps you up to date on the latest developments in the field of technology designed to assist individuals with disabilities and special needs.
I’m your host, Josh Anderson, with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. Welcome to episode 687 of Assistive Technology Update. It is scheduled to be released on July 26th, 2024. On today’s show we’re super excited to have Cornel, CEO and founder of .lumen, on. We also welcome back Amy Barry from BridgingApps with an app worth mentioning. Now let’s go ahead and get on with the show
Listeners, I want to make sure that you were fully aware of our next full day training. So as you know, here at INDATAA one of the jobs is getting the word out about assistive technology. One way we do that is by podcasts like this, but we also do four to five full day trainings throughout the year. Well, our next training is coming up on August 22nd, 2024 from 9:00 AM to 3:00 PM Eastern. This training is over innovative assistive technology and it is online only. All that you have to do is register to attend. Now spots are limited, so do make sure that you go and register for it as soon as possible. I will put a link to register down in the show notes.
During this day we’ll talk about all kinds of fun stuff, including artificial intelligence, robots, internet of things, virtual reality, augmented reality, adaptive control interfaces, and really more importantly than kind of talking about these technologies, we’re going to talk about how they work into the world of assistive technology.
I will be doing most of this training, and I will tell you I am not an expert on any of these things. Now I say that in that kind of way. What I mean is you don’t want me building your artificial intelligence systems or teaching your machines machine learning. You will not get very far in that. But I do work in the world of assistive technology and see how artificial intelligence is being used by some of the amazing creators, including our guest today.
So if you’d like to join us for a full day training, get some of those magical little CEUs that you might need for some kind of certification that you still keep, or just want to kind of have a little bit of fun, or maybe learn how these innovative and emerging technologies are being used in the world of assistive technology, please do join us for our next INDATA full day training, Innovative Assistive Technology, which will be on August 22nd, 2024 from 9:00 AM to 3:00 PM Eastern. We’ll put a link down in the show notes so that you can go and register as soon as possible. We can’t wait to see you there.
Folks, we cannot thank you enough for giving us a listen here at Assistive Technology Update. Without you we would not have been around for coming up on getting pretty darn close that 700 episode mark. But did you know that this is not the only podcast that we have? You can also check out our sister show, Assistive Technology Frequently Ask Questions. This show comes out once a month and it features panelists Belva Smith, Brian Norton and myself as we try to answer the questions that are plaguing your mind about assistive technology. We gather up all the questions we get during the month from emails, phone calls and many other means, and then we do our best to answer them. But I got to tell you, folks, believe it or not, we do not know everything, so we rely on our listeners a lot to reach out to us and give us some of those answers or maybe just talk about their personal experiences and things that have happened to them.
So if you like Assistive Technology Update you may very well love Assistive Technology Frequently Asked Questions. Again, this is Assistive Technology Frequently Asked Questions, where you can get your questions about assistive technology answered, or if you happen to have the answers to some of the questions asked on that show, please, please, please do reach out and let us know so that we can help the community with the answers that they so desperately seek. Much like Assistive Technology Update, you can find Assistive Technology Frequently Asked Questions wherever you prefer to get your podcasts. And as always, listeners, thank you for listening.
Next up on the show, please join me in welcoming back Amy Barry from BridgingApps with an app worth mentioning.
Amy Barry:
This is Amy Barry with BridgingApps and this is an app worth mentioning. This week I’m sharing an app called True Link. True Link Financial is a credit card company that’s sole mission is to assist caregivers, guardians and fiduciaries in maintaining control of money for their loved ones with intellectual and mental health challenges, all while sustaining a controlled level of independence. This extends for the use of clients with special needs trusts. It’s a pre-funded debit or credit card, where all parties agree to how the money is to be used and any purchases that are not preauthorized will be declined.
The True Link app is a simple way for both the funders and the spenders to maintain the prepaid account. The account and the app is very easy to set up. The transfer of money on the prepaid card is completed on the app by the guardian or fiduciary, and then the user sees what money has been added to their card by looking on the app, as well as what funds can be used for, and then the card can simply be used independently by the user.
This card has been a wonderful resource for our reviewer, as she has a special needs trust and often needs to rely on funds for emergency repairs to her wheelchair or speaking device, or even her van. Funds can be approved and transferred fast, so her world does not slow down. This resource is also an amazing tool for people with dementia and addicts in recovery, as caregivers and sponsors can help manage the user’s money by blocking specific stores and/or transactions. It really helps them maintain independence in their life while also sustaining a high level of security.
True Link is a nonprofit. There are minimal fees involved in setting up an account and each time you transfer money. There are also fees for using the card for ATM machines. In addition, there’s a max balance on the card of $20,000 and a max of 5,000 per transaction. This may seem high, but also know that if the user unknowingly uses the card for purposes that are not for their benefit the max amount will not crush their stability.
True Link is available for free at the iTunes store, Google Play stores, and on the web. For more information on this app and others like it visit bridgingapps.org.
Josh Anderson:
Listeners, wearables and artificial intelligence are some buzzwords in the world of technology, both consumer and assistive. Our guest today is Cornel from .lumen. He’s here to tell us how these things can come together to assist individuals with navigating their surroundings and a bunch more things in a whole new way. Cornel, welcome to the show.
Cornel Amariei:
Pleasure being here. Thank you for inviting us.
Josh Anderson:
Yeah. I am really excited to get talking about the technology. But before we do that, could you tell our listeners a little bit about yourself?
Cornel Amariei:
Definitely. So I’m Cornel. I come from Eastern Europe, specifically from Romania. I was actually born in a family where every family member except myself has a disability. That means my parents, my sister, my nephew, my cousins, my grandparents. I come from a family where I’m the only person who doesn’t have a disability, and after a career in the automotive field and by training an engineer and scientist I decided to build things that help. So basically that’s me now in a one minute description.
Josh Anderson:
Perfect. Perfect. Well, we’ll probably hear a little bit more about your passion and everything as we get kind of talking about the tech. So I guess kind of just big picture, let’s start off with what is .lumen?
Cornel Amariei:
Sure. Well, .lumen is a startup which we started four years ago, like four years and one month ago. But let me take you to the problem why we founded this company and what problem we’re solving. The problem which we’re solving is the lack of scalability of advanced mobility solutions for the visually impaired. So let me describe it a bit.
Right now you have over 300 million people with visual impairment worldwide, and is growing, is growing fast. But if you check the solutions which are the most used you still come back to the white cane and to the guide dog. Now the guide dog, it’s a great solution by what it does, its features and how it can help. That’s great. But just to point a couple of problems with it, last year we spent half a billion dollars and we only trained 2,000 guide dogs, so the real cost of a guide dog is through the roof. It is not something which is scalable.
In the entire world we only have 28,000 guide dogs, so that’s sort of where we are towards which we’re going. That is the problem which we are solving at .lumen. Now we are solving it. We call it the .lumen glasses. Basically what they do, they’re a self-driving car. Everything that a self-driving car does our glasses also do, but on the pedestrian side. But rather than driving wheels as a car, we actually guide people.
But to give you I think a better analogy, let’s think of the guide dog. A guide dog works by pulling your hand. So the guide dog pulls your hand, avoids you from obstacles, keeps you on the sidewalk, stops with crossing. Sometimes it helps to cross. It works indoor and outdoor, et cetera.
Our glasses, they do exactly the same, but they don’t pull your hand obviously. They’re not on your hand, they’re actually on your head, so it’s actually a headset. They actually slightly pull your head using vibrations, so you actually feel them on your head, how they pull you towards the path you should go on, avoiding you from obstacles, keeping you on the sidewalk, stopping at crossings, helping you cross, helping you navigate curbs or steps or stairs, all of these situations. That’s like a very, very quick presentation, what they do.
Josh Anderson:
Nice, and that’s amazing. I mean they can actually sense or know that there’s obstacles in my path, like you said, there’s crosswalks, all those different things? They’re able to I guess identify and interpret that information and then pass that onto the user that quickly?
Cornel Amariei:
The system is much faster than a human can react to it. So basically the system understands everything from not only where the ground is and obstacles below or above the ground, because obviously a pothole, for example, is an obstacle below the ground. That’s the first level understanding, but the most complex part about it is that it understands semantically. What do I mean about this? If you think in a [inaudible 00:11:09] geometrical world, the sidewalk and the roads, they are the same. They don’t necessarily have obstacles on them, but clearly you shouldn’t be walking on the road, you should be walking on the sidewalk.
The system knows the difference. It knows how to determine that. It knows how to determine where there’s ice, where there’s water bodies, where there are mud patches, where is terrain, where there are rocks. It can also determines all of the surfaces on which you shouldn’t necessarily walk, and it does that with what we believe is the most advanced AI in the world specifically for this task.
Josh Anderson:
That’s awesome. Can you kind of describe the device, the glasses themselves? Can you kind of describe what they’re like or… It’s always hard on a podcast to say what they look like, but just kind of how they feel and how they work?
Cornel Amariei:
For sure. So it’s a headset. It actually sits on your forehead region, so most of the interface is actually on your forehead region. It has a component in the front and a component in the back. The part in the front, that’s where all the cameras are. The system has six cameras. It’s where the buttons are. You can control the various features with the buttons. It’s where some microphones are, so it can listen to you. You can talk with the device and it answers. It has some directional speakers on the side so you can actually hear what the device is saying without covering your ears.
And in the back you have the super-computing unit and the batteries. It’s a small unit which does all the computing, and the batteries and everything. So this is very similar with an AR or VR headset, but it doesn’t sit on the eye region. It sits above and similar in weight, even in some cases lower weight, similar in comfort and everything.
So the quickest analogy is a small AR headset. I think that’s the closest thing to how it feels. [inaudible 00:12:52] that in the forehead region there is a specific system which we have, we call it a haptic feedback array. It’s a set of haptic actuators which actually vibrate your head towards the direction where you have to go, so that’s the primary difference which separates from other headsets.
Josh Anderson:
Nice. And I know some things might still be in development, but how long does it usually take a user to get used to it? I know sometimes with haptics it can take a little bit to get used to feel which side of the head maybe it’s vibrating on and everything. How long does it usually take a user to kind of get used to figuring out how to use it?
Cornel Amariei:
We have multiple videos on the internet, which at this point have over 40 million views of people who in the first minute or minute and a half, as they first put the device on they were able to navigate through real life complex situations. I mean people at CS for example, and CS, you know is a busy show, and there were visually impaired individuals which were able to do it in a minute and a half. It’s immensely intuitive. It’s one of the great wins which we have created. The way you train and the way you understand the device is incredibly fast, so it’s a minute, a minute and a half. But obviously for more advanced features it takes an hour or two.
Josh Anderson:
Still that’s definitely not bad. I think it takes a lot longer than that to get used to a guide dog, which, as you said kind of at the beginning, are not easy to get or find or train or really get into the folks’ hands that need them. I got to ask you, because I know you kind of came from the automotive world, is that where the idea came to kind of move the self-driving car technology into a, for lack of a better term, self-driving person kind of tech?
Cornel Amariei:
Partially. The experience which we had, because a lot of the team were a team of 50 engineers and scientists working on this and a bunch of us are coming from the automotive field. Particularly for myself, I actually don’t necessarily think it’s where the idea came from. The idea I think came from the problem. So when I found out the lack of guide dogs worldwide, that’s when we quickly got to the idea, and it just happened that automotive was the sector in which I was. Definitely I mean it helps from a technical perspective. There are a lot of similarities, but nobody tried to do a self-driving car on the head, absolutely nobody until we did it.
One fundamental thing which I think was at the core of the idea and we were the first in world to do it, since the ’50s people have been trying to represent visual information in a non-visual way. What do I mean about this? If you have an obstacle, you feel a vibration or you hear a sound and things like this, unfortunately the world is so complex that while it can work in a lab environment, in a controlled environment, the moment you go to the real world it doesn’t work anymore. You cannot represent more than one, maybe in some cases two, obstacles or situations in a non-visual way. It’s immensely complicated.
But what we knew we wanted to do very different from day one is that we didn’t want to represent the world, because it is not scalable, because it doesn’t work in the real life situation. We looked at the guide dog. The guide dog is not barking when you have five obstacles, it’s not barking five times. It’s just guiding me around them. That’s what we do. That’s what we fundamentally do different, and that’s what we have patents for, and we’re the only ones in the world who can do it.
Josh Anderson:
Nice. That is awesome. I know we talked a lot about kind of navigation and how the glasses are able to kind of help with that. What else either can the glasses do now, or kind of what are your plans for the future that they might be able to also do to help individuals with visual impairments?
Cornel Amariei:
So the first set of features is replicating everything that the guide dog does, but preferably better, of course with the exception of the emotional support. We’re still building technology. We cannot provide the same level of companionship and emotional support that a service dog can do, so that one, unfortunately, we cannot scale, but everything else we can.
So if you think of a guide dog, obviously there is a set of commands which is publicly known, which varies a bit region to region, guide dog school to guide dog school, but it pretty much goes to the same two couple of things. It can guide you in general. So you don’t specifically tell a destination, you just say, “Let’s go,” and it will keep you straight, avoid you from obstacles and stop when you need to take a decision. Or in some situations you can ask the guide dog to take you to a particular object or place, so, “Take me to an empty seat,” “Take me to work,” if they have been there multiple times and they remember the route. So those, we call it the guide me and the take me functionality. Those are the two functionalities which the glasses also have.
But here’s something interesting, the take me functionality. You don’t need to constrain yourself to requesting the glasses to take you to places you’ve been before. The glasses can take you anywhere. And what do I mean by this? You can take your smartphone, go on your favorite navigation app like Google Maps or Apple Maps, et cetera, you can find the best coffee shop around, or the best restaurant, or a particular address that you’re searching for, you can find it, you can press share, you can share it with the glasses, and the glasses will take you there, navigating all kinds of obstacles, keeping you on the sidewalk, helping you cross crossings. This is already full autonomous self-driving car, which it does.
But even more, and this is something we’re now experimenting with, it will very soon, even in [inaudible 00:18:19] it already does, it will very soon be able to help you navigate public transport, so you can actually go as far as you want. You will be able [inaudible 00:18:28]. You’ll actually go to airport level. So you will be able…. If you’re in one corner of the city and you want to take a tram or a bus or anything, it will take you to the bus station, it will help you get in the right bus, it will keep you in the bus, it’ll go out of the right station, and then it will guide you to the address. That is something which was never achieved before on any kind of navigation system. I don’t mean visually impaired system technology, I mean in general urban navigation, it was never done before until we’d done it.
Josh Anderson:
Oh, that’s awesome. That does just open up a whole new level of transportation, and just… You know, that’s such a barrier for so many individuals, so that will be absolutely great. Well, I know you’ve been working on this for quite a while, I believe you said like four years. Can you maybe tell me a story about someone’s experience kind of testing it, or maybe something that you found along the way that was pretty interesting or maybe maybe kind of blew your mind or someone’s mind that was blown when testing and using the glasses?
Cornel Amariei:
Definitely, and it happens so many times. We’re so lucky to travel and to give demonstrations and to see people, how they react to them. One of the moments which I clearly really remember, it was 2021 January, so we founded the company during the pandemic, actually during the curfew we founded the company, very bad moment. And in January 2021 the company was a few months old. We create the first haptic navigation system and we had this virtual path on which we invited visually impaired individuals to go on. So it was in a very large hole. There was absolutely no obstacles. We wanted it to be as scientific as possible experiment. So we had this virtual path going around virtual obstacles, so you couldn’t use any other sensors to detect the obstacles. It was just a large hole and the device, and the first person which we tested with, and it worked in a minute.
We never expected it to be so intuitive. We never expected it to work so well. It was all assumptions until then. I mean we tested and we thought, yeah, I think it’s quite good. But to see visually impaired individuals ranging from 18 years old up to I think 75 or 80 years old… 80 was in the first week of testing… We invited 40 individuals, or 30, and to see all of them being able in a minute, or worst case two minutes, to navigate with the precision of centimeters on virtual paths, that was absolutely incredible. That was also the month in which we finalized the patent and we actually proved everything in the patent working. And I remember one of the individuals, a very well-known visual impaired individual, the vice-president of National Blind Association here in Romania, he said a quote which I’ll forever remember. He said, “We believed something like this will exist, but not during our lifetime.” And that was a quote which really, really was incredible.
It took us months to realize that it was the first time ever when anything but a guide dog actually guided with that precision, or a human companion. It was absolutely the first time, and that was pretty amazing. But that was a technological demonstrator. We were proving the technology. But then when we were proving the product. I mean a few months ago, at the beginning of day, I remember I was at the United Nations in Vienna, on the headquarters, and there were some blind individuals testing. There was a particular woman, I think she was from the US, together with her teenage son. The son didn’t have any kind of visual impairment and the mother, she was trained for two minutes and then she began walking. She was a guide dog user, and she began walking and the son literally began crying. It was 15 seconds until he began crying, and I never expected such a reaction. I actually didn’t believe those reactions were real when I see for other products or anything else, but that was the moment… When I saw that I was like okay, we’re doing something right here.
Josh Anderson:
Most definitely. Most definitely. Well, Cornel, what kind of phase are you in as far as development and kind of getting it out to folks?
Cornel Amariei:
So basically we’re putting the first version in a limited run on the market end of this year. Now what it has to be understood, first of all this has like 70% of the performance of a Tesla auto-pilot. This is amazing, the level of technology, which is… I know we did some research and we had an external audit on this, and basically we asked some automotive companies how long will take you to develop something like this, and their answer was like seven or eight years and 50 million euros, about $50 million. We did it much faster and much cheaper, but the same amount of work. It took us 120 years of work. If I add all the hours of everybody who works in this project, it took us 120 years to get here. It’s pretty amazing. We tested this technology with over 300 visually impaired individuals from over 30 countries, and over 2,000 blindfolded individuals.
Not really important on the second part, but it’s a fun fact, because we actually counted them. So this is something which was validated all over the world from west coast US, to Japan, in Africa, in the Nordics, in Europe, in the southern part of Europe. We have tested this all over the world. This end of the year we’re doing a limited run for the European market. Beginning of next year we’re releasing internationally in the European market as a medical device. Very important, it is a medical device. This is not a consumer product. It is something which is guaranteed to the criteria of medical devices, which is… I mean looking at other assistive technologies, they simply cannot accept the lack of regulation, and lack of performance, and lack of safety which they bring.
We do not accept… We did not condone something like this. We build things by medical device regulation, and that’s much software to do than just a simple consumer product. So it’s a clear differentiator, which we really wanted to do. In the US our timeline is roughly end of next year, so end of 2025 we want to be in the US. Obviously we are certifying according to EU regulation now, and we’re beginning very, very soon the process for FDA certification in the States. So it’s a little bit of work for the United States, but right now Europe is in a few months horizon.
Josh Anderson:
Nice. That is awesome. Cornel, we’re kind of running out of time. If our listeners would want to find out more, what’s a great way for them to do that?
Cornel Amariei:
I think there are two options. One of it is obviously the website, www..lumen.com, .lumen in letters or social media .lumen. We are present on LinkedIn, on X, on former Twitter, on Facebook, Instagram, and we are also on TikTok, which was actually not expected. On TikTok apparently we just reached like 20 million views, which was really, really not expected.
Josh Anderson:
Awesome. Awesome. Well, Cornel, thank you so much for coming on today. Such a… Just a really, really great medical device to really be able to help folks. I know there hasn’t been anything out there except for the dog for so long in order to really help folks with navigation, and not just a portion of navigation, but really the entire, I don’t know, kind of scope of being able to travel. I think of just how so many individuals, it seems like maybe they had something that could help on the city streets, but I feel like this could help anywhere, anywhere that you possibly wanted to go, and possibly wanted to walk, and possibly wanted to be, and just really great. We love the work that you’re doing and can’t wait to kind of see it to come to fruition. So thank you so much for coming on the show today and telling us all about it.
Cornel Amariei:
Thank you so much. Always a pleasure to discuss about what we’re building.
Josh Anderson:
Do you have a question about assistive technology? Do you have a suggestion for someone we should interview on Assistive Technology Update? If so, call our listener line at 317-721-7124, send us an email at tech@eastersealscrossroads.org, or shoot us a note on Twitter at INDATA Project. Our captions and transcripts for the show are sponsored by the Indiana Telephone Relay Access Corporation, or Intrac. You can find out more about Intrac at relayindiana.com.
A special thanks to Nikol Prieto for scheduling our amazing guests and making a mess of my schedule. Today’s show was produced, edited, hosted, and fraught over by yours truly. The opinions expressed by our guests are their own and may or may not reflect those of the INDATA Project, Easterseals Crossroads, our supporting partners, or this host. This was your Assistive Technology Update. I’m Josh Anderson with the INDATA Project at Easterseals Crossroads in beautiful Indianapolis, Indiana. We look forward to seeing you next time. Bye-Bye.