Full Transcript
Jeff:
Welcome to Blind Abilities, I’m Jeff Thompson. Today we’re going to be talking about Envision—that’s the company that makes the Envision AI app, one of the best, most accurate OCR readers out there, available on Android or iOS. Check them out at letsenvision.com. Now we’re going to be talking about the Envision glasses, the AI-powered glasses with the Envision app built right into them, and as they say, let your glasses speak to you. And if you haven’t tried out the Envision AI app, you get a 14-day trial. So, if you have iOS or Android, download it, kick it around, and you’ll be surprised, you’ll be impressed, you’ll probably move this app right to your front page and use it as your one-stop shop for OCR and much, much more. And that very same app is going to be on the Envision glasses—now, these are glasses where it’ll be able to load up the app right to it, and with some gestures and voice commands coming out, you’re going to have full functionality of the Envision AI app right from the glasses. And also with me in the studio is Scott, a.k.a Kayaker from Boston, how’re you doing Scott?
Scott:
I’m doing very well, thanks, Jeff.
Jeff:
Well, we’ve got Ilse in the studio from Envision, thank you so much for taking the time to come onto Blind Abilities.
Ilse:
Thank you so much, Jeff, it’s great to be here.
Jeff:
With all this shutdown, lockdown going on, and stuff like that, everyone was probably expecting to see you at the conventions and everything, well, they get to hear you here on Blind Abilities, so welcome.
Ilse:
Thank you so much, it’s so good to be here, and we wish we could have been to all the conventions and everything, but unfortunately we couldn’t.
Jeff:
Oh, that would have been so cool, so many people were going to want to try on them glasses, the Envision glass.
Ilse:
Yeah, it’s something totally new.
Jeff:
Yeah, it is, it’s exciting! I was talking to Scott earlier about how the Envision AI app would be on the glasses, and that’s like, wow, and he said, “I want to be there,” so I’m glad you could make it, Scott.
Scott:
Thanks, it’s exciting to hear about the new technology and all the cool stuff with the Enterprise Google Glass that’s definitely for us. I’m dying to hear more, so do you want to jump in and tell us what the new product is?
Ilse:
You guys know the Envision app, you know that you can, you know, read texts on the go with the app, you can read documents, multiple documents, you can scan scenes, barcodes, colors, anything like that. Well, what we’ve done with the Envision glasses is that we have put the app on the glasses, so everything you can do with your phone you can now also do with Google Glass, which is way easier because then you don’t have to hold your phone in your hand, you just have the camera on your head, and you scan everything around you and it’ll be read out to you like the app does. An extra feature we also added to the Envision glass is the video call, so for whatever reason Envision can help you out somewhere, either- if it’s at the store, a roadblock, or at the train station, whatnot, you can call someone who has the Envision Ally app, and they will see what you see. So basically, they can help you out wherever you are. So, yeah, that’s Envision on the glasses, and then our big goal with it is that it’s going to be really a platform also for other apps to be on there, so that you don’t just have Envision on there and Envision’s features, but also, you know, let’s say Be My Eyes or things like that, so we’re discussing with a couple of companies now, and yeah, it would be great to have some more very high-value apps on there that people can really use.
Jeff:
Oh yeah, that’s exciting, especially when you’re talking about being able to connect up to someone- say you have someone in your family or something that does have the app downloaded, and you call them-
Ilse:
Mm-hm.
Jeff:
-that’s direct information that you can use instantaneously.
Ilse:
Yeah, definitely. It’s really nice for most people to have a person that they know right there. Yeah, we’re excited about this feature.
Jeff:
Oh yeah.
Scott:
So that feature depends on your cell phone or Android or an iOS device to make the cellular call, correct?
Ilse:
Everything is on the glasses-
Scott:
Okay.
Ilse:
-for the user, the person that gets the calls or that receives the call will have the Envision app.
Scott:
Ah, so do you need cellular service with the glasses itself, or is that- how does the data work for the glasses?
Ilse:
Yes, it’s connected to WiFi or if you’re outside, your hotspot. So, that way you have your internet to make the video call.
Scott:
Gotcha. So the recommended solution if you’re at that train station and there’s no wireless there, you’d set your cell phone to be the hotspot and have your glasses just connect to that?
Ilse:
Yes-
Scott:
Gotcha.
Ilse:
Have your hotspot ready when you walk out the door, and you straight away have that WiFi that you might need on the road.
Scott:
Gotcha.
Jeff:
That’s great, because hotspot, I just became familiar with that, I’ve got a trailer up in the middle of nowhere, and I don’t have WiFi, so I hooked up the hotspot for $10.00 extra a month I have a hotspot, and 15 gigabytes through Verizon, and they’re not sponsoring this, by no means, but it works great, I was able to- I mean, hotspotting is really cool, I mean, I’m new to it so I’m excited to talk about it, but we’re talking about the glasses and Envision AI right now. But, yeah, that’s great that you can be able to do that.
Ilse:
Yeah, we usually hear actually a lot, especially since the corona situation, the internet is not always great anymore?.So, like, a lot of us use hotspot just to work every now and then if internet fails on us for a bit.
Jeff:
Mm-hm. So, people can now go to the website, and that’s letsenvision.com?
Ilse:
Yeah, that’s letsenvision.com/glasses for the glasses, and the preorder is still open. We don’t have that many glasses left, but yeah, it’s still open.
Jeff:
Still open, and you expect that preorders will stop and they’ll be available, what, in August?
Ilse:
Yes, so we’re shipping the first glasses that were ordered in the preorder campaign end of August, and after that it’s going to be- yeah, regular sales.
Jeff:
So, what’s it like being at Envision AI when you’re coming up with all these innovations? I mean, you went from the app to adding feature after feature, I really like the batch mode that I just noticed, where multiple documents all at once-
Ilse:
Yeah.
Jeff:
-and now, moving to the glasses and entertaining other companies to come onto the glasses, this is huge.
Ilse:
Yeah, I think what I like most about Envision is we do everything with our users. So, all the features that you see that we’re adding, that we’re optimizing, that we’re thinking about, are all on the app because people asked us to, you know, try to get these features on the app. Same for the glasses, it’s going really well now, we have 15 glasses out for testing, and we get a lot of feedback, and we immediately try to incorporate that into the glasses, make everything better from feature requests to sounds and things like that, it’s all super important, so we really are very close to our users and potential users, just so we can make the product for them. I think that’s why everything is going so fast all the time.
Jeff:
It seems like it’s always been community driven.
Ilse:
Yeah, yeah, definitely, especially because we’re sighted, so if you want to make this product you can’t just make a product that you think will work, that doesn’t make any sense.
Jeff:
Mm-hm.
Ilse:
We work together with users, our team, our ambassadors, to really optimize and make sure we’re there. And also, the glasses- there was really a lot of questions actually came in on wearables, you know, made it a little bit easier to walk around with a cane or a guide dog and still see everything, so having one hand still free, so that’s why the guys really looked into glasses, and they decided that the glass that we have now, the Google Glass, was really the best one for what we needed out of the hardware for our software but also the partnership with Google is very nice, so that’s also really important for us.
Jeff:
The form factor is the Google Glass, but the software inside, is it a version of the Android that’s in there?
Ilse:
It’s a version of Android, but we have built our own layer.
Jeff:
Mm. That’s cool, that would be Scott’s department.
Scott:
Yeah, the Android stuff. But as far as the hardware itself, when the Google Glass first came out, there was some resistance to the style and the people- has the Enterprise Glass been modified since the initial release or can you describe a little bit about the hardware, and, you know, what camera capabilities are actually on it so people have a sense of what they would look like walking around with these things on?
Ilse:
Yeah, so it’s very delicate actually, so if you think about it, on your right ear you have a little bit of a bigger, how do you call that, a glass-
Scott:
The stem, I think, or frame? On the side of your-
Ilse:
Right on the side, yeah-
Scott:
Yeah.
Ilse:
It’s a little thicker that has a touchpad, then if you move closer towards your eye there is the camera, and a little teeny-tiny glass screen, and the other part of the glass all the way through the other ear is a titanium band, very thin. So it’s actually a very elegant glasses that you have on your face. What we also have is the Smith optics frame, that is a real, proper frame, but yeah, you can choose either one.
Scott:
Okay, so you can get it with sort of protective sunglasses or tinted lenses, is that correct, if-
Ilse:
Yeah, that’s correct, or for people with low vision, that-
Scott:
Right.
Ilse:
Yes, definitely
Scott:
Gotcha.
Jeff:
Well, that’s sweet.
Scott:
Very cool. And the camera itself is just one camera, so you’re not getting any stereo footage for accurate depth perception, just a single camera, correct?
Ilse:
One camera, yeah, that’s correct.
Scott:
Alright.
Jeff:
But that would be great for if I just wanted to find something out instantaneously, I could just call up through the Google Call, and they would have access to the camera through the app that my friend would have on their phone, and boom, there we go.
Scott:
I’m always looking towards the future, and what features can be coming up the pipeline with all the developer conferences and all the advances in machine learning, and vision recognition, stuff like that. I said this, you know, three years ago, this is exactly the type of technology that you’re going to be seeing in products for us, that’ll help us navigate scenes, you know, going far beyond OCR. So, is everything that’s in the existing Android and iOS app supported in the glass, and how are the controls done, then, if I’m not using my touchscreen, can you describe a little bit about how one might execute all the functionalities of the glass?
Ilse:
Yes, definitely. So all the features that are on the apps are also going to be on the glasses, plus the video calling. It basically works like this—the touchpad on the side of your head is a touchpad, and you tap, you scroll, like you would do on your phone, and it’s all with TalkBack. What you can also do is make voice commands.
Jeff:
Oh, sweet.
Scott:
Gotcha.
Ilse:
That makes it a little easier and fully hands-free.
Scott:
And is there a speaker on the glass itself, and- or are you relying on a bluetooth or an earpiece support as well?
Ilse:
There is a speaker on the glass itself-
Scott:
Okay.
Ilse:
-we do also have bluetooth, so you could put your own earpods in the future, but there is a speaker. And you can, you know, lower the volume-
Scott:
Right.
Ilse:
-and things like that. So, yeah, if you’re somewhere that you don’t want everyone to hear, just lower the volume.
Jeff:
Wow, this has all the- they’re checking off all the boxes there, Scott.
Scott:
Yep, it certainly is.
Ilse:
Is there anything that you haven’t heard that you’re like, “I need this on the Envision glasses,” just please let me know!
Scott:
Oh, I have a laundry list of stuff, you know, the machine learning aspect is the most- I call it the last 10 yards problem, you know, you take your Uber to your hotel or your restaurant, and what I’m waiting for is to see some machine learning AI where I can scan around and say “Door at 2:00,” because you’re looking for the door to get into the place that you just got dropped off at, or, you know, elevator, or stairs, you know, the common navigational things. I want to see some MLs for that dropped in so we can get distance and bearing types of technologies, you know, just by pointing around. That’s what I’m waiting for, and I’m starting to see some of that, and- I don’t know, can you talk about any, like, future ideas that you guys have been thinking about for the app or is that confidential at this point?
Ilse:
A little bit! A little bit, so we’re definitely thinking about these types of things, yeah, it would really make life so much easier. It’s not in the nearby future, since first, you know, basis of the glasses and a couple of features that we’re working on for the apps, but this is definitely something that is in the back of our heads, yeah, that would be amazing.
Scott:
Yeah. Do you want to describe a little bit about your scene recognition software and what types of descriptions you’ll get out of that, because I think it’s been, I’m not sure everyone’s- I hope everyone’s familiar with their phone app by now, if they’re not it would be nice to hear the types of things you can see in the scene, and maybe, you know, what directions you’re going with your technology in that regard?
Ilse:
Yeah, definitely, so what we have now is a scene description where you hear whatever you’re seeing from people sitting in a room or a lot of buildings, things like that. What you can also recognize is people that you taught Envision to remember, so if you scan Envision around and you would come across me, it would tell you “That’s Ilse.” So it basically detects every single object right now, but where we’re going in the future is you can also teach Envision objects. So right now it only teaches faces and it recognizes a lot, it would be very interesting because, you know, one chair is not the same as the chair next door, so it would be interesting to teach Envision your own products, your own stuff in your house and everything, so that’s something that’s coming in the future.
Scott:
Very cool, very cool. Is there one feature that you are most proud of, that you feel like really sets this app apart from everyone else at this point?
Ilse:
I feel that we have so many different languages. We have also the non-Latin based languages—most people use the app for reading, because it’s very accurate, and I’m just happy that we can reach so many people in so many different countries with this, and help them. That’s what we’re super proud of. In terms of, you know, what the company is doing it’s really the glasses, we’re the first, we’re really excited, just came on as a partner on the Google website which is like “Ah, that’s nice!” You know, it’s- yeah, we’re getting there, it’s just very, very interesting to be in the assistive tech in these times with AI, with these amazing products that a couple of years ago it looks so futuristic, but now we can really see it help people a lot and I think that’s something to be proud of.
Scott:
It certainly is.
Jeff:
Absolutely, absolutely. I’m just excited because Envision AI, you guys came out with it, and seeing AI was out there, but, you know, when you talk about the accuracy of the OCR that’s what sold me on it, you know, I like accuracy, I’m a guy that likes to get things done, I just want it to work and I don’t want to be amazed by it, I just need the information that I’m going for, so that’s why I’m with Envision AI on that note. When I heard you guys were taking it to a whole nother level with the glass, the Envision glass, I was like “Wow, this is really cool,” to see that, you know, Google partnership and everything like that, it’s like this is something to watch or something to get involved in and get invested in too.
Ilse:
Yeah, thank you for that. It’s really something new, it’s very exciting. And we just hope we can help as many people as possible.
Jeff:
Mm-hm.
Ilse:
Make lives easier.
Jeff:
Yeah. Do you guys did or do a thing for Accessibility Day where you had the community come on and that was pretty good, I enjoyed that.
Ilse:
Yeah, thank you. It was super interesting to talk to all these innovators in the field and learn from that and learn how they think about accessibility. Yeah, and we also had our ambassadors on, who are, you know, invested in the company and in the community from another side, but, yeah, it was really nice to get these people together and talk about what really matters.
Jeff:
Great. Scott, you got any more questions?
Scott:
Sure, I’ve got tons. The hardware that you’re using—how do you see that holding up in the future? You know, we’re always updating our phones every couple of years, often enough, but do you think the current Google Glass is going to be ready to support the new features that you’re willing to add for the foreseeable future or have you thought about an upgrade program or any hardware limitations on the glass as it is now?
Ilse:
Right now no hardware limitations, that’s also why we chose these glasses.
Scott:
Yeah.
Ilse:
Yeah, they really can do everything we wanted, which is the app and many more things.
Scott:
Mm-hm.
Ilse:
And of course, in terms of updates and things we will still be Envision, like we are on the app, we have updates as much as we can, yeah, we will stay on it.
Scott:
And with any wearable, my question is always let’s talk about battery life. How long can we expect to be using this at a time and what’s the recharge system like?
Ilse:
That is actually a very important one. The battery lasts between six and eight hours—which is pretty long, it’s like a workday.
Scott:
Yeah.
Ilse:
And they charge with fast charge, like 50% is done in half an hour, and if you want to go up to 100%-
Jeff:
Oh, wow.
Ilse:
-it will take about two hours to charge. Yeah, the battery life is really pretty long. Of courses, if you are, you know, face calling someone the entire day it will drain a little faster.
Scott:
Mm-hm.
Ilse:
But yeah, normal use, like if you’re at work or whatever you’re doing as a student or reading books or whatnot, then yeah, six to eight hours.
Scott:
And is it possible, if you wanted to have an external battery pack, to actually, you know, plug it in with a tether while you’re wearing it, or does that not work?
Ilse:
That is actually possible, yes, definitely.
Scott:
Okay.
Jeff:
Wow. Don’t stop now, Scott, you’re doing good.
Scott:
I just want to keep talking about the future, the future and all these new futures, but, yeah. What you have now is pretty remarkable as it is, and I think if you wanted to sum it up, you basically see it as a reading device first and these other features are sort of evolving and based on feedback from your users, that’s what it sounds like.
Ilse:
Yes, definitely, and what we always make sure is that, you know, our software is device agnostic, so if there is any other type of hardware that does- is good enough-
Scott:
Right.
Ilse:
-for what we need and what our users need then we’re always happy to test it out.
Scott:
Very cool, very cool.
Jeff:
That’s great.
Ilse:
Yeah, that was like a sneak peak of the future.
Scott:
Yeah, because I think there’s a definite advantage to have, sort of the head-mounted camera type of technology because it’s always- it tracks where you’re looking and it always gives you a reference that’s accurate whereas if you’re holding your phone, you know, a slight five degree angle can really throw you off, but if it’s mounted on your head you’re going to be- you’re going to have very accurate information of where you’re looking at and get a sense of your surroundings.
Jeff:
Mm-hm.
Scott:
I always appreciate those types of things, but it’s- there aren’t a lot of options like that, so it’s nice to see you guys do it.
Ilse:
There’s some things that you can build yourself, or buy that you can use with your phone, but today we did another round of testing and it’s very interesting to see that people just automatically turn the right way to point to a piece of paper. It just goes naturally, and it’s really nice, it just makes everything a lot faster.
Scott:
Yeah, yeah.
Jeff:
Oh, plus, being hands-free, I mean, that’s totally everything-
Scott:
It’s huge.
Jeff:
-some people, yeah, you have a cane, you have a dog, you have a soda, a beverage, a coffee-
Ilse:
Yeah.
Jeff:
-yeah.
Scott:
Yeah, we’re just about always down one arm, you know, you do everything one-armed.
Jeff:
And then you want to do something, you don’t want to have to be gesturing, or, you know, moving or all that stuff, you just want to, like you said, voice commands, bingo, there you go.
Scott:
Let me jump on voice commands, so what is the trigger word or is it easy to activate, is it, you know, touch to activate, or is it, you know, always listening, how- that’s a good question to ask, I guess, as well.
Ilse:
It’s going to be easy to activate. So, we’re still building this so we haven’t made the commands yet, but that’s also something that we’re working on with testing and everything right now to see what would the best way to activate be for you guys.
Scott:
“Hey, Ilse.”
Jeff:
There you go!
Ilse:
I don’t know, but that would be interesting!
Scott:
No bias there at all. “Hey, Ilse.” Hey, that works! Because there’s a lot of Ilses in the world out there that would like that.
Ilse:
Yeah!
Scott:
I think there’s lots in the Netherlands, I bet.
Ilse:
Maybe I should, you know, we should make a poll and see what people think.
Jeff:
Well, Scott suggested it, I’ll second it, so there you go.
Scott:
That’s my contribution.
Ilse:
Thank you, I appreciate-
Jeff:
You tell Karthik we said that. How to invoke it, that’s really cool. I like the voice command, I mean, touching is okay, but voice commands, once you’re in a groove is just natural and, you know, like you said, naturally looking at something, it’s just going to keep going smooth. You can even turn the pages and keep on talking to the glasses, you know. That’s cool.
Ilse:
Yeah.
Jeff:
Wow. To pick up a magazine again.
Ilse:
Yes. That’s something, by the way, we’re working on hard right now for the apps as well as the glasses, the column detection.
Scott:
Yeah.
Jeff:
Oh, yeah!
Ilse:
So important, so we’re working on it very hard now because so many requests came in and, yeah, that would be, you know, a magazine, a book, whatnot, that you usually have to, you know, cover half of the book but now you don’t have to do that anymore.
Jeff:
Well, even for students, if they can detect that something’s in the margin, or in the column, you know, that would be kind of like a column, that it would isolate that and read that separately, that’s cool. I’m glad you guys are really looking into all the obstacles that we come across. I mean, OCR seems simple and easy, but there’s so many graphic designers out there that don’t want to just make a page look plain, they fancy it up a little bit, and if you guys can detect all that that’s better for the reader.
Ilse:
We try, we really try.
Jeff:
Great.
Scott:
Do you guys do some of your own machine learning models or are you leveraging third party models, or are you doing anything specific for the blind yourself in-house?
Ilse:
A little bit of both. So we have some models that we’re using, and some that we build ourselves.
Scott:
Okay.
Ilse:
I am not that technical, so-
Scott:
Sorry, that’s okay!
Ilse:
[unintelligible] that. But yes.
Scott:
Very cool.
Jeff:
Great. Well, it’s 90 degrees here in Minnesota so it’s summertime, so what kind of summer specials have you guys been talking about.
Ilse:
You know, actually, yesterday I found myself sending out an email saying “Guys, we have a summer sale,” even though it’s raining here. I felt really sad, but we have to, you know, make it feel like summer, so we’re like “Summer sale,” so we have a summer sale for our app which goes up to 50% discount on the annual plan, 20% on lifetime, and 30 on monthly. Glasses prices go on the website right now for 18.99, that’s euros-
Jeff:
Mm-hm.
Ilse:
Yeah, so that’s the summer for us!
Jeff:
It’s going to be a good summer, all the things that’ll be coming out in the future, I know I just updated- I think yesterday I got an update to the Envision AI app, I was, like, excited, I was like “Wow, we’re going to be talking to Ilse tomorrow and I just got the update!” Anything else, Scott?
Scott:
I think I’ve covered just about everything I could think of, unless there’s anything else you want to tell us?
Ilse:
Really, I would say go to letsenvision.com/glasses if you want any more information, and we are always open to any feedback and any questions, whatever you want to know, please just, you know, send us an email, and we’ll be here for you.
Jeff:
Great, we’ll do that.
Scott:
Can’t ask for anything more than that.
Jeff:
No. You’ve always been- always listening to the community. Tell Karthik and Karthik hi, and Ilse, it was really nice to meet you and thank you so much for taking the time to come onto Blind Abilities.
Ilse:
Thank you so much for your time, it was great to be with you both. Thank you.
Jeff:
Alright.
[Music] [Transition noise] -When we share
-What we see
-Through each other’s eyes…
[Multiple voices overlapping, in unison, to form a single sentence]
…We can then begin to bridge the gap between the limited expectations, and the realities of Blind Abilities.
Jeff:
For more podcasts with the blindness perspective, check us out on the web at www.blindabilities.com, on Twitter @BlindAbilities. Download our app from the app store Blind Abilities, that’s two words, or send us an email at info@blindabilities.com. Thanks for listening.
Contact Your State Services
If you reside in Minnesota, and you would like to know more about Transition Services from State Services contact Transition Coordinator Sheila Koenig by email or contact her via phone at 651-539-2361.
Contact:
You can follow us on Twitter @BlindAbilities
On the web at www.BlindAbilities.com
Send us an email
Get the Free Blind Abilities App on the App Storeand Google Play Store.
Check out the Blind Abilities Communityon Facebook, the Blind Abilities Page, and the Career Resources for the Blind and Visually Impaired group