Episode Page
Podcast Summary:
Jesse Anderson, Assistive Technology Specialist at State Services for the Blind of Minnesota, discusses the latest advancements in assistive technology for high school and college students. He highlights tools like Aira’s AccessAI, BeMyAI, JAWS, and smart glasses, emphasizing the importance of learning both mobile and computer-based solutions to improve independence and efficiency. Jesse also explores emerging technologies, such as multi-line Braille displays and AI-powered apps, while stressing the value of early adoption and flexibility in using various tools. His advice encourages students to embrace technology as a way to overcome challenges and succeed in academics, work, and everyday life.
To find out more about the services provided at State Services for the Blind, and what they can do for you, contact Shane DeSantis at shane.desantis@state.mn.us or call Shane at 651-385-5205.
Full Transcript:
{Music}
Jesse Anderson:
* When I meet with students for transition technology assessments, high school and college.
* Maybe I’ll use a combination of large print and audio or whatever it happens to be, but just being kind of flexible so you get the materials that you can read.
* You’re wearing smart glasses and you don’t have to hold your phone in your hand. There’s something to be said for the hands-free stuff.
* It’s never been a better time to be a blind person or a visually impaired person because there’s just so much out there.
Jeff Thompson: Welcome to Blind Abilities. I’m Jeff Thompson. Today in the studio, we have Jesse Anderson and he’s an assistive technology specialist at State Services for the Blind in Minnesota. Jesse, welcome back to Blind Abilities.
Jesse Anderson: Thank you. Thank you for having me back.
Jeff Thompson: Glad to have you back, Jesse. Let’s talk about technology. Technology that’s available to students who are in high school preparing to go to college and the workplace. And let’s also take a look at technology that’s coming in 2025. A look into the future. I’m sure you’re on top of all that as well.
Jesse Anderson: Yeah, it’s actually a pretty interesting time in both mainstream and assistive technology these days. This time of year, you have a lot of your, whether it’s operating system updates or assistive technology updates, starting with like Apple and iOS 18. 18.2 just came out as they’re starting to implement the start to their AI stuff that they’re building into iOS 18.
This time of year, we also have your yearly major updates for your Vispero Freedom Scientific products like your Fusion, ZoomText and JAWS to like your, all of your regular software is being updated and things like that. I think in the next year or so is where it’s going to get a lot more interesting too.
Jeff Thompson: Yeah, AI has really made itself known. It’s really sliding into a lot of different softwares and devices. What’s your thoughts on that?
Jesse Anderson: I actually find it pretty fascinating. I’ve been following it for quite a while. I think what really kicked it off for a lot of blind and visually impaired users was BeMyEyes last year. The feature of BeMyEyes, which is BeMyAI.
I remember hearing that podcast from AppleVis, I think it was in like mid-summer last year and then it came out last fall. It’s really opened up a lot more doors for people because you can get such a detailed description, whether you are taking a picture with your camera, getting a description of an existing image from your photo library that you took at some point or now that it’s out on Windows as well.
When I meet with students for transition technology assessments, I will show them the app and I’ll be like, okay, if you are in let’s say a science class and you have a diagram of the human body with all the organs and stuff or if you have a geography class where you’re looking at maps, maybe you’re in a class where you’re using charts or graphs and if you have trouble getting at some of that graphical information.
BeMyEyes has worked pretty well for describing that information and then of course you have the ability to follow up and ask further questions on it. So once you get a handle of what it is you’re looking at, you can dig in and get further detail. I think that’s been really helpful.
Jeff Thompson: Yeah, it was a neat relationship to see that BeMyEyes and Open AI combined up using chat GPT. Then of course, Aira came out with Access AI and with these models working on interactions with people with low vision or who are blind and every interaction from someone from the blindness community that uses these apps is only going to help better make these models more useful for people in the blindness community.
Jesse Anderson: Yeah, there’s a lot of places that are starting to implement AI. You have your mainstream platforms like your Google Gemini and stuff, but then like you said, the Aira app is implementing AI into their app as well.
And then you have some really neat uses of AI. Like if you want to get help for something, you have the FS Companion website and it’s also a feature of JAWS at the moment and it’ll be coming to Zoom, Text and Infusion sometime next year. So like if you are new to JAWS or you forgot about how do I do this one thing that I’ve wanted to do? I don’t remember what the keyboard command is or I don’t remember the process for doing something.
You can go to this FS Companion feature of JAWS or you can do it right on the FS Companion website and it works pretty well. Like you can ask, you know, oh, how do I read tables in Microsoft Word or how do I use convenient OCR or whatever it is feature you want? It’s actually a really nice way to quickly get help without having to dig through all of the help systems.
Jeff Thompson: Yeah, I really like that. You’re in the middle of homework, you want to find something out, boom, bang, you’re done and you move on.
Jesse Anderson: Again, you can ask follow up questions too.
Jeff Thompson: Yeah, that’s a big game changer being able to do follow up questions. Another thing that I heard coming down the line here is the Mantis, which is a refreshable Braille device. It’s not a full computer. It’s kind of a mid-range, affordable refreshable Braille device and now comes with text to speech.
Jesse Anderson: Yeah, I haven’t had a chance to try it yet, but yeah, the Mantis has some text to speech features, which is really cool because I remember when the hardware came out, I’m like, well, there’s volume controls and stuff on here. Be cool if we had the text to speech that the Brailliant line does and looks like it’s coming out now. I definitely want to give that a shot soon.
Jeff Thompson: Yeah, that’s a great upgrade for the Mantis. Now the Mantis and the Chameleon, they’re kind of a hybrid somewhere stuck between refreshable Braille device and a note taker. Can you tell us where they fall into place here?
Jesse Anderson: Yeah, I kind of call them. You have your traditional Braille displays, which really just are refreshable Braille displays, but over the past several years, I kind of call them smart Braille displays because they’re not a full-fledged note taker like your BrailleSense or your BrailleNote Touch, but they have some basic built in features like a book reader. So you can read Bard on Bookshare. Braille books or some of them will have, you know, a built in really basic note taking things so I can jot down a note or, you know, your four function calculator. And then, of course, they work with a Braille display with your computer or your phone. So it’s kind of an in between.
Jeff Thompson: And it’s portable. You can use that in school for sure.
Jesse Anderson: Yeah. And the thing I like about the Mantis is especially for somebody who’s using a computer quite a bit, personally, I think it’s just easier to you have a QWERTY keyboard like you would have on a regular laptop. And so when you’re doing a lot of, you know, computer keyboard commands, you can do them just as you would on a regular QWERTY keyboard. Whereas on like it’s still possible to do those commands with Perkins style keyboard, but you’re holding, you know, a lot more keys at a time to perform those tasks. Because, you know, if you do control P, well, you got to do the four keys of the P and then you have to do the control and it gets to be a little tricky. Not enough fingers sometimes.
Jeff Thompson: Keyboard acrobats.
Jesse Anderson: Pretty much. Yeah.
Jeff Thompson: Jesse, can you tell us about the refreshable Braille devices that are coming out that have multiple lines of refreshable text? And you can also do graphic tactile imagery as well.
Jesse Anderson: Yeah, actually, I just got to see a couple weeks ago, there is this Monarch display and it’s a really cool device. Like you can do text, you can do graphics and you can kind of do them together. It actually refreshes pretty quickly. The Braille feels good. It’s got your Perkins keyboard. It’s got some extra navigation keys. And what’s neat is that you can zoom in and out to different things. Like let’s say you have a map of the United States, you could, you know, be looking at the entire United States. And then let’s say that, oh, I want to explore if the source map that you’re looking at has that level of detail, you could say, oh, I’m going to touch and hold on where Minnesota is and then hit a button and it’ll zoom into that part of the map. And so then instead of getting just the outline of Minnesota, then you might zoom in a little bit and then you might get like major cities or maybe they’ll have rivers or whatever they choose to put on that map.
You can zoom in and out and to get that amount of detail. I think what’s really interesting about these multiple line displays is like the Monarch right now, it works as a standalone device and you have to download or you have to feed it images directly. But screen readers are going to have to start figuring out a way to present information for these multiple line displays because up till now, your screen reader, everything has been sequential.
You have to look at whether you’re doing it via speech or Braille. You’re doing it one line at a time or one item at a time. Whereas now if you have, let’s say, a multi-line display, and especially I could see this being helpful if you’re doing like math or if you’re proof reading a document where, yeah, you can have the line where your cursor is and you can have an indicator where your cursor is in your document, but it might be a lot easier to read things in context or get a better idea about that equation trying to solve because you can read multiple lines of it at a time.
So screen readers are going to have to figure out a way how to present that if somebody has a multi-line display and I’m really curious to see how that is going to pan out and especially they’re working on a new Braille format. I mean, instead of your traditional BRL, BRF. Now you’re going to have this upcoming. I don’t remember what the extension is going to be, but you’ll be able to have Braille text and Braille graphics in the same file. So I know they’re working on it and I think they’re trying to get something out by sometime next year, at least an early version of it.
I’m curious to see how that goes. So theoretically, you would be able to download a book share book and especially if it’s got a textbook, maybe it’s got some charts in there. Maybe it’s got some graphics in there. So you would have the text and the graphics all in one file.
Jeff Thompson: It’d be interesting to see if it could give you that bird’s eye view of a graph. Yeah. That overall, the context, like you were saying, I think that would be awesome. And I think that’s where AI is going to have to come in a little bit there too.
Jesse Anderson: Well, that’s the thing is I think there’s a lot of kind of coming together with a lot of these different technologies, AI. And even you think about how a screen reader works. Again, if you are reading everything sequentially and you have to, maybe the website isn’t that you’re looking at, isn’t designed super well. So instead of having to arrow down a whole bunch or hope that there’s headings or do a search to see if there’s a certain keyword on the web page. You know, if your screen reader has some AI in it and you just go, hey, is there a contact phone number listed on this page anywhere?
Or, you know, you ask something specific about a page that could kind of speed up some efficiency. And then you combine that with like wearables. And there’s just a lot of things that are happening, I think right now.
Jeff Thompson: Yeah. A lot of new platforms, a lot of new updates, like you said, about the software updates coming out, a lot to stay on top of. Now, Jesse, you mentioned efficiency earlier, meaning that you can get work done sooner, faster and all that.
And when people get these tools, you don’t want to get them one week before school starts because you have to be able to use these tools efficiently when class begins.
Jesse Anderson: Yeah, there is so much happening all the time and there’s so much changing all the time. And it’s hard for me, me, even me who follows this stuff all the time.
It’s hard to keep track of it all. But if you’re a transition student, some of the best things that you can do, if you’re especially you’re thinking about work or college, just being able to practice using some of this technology, whether it’s something on your iPhone or iPad, or especially I highly, highly encourage people to start using a computer. Most of your businesses, most of your office jobs are going to be in a Windows environment. So it’s a really good idea to learn Windows, whether you’re using a magnification like Windows magnifier or Zoom text, or whether you’re using a screen reader like Fusion or JAWS or NVDA, learning some of those things while you have that extra support. You have your vision teachers and SSB to help back you up. So that way you don’t have to scramble. As you said, learn that technology that you happen to need for that class a week before you get started.
Jeff Thompson: And that’s what you do for pre-eds for students who are wanting to learn about technology, you introduce them and find out what works, what may work for them.
Jesse Anderson: Yes, absolutely. One of my key roles at SSB, there are a few of us technology specialists at SSB. I work with Transition Aid students, high school and college. A lot of what I do is I meet with students and we do our technology assessment. And in that meeting, what we’ll do is we’ll look at what types of technology somebody is currently using.
What are their current needs? Are they still in high school? Maybe they graduated and are starting college in the fall. Are they already in college and are having trouble with a certain type of class or a certain type of thing? And we will look at anything from computer software to mobile apps, CCTVs, Braille displays, anything in those areas and figure out what it is they need to be successful.
Jeff Thompson: SSB just had a panel of students and one of the students said that they wish they would have started earlier, but they realized that they couldn’t use an iPad in college and do the work that they wanted to get done. So they quickly had to learn more of the JAWS commands in a laptop. So that just supports what you were just saying.
Jesse Anderson: Yeah, absolutely. And I did listen. I had planned on sitting in on that College 101, but I did listen to the archive and there was a lot of really, really great discussion. I really liked what a lot of the students said there.
And you know, yeah, you’re right. I love my iPad. I love my iPhone, but there are just some things that are much, much easier, if not just really necessary to do on a computer.
Jeff Thompson: What really impressed me about the College 101 panelists is they were saying things that were actually like, that’s absolutely right. You know, I was like cheering them on in a sense. The future’s in good hands. That’s all I got to say.
Jesse Anderson: Yeah, like you, I was really impressed with what the students were seeing in that panel and I would definitely encourage other students to give that episode a listen because the whole time I was listening, somebody would say something like, yes, good. Thank you for saying that. That’s great. Yeah, I thought they did an excellent job.
Jeff Thompson: I think it was also stressed. Like you said, learn a laptop, a computer. Like you said, out there in the corporate world, it’s going to be Microsoft based stuff, unless you go into a specific field of audio, if you’re going to be a recording engineer. But to get through your college, at least your first few years, most of all college, you’re going to need a PC type of laptop. Yeah, it’s easier to get scripts for it if you need it at a job, that type of thing.
Jesse Anderson: Yeah. And beyond the PC, beyond a laptop, like there is still a place for mobile, like I would be lost without my iPhone nowadays because back when I was in college, which yes, sadly was like 20 years ago, I’m old. But when I would go to a class, the instructors, you know, oh, your syllabus says one thing, but I can’t tell you how many times the instructor would be like, yeah, I know the syllabus says this, but this thing just came out and I want you guys to read this article and we need to discuss it.
We need to cover this type of article or news story or whatever. And so there would be a lot of times where you’d be sprung with some sort of document or something that you had to read and it maybe wasn’t in an accessible format, even if it was digital format, it was one of those pesky image PDF files that a lot of things don’t get along with. But knowing JAWS having that convenient OCR feature or knowing an app like Seeing AI or BeMyEyes or even something like Voice Dream Reader with a Voice Dream scanner, which I use all the time just to be able to, oh, here’s this garbage PDF file that I can’t read. Let me go ahead and just share it into my Voice Dream app. And I just capture all the text out of it. And without me having to wait for someone to read it to me or wait for, you know, send it over to disability services, have them convert it into an accessible format for me, I can just take care of that myself. And then I don’t fall behind. I keep up right with everybody else. So knowing those types of tools is so helpful.
Jeff Thompson: And that Voice Dream reader app will open up PDFs too.
Jesse Anderson: Yes, it absolutely will. I’ve gotten a few inaccessible documents through work. I’ve gotten some through some of my other stuff that I do on the side and even looking at some digital magazines. A lot of them are just purely pictures of text. They’re graphics. And so I just run them through Voice Dream scanner and boom, I’m reading the magazine.
Jeff Thompson: Yeah, it’s nice to have those options. When I hear this term, it’s been used a lot. To have a lot of tools in your toolbox. It doesn’t mean you have to have 8000 gadgets.
It’s just that you have to know a way to access information. It might be just an app which doesn’t take up any more space in your pocket. It’s in your phone. And so I’m glad you brought up the phone when we’re talking about all these devices. The phone seems to be that thing that’s always nearby that can do a lot of stuff for you.
Jesse Anderson: Absolutely. Like I said, I use my phone as a handheld CCTV or video magnifier. Every single day I use it for communication. I use it for research. I use it for, you know, if I need to quick get a ride somewhere, I’ll just grab a lifter or an Uber or something. Check a bus schedule. There’s so many things that you can do.
But I think the other thing in addition to having a lot of tools in your toolbox, I think another option is to be a little flexible too. Because in an ideal world, let’s say that I’m a heavy Braille reader or let’s say that I like everything in a Word document format or, you know, whatever it is, some specific thing. Sometimes you’re not going to be able to get everything exactly in the perfect format. Like for me, if it’s a matter of getting the job done and as long as it’s like, you know, pretty accessible and readable, like, OK, it’s a PDF. It’s not my favorite type of document. But as long as I can read it, I’ll work with it. Or if I want something in a hard copy Braille, but maybe I have a refreshable Braille display, that stuff does take time to produce.
And so rather than falling behind, kind of using the tools you have. And maybe I might listen to something in audio to speed things along. Or maybe I’ll use a combination of large print and audio or whatever it happens to be. But just being kind of flexible so you get the materials that you can read.
Jeff Thompson: Yeah, it all comes down to crunch time. Yeah, you just want to get it. You want to get it done. I want it to work because I got stuff to do.
Jesse Anderson: I hear you.
Jeff Thompson: And now even the photos app in the camera of an iPhone. If you touch this photo, it’ll read you the text. Yeah, there’s so many different ways that you can do it. So when I say having a toolbox full of tools, it’s just knowing that that’ll work. That you might not. use it for three weeks, but all of a sudden that one moment, it’s like, ah, there. And you’re saved for that moment, waiting for the next barricade.
Jesse Anderson: Yep, exactly. Like for me, for instance, I use a combination of screen reader, like NVDA, and magnification. And so sometimes I’ll run into an instance where I’ll be looking at a document or a website and something just, oh, it’s just not reading it correctly. I’m like, okay, well, if my conventional weapons won’t work, I have all these other backup tools in my toolbox. So maybe I’ll fire up JAWS and give that a shot, or maybe, okay, it’s not reading this one little thing.
Let me see if I can turn on the magnifier to click that one pesky thing that I can’t get access to. Like when I meet with students and let’s say they’re a screen reader only user, I often recommend that people know, even if they know one better than the other, I still recommend that people know a couple of screen readers because I talk to a lot of people who are screen reader users and they will switch depending on the task, depending on how well it reads something, you know, they’ll switch between a screen reader, different screen readers throughout the day for low vision users. Like let’s say you use ZoomText. While the ZoomText is weird and there’s a driver, it’s misbehaving, I can just fall back and fire up Windows Magnifier at the last minute, or maybe I go to a job site, or I go to a lab and they don’t have ZoomText on the computer and we’re working in a classroom, we’re working in a group, and I can just, oh, well, let me just fire up Windows Magnifier because that’s on every Windows machine. So it’s just those little knowing those little backup strategies and tools.
Jeff Thompson: Even some browsers sometimes, sometimes a Safari browser doesn’t work, but a Chrome browser will. Been there. So just having that option, it just works sometimes.
Jesse Anderson: Absolutely.
Jeff Thompson: As we hit the 2025, you believe it’s here, hmm, quarter of a century already into this 2000th. What do you see coming down the line for the future for technology?
Jesse Anderson: It’s really going to be fascinating to see where this Braille stuff goes because we have more accessible Braille graphics with multi-line displays and such. I’m hoping that the price does come down because, ooowhee, that stuff is pretty expensive at the moment. Very, very expensive at the moment. But that’s a fascinating thing to watch and to see how screen readers take advantage of that. But I think AI is going to be become even more powerful and capable of doing different things like one of the apps I’ve been testing for the past few months already. They used to call it the envision Assistant, but they call it Ally now. And right now they’re at a point where you can converse with it in real time. So like you and I could be talking right now. And while I’ve been testing this, I’ve had really interesting conversations that have had practical use. Like I’m trying to research a topic. Like I might say I’m trying to like, hey, these are some songs I can play on the drums.
What are some, you know, I like rock songs and different things or what other songs could you recommend that are at this skill level? Or the other night I was talking to a friend of mine and we were trying to think of a couple of these old movies that we watched growing up and I could not, you know, you think of little bits of them, but you don’t remember the names. And I was just conversing with Ally, telling them these vague clips of the movie. And I think we figured out about three or four of them that was trying to figure out.
So that’s interesting. But I think where it’s going to become even better is we already have this live conversation stuff back and forth. But if we add, I think, two key things to AI that we’re using for assistive tech is the ability to remember things between sessions.
So if I’m learning some sort of a task, and like, I’ll just use like learning a new language, for instance, we start out and we learn three to five words of a new language. Right now, when I stop that AI session, it forgets. And next time I start, it has to start from scratch.
Jeff Thompson: Groundhog’s day.
Jesse Anderson: Yeah, but now like if it starts to remember, then you can say, oh, well, here you learn these five words, then maybe you might do a quick little overview or refresher with you and then you can move on and build. The other thing that I think is going to be helpful is live screen monitoring or camera monitoring. And by that, I mean, if I’m using BeMyEyes, BeMyAI on my computer, and I have it monitor just whatever I’m doing on the screen, you know, in addition to me talking to it. And let’s say that I am doing a visual task or let’s say that I’m playing a game. And it’s watching me play the game. And at any time, you know, because it’s seeing what’s on my screen, I can say, what does that door say up in front of me?
Or what’s my health? Or, you know, that’s just an entertainment use, but I’m sure you could come up with all kinds of work or educational uses for that too. Same thing with the phone camera, you know, if you are looking at something through your camera, right now you have to take a picture, get a description, take a picture, get a description. And it’s pretty fast. But if you get to the point where instead of taking a picture, it’s like live video, and then you’re just conversing with your AI back and forth on top of that, it’s not going to be maybe like exactly real time, but it’s going to be really close.
Jeff Thompson: Like live audio description, interactive.
Jesse Anderson: Yeah, that would be one use for it. Or like, let’s say I’m watching a film that has subtitles, I can’t read them. But if I pointed my camera at the TV, and I said, Hey, I’m going to watch this movie, it’s in French, whenever the subtitles appear on the TV, read them. So you could just kick back and watch your movie. And then anytime subtitles come up, your AI will just read them to you. That could be a good use for it.
Jeff Thompson: Just listening to you made me think that like, if I had glasses on like the Meta Ray band glasses on, which are glasses that are now available for about $300.
Jesse Anderson: Yep, I have a pair.
Jeff Thompson: You can record videos cameras, and you can hook them up to BeMyAI, the BeMyEyes or Aira, and use these with people talking to you and stuff. But I’m thinking when you’re using them with AI in the future, we’re talking the future here, if I walked a route once, if I was to repeat that route the next time, you see what I’m getting at, like bread crumbing video.
Jesse Anderson: Sure.
Jeff Thompson: That’ll be interesting because humans are repetitious.
Jesse Anderson: Yeah, and I can’t remember the name of the app, but there’s already an app that’ll use your smartphone camera to do that. I can’t remember what it’s called though. But yeah, that’s another great use. Or like, let’s say that you’re walking a route, and you like, maybe you looked at a map ahead of time, and you know that like, okay, I’m going to go from my apartment to a restaurant. But I know that on the way, I know I pass a bank. And so I could be walking and it’d be like, hey, let me know when I get close to the bank. So there’s all kinds of creative stuff you could do.
Jeff Thompson: Like Ben Vision, they got that Speakaboo app. And that one is supposed to be able to scan the room, and you’re like looking for your shoes. And within seconds, that’ll say they’re over by the chair by the door. It’s in beta right now, who knows what it’s going to be like, but it’s neat to see this stuff coming down the line, especially like with Glidance and coming out with the Glide. That’s going to be interesting. A lot of people are excited about it, just like the Meta Ray bands.
Jesse Anderson: I’ve had my Ray bands since May, late May this year. These are not glasses that are designed specifically for blind people. They are mainstream glasses that have AI in them. There are limitations. And so if you go into them with the right expectations, they’re great. Yes, they have limitations compared to apps like BeMyEyes or Seeing AI or Aira.
But for certain use cases, they’re great. And then the Glidance stuff, I am genuinely curious about a coworker of mine actually was out in New York when they had one of their early prototype thing. And he got to see one.
And I think he’s going to see it next year too. I think a lot of times you get these devices and they’re almost like a solution looking for a problem. Maybe they don’t talk to the audience that they’re trying to serve.
Like, you know, did you talk to blind people before you develop this and figure out what they need? And I think that the Glidance people, the developers, they’re working really well with the community and figuring out like what it is they need, what kind of barriers there are, and just really unique approaches. I’ve heard a few different podcasts about them. I’ve watched a couple of video demonstrations of them. They actually had a session at the recent Site Tech Global Conference last week, which is this virtual tech conference, which I was a guest on a panel there, by the way, on Accessible Gaming.
They had a really unique session on a few different navigational aid devices anywhere from cane to Glidance to, I don’t remember the name of it, but it was like this kind of, not a full vest thing, but it was like a thing that you kind of put around your neck and it hangs down sort of like a vest, probably about like down to about like heart level, you know, like your cane detects things on the ground, but like the Glidance camera and then this other wearable, it tries to detect things that are higher up. So if you’re walking on that sidewalk and people don’t trim their branches, which I run into that problem near my apartment quite a bit.
Jeff Thompson: We had Glidance developer on and he was talking about it. And I like that they’re going through so many iterations. He himself is visually impaired and has worked at Google and other companies, but I’ve never seen the approach like Glidance is doing that they’re bringing every iteration to the public, working it out, going back, working it out, going back and they’re looking at next fall of 2025.
Jesse Anderson: Yeah.
Jeff Thompson: And this is a device that you kind of push along, but it’ll veer. So you follow the stick, just like you would a guide dog in a sense, when it recognizes an obstacle or something to go around or it’ll put on the brakes. So they’re really working up, they jacked up the wheel sizes, they’ve made some really big differences.
It’s only X amount of pounds and it folds up a little bit so you can get it into a taxi or lift it up when you’re going to go upstairs. It seems like they’re really doing the homework with the community before they put it out there.
Jesse Anderson: Yeah, I’m genuinely intrigued about the Glidance device too, because like you said, it’s just really fascinating. Like when I watched that video clip, they had a guy going from his home to like a coffee shop or something and they talked about approaching stairs, as you said, applies the brake or just weaving around a tree or a pole. The device doesn’t pull you, you just walk like you were pushing a shopping cart or whatever. It will go at whatever speed you want it to. It just so happens to use its sensors and cameras and such that it will try to turn you around or it’ll steer you around things.
Whereas if you’re a cane, you might be clanking into a lot of things where this one, you just kind of like a, it’s almost like an automated guide dog where you just kind of go around obstacles. You didn’t even know we’re there.
Jeff Thompson: It’s something new and I’m sure it’s going to go through ups and downs a little bit here. I like that it’s been around for over a year now with trials and working it out. It seems realistic.
Jesse Anderson: I have to admit I am genuinely curious. I hope they have a super off-road model where, you know, the winters of Minnesota. I’m curious to see how people are going to clamor over the snowbanks with that bad boy, you know?
Jeff Thompson: Well, that’s like the All-Terrain cane for Minnesota. I have the All-Terrain cane and I have the Urban XPlorer and those are two new canes that came out. But in Minnesota, hey, that makes sense. Sometimes when those sidewalks and you come to a place that the street has been plowed out from the sidewalk and you need something to brace yourself, I think there’s a purpose there.
Jesse Anderson: Oh yeah. I could tell you horror stories, man. The corner right by my apartment for years, it’s just been terrible.
Jeff Thompson: So it’s like there’s devices out there, but not everything’s for everyone. But every once in a while, it’s nice to have something that you can turn to use.
Jesse Anderson: I’m not even sure how much I would use something like Glidance, but just the like whole curiosity factor. Oh my, it’s just kind of a cool thing. I kind of want one just to play with it, you know?
Jeff Thompson: I’m thinking the mall, the Mall of America.
Jesse Anderson: Sure.
Jeff Thompson: Because you’re in wide open space and there’s people around just to see what it’d be like. And there’s escalators so you could just hold it to by your side and go up the escalator.
Jesse Anderson: Sure.
Jeff Thompson: Go around again.
Jesse Anderson: Yeah. And then you’re wearing your smart glasses and you just look up, you’re using the AI with your smart glasses to say, hey, I need to go to this store. And then you’re just looking at the marquees above each store. And it tells you what store you’re passing like that.
I don’t know. There’s some pretty cool stuff, you know, and you don’t have to hold your phone in your hand. There’s something to be said for the hands free stuff.
Jeff Thompson: I got to give it to the Meta Ray Ban glasses, hands free.
Jesse Anderson: Yeah. That’s really cool.
Jeff Thompson: That’s everything. You got a cart, you got a cane, you got nothing else.
Jesse Anderson: Yeah.
Jeff Thompson: You can’t hold your phone up. It’s hands free. You can do everything you want to do and you get the sound. It’s awesome. I’ve listened to books on it. I’ve listened to podcasts on it.
Jesse Anderson: The Ray Bans do work as a regular Bluetooth headset. They actually sound a lot better than I thought they would both for, like you said, audio books or voiceover or even on a zoom call or a phone call.
They actually sound really good. And like one of my first really early uses of the glasses I had just gotten them, and I was walking up to the front of my apartment complex to go get a package from the mail room. I was walking back. So I had my keys in one hand. I had this big box in another hand. And right by our end of the apartment, we have some mailboxes and there’s a big TV next to it. And there was show little ads or show little promos and stuff on there. But there is this giant wall of text on the TV.
I’m like, Oh, I wonder if that’s some important announcement. So not even thinking about it, just like any sighted person would do. I was walking down the hallway, pause for a second. I looked at the TV and said, Hey, what’s on this TV? And it gave me this summary.
And apparently, it was like this big paragraph. It was like a tribute to one of the founders of my apartment complex because they had recently passed away. And so it was kind of a little tribute. I’m like, Oh, okay, well, that’s cool. But just being able to pause for one second, look at something, and get that quick description almost instantly, you know, without taking the phone out of my pocket with my hands full. Pretty cool.
Jeff Thompson: Yeah, a friend of mine just went on a trip to Houston, and they brought him with and they hooked him up with Aira, and they go through the WhatsApp. So it takes one minute longer. They were able to actually navigate and see what’s around them to see what they’re passing to have someone talk about. She’s obviously she has a cane and luggage.
So she wouldn’t be able to use something else. So these are a really good thing. But like you said, go in with the realization that it is the first iteration, it is designed for the general public, and we’re just using stuff that works for us. And some of the stuff does work for us really good.
Jesse Anderson: What I often tell people when I meet with them for tech assessment is I’m going to show you a lot of cool stuff today. Some of it you may need some of it may you may find helpful some of it you might not. And what I will say is that technology doesn’t solve all of your problems, but it can often be helpful. Like you said, if you go into the smart glasses like the Ray Bans or whether it’s a what you know, oh, there’s this new app that came out and it’s going to solve all the world’s problems. No, no, maybe not so much. But if you go into it with the right expectations and find a good use case for it, if you find a use for it, great.
Jeff Thompson: Like you mentioned the W Walk, to me that’s like a smartphone duct tape to a cane.
Jesse Anderson: Kinda.
Jeff Thompson: I’m joking in a sense. But
Jesse Anderson: Yeah, yeah, yeah.
Jeff Thompson: But it’s a smart cane. And yet some of us have gone through canes, all of a sudden you get it stuck in a door jam and boom, you need a new cane.
Jesse Anderson: Oh, I’ve snapped a few.
Jeff Thompson: Yeah. And so it’s where you want to put your dollars. I think, like we said in the beginning here, your PC, your JAWS, the stuff that you’re going to use every day, all that stuff.
I really like to have more things and some of this other stuff is great and you can take off into it. But there is no tool out there that is going to replace everything and whiskey away onto some fairy tale world of best blind ninja.
Jesse Anderson: No, no, there’s not one magic device or software.
Jeff Thompson: I really liked what you said about the iPhone, bringing that up. It’s something we didn’t even have on our list in a sense. But like your quick notes, like you said, the teacher says, oh, by the way, I thought this would be interesting, page 33 to 74 for tomorrow. Who knew that? And boom, you can take quick notes. You’re scheduling it. All that stuff can be done right on a smartphone.
Jesse Anderson: What always happened to me was you would be in class and they would hand out this sheet of paper and they’d say, read this article and then we’re going to come back as a group and discuss it. Well, that would happen to me all the time. So I would either have to have somebody read it to me or I would just have to wait until the discussion would begin.
And then I would just have to kind of join in as I piece things together. But now I can use my phone as a magnifier if I want to to read it, or I can snap it with seeing AI or a voice stream scanner. And I can just instantly have that read just like everybody else. I don’t have to worry about the instructor had it in an accessible format.
Even if it’s not natively accessible, I can just use my little iPhone and boom, almost instantaneously. So it’s stuff like that. And that’s the kind of things that I tell people in assessments to.
Jeff Thompson: That’s great.
Jesse Anderson: Just for practical examples.
Jeff Thompson: Well, Jesse Anderson, thank you so much for coming on and talking about assistive technology, because it’s important to know what’s out there, whether you use it or not, it’s nice to know something’s there. Because when you come across that barricade or brick wall, there’s some way somehow someone else has done it. And with you coming on here, they’re finding out ways of doing it. So thank you.
Jesse Anderson: And no problem. It was a fun discussion. And as I kind of always say, it’s never been a better time to be a blind person or a visually impaired person because there’s just so much out there.
Jeff Thompson: Yeah. So if you’re a student, check out state services for the blind in Minnesota here, get ahold of Jesse, and he’ll show you what’s out there.
Jesse Anderson: There’s a lot of cool stuff.
Jeff Thompson:
To find out more about all the programs at State Services for the Blind, contact Shane.DeSantis@state.mn.us. That’s Shane.DeSantis@state.mn.us.
Be sure to contact your State Services for the Blind, your Voc Rehab, and find out what they can do for you. Live, work, read, succeed.
[Music] [Transition noise] –
When we share-
What we see
-Through each other’s eyes…
[Multiple voices overlapping, in unison, to form a single sentence]
…We can then begin to bridge the gap between the limited expectations, and the realities of Blind Abilities