Full Transcription
Dean:
Just ignore those folks who say “Well, you can’t do that, because you need to be sighted.” Really try and explore the technologies that are out there, ‘cause there’s a lot of stuff out there.
Sarah:
Accessibility should be a human right, that everybody should have the opportunity to use technology, and, you know, we love what we make and we want everybody to be able to take advantage of it.
Dean:
One of the things we’re hoping with screen recognition is when the third party developer turns this on, and they sort of explore their app with voiceover, they come across something that says “Possibly slider,” and the developer is like “Wait, this is not ‘possibly a slider,’ this is a slider!” You know what I mean? Let me go into my code and fix it.
Sarah:
We’re really excited about some of the features that we’ve built in this year to support the sort of broad spectrum of hearing, whether that be for the Deaf community all the way to those who just want to be able to use our technology better, and I think headphone accommodations falls into that category of something that can be beneficial to anyone.
Brian:
Welcome back to another Blind Abilities: That Blind Tech Show special. I’m here today of course with my partner in crime Jeff Thompson. How are you doing today, Jeff?
Jeff:
I’m doing great today, Brian. Really excited about today’s show.
Brian:
Why are you so excited? Who’s on it?
Jeff:
Well, my Apple orchard – it’s the people who actually started this whole thing, so it’s nice to have the people from accessibility here with us today.
Brian:
Yeah, so, we’ve got Sarah Herrlinger, who I’m sure a lot of you have heard of, from Apple – the global Head of Accessibility at Apple – as well as-
Jeff:
-Dean Hudson, and he’s actually an engineer there, and he’s a Braille user and a voiceover user, so that’s great that he’s right there, Johnny-on-the-spot.
Brian:
Or, as we should say, the Dean-on-the-spot.
Jeff:
The Dean, he’s in control.
Brian:
Well, what are we doing wasting our listeners’ time listening to us, why don’t we jump right into the interview?
Jeff:
There we go!
Brian:
In the studio today, we have Sarah Herrlinger and Dean Hudson, from Apple accessibility. How are you doing today, Sarah?
Sarah:
Doing well, thank you so much for letting us join you! We’re really excited to be here.
Brian:
Excited to have you. And Dean, how are you doing today?
Dean:
I’m doing fantastic. It’s great to be here, and we’re ready to talk about some of the new, exciting stuff coming up in our products.
Brian:
Very exciting time, obviously. And Sarah, how does accessibility come to life at Apple?
Sarah:
Well, you know, accessibility for us is one of our core corporate values, you know, we’re built around six values and accessibility has been one of them right from the start. It’s interesting – I was talking to some folks about the ADA 30thanniversary coming up, and realized that Apple actually started it’s Office of Disability in ’85. It’s something that we’ve been really trying to make sure is a part of what we do for a really long time now, and we do that because we believe that accessibility should be a human right, that everybody should have the opportunity to use technology, and, you know, we love what we make and we want everybody to be able to take advantage of it.
So, when we look at accessibility as part of what we do, we build it in early. Our team is involved in really all of the projects as they come up, from when they’re an idea in someone’s mind to when they actually become a reality. The accessibility team is lockstep with that process and what we’re doing is kind of twofold: one, it’s making sure that whatever that amazing thing that that person is coming up with is made to be accessible for all of our users, but it’s also looking at what are some of the really cool things that we could do uniquely for our users, and making sure that everybody has a little, you know, surprise and delight of their own.
Brian:
That’s awesome, that is really awesome, and exciting. Dean, we’re at an exciting time right now with iOS 14 going on, and the public betas just came out. What in iOS 14 is going to be new for the blind and low-vision people out there?
Dean:
Well, there’s a lot. I’m going to talk a little bit about voiceover stuff since I’m a voiceover user, so near and dear to my heart, and I’m just going to add on to what Sarah said. You know, people always ask me “What do you do at Apple? Do you enjoy it?” And I got to thinking, one of the things that makes it so fun to work here is what we can deliver for our users. And we kind of- and I’m cheating a little bit, because I see the product first, but I’m just always happy to know how this is going to affect voiceover users.
So, I’ll jump right in. The big thing that we’re doing with voiceover this year, we’re constantly making voiceover adjustable, or making it customizable for our users, but then also making it very smart. And what I mean by that is, we’ve all heard of the technologies for machine learning, computer vision, and for the last few years we’ve been carefully planning and sort of figuring out how can we make those technologies work with voiceover, how can we make the best experience for voiceover with those technologies?
And so what we’ve introduced this year- let me back up, it started last year. You may have noticed, for iOS 13 if you navigated over a button, and there was no label, we would say “Possibly Play button,” or “Possibly Order button,” and that was sort of a time where we were sort of dipping our toe into machine learning and figuring out what that could do. This year, we’ve really jumped into it. Not only do we read labels, but we can actually detect user objects. So, things like sliders, tables, tool bars, scroll areas – just UI elements that our users interact with within an application. Voiceover sort of compliments the accessibility work our third-party developers have already done, and we add onto that by giving you things that might be missing. So, a good example of this is say some apps that I use and probably a lot of our users use – write apps, or shopping apps. And you’ve added all these items to your cart, then you go to check out, but somehow the checkout button is not accessible. And you may hear “Button button,” and now you’ve got to play the guessing game, and nobody wants to do that when you’re trying to get food. So now it would say “Order button,” or in some cases would even expose the button so you could navigate to it and double-tap to place your order. So, we’re really excited about that technology, it really changes the game for our users.
A second thing that we’ve done is image descriptions, more sort of friendlier user descriptions, more complete. So currently, if you had a photo of two people sitting in a restaurant at a table, we might say “Two people at a table with plates.” So that’s what the eye sees, but your brain goes a little bit further and it adds context to that, so you’re- “Oh, that’s two people enjoying a meal, because it looks like they have food there.” So, we might say “Two people in a restaurant enjoying a meal.” Now that’s sort of image descriptions.
So those two items are very, very helpful – one, the first one’s being recognition to give access to applications, and then image descriptions to give a photo context as to what’s going on. So that would be sort of voiceover-iOS, some of the big-ticket items. I do want to move to the desktop, because there is some really great work going on there as well. Coding, development – this is an area where we feel that our users can really take advantage of. Long gone are the days when you could get a job without any computer skills. You have to use a computer for any position you’re trying to get, but we also know that the more you have experience with coding – doesn’t mean you have to be a developer, doesn’t mean you have to be an engineer, but some experience is going to give you a leg up.
We really need that, and so we’ve done a lot of improvements around our integrated development environment. Xcode, we now provide easier ways for you to navigate code using the rotors, we have some cool rotor functions in there, we allow you to now be able to navigate through breakpoints within your code, and then lastly, which is a huge one for our population, the live previews. So, in Swift, if you’re coding, you don’t have to compile the code, but instantly you can see the results of your code on the right side of the screen, so we’ve made that accessible now. So that’s going to be a huge, huge jump – we’ve already introduced Swift playgrounds a few years ago now with iOS, but this is now bringing development to the Mac, and it’s actually designing applications, so it allows someone to start, just, you know, “Hello, world,” the basics, all the way to developing your own application.
Just a couple things – we’ve added some stuff to Braille, so now we’ve added auto-panning in Braille for both iOS and desktop, and then last year we added a lot more languages to the Braille tables, I think we’re up to like 82, but this year we’ve added the ability to switch Braille languages within the rotor, so we think that’s a really cool feature. And again, it’s customizable stuff that you can do, and just to allow your experience to be much, much more improved.
So, those are the big ones for voiceover, I’ll throw it over to Sarah for some of the other features.
Sarah:
Yeah, actually just to take a step back also on what Dean was saying, you know, I think the voiceover recognition or that image description and screen description piece is really something that’s going to be incredibly beneficial to anyone who’s a voiceover user. But the Xcode stuff I think is personally really near and dear to my heart, in that, you know, I came to accessibility through working in Apple’s education division and working on special education, and kind of to what Dean was saying about, you know, most jobs today, you need a background or you need to understand how computers work and feel comfortable with them.
One of the things that I always focused on when I was in education was trying to solve the fact that I never wanted a student to ever be told “You can’t follow your dreams, and be whatever you say you want to be when you grow up because there’s- something holds you back.” And so I’m really excited at the prospect that, you know, this is definitely the long game, but that if we can get kids engaged in engineering now, that the hope is that it will give them potential into the future, and I think that’s beneficial not just for that own individual’s employment, but also for the fact that I think every company is better for hiring diversely, and for bringing in members of the blind community to help them make their products better. So, our goal is not just to really do this from an Apple perspective, but hopefully to really make some change in the world.
Dean:
Yeah, that’s all very exciting.
Jeff:
Oh, that really excites me – on Blind Abilities, we have a lot of transition-age students who are looking forward to their future, and what you guys did with Learn From Home and all the- trying to teach educators as well to bring them into the field so we can teach more people coding. What you’ve been doing with Swift Playground and Xcode is just exceptionally well-received by our listeners, because it gives them that opportunity, like you said, to seek a future and not be limited by what they can achieve, now they have a place where they can go and get a taste, whereas people say, you can get your foot in the water, get a start with it, so thank you for that.
Sarah:
Yeah. And we know there’s room to grow, but I think it’s a good start. And year after year, we’re trying to keep building on this as something that we think is important. Also, I wanted to talk about a few of the other things out there – thank you, Dean, for all of that great info on voiceover. One of the other ones that we’re excited about for the low-vision community is updates to Magnifier.
Magnifier is one of our most popular and beloved accessibility features, it’s a tool that gets used by people regardless of whether they identify as having vision loss or not, and we’ve done a pretty big upgrade this year, both in terms of the UI – so, being able to now control the size, the amount of real estate the controller takes on the screen, but also being able to customize what you want as sort of the primary elements for you, in the controller. So, whether you want the ability to change the zoom magnification, or whether you want to change the contrast or things like that, you have a lot more control over that. We also have improved it to magnify more of the area that you’re pointing at, as well as to capture multi-shot freeze-frames and be able to filter-illuminate those individually. So, we think there’s some great stuff going on for Magnifier that people are going to be excited about.
Another feature that has been actually gaining a lot of press, or a lot of people have sort of started to talk about on social and such is a feature we have called Back Tap. What Back Tap does is by literally tapping the back of different models of the iPhone, you can assign shortcuts. So you do a double or a triple tap on the back of the device, and you can assign those to different actions. So, those could be kind of general mainstream actions, like in my case, I have the double tap set up to take a screenshot and the triple tap to close the home screen, so just close the device.
But you could do it for a lot of different things; you could also set it up as, for example, the accessibility shortcut if you have more than one accessibility feature turned on, or to just turn on or off an individual accessibility feature, or even to set it up with more complex workflows from Siri Shortcuts. So, if there is something that you do all the time that takes multiple steps, you could connect one of these backtaps to that specific Siri Shortcut and really simplify the amount of time and energy it takes to- or action that it takes to do that specific thing. Dean, I know you have yours set differently than mine, what do you have yours doing right now?
Dean:
I have double tap opening the control panel, which I go to often – I actually was playing around, and I live in San Francisco, so I use BART a lot, and so I have triple tap set up to tell me when the next train is coming, so that’s very handy if you’re walking, just “Oh crap, what time is it?” And then I can pull out my phone and figure out when the next train is coming, so it’s very cool.
Brian:
It’s good that you said BART the train, because obviously in the blindness community BART the reading app is a lot different.
Dean:
Oh, that’s right.
Brian:
But, yeah, the blindness community is very excited about Back Tap gesture and all the possibilities it’ll offer. Will it be something in iOS 13 that, you know, you introduced a lot more customization – in fact, I loved being able to do a one-finger triple tap to launch my notifications and a one-finger quadruple tap to launch my control center. Is the Back Tap something that developers will eventually be able to kind of work into their customization like they have with the actions rotor?
Dean:
If they take advantage of the shortcut ABI(?), certainly you could hook into that. So that’s kind of the way that any of those gestures, even IO 13, would work.
Jeff:
That’s really great. Yeah, there’s something else that was mentioned – LIDAR, which is really exciting, the potential depth and navigation when you’re talking about the future of what could be happening a couple years down the line. I think some of the stuff that was mentioned during the WWDC and afterwards – it seems like, and especially for you, Dean, what this potentially could do for navigation indoors or navigation object recognition distance, I just got sucked into this conversation that I was listening to about LIDAR one time, so I’m really excited about the potential of this computerized phone that’s coming out.
Dean:
Yeah, it’s like I said in the beginning – we are just always excited about what we can build in for our users and just being able to leverage this technology we think can do some very, very powerful things for blind users. For all users, but specifically things for voiceover users.
Jeff:
That’s exciting.
Brian:
Very exciting – everything you guys are doing is very exciting. It must be great, you know, working at the mothership, at Apple. Dean, as somebody who is visually impaired, what advice would you have to a student out there that’s looking to possibly get into design and development?
Dean:
Wow. You know, I was asked this question by another blindness organization last fall, and I’m almost jealous, because I didn’t have any of this type of technology when I was going to high school. It was volumes of Braille, and a big clunky Braille writer, and you slept through it. I was a computer science major, and even starting there I didn’t have a speech synthesis at all. There was no screen reader – my screen reader was a human reader, so that’s how I got through my courses.
But today, it’s just amazing what you can do with all of the technologies, including stuff with the phone, the iWork stuff. I would say just plunge right in, like Sarah says – just ignore those folks who say “Well, you can’t do that, because you need to be sighted.” Really try and explore the technologies that are out there, ‘cause there’s a lot of stuff out there.
Last year, we even did something with the camera so that a blind person could take not just a photograph, but a better photograph. So, we’re doing a lot of stuff that makes creativity and design just a lot more comfortable, and so that you’re focusing on your work, not the accessibility around your work.
Jeff:
With screen recognition, and being able to further identify, put a sentence to what images may be, rather than just what you see, it’s not to cover for developers that don’t actually make their apps accessible, but it’s a backup. Because when I go shopping, and I want to get a product like a grill, or something, and the app is accessible by only one company, that’s the grill I’m going to get. It would be nice if some of this technology you’ve got, the machine learning, will help that, where some developers come up short a little bit on labeling, to give us more of an opportunity to have a choice of five different grills, because the apps will work with us. So, I think that machine learning, the screen recognition, the images, all that you’re doing there, is going to offer us a better chance to have more opportunities to shop for different goods, especially when accessibility is the point that we decide to buy something or not to buy something.
Dean:
Yeah, and an aspect of this that we are really excited about as well – we have a very strong relationship with third party developers. We have a whole team that works with them, and we have folks that are dedicated to accessibility that work with them on their apps. One of the things that we’re hoping with screen recognition is when a third party developer turns this on, and they sort of explore their app with voiceover, they come across something that says “Possibly slider,” and the developer is like “Wait, this is not ‘possibly a slider,’ this is a slider!” You know what I mean? Let me go into my code and fix it. And so that’s what we’re really hoping for – we want developers to be on board and continue to make their apps accessible so that everyone can use them.
Jeff:
That’s the dream.
Brian:
Obviously there are people like us that have been using your products for a very, very long time – Jeff and I both own multiple products – but to somebody who’s recently lost their vision and is just starting out with voiceover, what advice might you have for them?
Dean:
Well, one of the things that we have set up in all of our stores is Technical Trainer. And so if you’re recently blind, don’t know anything about phones or iPhones, that’s one place to start, to go in and ask for someone who’s familiar with accessibility, and have them walk you through how to use the device. But also, on all of our devices, phones and desktops and watches, there is a setup process that, when you turn it on, says “Would you like to use voiceover?” And you say yes, you turn this feature on, you give the command, and they’ll walk you through how to use voiceover on your system. And so, at least on the Mac you get some sort of a screen-by-screen- it goes all the way up to how to browse the web. But it shows you how to use the controls, how to navigate. And so we think that also can be a very helpful thing for our users.
Brian:
That’s fantastic. The watch – the rotor is going to be coming to the Apple Watch. What actions can you speak about that might be available to voiceover users coming up?
Dean:
Yes, there are quite a few. Obviously the important ones – characters, words, lines – but then we also added volume, which I find very, very useful, and headings, and speech rate. So those are the ones that are currently in there – obviously if this continues to grow we’ll add more, but those are the ones we felt were pretty valuable for our users.
Jeff:
Oh yeah, definitely. Can you talk about the headphone accommodations? I think this is really, really cool, especially for the silver tsunami – people who are aging, that hearing- they might not need hearing aids, but it seems like they’re going to be able to actually customize what you hear through the phone, what you hear on phone calls and other stuff, to some particular nuances that you may want to hear or may not want to hear when you’re using your device.
Sarah:
Yeah sure – yeah, we’re really excited about some of the features that we’ve built in this year to support the sort of broad spectrum of hearing, whether that be for the Deaf community all the way to those who just want to be able to use our technology better, and I think headphone accommodations falls into that category of something that can be beneficial to anyone.
It’s a new accessibility setting that’s designed to adjust certain frequencies which can amplify or dampen particular sounds and be able to better turn your audio for an individual’s unique hearing needs. And this covers movies and music and such, but also phone and FaceTime calls, and podcasts, and all of that, to make it just a more crisp and clear experience.
And the way that it works is there’s now, under headphone accommodations, there’s a custom audio setup available there, which you have the option to incorporate a personal audiogram if you have one, but if not you can go through a series of steps which will kind of work to determine what frequencies are better or worse for you so that you end up with up to nine unique profiles based on your personal sound preferences. So that’s three amplification tunings with three varying strengths, so that you can have some flexibility depending on the environment that you’re in, or the kind of sound you’re listening to, to be able to have it be a better experience for you.
And we’ve built this to work with AirPods Pro, the second generation of AirPods, some of the Beats headphones, and also earpods. And another interesting element with this is for the AirPods Pro there’s actually one additional piece, which is in AirPods Pro if you have them now you may know there’s a way in which you can set it to either transparency mode or noise cancelling. And if you have it set to transparency mode, it will take your headphone accommodation preferences into consideration with that, it uses those. So transparency mode now includes those particular settings, and in that way it sort of makes quiet voices more audible, and outside environmental sounds a little bit more detailed. So if you think about some who, for example, might be a voiceover user, if you have your headphones in and you’re moving through space out in the world and kind of using voiceover to get navigation, this can help you be more contextually aware of your surroundings, when your headphones are in but you still want to be able to be out on a busy street and know that you kind of have an idea of what’s going on around you.
Jeff:
Love that separation, where you can have the noise reduction or the transparency, it’s so great that you can be involved in what you’re listening to but aware of your surroundings, that’s awesome, I love that feature.
Sarah:
Yeah! We can’t wait for more people to take advantage of this one.
Brian:
Jeff, what’s the name of the club that we started, since I always butcher the name of it?
Jeff:
Oh, the AirPod Pro Pals Club?
Sarah:
Nice, sign me up!
Brian:
There you go, yeah, sign everybody up! But, you know, I know we could go on talking to you guys for hours and asking questions, we only have a limited amount of time with you today, so this has been a real treat for Jeff and I, we often say “If you’re going to have a disability, there’s no better time in human history to have a disability than now, because of the amazing things companies like Apple are doing.” So, Sarah and Dean, thanks so much for being with us. Jeff, I’ll let you get in a few words.
Jeff:
I think it’s really exciting, you know, there’s so much more information out there, and all the possibilities that they’re coming up with, the Apple products, especially with the silicon, the powerhouse that you guys are building, it’s just augmented reality, all the stuff for gamers and everything – everything, it’s just exciting. I think this WWDC really, really made me lean towards the future like wow, what’s coming. So, thank you.
Sarah:
Well, thank you! It’s been a pleasure to talk to you, and nice to be able to share a little bit more of what the team’s been working on over the course of this year, and the last few years to get to this point. So, we are also really excited about what the future holds, and can’t wait to keep doing this for years to come!
Brian:
That’s very exciting. It’s exciting when, before coronavirus, and we used to be out and about amongst society, and I’d be sitting at my neighborhood restaurant, flicking around on my iPhone with the screen curtain on, and somebody coming up to me and would happen to often say “Wait, you’re blind, how are you using an iPhone?” And I would love that people would ask me that question, and then I get to explain to them about all the fantastic accessibility features on the iPhone to them. So that was very cool, hopefully I’ll be able to get to do that again soon one day.
Sarah:
Oh, I hope so. For all of us, let’s find a good way to get back out there.
Brian:
Thanks again, Sarah and Dean, thanks so much for being with us, and keep up the great work over there at Apple.
Sarah:
Thank you very much!
Dean:
Thank you!
Jeff:
Thanks, guys.
Brian:
Very exciting stuff for iOS 14, I can’t wait to test this stuff out.
Jeff:
Yeah, you’re running the beta? I haven’t switched over to the beta yet because I gave my beta testing phone to my granddaughter, and she loves it, and I just couldn’t ask for it back. But, I need to be connected, so I’ve got my Mac, my production, my whole studio and everything I need to be connected, and then when I go to the cabin or the trailer, everything like that, I need it to be fully productive and all that, but I’ve got enough friends that are using it, and I stay tuned to all the news. You’re kicking it around, so it’s good news, and to hear of all the stuff that’ll be coming out this fall from them – it’s not only that, I tried to get a few hard questions in there, but there’s so much stuff, and if you want to find out more, you can go to apple.com and check out the developer notes that have been coming out since WWDC, but I think the future’s a lot brighter. There’s an Apple for everyone in the future, I guarantee it.
Brian:
Yeah, I mean, it’s very exciting. As you mentioned, I am running the iOS 14 beta because I do have an extra device, and very smooth, very happy with it so far, but if you only have one device, would not recommend even though the public beta’s dropped recently-
Jeff:
Mm-mm.
Brian:
-yeah, you know, you just want to be safe, you know – wait until September, it’s right around the corner.
Jeff:
They even warn you, like “Hey. Hey. Hey. Hey.”
Brian:
“Careful. Careful.”
Jeff:
If mom and you can’t get her, she’ll be upset, you know?
Brian:
And part of me wants to put that Big Sur beta on the computer, but another part of me is like, “Uh-uh,” you know, it’s like I’ve got two voices talking in my head, so-
Jeff:
Yeah, it is-
Brian:
There’s just a lot of excitement coming out of Apple. I mean, we’re in the middle of a pandemic, and there’s a lot of excitement coming out of Apple, Jeff.
Jeff:
Well, they’re doing a lot of stuff, you know, the stuff that they’re doing, they’re building into it, you know, they’ve got the new silicon, that’s a beast, that’s really a beast, and it’s their own, so now they can prioritize what they want to dream up, what they want to bring to it, and it works with all their stuff, they’re not limited like they were before when they were using third party chips in here. Now they can build, design, and everything, throughout their whole orchard, all the way from the Apple TV, all the way to the Mac, could be and will be at some point using the same silicon. And to me, a developer, I’m sure those developers just start rubbing their hands together, you know, like “Ah-ha.”
Brian:
Well, there’s going to be more money in the developers’ pockets, you know, instead of just having potentially an app on the iPad and iPhone, it could now be on the Mac. It’s Apple, and it’s everywhere.
Jeff:
Yeah, I’m not worried about this catalyst, I think the catalyst was kind of a preemptive thing, like “Hey, we could do this,” but that’s crossing over operating system to- you know, iOS to OS, you know, when you’re going from Mac to the- or, well, now they even have the watch operating system, they’ve got the-
Brian:
Watch OS, TV OS, and everything.
Jeff:
There’s four OSs now, right? Let’s see-
Brian:
No, five – iOS, iPad OS, TV OS, Watch OS, and Mac OS. Five, I can count to five. What about you?
Jeff:
There you go. Well, they better keep it there, because you’re a one-hand guy, you know what I mean?
Brian:
I am, I am-
Jeff:
You’re a one-hand counter.
Brian:
Once I get to six, that’s where numbers start getting confusing in my head.
Jeff:
Yeah, which hand first?
Brian:
Yeah, well, for now, we are That Blind Tech Show. On the Facebook, you can join us in our group, That Blind Tech Show, you can tweet us, @BlindTechShow, you can email us in ThatBlindTechShow@gmail.com, or you can call us, we’d really love to hear from you, at our new phone number 612-367-6093. If you do leave a message and you’re not a spam call, just let us know that we can use your tape on our show if we’d like-
Jeff:
Mm-hm.
Brian:
-and I guess for now, we-
Jeff:
Nope, nope-
Brian:
What did I forget?
Jeff:
You can also find us on blindabilities.com, on Twitter @BlindAbilities, and check out the Blind Abilities community on Facebook as well. And you can use that same phone number, because we, you know, we share the same operator.
Brian:
We share an inbox, we share an inbox, don’t let anybody know that-
Jeff:
That’s all we share, that’s all we share-
Brian:
That’s all, that’s all. Not that there’s anything wrong with it-
Jeff:
No.
Brian:
But, you know, I listen to Blind Abilities, I don’t just download, I actually listen. Heard a really great episode about some blind individual in New York City where you can probably hear the sirens coming right now-
Jeff:
Yeah, timing – you just saved me a little work there!
Brian:
-yeah, so anyhow, I heard a great episode about what it was like living with coronavirus in New York City.
Jeff:
Around the world.
Brian:
That’s an impressive series, really enjoying that series.
Jeff:
Around the world with COVID-19, from a blindness perspective, and Brian, you were number seven, I believe it was?
Brian:
Lucky number seven, I believe that’s Mickey Mantle’s number, so, yeah, lucky number seven. But, yeah, so, check that series out, along with a lot of great human interest interviews going on, on the Blind Abilities app, the website-
Jeff:
And we’re headed to the studio for That Blind Tech Show pretty quick.
Brian:
Yeah, and also, don’t just download, listen too, so-
Jeff:
Yeah. I’ll save you a donut.
Brian:
Ooh, donuts – donuts are in the studio, virtual studio?
Jeff:
Virtual donuts.
Brian:
Those are the only kind of donuts I can eat if I’m ever hoping to lose the little weight- I guess we have to get over to that other show, so for now, we are out.
[Music] [Transition noise] -When we share
-What we see
-Through each other’s eyes…
[Music] [Transition noise] –
When we share
-What we see
-Through each other’s eyes…
[Multiple voices overlapping, in unison, to form a single sentence]
…We can then begin to bridge the gap between the limited expectations, and the realities of Blind Abilities.
Jeff:
For more podcasts with the blindness perspective, check us out on the web at www.blindabilities.com, on Twitter @BlindAbilities. Download our app from the app store Blind Abilities, that’s two words, or send us an email at info@blindabilities.com. Thanks for listening.
Contact Your State Services
If you reside in Minnesota, and you would like to know more about Transition Services from State Services contact Transition Coordinator Sheila Koenig by email or contact her via phone at 651-539-2361.
To find your State Services in your State you can go to www.AFB.org and search the directory for your agency.
Contact:
You can follow us on Twitter @BlindAbilities
On the web at www.BlindAbilities.com
Send us an email
Get the Free Blind Abilities App on the App Storeand Google Play Store.
Check out the Blind Abilities Communityon Facebook, the Blind Abilities Page, the Career Resources for the Blind and Visually Impaired, the Assistive Technology Community for the Blind and Visually Impaired. and the Facebook group That Blind Tech Show.