Full Transcript
Timothy:
IDATA or Innovators Developing Accessible Tools for Astronomy is a team that is made up of individuals and institutions from across the United States.
Pete:
Introducing Timothy Spuck from Associated Universities Incorporated,
Timothy:
Afterglow or Afterglow Access, the software itself, has now been rewritten, we have an innovative way of taking an image or a data array, and rather than seeing it or displaying it visually, we can display it through sound.
Woman:
We think of astronomy as a visual science. But the information captured by telescopes is simply data. We choose to represent these data visually, but there are other ways we can experience them.
Timothy:
Nobody looks through the telescope and sees the detail in the spiral arms and the galaxies and all of that sort of stuff. These spectacular images are simply numbers that come in through a computer from a camera, and then the computer turns it into a visual display. The software takes something that has been visual and instead of being able to see them, we can actually hear them.
Pete:
If intelligent life without vision were to exist on some distant planet within our galaxy, these life forms would still seek to explore our universe. How exactly would they do this?
Timothy:
Research has shown that our auditory acuity is significantly better than our visual acuity as humans.
Woman:
In IDATA students with and without visual impairments work together to design and refine tools to help everyone experience the wonder of the stars. They learn how computing can support access so that they can explore the universe together.
Pete:
And now let’s join Jeff and Tim in the Blind Abilities studio.
Timothy:
You know, you often hear a picture is worth 1000 words, well, how many words is a song worth?
Woman:
With accessible astronomy tools we’re developing in IDATA, the sky is no longer the limit.
Jeff:
Welcome to Blind Abilities, I’m Jeff Thompson, and I’m in the studio with Timothy Spuck, and I read an article about the universe just became more accessible. And I was really intrigued, so I had to get a hold of them and find out more. Timothy, welcome to the Blind Abilities studio.
Timothy:
Well, thank you, Jeff. It’s great to be here, especially to be here to talk about this topic, because it’s a really interesting project, I think both from the perspective of the need that it’s fulfilling, but also, you know, the way that it opens up the universe and the way we think about data coming from the universe and how we interpret it as humans. It’s been an amazing journey for me. I have definitely not been alone in this journey, IDATA or Innovators Developing Accessible Tools for Astronomy is a team that is made up of individuals and institutions from across the United States. Our key partner organizations on this are University of Nevada, Las Vegas, University of North Carolina, Chapel Hill, Turk, which is an institution in Cambridge, Massachusetts, also Glass Education in Wisconsin, and then we have a number of key consultants across the US. Those individuals and institutions are AUI’s partners in this endeavor, as well as a number of nearly 20 teachers and students from their classes. So it’s really been a strong team effort. The results have been really spectacular. We’re definitely very pleased. One of the additional kind of perks that didn’t come along with the initial project, but AUI does a lot of work in Chile, because we of course manage large radio astronomy facilities for the US federal government. So we actually have a partner in Chile at the Universidad Diego Portales in Santiago that is translating the software and resources into Spanish for broader dissemination in Latin America. All around it’s been an exciting project, and I’ve pulled together a great group of people to work on it.
Jeff:
Well, that’s great, quite a team. Now, Afterglow. That is the software that’s web based, that now just became available for anyone for free.
Timothy:
Yeah, exactly. And so Afterglow or Afterglow Access, which is what we’re calling the new version of the software was an initial piece of software developed to go along with the Skynet robotic network of telescopes at North Carolina Chapel Hill. The lead software team there, Josh and Vladimir and Dan are really the key developers of Afterglow. When we came together to initiate this project, what we wanted to do was to build on the success of the original version of Afterglow and turn it into something that actually made astronomy more accessible to the blind and visually impaired. The software itself has now been written, it’s been rewritten with a lot of improvements, a lot of added features. In particular, it works with screen readers, we also have an innovative way of taking an image or a data array, and rather than seeing it or displaying it visually, we can display it through sound, through sonification. There has been a lot of changes within the software. As part of the development of this tool, the software tool, the software is also designed in a way that it’s modular. So right now what we’ve done is we’ve taken it from behind, we’ll say, the firewall of the Skynet robotic network of telescopes where you have to create an account and then get access to the software into the telescope. And so we have, at AUI, we’ve installed it locally on our server, we’re able to offer the software globally to anybody that is interested in trying it out and using it simply through their web browser. And there’s no installation that’s needed. This is something that we’re super excited about, because this is the first time that this sort of tool has been made accessible for the rest of the world to access free of charge, and just very simple to use.
Jeff:
Now typically, you receive all this data in numbers and grids and like a matrix of sorts. So you are able to take that data that someone can turn into what we call visuals, you have turned it into, like you said, audio.
Timothy:
Yeah.
Jeff:
That must have been quite a feat.
Timothy:
It was a really interesting brainstorming session that resulted in our way that we are displaying the data through sound. I think when you think about it, or at least when I think about it, I think about it from the perspective of if we had an intelligent civilization or species out there on another planet that never developed sight, for one reason or another, they didn’t have eyes, and so they couldn’t pick up electromagnetic energy, they would still explore their universe and the world around them. And so just for me, it was asking the question, well, how? If another species did not develop the visual capabilities, well, how would they go about exploring their universe? And so thinking about your body, we actually are equipped with a variety of sensory instruments. And then when you start to think about an image that comes off of a telescope, nobody looks through the telescope and sees the detail and the spiral arms and the galaxies and all of that sort of stuff. I mean, these spectacular images are simply numbers that come in through a computer from a camera, and then the computer basically has software that turns it into a visual display. But when you drill down into it, all an image is it’s a bunch of pixels. And so you have a pixel array, maybe you have 1000 pixels in an image, and maybe it’s 100 columns by 100 rows, and that’s your data array. And if you start to think about it, you say, okay, we could look at that thing visually, but you know, I wouldn’t mind taking little Jelly Belly jellybeans and putting them in each of the boxes in each of the pixels, and licking my way across the data, and actually experience the data through taste, you know, we could take and put little scented candles in those little boxes, and we could smell our way across the data array and explore the image that way or explore the picture that way. But one of the things that we wanted to do is we wanted to do something that made the software truly accessible. And the one barrier is how do you explore an image without utilizing visual capabilities? The other is, well, how do you make it free? How do you make it so that individuals or schools or you know, communities that don’t have a lot of financial resources, they can’t go out and buy the software and things like that? How do you do it for free or for very minimal cost? As we looked at it, we said, okay, well, we can build this thing in a web browser. So that means anybody who has internet access, and who has a web browser could use it. And then we didn’t want to have somebody have to plug in some apparatus or some device where they’d have to feel the data array. What we said is, okay, well, we can transfer sound over the internet pretty easily, or through any computer, you have somebody who has a pair of headphones, it works. That kind of really led us in the direction of well, let’s sonify this and let’s imagine how we could sonify a data array. If you take that and then you take the next leap and you say, okay, we’re gonna use sound, then you got to ask yourself well, so when I look at an image, what are the things that I do? I scan an image, I look around and I say, oh, well, there’s this feature in the upper left hand portion of the image. In other words, I look at an image and what I want to know is where things are in the image, are they in the top and the bottom and the left and the right, so being able to navigate around the data array. And then the other thing was, in particular raw astronomical images, you’re looking at brightness of objects. And so that was the other piece of information that needed to be transferred. In taking that into consideration, what we did is we thought about, alright, we have the data array, we’ll say it’s 100 pixels by 100 pixels, total of 1000 pixels. And then let’s imagine along the bottom of the picture that we have a piano keyboard. And so if I have a low tone, then that’s something that’s on the left hand side of the image. If I have a high tone, that’s something that’s on the right hand side of the image and mid tones are in the middle. So you can use tone to indicate position of an object or position of a feature in the image from left to right. Then we use a time dimension, if you can envision that there is an individual striking the keyboard for every row of data, and you go through that 100 rows of data in a period of 10 seconds, if you hear something about eight seconds into the scan, then you know it’s at the top of the image. If you hear something that’s one second into the scan, then you’re going to know it’s at the bottom section of the image. So now all of a sudden you can navigate through sound, you don’t need to see anything, you can say I heard a low tone, and it was early in the scan, you know that then, well, it was in the lower left hand portion of the image. And so I can actually zoom into that area, just through my auditory capabilities. Then we added one more dimension, which is volume, and volume indicates brightness. So now we can say I’ve got a bright object, and it’s in the lower left hand portion of the image. And then I can zoom in and actually increase my resolution of the scan to identify other potential artifacts or features within the image. So that kind of is the best verbal description that I can give you as far as how the software works. So basically take something that has been visual, which is just looking at these spectacular images on screen. And instead of being able to see them, we can actually hear them. And I guess one added piece that we did with the software is that, you know, if you have nebulosity or if you have sort of fuzziness, we add what’s called white noise, or a little bit of background static. So that actually allows an observer, somebody who’s using the tool, to have that feature as well.
Jeff:
Well, that’s really cool. Because now you got dimension, you got location. And then with the ability to zoom in on something, then it strikes a whole ‘nother investigation of what is being projected to them
Timothy:
Right, you know, thinking about how we use the various dimensions and perception capabilities that we have in a new way has been really interesting. And especially, you know, when we think about research has shown that our auditory acuity is significantly better than our visual acuity as humans. You know, you often hear, you know, a picture is worth 1000 words, well, how many words is a song worth?
Jeff:
Oh, my gosh, yeah. You know, until someone is put in a position where that’s what they’re utilizing, someone that’s totally blind or someone that is subjecting themselves to eight hours of this, you know, that type of thing, that’s when you start really learning. You’re putting yourself into that, locking yourself into a perspective and gathering data from it. You mentioned all these students and teachers that came together to work on this project, how long did it take before you realized you were onto something?
Timothy:
This has been a project that’s been going on for a little more than four years, you know, we engage students and teachers, both blind and visually impaired students in the process. And so initially, I would say, you know, the first year, there was a lot of brainstorming about, well, how do we approach this? How do we do this, and then in the subsequent years, there was, you know, sort of taking our ideas that we came up with, field testing them, going through an iterative process to get feedback and input from students and educators as well as others who were part of the team to really end up with the product that we have today. So there was an iterative design process that was used to make all of this possible. And we were really trying to keep the user at the center of this. You know, so often what we do in design processes is that we bring experts together and we say, okay, design this, and then we look at well, okay, so now it’s designed, so how do we go ahead and get the user community feedback and improve it? Or how do we make this tool more accessible now that it’s been designed for the general population. We did not approach things that way, we wanted to approach things from the beginning. We know that if we are to make this thing usable and accessible for blind and visually impaired students, we needed to engage them in the process very early on. In addition, you know, we also adopted, you know, the philosophy that, look, when we make something accessible or more accessible, one of the things that we know is that that historically has yielded many benefits for individuals who don’t necessarily have a disability. In trying to make something more accessible for individuals who have sort of been cut out of the pool, who haven’t had the opportunities, you’re actually increasing accessibility for everyone. Jeff, earlier, before we started recording, you were talking about how, you know, you were in an astronomy class and you couldn’t do the labs. Ultimately, what we want is for this tool to be able to be utilized by middle school and high school students, whether you’re sighted or visually impaired. And who knows, a sighted individual may decide that, you know, well, I would rather hear the data and I want to listen to the data to see what I hear versus what I see, I can explore the data through multiple sensors that I have on my body, if somebody is fortunate enough to be sighted and to have full hearing capabilities, this tool is something that expands their options, and it makes the data more accessible to them while actually making it accessible for individuals who have been left out historically.
Jeff:
Timothy, what got you connected with being inclusive? I’ve heard from a lot of people into science and stuff, your connection to the BVI, the blind and visually impaired community, seems to be on steroids a little bit more than what I’m typically used to seeing, and I’m really excited that you did bring in the involvement of the users from the beginning.
Timothy:
Yeah, and I want to put a shout-out to the Wisconsin School for the Blind, who was one of our key partners in this project. And for me, personally, I taught earth sciences and astronomy for 20 plus years. I had, as a classroom teacher, a number of individuals who had various disabilities, or actually, what I like to kind of define as just varying levels of ability. You know, I think if we looked at each of ourselves as an individual, and we could plot a spectrum of abilities, what we would have are individuals who maybe they have strong visual capabilities, but they have weak auditory capabilities, or the reverse, or maybe they have weak visual capabilities or limited auditory capabilities, but they really have great intuition. And all of these things are really important in order to explore the universe that we live in. For me, it was sort of a natural move to creating a tool that makes astronomy more accessible to the blind and visually impaired. I guess it was a natural piece along my personal and professional trajectory. But beyond that, I mean, we had some exceptional partners who have done work in this area, that were part of the teams, that actually helped me grow throughout the project as well. If we think about the University of Nevada, Las Vegas, and the forum group there, Andy Stefik, leads up that effort and initiative, where they have done a lot of work to make computing more accessible to the blind and visually impaired. We think about, you know, well, Yerkes, it was initially Yerkes observatory. But then when Yerkes closed, it became Glass Education that led the project there, and Kate Meredith and Kathy Gustafson, they are individuals that had a lot of experience trying to make astronomy more accessible to deaf and hard of hearing and to blind and visually impaired, and so I think, you know, we had people on our team that had some expertise in this area. And it was certainly something that allowed those on the team who maybe didn’t have as much experience going in to actually learn a lot and grow a lot in our understanding of, I guess, accessible-izing science resources.
Jeff:
Well, I think the work that you and your teams are all doing is really gonna enlighten the minds or open up the opportunities for science teachers across the world to see that there are opportunities and tools that you’re developing that not only help the visually impaired and blind access this type of world, universe, astronomy, in particular right now, but getting to the teacher to let them believe that we can teach that. Like you mentioned, I couldn’t do the labs. The teacher told me I couldn’t do the labs. We didn’t try. But I still got an A in the class because I think I changed his opinion a little bit. But if we had these tools that you just mentioned, you know, this is going to help open the doors across the education spectrum, starting from the teacher, to give some opportunity to the students.
Timothy:
Yeah, Jeff, I think that’s a really, really good point. Because what we visualize now, or what we believe this software tool creates is the opportunity now for a classroom where you might have a blind or visually impaired student, and you might have a sighted student, they’re in the same classroom. Well, now they can use the same tool to do the same learning activities, and simply experiencing the data in a way that makes sense to them as an individual. Now, these are the kinds of things that we need to do in education to actually break down the barriers. And it’s not just there’s the technological barriers. But then there’s also the barriers to what others perceive those with disabilities can do. And I think that one of the things that our research has uncovered is that by engaging sighted students, and blind and visually impaired students in this project helped to make those individuals who are not visually impaired better understand what individuals who might be visually impaired can actually do. The individuals who have various disabilities, what they simply need are the right tools and the right resources. The ability is there. It’s just that because of the way we structure and we build technology and the way we structure and build learning, it’s simply that the opportunity is not there. This project has, I think, been groundbreaking in that way, it has really taken us into an area where most people would say astronomy is a highly visual field. I mean, most people that I’ve talked to, when I tried to explain this to them, well, you know, we’ve built the software that can do this, this and this, and they’re like, I don’t know how somebody who doesn’t have vision could basically look at a picture through their ear.
Jeff:
Probably the same way scientists can take a look at data and turn it into a vision.
Timothy:
Yep, exactly. Astronomy is not a visual science at all. It’s actually a data rich science. This is where an institution like the National Science Foundation, who actually, you know, funded this project and continues to support the project, this is where that sort of investment in potentially transformative and innovative technologies brings benefit to normal, everyday people in their lives, makes our lives better through these kinds of advancements in technology.
Jeff:
When you were describing the matrix, the lower left hand or a 10 being up higher, or the time the sequence that it came in, I was thinking of the game called Pong, way back, one of the first video games, and basically uses a grid system and something travels through it. When I first heard the sound on the video, I was thinking, could there be a small game to enhance or enrich someone to learn how to use the sound for location? Because I’m sure there’s a learning curve, there’s going to be something of an understanding, you know, of what they’re hearing, and what they’re actually picturing in their mind or what they’re actually gaining from it. So I was thinking, could someone just make it a game, like, here’s a star, here’s the Big Dipper, and what would that sound like? It would just be interesting how you would convey that to someone.
Timothy:
Yeah, I think that’s an interesting idea. And certainly, I think gamification of learning is something that we know can be very powerful and very beneficial. It’s interesting that you bring that up, because we’re going to be exploring some of this stuff in our IDATA phase two project. Let me take a step back, we actually have a follow-up proposal to the US National Science Foundation for additional support to continue this work. Because the initial project, IDATA one, was part of what was called the National Science Foundation STEM Plus C program. And STEM Plus C was basically exploring computation or computing across STEM disciplines. What our research really focused on in this phase one was to actually embed or immerse blind and visually impaired and sighted students into a software design and development process that would result in Afterglow Access. And we wanted to then look at what does this do about or how does it impact attitudes about for students, as far as this this experience make them more interesting in computing, because we knew that there was research that supported the idea that women in particular, girls, were more interested in projects, or in engaging in activities that they will perceive as helpful to others. And so we know for example, in software engineering, that we have very few women and so we looked at this as, well, getting girls involved with this sort of a project, would it make them more interested in potential STEM careers or computing science careers. We’re still processing the data there, I don’t want to share any results, but I think we’re getting some positive results there. But we’re completing- phase one is we’re completing IDATA phase one this year. You know, we’ve got the software out, we have some curricular resources that are coming out that basically can be used in the classroom by teachers to help students understand astronomy concepts without the use of sight, whether it’s sound or touch. Now, in phase two, this is where we are further developing the capabilities of the software, but also developing a teacher training program and initiative for broad dissemination across the United States and elsewhere, you know, Canada, Latin America, etc. So we’re waiting to hear from the National Science Foundation, and we’re keeping our fingers and toes crossed, that we will be able to continue this work, and actually take what we’ve sort of invented to date, and now really build the dissemination effort and program around it that can explore things like well, do we want to gamify this a bit? And is that the best way to get teachers and students to learn how to use the tool? Because you’re exactly right, sure, you can go in, pull up an image and listen to it. And you’ll hear differences between images. But without a little bit of training, you’re not necessarily going to know what the differences mean. That’s our next phase, is to go ahead and ensure that interpretation of data through sound is understood.
Jeff:
Yeah, I was just saying like in the driver’s test here, first thing they do is they give you like, four sample questions, just so you get the know-how to use the machine, then I was thinking well, do you need to have one or two little examples just so they can get an idea of how to use it? When you mentioned women and science, it brought back that I remember the spectrum, the light spectrum, was basically brought to light by a woman, you know, if light is traveling away, or if it’s coming towards you, like if something’s rotating around a planet or something, I believe that was a woman that came up with that, way back.
Timothy
So I’m not exactly sure who you’re referring to, but certainly women have played a strong role in advancing astronomy over the years. For example, what was once referred to as the Large Synoptic Survey Telescope, which is this large telescope that is going to survey the night sky from Chile, every roughly two and a half nights, it will survey the entire night sky for 10 years, it’s going to sit there and do that.
Jeff:
Oh, wow. It’s now called the Vera Rubin Observatory, recognizing Vera Rubin for her significant contributions in astronomy. Unfortunately, astronomy historically has been heavily male dominated. It still is today, but we’re doing a better job than what we had in the past. It’s only today, or more recently, where the women in astronomy are starting to get the credit for the major discoveries that were made in the past and discoveries that are being made today. And I think that that’s critically important, because if we are going to attract more young girls to consider STEM careers, or to consider careers in astronomy, then they’re going to have to see examples and role models. And of course, the more we diversify a field or the more we diversify a project, the better it is. Not only do you improve the quality of the science that’s done, but you also improve the impact of the science that’s done. That’s true, whether it’s, you know, we’re talking about attracting women, or we’re talking about attracting the blind and visually impaired to become professional astronomers, individuals who can experience and share the universe through a different lens serve to enrich, like I said, the quality of the science that’s done, as well as the impact it has.
Jeff:
You’re attracting abilities. That’s what you’re doing.
Timothy:
Exactly. Right. And I mean, I met a couple really interesting individuals through this project, Juanda Diaz Merced, Juanda is an astronomer from Puerto Rico. She became an astronomer, and then she went blind after she was an astronomer. And she has done a lot of work in this area of sonification of astronomical data. I remember I was walking with Juanda down the streets in Pittsburgh, we were actually going to see President Obama, there was a thing at Carnegie Mellon University, and we were walking down the street and we passed a woman who was in a wheelchair, one of the mobile wheelchairs. So we got maybe, I don’t know, 30 feet away from her. And before I knew anything happened, Juanda says, she grabs my arm as we’re walking and she says, oh, my gosh, she fell, we need to go back and help her. And I was like, what happened? And Juanda heard all of this, you know, before I ever realized that this had happened behind us. You know, that helped me to understand, my gosh, I’m actually pretty deficient in my auditory capabilities, because Juanda is, she’s Superwoman with them, because it is something that she recognized well before I did, and then I had another experience with Ed Summers, who’s a software engineer for SAS. He’s a blind software engineer. He’s on our advisory board for the IDATA project. I was standing over Ed’s shoulder and he knew I was behind him somehow. He was going through a document on his computer, and it’s like, hey, Tim, you want to read this? Or you want to listen to what I’m reading? Because he says, I’m speed reading right now. And I’m like, oh, okay. All right, I don’t quite understand how, but I’ll give it a shot. So he gave me the earphones, and he put one in his ear and one in mine. And the only thing I heard was drrrrr, drrrrr, drrrrr, that was it. And he’s like well, it’s saying this, this and this. And he says, you need me to slow it down for you? So he had to slow it way down. And finally, I could hear the screen reader and hear the words and make sense of it. But again, Ed’s auditory capabilities have developed far beyond mine. I think that this is where I started thinking about, well, we’re not people with disabilities, we are people who have varying levels of abilities. And so what you want in any project is you want a team that complements each other, that can offset my deficiencies and build on my strengths. And I think that’s one of the things that the IDATA team found. The Rubin observatory is going to be amazing, because normally in astronomy, you know, you get data that gets collected by a telescope, and then it goes to a team of astronomers, and they have one year that they have proprietary rights, and then it gets open to the public. But with the Vera Rubin, this thing is going to take the images, scan every image, and these images are huge, by the way, it’s going to scan every image for changes, and it’s going to create an alert stream for that image, and then release it to the public in 10 seconds. It’s almost going to be like a live stream off of the telescope that is going to be made publicly accessible to anybody who wants it.
Jeff:
So you said it’s going to scan the sky for two and a half days?
Timothy:
The field of view for its images, I think like it’s at least several full moons, every image is a couple full moons or two or three full moons.
Jeff:
Oh, wow.
Timothy:
So it’s big, and it’s super high resolution, it’s going to take an image, that image is going to get dumped out of the camera. And it’s going to go into a processing pipeline, while the telescope moves to the next field. And then that pipeline is then going to get released publicly, in a matter of seconds. And then it just repeats that.
Jeff:
It just keeps going.
Timothy:
Pounding away images night after night. And it’ll take it about two and a half nights to cover the entire night sky from Chile. And then it’ll go back to square one. Start again. Nothing has ever been done like this before. We expect that we will discover lots and lots of new asteroids.
Jeff:
Yeah, that’s what I was thinking.
Timothy:
ep, we’ll identify lots of stars that are changing in brightness. And some of that change in brightness is going to be due to planets orbiting it and moving in front of it and cutting out a little bit of the starlight. What it’s doing in a sense is it’s doing a survey, and then the survey data is going to produce all of these red flags that say, hey, you might want to take a closer look at this.
Jeff:
And then some studies will focus.
Timothy:
Right, exactly. And then other telescopes will go and look at it.
Jeff:
No, there’s four planets blocking that light.
Timothy:
Yeah. And then the challenge, though, is, I believe, at least in the first few years, they anticipate like 10,000, I think it’s 10,000 alerts per night of things that are going to change.
Jeff:
Wow, that just happens if those planets blocking the light are on a perfect plane from us.
Timothy:
Yep, exactly. Again, you know, it’s tremendous, tremendous amounts of data. We’re entering in an era in astronomy, you know, some of the big questions that are out there, the answers to them are going to become a lot more clear. I think there’s always going to be things to explore and refinement of answers to questions.
Jeff:
I think astronomically, you’re going to get more questions that are going to evolve from this as well.
Timothy:
You know, we think we’ve got some things figured out and then all of a sudden, you know, we send a space probe out to Jupiter, and we see all of Jupiter’s moons, and we find out they’re not these dead worlds like the Earth’s moon is and there’s all of these things that are going on, the moons are covered with ice and there’s a salty, briny ocean underneath, you know, a couple kilometers of ice. And then that gets you thinking about, well, you know what, we have life here on earth in the very deep ocean that doesn’t need any sunlight or anything, it lives off of geothermal vents. And so we know that there’s geothermal vents on this moon Europa around Jupiter, and so what does that mean? It’s like well okay, if life can evolve and sustain itself at the bottom of the ocean living off of geothermal vents on Earth, well, then, why couldn’t it do the same on this moon around Jupiter? Lots of, you know, interesting things have happened in the last 50 years in astronomy, and I think will happen in the next 50 years.
Jeff:
Or in the last week with making oxygen on Mars.
Timothy:
Well, yeah, I mean, that’s it, and evolution occurs in leaps. And I think for astronomy and space sciences, I think we’re sort of in that phase where there’s this leap, because you look at, well, okay, there’s public funding, you know, government funding that’s gone into astronomy over the years, but we are at a point where the billionaires who have all this money to spend who are kind of like, no, I want to be the first one to, you know, put a person on Mars. They’re spending their billions of dollars to do it. And they can do it a lot cheaper, because NASA might need to say, well, there’s a 95% chance that you’re going to survive the trip. Whereas, you know, these other companies can say, well, there’s an 80% chance that you’re going to survive the trip. And at 80%, they would have a ton of people in line waiting to board the ship, these would be top astronomers and top scientists in the world. It’s an explorer mentality, you kind of go back and you think, well, you know, why did Lewis and Clark do what they did? Why did these people who went to Antarctica, left the comforts of their home and went to walk across this chunk of ice, and they knew that their chances of survival were not super great? Well, they do it because they’re explorers.
Jeff:
Yeah, the thing about exploring out in outer space is you’re in a tin can, or you’re in a shell that needs to create your sustaining life, basically.
Timothy:
Yes.
Jeff:
That’s the difference, where Lewis and Clark, you bear the elements, well, in outer space, I don’t know if you’re really bearing the elements. Because if you are, you’re gone. It’s really interesting, all the way from what’s gravity, all these mysteries, and the size of the universe is beyond your imagination. And it’s just total imagination. I always wanted to get into it, but here I am at age 59, and, you know, I just think about it, but you’re actually got hands on, and you’re helping so many young students and stuff dipping their toe into the areas of science, it’s really cool. I’m really glad that you guys are going down this field, this area, and opening it up to opportunities, that you’re opening, you know, STEM programs, it’s been hard to get people with disabilities involved in them, and even women getting involved in more and more stuff. You know, it’s like being more inclusive, and like I said early on, it’s the abilities that we’re really looking for.
Timothy:
Right. Again, you know, these diverse perspectives, these diverse lenses through which we come to know our universe, it’s really, really important for us to be able to come together and share these different perspectives in a way that helps each other to grow. And I think that anytime anybody or any group of individuals is left out of the picture, in a sense we’re blinding ourselves to that contribution, or to that capability, or to that portion of the ability spectrum. You know, I think that as we accessible-ize, or we make resources and tools more accessible, this is something we simply improve the quality of life, and we improve the quality of the science that’s done through a project.
Jeff:
And like your article said here, I’m gonna pull it up. Let’s see. And from what you’re doing, you are making the universe more accessible.
Timothy:
Yeah, and Jeff, one of the other super exciting things here is that we thought, well, maybe, maybe, you know, astronomical data and astronomical images are in typically what’s called a FITS file, most of the images that other people were using, or that people see, are a JPEG or a TIFF file, something like that. And so with a JPEG, a JPEG image is typically made up of a blue pixel array, a red pixel array, and a green pixel array. And then those images are simply merged to give you a color image that then you can display. The software, the way Josh has written the software, we have this kind of neat added capability that sort of has happened here more at the last minute. And that is that not only astronomical images, but I can take an image from my camera, a JPEG file, upload it into the software. And basically what it does is it will automatically break down that image into one red, one green, one blue image. And then I can listen to the red image, I can listen to the green, I can listen to the blue, and so we’re not now just talking about astronomical images. Now we’re talking about medical imagery, or images from satellites. And we can hear that and sonify that. Now I’ll be the first to admit we’re still in the very early stages, so we’re hoping to get additional resources to continue our work. But as we think about this, it’s just data. That’s all it is. We choose whether or not we visualize it, or we use some other way of interpreting it. But you could envision at some point in the future, this tool being used to take a CAT scan, and instead of somebody visually inspecting it, the potential for somebody to hear it. And again, if our auditory capabilities are indeed, as research has shown, we have a much higher resolution in our auditory capabilities versus our visual, you know, this could lead to innovations in the way that we analyze x-rays or other kinds of medical imagery to get a more accurate diagnosis. It is true that astronomy research has really led to advancements in other fields, the technology that we develop for space exploration and for astronomical purposes have actually spun off into everyday society, whether it’s software algorithms, or it’s CAT scans. You know, today, most of us don’t go in for exploratory surgery. And the reason for that is that we simply go get an MRI or a CAT scan that allows doctors to see inside without having to cut us open. That technology came from advancements in radio telescopes, when we discovered how to build radio interferometers. Eventually, that technology found its way into medicine, and we now have CAT scans because of that. My mind goes in lots of different directions with this tool, with the beginnings, what we’ve developed here with Afterglow Access, really has tremendous potential and capabilities to advance not only teaching and learning, but also to advance research.
Jeff:
You know, it’s really interesting that you’re talking about this, and I really get sucked into this. I was just talking to Brandon Biggs, and he works out at Smith Kettlewell Eye Institute, and he’s working on a little program about auditory, he’s trying to break it down because if we’re standing by a road we hear cars, we hear all sorts of noises if the ocean’s to our right and all that stuff, but the road itself doesn’t make a noise. You can see where he’s pondering where to go with this, and it’s really neat that people are looking into this. I mean, you guys have taken it to the level where you actually have a working project, and I really hope that you take it to the next level, I hope you get that funding to do so, because it’s growing, and the inclusiveness that you have, the involvement of the blindness, visually impaired community, and all the players and people, it’s a great thing to hear that you are working in this area and doing such great work in it. It’s exciting.
Timothy:
Wow. Thank you, Jeff, and like I said, if it wasn’t for the National Science Foundation taking a chance on us, we wouldn’t be where we are today. If it wasn’t for those students that worked with us taking time out of their summer, and time out of their regular day at school, and the same for the teachers, we wouldn’t be where we are today. If it wasn’t for all of the partner institutions making their key and really important contributions, we would not be where we are today, so this truly has been a team effort. The results are just, you know, they’re pretty spectacular. Overall, when I think about, you know, the project and when we first started the project, where we are today, I think for the most part it’s where we dreamed we would be. But you know, we had great expectations, and we could have fallen short and we still would have been very successful for what we were trying to do. But I think the team itself has just done a great job. We’re hopeful that we’ll continue to work together, and further develop Afterglow Access and come up with Afterglow Access 2.0, which will be a future version that actually makes some of the analysis tools BVI accessible, and that also leads to a full teacher/professional development experience so that we can get this thing into more classrooms so that there’s not a case in the future like you experienced where, you know, you were part of the classroom but you couldn’t take part in the labs, and we want to be able to address that and fix it. But we are actually now in a partnership with the Organization of American States, so AUI and the Organization of American States, I don’t know, are you familiar with the OAS, Organization of American States?
Jeff:
Uh-uh.
Timothy:
You can think of it as a United Nations body for the Americas.
Jeff:
Okay.
Timothy:
So North and South America, but then we also have a couple countries in Africa that are part of it, but to make a long story short, what, you know, we are doing, and it sort of is built out of this IDATA project, is we recognize that there’s all of these resources that are out there for individuals with varying disabilities, whether it’s something physical or even, you know, brain trauma, individuals who have various disabilities. There’s things that are out there, but in my experience, you kind of run into people and then you hear about them. There’s not a good centralized location where individuals can go to find resources. And so that’s not necessarily in other areas, for example, if I want to find a great restaurant, well, I’m just gonna go to Yelp and find a great restaurant. If I want, you know, a good hotel, I’m gonna go to Trip Advisory and look at the ratings and stuff like that. If I want a science activity or I want an astronomy activity that’s going to teach asteroids and it’s for the middle school level, I would just go to the Astronomical Society of the Pacific, and the NASA Night Sky Network, because they have a portal there where they have these activities, and I as an educator or somebody can go find what I need. I have not run into that anywhere for resources for individuals with disabilities. And so what we are doing is we’re in the process right now, we’re doing a campaign to get people to share with us their resources that they know about that basically address teaching and learning for individuals with disabilities. So it could be a software, it could be a curricular resource, it could be a conference that’s out there that happens annually, it could be policy documents that have been used in various areas to improve things for individuals with disabilities. We’re trying to create this collection, and then we want to build a centralized portal where an individual with a disability or a family member can say, I want to teach my blind child about the solar system. And so what they would do is they would be able to go to this portal, identify what they’re looking for just by checking some boxes, and then the portal would deliver back to them, hey, here are the resources that are out there, so you could use this activity, you could use this activity. We talk about accessibility, I mean, the first part of accessibility is knowledge. You’ve got to know what’s out there. We’re working on developing this portal and I’ll send you the invite letter to complete the survey, and anything that you could do to share that with others would be super helpful.
Jeff:
Oh, yeah, I’m coming with some names real quick in my mind right now.
Timothy:
Great.
Jeff:
I know one that works at the New York Library that does a lot of resourcing and stuff.
Timothy:
That would be excellent, because I think what we kind of set a benchmark for ourselves and our team, we would like to kind of have at least 500 resources that we can input into the system, and then build the database and build the portal around that. Eventually, this then becomes more of a broker site, you know, you have something that you want to share that helps in education of individuals who are with disabilities, then you can go to the portal and you can share it, the resource could get vetted and then it would appear in the portal. If you need something, you can go there and find it, and so we envision this thing sort of evolving and growing, you know, hopefully we can start off with 500 resources and in a couple years end up with 10,000, and then just sort of have this portal more or less grow in a grassroots sort of way.
Jeff:
Yeah. 20, 21 years ago, wait, 24 years ago, I asked someone after I lost some eyesight, I said, where do they keep the blind? That sounds pretty naive right now, but I had no idea. I had no idea, you know, parents who have a child that’s born into blindness or becomes blind, parents have nowhere to turn either sometimes. I mean, there’s some organizations, but it would be neat for even the organizations to be able to draw from a resource such that you envision.
Timothy:
Yep, exactly, and if we start thinking about, you know, building tools, or building resources, in particular for education, we stop thinking about well, we’re gonna build this for the masses, and then we’ll modify it for those other people. If we can stop thinking about things that way, and from the beginning we say look, we want to try and build this for everyone, because when we try to do that, it actually makes for a better tool, a better resource for everyone, and that, I think, is a culture of sort of a mentality that we just need to kind of chip away at and start thinking more in the context of when we build something, we’re having everyone in mind from the very beginning.
Jeff:
When we have software products that are built out there, when they try and retrofit them later, it’s subpar. You don’t get the same user experience, not an inclusive one. It’s a different experience.
Timothy:
Exactly.
Jeff:
Well, great, I’ll look forward to this, and it’s really great meeting you and talking to you. I really like your attitude and your perspective.
Timothy:
Well, thanks, Jeff, I enjoyed the conversation and yeah, we went over 20 minutes, I guess!
Jeff:
It happens! It was up to you and you did a great job.
Timothy:
We’re all in this world together, I think if we build the right tools, we can build a stronger and more impactful society together.
Jeff:
Timothy Spuck, thank you so much for what you’re doing, and thanks for coming onto Blind Abilities and talking about this. I’m excited, I’m gonna keep following up with this project, and I’m really excited to hear someday about some of the students that return to do some more of these studies and hopefully they got inspired to seek out sciences, the STEM programs to get into. Good stuff.
Timothy:
Absolutely. Thanks so much, Jeff, really appreciate the opportunity to talk.
Pete:
We’d like to thank Timothy Spuck for joining us on our podcast today. IDATA is a National Science Foundation STEM C project. For more information on the IDATA project, check out their website, at idataproject.org. That’s i-d-a-t-a project dot o-r-g. And from all of us here at Blind Abilities, through these challenging times, to you, your family, and friends, stay well, stay informed, and stay strong. Thank you so much for listening, and have a great day.
[Music] [Transition noise] -When we share
-What we see
-Through each other’s eyes…
[Multiple voices overlapping, in unison, to form a single sentence]
…We can then begin to bridge the gap between the limited expectations, and the realities of Blind Abilities.
Jeff:
For more podcasts with a blindness perspective, check us out on the web at www.blindabilities.com, on Twitter @BlindAbilities, download our app from the app store, Blind Abilities, that’s two words, or send us an email at info@blindabilities.com. Thanks for listening.
Contact Your State Services
If you reside in Minnesota, and you would like to know more about Transition Services from State Services contact Transition Coordinator Sheila Koenig by email or contact her via phone at 651-539-2361.
Contact:
You can follow us on Twitter @BlindAbilities
On the web at www.BlindAbilities.com
Send us an email
Get the Free Blind Abilities App on the App Storeand Google Play Store.
Give us a call and leave us some feedback at 612-367-9063 we would love to hear from you!
Check out the Blind Abilities Communityon Facebook, the Blind Abilities Page, and the Career Resources for the Blind and Visually Impaired group