The Catchup

From Science Fiction to Reality: Skin Sensors in Prosthetics and Robotics

John Smith, Denison Rice

Imagine having the ability to feel and interact with the world just like a non-prosthetic would, that's the promise of a revolutionary new type of sensor that mimics the characteristics of human skin. We delve deep into this game-changing innovation, exploring its potential to enhance touch sensitivity and dexterity in robotics and prosthetics. We also touch on the associated ethical concerns and how this development could alter the lives of prosthetics users, including those participating in the Special Olympics.

The world of robotics and artificial intelligence is an exhilarating, yet perilous journey. Grappling with the benefits and risks, we question the implications of developing AI without proper oversight. We also discuss how advanced touch sensitivity and dexterity could be a boon for robots, potentially transforming the lives of people with prosthetics. As we journey from the realm of science fiction to the cusp of reality, your interaction and support mean a great deal to us. Don't forget to leave a rating or review, join our live streams, and check out our merchandise shop linked below!

Support the show

Let's get into it!

Follow us!

Email us: TheCatchupCast@Gmail.com

Speaker 1:

This week we are talking about an interesting one. Let me read to you some of the information I've derived on this topic. Okay, so, as you can see by the title, skin deep, unveiling the future of robotics, we double on Tondra, of course, as is typical with most of our titles, but we are going more than skin deep on this skin deep topic. Essentially, what has happened? Researchers I'm paraphrasing here Researchers from the University of British Columbia, along with Honda, they've done this research that has allowed the creation of stretchable, smart smart because it adapts and highly sensitive sensors that mimic the characteristics of human skin. I'm assuming that in the design process, would lead to it looking like human skin as well. Right, this is for robots, which is kind of the alarming thing, I'm sure, for most people, but not just for robots as well. There is a very practical and positive application of this as well that I think we should discuss. But anyway, yeah, highly sensitive sensors that mimic the characteristics of human skin, paving the way for its application in both robotics, yes, but also prosthetics, okay, and I think that's a very good thing, because we know people, directly, indirectly, that have prosthetics and they've come a long way for sure. I mean more of them move and interact just like a non-prostetic would, but as far as their sensitivity of it, what they feel when they touch things, it's not a course. It's not like the human skin. This gets us a lot closer, if not a direct replication of that. When applied to robotics or prosthetic limbs, the sensor endows them with touch sensitivity and dexterity, making tasks like picking up soft or fragile objects a lot easier. Right and I know I'm not speaking from experience by any means, but just by things that I've seen visually it would stand to reason why this would be such a huge benefit for people with prosthetics and I think, from a positive side. That's why I wanna lean on is the benefits for things like that right, how this impacts people with prosthetic limbs, giving them the ability, through robotics, essentially to feel what people with non-prostetic limbs feel. So I'm looking forward to discussing that. Of course, the robotics side of it thing I'm not gonna lie to you guys, this is where I start to deviate from my pro AI stance, because that kind of stuff that scared me a little bit. I'm not gonna lie to you guys. So it'll be a good discussion that we get to have on this. I'm very much looking forward to getting into it.

Speaker 1:

I'm glad you guys are on here with me on a Friday night. You guys cold staying at home. What you doing? No, thank you so much for tuning in joining us and yeah, I say I roll the intro. Let me go ahead and get into it. What's going on? Everybody, I'm John, and this is the catch up.

Speaker 1:

Before I jump into our topic even further, I wanna remind you guys the three best ways to support this show. Number one leave us a rating review. Wherever you're listening, wherever you're watching, there is a way to do it. Don't pretend you don't know If you're live streaming with us on Facebook. Leaving a rating on the Facebook page helps a ton, helps get us out in front of more people and, of course, it helps us know how we're doing as well, which is a huge benefit for us. Whatever we're doing well, what we can improve on, it's very beneficial, but also, again, helps us get in front of more audience members. We wanna grow this show very badly. It's a huge passion project of ours and a great way for us to connect with so many people about topics that keep people talking and entertain and again grow a bond with people we've never even met before, which is very awesome, and we love to do it. We wanna keep doing it, so leave us a rating review again.

Speaker 1:

Wherever you're listening, wherever you're watching, you can do it on YouTube, you can do it on the Apple podcast, spotify, et cetera, et cetera. Number two if you're not on the live stream with us, where are you? Jump on the Facebook, jump on our YouTube page, just search our topic or podcast title and subscribe or follow us. We go live every Thursday night, except for the last two weeks when it's just been me. I went live on a Sunday and a Friday, but every Thursday night we go live and the audio episodes come out the following Monday. So please jump on with us, get in the comments section, let us know what you think, even if you're on rewatch. Let us know what you think about our topics after we discuss them. We very much do this because we want the two-way conversation. We wanna hear what you have to say, so please do that. And number three as I mentioned at the top of this live stream, check out our shop. We got brand new products from there, including a hoodie, including some clean t-shirts, and we got some other options to keep you warm throughout the winter as well and rock the stuff that you're watching. Some catch a podcast merch and that money helps go to Ward us being able to fund, promote and host this show, because it does cost. So if you're interested, please check that out. We'd love to know what you're digging for the merch.

Speaker 1:

With all that said, let's dive right back into this topic. So, to further explain what we're talking about here, this sensor is mainly composed of silicone rubber, which obviously is not always very skin-like, but I think it could be developed to be as far as feeling, as far as look, et cetera, and so it can detect forces on and along a surface. So if you're rubbing your finger across something, you'll feel the bumps and such and so forth, including weak electric fields more much kind of like a touchscreen. So that's interesting, right? I guess I hadn't even thought about that as far as a prosthetic sensor goes or not sensor, but a limb, what that would feel like in regard to other electric forces. So it's interesting. It's phrased that way because what you would want to feel electric fields from, maybe that would be more in the realm of robotics, right, if you have augmented displays that pop up, I think like Tony Stark's AI interaction right, where he would just pop up projections. He would touch it. Maybe it's things like that for robots, I don't know. Of course we can find out together. I'd love to know your thoughts on that. It says unlike a touchscreen, it can buckle and wrinkle like human skin, enhancing its ability to interact with objects and other people. So when you press down on something right, you see the little flex in your finger on both sides right, which is fantastic and kind of wild.

Speaker 1:

Sensor's simplicity and fabrication makes it a viable option for scaling up to cover large surface areas and for mass production. So large surface areas, I don't think are meaning like for manufacturing plant floors. I think that it could, obviously, but I think what they're talking about for large surface areas would be human sized surface areas. You know what I'm saying Because you would not just put that on your hand or in the place of a missing limb, you know. Again, I hate to say these things because it does hurt to think about the people who have to deal with that, but it could cover an entire body. So, with that said, researchers note that as sensors evolved to become more skin like developments in AI and sensors will need to work hand in hand to make robots smarter and more capable of interacting safely and effectively with humans in their environment. So that's kind of the trajectory I wanna start off with, right? So what we're talking about here is making robots look more human.

Speaker 1:

Okay, which is weird. I'm not a fan of that personally, but what it allows you to do is have more of these roles that could be replaced by robots, by AI, more, dare I say, blue collar roles. Right, you might say that could be replaced by these robots, because they now have the touch, sensitivity and dexterity, through this development, that you and I do as humans. Right, I would have to say, that's why I'm not a huge fan of it. Of course, you see things on science fiction movies where you have robots that do really strenuous medical operations in emergency. Right, that could be beneficial. Not to take jobs away from doctors by any means, but as the world population grows, I think those type of things and I don't expect this in the next 10 years, by any means but just having that available to you in cases of emergency, I think is beneficial. But, yes, I would also say that my main concern is and we've talked about this many, many moons ago on this podcast.

Speaker 1:

I don't like the idea of AI as far as coming from a God complex, because that's essentially what this is. Right, we are made, in my firm belief, in the image of God, right? So why are we? And it benefits us in every way that we need it to. You know what I'm saying, but that doesn't mean we have to create something from out of nowhere that looks exactly like us. That's a God complex to me, and so the fact that people want to make robots that look just like humans, now have the skin of humans or at least a manufactured version of it, that give them the same sensitivity and touch abilities. I would like to know more about what the end goal of these things are. I'd love to hear what you guys are thinking about that kind of stuff, because when you have and this has been my belief, I watched somebody phrase it the exact same way the other day when you have things like chat, gpt or Bard or what have you, those are consumer oriented AI that benefit us with search and with research and with knowledge, right, I don't have a problem with that, but anything that you have that you could have show up on the factory floor. You could have co-star with Bruce Willis and Sturragut's you know what I mean and while the rest of us just chill at home.

Speaker 1:

It's kind of a weird dystopian future that I'm not on board with. I remember now one hand, I'm excited to see what this would look like. I remember in 2021, when they announced Los Angeles would host the 2020 Olympics, one of the first things SoFi Stadium said was they would have AI robot kiosks, right? So with this new technology coming from Honda and University of British Columbia, what does that mean? Are they going to look like us? You're going to walk right up to them and it's going to be like hello, where are you sitting? You know those type of things? I don't know that kind of stuff. I don't have any.

Speaker 1:

This is where Dennison's perspective is so good, because he would have more of that research backed and even, just in general, a slightly different perspective on this than I do. But my concerns really just lie with kind of a disillusionment that the public would get of interacting with humans over here, robots over here, maybe not even being able to tell the difference in certain situations, right, I know that would affect me. You know what I mean, and it just raises questions as to who can you trust? Where is this information coming from? And personally, as much as I do love AI and the benefits through the means I mentioned earlier knowledge growth you know those type of things when does that go when it's like a personal hand in hand? What was that movie, her, where a dude falls in love with his AI counterpart. You know, I don't know.

Speaker 1:

There's so many unknowns and again, kind of that I discussed last week, or earlier this week, if you will, where our lawmakers, not just in the US but around the world, need to jump on legislating safe development of artificial intelligence, because it cannot be allowed to be allowed to be allowed to be allowed. It cannot be allowed. Imagine this right, I don't know how many of you guys seen Upgrade. That movie freaked me the heck out. I'll give you a quick rundown of it in a second, but essentially it ends with an AI controlling a human body. But do people know that? No, they didn't know it. That was the thing. And that is when you give something the touch, sensitivity, the dexterity, the look of a human, that's something people are going to come into contact with. They don't know if they're talking to a robot or not, if that robot has AI that is allowed to develop freely and write its own coding without any oversight. That's dangerous. That's dangerous. In my opinion.

Speaker 1:

Upgrade, very simply, is very weird because I felt like I was targeted with the ads, just as the guy in the movie was targeted by the AI. It was so weird because I thought it was supposed to be like the biggest movie, because I saw ads everywhere. Turns out it was a low budget film. I was like one of the few people that even knew about it, but basically was a guy who lived in the past. He was one of the few people that worked on classic cars.

Speaker 1:

I think it took place in like 2060. He was fixing this guy's 69 Camaro. He was a tech guy and his wife was in tech as well. But sorry so, the guy whose car he was fixing was a tech guy and then his main character's wife was in tech as well. So those two people got along, they connected and on the way home from working on this car, the self-driving car they were in got hacked, taken over, goes to the end of a dock, rolls over multiple times. Both of them fly out.

Speaker 1:

Main character and his wife, wife, gets killed. He gets shot in the neck and paralyzed, and so he's having to deal with that. He's very angry, blah, blah, blah. But he gets told that there's this option that could give him the ability to live a normal life again. It's an AI, really, it's a chip. It was just a chip that went in his neck and allowed him to connect his brain to the rest of his body, so he wasn't paralyzed anymore. But he's sulking in the misery of trying to find the people who killed his wife, and then he hears a voice in his head it was an AI. This chip was integrated with AI, so it gives him not just the ability to think faster, move faster too. He basically becomes an action hero tracking down these people that killed his wife. Well, the AI's entire plan. The AI was actually controlling not physically, not directly, indirectly controlling that tech genius that the main character worked on the car for. He targeted this guy because he wanted a human body, right, and at the end of the movie ends up breaking his mind so that can take over his entire body. So the AI becomes the main character, basically, and kills a cop, kills two other people and that's the end of film.

Speaker 1:

I Hope I didn't ruin this for anybody. When I saw it, it freaked me out. I'm not gonna lie, not like freak out, but I was like dang, that scared me, you know. And time goes by and I'm like, yeah, I Don't know. Though, you know, you can't really see that happening. And then the last year I'm not gonna lie, I've thought about it a few times. I can see that happening, I can see that coming up and it's it's cast, a real to think about. But I really could.

Speaker 1:

And you look at things like this, these developments and what we discussed last week as well, you know you have to think about all of the options and all of the considerations that could come into play when it comes to Development of AI, development of robots again, making them look like us, act like us. And that's why I reiterate, just wanting to know Again let me know if you're on the live or if you watch it back later what do you think is the biggest benefit of this in for robots? Obviously, doing more delicate tasks, for sure, for sure. But how does that benefit us? What, what, what things am I overlooking that allow more sensitivity of touch, dexterity? All of that that benefits us without taking away Some of those more sensitive jobs. Right, you could say Cooking, for example. Right, you could have robotic cooks with things like this, self-cleaning on the skin and they understand the heat and the sensitivity they need when touching certain food. You could have that for sure. But then that takes away Several people's jobs. All right, a lot of people's jobs.

Speaker 1:

Um, again, you know, if you put it on and Not just in the way you see these giant mechanical robots doing this. But if you put this on a production line, right, doing the more Hands-on stuff like installing a door handle or what have you? Right on cars, I understand that for sure, and that is Probably why Honda's involved with this. Right, honda's been involved with robot development for a while, a long while. But how does that benefit us as a consumer? Yeah, you get these things quicker, probably, maybe. I even that's probably negligible. It would definitely be cheaper for them to produce in the long run. But, um, yeah, I'd be interested to know what you guys think on that.

Speaker 1:

Um, I know, for me, the big thing that I like to think about is how it would help people with prosthetics. You know, I think as far as I love that idea Absolutely for talking about, you know, feet, elbows, arms, all that kind of stuff. You know. You add in the idea of like take exemplary situation would be special Olympics right In the way of, or, I'm sorry, actually the Paralympics, pardon me, add that in. Well, what if we were able to, with this type of advancement, get rid of the Paralympics? You know what I mean. Now everyone's able to compete together again, whether they're missing a limb or not because of this synthetic skin that's on their prosthetic limb. I think those provide great opportunities and I really like that idea. I think it's a huge, huge advancement and opportunity for the idea of medical science. And honestly, you know, saying all of this and then circling back, I would have to imagine that that was the goal when it came to the researchers from the University of Bridge, columbia, whereas the robot side came from Honda and their work on this. You know what I mean. So I don't know.

Speaker 1:

I try to remain open to this stuff because I don't try to go into the future of AI and robotics like I'm watching iRobot. I don't wanna do that. I think it's very limiting. I think people that are afraid of chat, gpt and Bard and those type of things are missing out on a lot. I mean, especially since it's so integrated.

Speaker 1:

I made a survey on SurveyMonkey the other day. This is just for my job through their integration with OpenAI. All I had to do was tell it while I was hoping to get out of the survey, and it made me a whole survey. Now I edited a few things on it. But I'm just saying there's so many opportunities when it comes to that kind of stuff, right, that I don't think that we should limit ourselves based on a fear or concerns.

Speaker 1:

But I do think that with certain things, especially with robotics and I think robotics and AI are where I start to draw my line between the two, because I don't think I need a mechanical co-worker sitting next to me being like, hey, maybe you should try typing that. You know what I mean. I don't need that, I don't want that. But that's not to say there aren't very positive potential implementations of this right. So actually let's ask that, let's figure that out real quick. What are some possible positive implementations of this technology with robots? Just to get the other side right, we like a fair and balanced report on here.

Speaker 1:

The soft skin-like sensor can make interactions between humans and robots safer and more comfortable, reducing the risks associated with hard, rigid robotic components. Lifelike tactile interactions could foster more natural engagements between humans and robots, making robots more user-friendly and accessible. Okay, great, but I will say and that's not a game changer for me, but I will say that if you were to think of it in the terms of, maybe, a long-term caretaker at home, that's very good. I will say that. So say, you have a loved one who's up further in age. You don't want to put him in assisted living, you don't want to put him in a nursing home, but this type of technology is available to you and they're able to help take care of your loved one with sensitivity. I know this sounds maybe a little weird based on the stories I was giving earlier, but I would be on board with that.

Speaker 1:

Increase dexterity in robotic systems, enabling them to perform intricate tasks that require a delicate touch, such as handling fragile or soft objects. Again, maybe home caretaking stuff is the stuff I'd be on board with, because I don't want to see people lose their jobs through a whole industry with stuff like this. Robotic assistance in healthcare this is one I mentioned earlier too robots equipped with this technology to assist in patient care by safely handling patients or delicate medical equipment and prosthetic limbs with the quote robot skin sensor can provide individuals with a more natural and intuitive sense of touch and proving their ability to interact with the world. Absolutely Advanced research in robotics. This development can spur further research and create even more advanced tactile sensors and integrating sensory feedback in robots. Okay, in industrial environments, robots equipped with this technology can handle a wider range of materials and products and proving the efficiency and safety of automated processes. And again, that's something we touched on, like with a production plant, but also something I don't necessarily want to see stripped from us, right, because people need those jobs and, as we've seen with this UAW worker strike, people love those jobs. You know what I mean. They've been very committed to it, so I don't want to see that taken away from people by any means. Let's see here enhanced performance and service robots like hospitality, retail or other customer facing industries. Same concern there. Unfortunately, I asked for positivity. Now it's not really helping.

Speaker 1:

Development of smart wearables and hap I can't talk that one. Development of smart wearables and haptic devices, since for technology, you could also find applications in development of smart wearables and haptic devices, find more realistic tactile feedback and enhancing user experiences. I think that would be beneficial as well, for take burn victims, for example. Right, they could feel things in the normal way again. I think that would be hugely beneficial. Assistive technologies for individuals with disabilities we've talked about that Robot education and training.

Speaker 1:

The skin-like sensor can provide a more interactive and tactile dimension to educational robots, making learning more engaging, hands-on. Not sure I'd want that. I don't want my kid, whenever they're available or whenever they become a thing, to have an educational robot. I think that's a little weird. And the robots equipped with this technology could also be deployed in exploration or rescue operations, where a delicate touch is required to navigate through fragile and hazardous environments. Now, that's a good one. I'm happy to end on that. If you think about situations where there's a massive fire on a house, maybe wildfires, for example, to have a robot that could go in there, shaped like a human, looking like a human, and deactivate that sensitivity when needed and then reactivate it at the right time when rescuing people, that's a very beneficial tool. I like that a lot, and that could also go for crime situations you know, mass shootouts, if you will I hate to bring those things up, of course but also rescue operations in general. I think that's a great idea when humans can't get to the situation in the way they need. But, yeah, I love that idea. I think that's a solid one to end on.

Speaker 1:

I hope you guys got something out of this. This has been an interesting discussion. To know that we are moving this direction and it's not just a side project. It's not like Boston labs that you'll see a story on nine months down the road, right, no, this is Honda and they're very invested, so interesting. We'll see where this technology goes. This story just came out yesterday, so this is very new and I'm glad we got to talk about it. So, thank you guys so much for jumping on the livestream with me. Glad to be able to do this with you guys. Also, as always, remember those three things leave us a rate and review, follow us on our live streams and comment. Even if you didn't jump on the live stream, leave us comments afterwards. Check out our shop. It's linked wherever you're listening, wherever you're watching. As always, I appreciate you guys. Thank you guys so much for listening. Thank you for watching. I will catch up with you next week. Thanks for listening you.

People on this episode