Computer Vision News - February 2023

11 Katherine Kuchenbecker quantify the better techniques. But, yes, I agree, haptics is not a niche area anymore. It’s really growing.  Actually, it's normal that you agree because that researcher was you! Five or six years ago, you said that in an awesome Futurism video. What progress has been made since then?  Thank you! Actually, I had forgotten about that video. I recorded it only shortly after I moved to the Max Planck Society here in Stuttgart. I enjoyed doing that interview, and it was a good outlook. Since then, I've established my department. I've been here now for six years. I have about 40 people working with me, and I've graduated three PhD students. I think, like, six or seven of my postdocs are now faculty members, and some of the others are on the faculty job market. That's also an indication of universities getting excited about hiring people with this kind of expertise, seeing the promise of interactive touch technology and touch capabilities for robots. In those intervening years, we have seen surface haptic… touch feedback on flat screens is also people and these interaction sensors and actuators creating mechatronic systems. I could see that there was good momentum; it was feasible. There were several labs across the country and in the world working in haptics. It was part of the overall robotics domain. But you needed such specialized knowledge. There were not very many commercial devices that you could buy. There was nothing like a dataset or any standards. Now we're moving past that, and I see more and more people in robotics, in human-computer interaction, starting to think about haptics and getting excited about it. One other challenge I want to mention for the diffusion of haptics research is that I think compared to computer vision or audio stimuli, it's harder to share the experience of a haptic interface. At our conferences, researchers not only present papers and posters, but we also present demonstrations, hands- on demonstrations where you bring your system. We brought our hugging robot HuggyBot to Hamburg, where it hugged more than 100 people over three days, so that colleagues from around the world could feel what it’s like to hug a robot. Because if you just read in a paper what it's like to touch this thing in virtual reality or hug a robot, it's really hard to understand. I think it's harder to transfer that digitally. It's almost like the very problemwe're trying to solve. Itmakes it not as easy for other people to understand what we're working on or what works well. It's harder to evaluate and

RkJQdWJsaXNoZXIy NTc3NzU=