Our goal is to build technologies to help humans by understanding and utilizing the essential "What Is a Human Being?" starting from the sense of touch. To this end, we aim to create theories and technologies necessary for a society in which humans, machines, and AI coexist, based on knowledge and technologies of robotics and virtual reality.
We constructed a system that transmits the material perception to the operator by transmitting force and vibrotactile information through an avatar robot.
We developed a teleoperation support system for remote assembly by industrial robots. The workability was improved by transmitting force information.
We developed a system that converts and transmits raw vibrotactile information into information that is easily perceived by humans. Application of this system to teleoperation of construction machine improved workability.
We experimentally investigated appropriate operation methods for each subtask, such as digging and turning, and integrated the results to construct a control interface that achieves high operability and low burden.
We developed a teleoperation support system that reduces the physical and mental burden on the operator by providing force-feedback to follow the target trajectory based on prior learning of the operation trajectory and real-time evaluation of trajectory similarity.
We experimentally compared the mechanisms of operation interfaces that enable intuitive operation input and low physical and mental strain.
To support telecommunication, we constructed a system that models the pressure distribution on the palm that occurs interactively with a handshake and reproduces it via a novel tactile display.
We constructed a system that enables humans to perceive 3D positional information of surrounding objects using vibration stimuli. And, we constructed a model that dynamically controls multiple vibration stimuli according to the target position.
We constructed a system that presents the sensation of touching various materials in VR by dynamically controlling the winding and feeding of various materials in conjunction with VR.
We extended the tactile illusion phenomenon called "phantom sensation" and constructed an algorithm to present the sensation of objects moving around the user.
We constructed a wearable tactile display that presents distributed information on fingertip skin by spatio-temporally controlling suction pressure from a number of suction ports.
We constructed a tactile display that controls the position/orientation of the fingertip contact surface in 6-DOF using a parallel link mechanism. It is possible to reproduce various object shapes and forces/torques applied to the fingertip.
The tactile sensations perceived by humans are diverse and sometimes include individual differences. We constructed a multidimensional and multilevel mathematical model to represent such diversity.
Since tactile sensation is perceived by touching an object, it is a sensation that includes changes over time. We constructed a model to capture such a time series explicitly and to make it usable for engineering purposes.
We performed simultaneous measurements of various factors that contribute to tactile sensation, such as skin vibration, skin deformation, and contact force.