Skip Nav Destination
Close Modal
Update search
NARROW
Date
Availability
1-2 of 2
Abderrahmane Kheddar
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
fMRI-Based Robotic Embodiment: Controlling a Humanoid Robot by Thought Using Real-Time fMRI
UnavailablePublisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2014) 23 (3): 229–241.
Published: 01 October 2014
Abstract
View articletitled, fMRI-Based Robotic Embodiment: Controlling a Humanoid Robot by Thought Using Real-Time fMRI
View
PDF
for article titled, fMRI-Based Robotic Embodiment: Controlling a Humanoid Robot by Thought Using Real-Time fMRI
We present a robotic embodiment experiment based on real-time functional magnetic resonance imaging (rt-fMRI). In this study, fMRI is used as an input device to identify a subject's intentions and convert them into actions performed by a humanoid robot. The process, based on motor imagery, has allowed four subjects located in Israel to control a HOAP3 humanoid robot in France, in a relatively natural manner, experiencing the whole experiment through the eyes of the robot. Motor imagery or movement of the left hand, the right hand, or the legs were used to control the robotic motions of left, right, or walk forward, respectively.
Includes: Multimedia, Supplementary data
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2009) 18 (2): 156–169.
Published: 01 April 2009
Abstract
View articletitled, Thermal Display for Telepresence Based on Neural Identification and Heat Flux Control
View
PDF
for article titled, Thermal Display for Telepresence Based on Neural Identification and Heat Flux Control
We present a new approach for thermal rendering in telepresence which improves transparency; it aims at reaching, as closely as possible, what is experienced in similar direct touch conditions. Our method is based on a neural networks learning classifier that allows generating appropriate thermal values (i.e., time trajectories) used as desired inputs of two independent controllers: the one controlling a bio-inspired remote thermal sensing device (i.e., an artificial finger), and the other one controlling the user's thermal display. To do so, two databases are built from real measurements recorded during direct contact between the operator's finger and different materials. One database is used for training a classifier to be used in online identification of the material being remotely explored; the other is used to generate desired thermal trajectories for the previously evoked control loops. The learning bloc is based on principal component analysis and a feed-forward neural network. Experimental tests validating our method in different scenarios have been carried out; the obtained results are discussed.