ModGrasp
A Wave Simulator and Active Heave Compensation Framework
JOpenShowVar, a communication interface to Kuka robots

Best Oral Presentation at Session Robots and Artificial Intelligence, ICICT22

I am very glad to announce that our work titled "Mixed reality (MR) Enabled Proprio and Teleoperation of a Humanoid Robot for Paraplegic Patients" has won the Best Oral Presentation at Session 5 "Robots and Artificial Intelligence" at the 5th International Conference on Information and Computer Technologies (ICICT), New York City (virtual), United States, March 4-6, 2022.

The awarded paper is: Filippo Sanfilippo, Jesper Smith, Sylvain Bertrand and Tor Halvard Skarberg Svendsen. Mixed reality (MR) Enabled Proprio and Teleoperation of a Humanoid Robot for Paraplegic Patients. In Proc. of the 5th International Conference on Information and Computer Technologies (ICICT), New York City (virtual), United States, March 4-6, 2022.

I really thank all my co-authors and I congratulate all of them for their contribution.

Mixed reality (MR) Enabled Proprio and Teleoperation of a Humanoid Robot for Paraplegic Patients

A new paper has been presented in the Proc. of the 5th International Conference on Information and Computer Technologies (ICICT), New York City (virtual), United States, March 4-6, 2022. The selected article is:

Filippo Sanfilippo, Jesper Smith, Sylvain Bertrand and Tor Halvard Skarberg Svendsen. Mixed reality (MR) Enabled Proprio and Teleoperation of a Humanoid Robot for Paraplegic Patients. In Proc. of the 5th International Conference on Information and Computer Technologies (ICICT), New York City (virtual), United States, March 4-6, 2022.

The work received good feedback and triggered an interesting discussion. I thank my co-authors, and congratulate them for their contribution.

This work is supported by the Top Research Centre Mechatronics (TRCM), University of Agder (UiA), and by Halodi. Principal investigator: Filippo Sanfilippo. The pilot project is facilitated by Inger Holen at I4Helse as a collaboration between the TRCM, the student consulting company Young Industrial Innovators (Yi2) at UiA, the Assistive Technology Center at the Norwegian Labour and Welfare Administration (NAV), Dahlske High School, and MENY Krøgenes. The pilot project is supported by Digin, the largest Norwegian information and communications technology (ICT) cluster. The authors acknowledge the contribution of Andreas Eikin, who took part into the considered human subject study. The authors also acknowledge the contribution of Kristoffer Sand, Arminas Gronskis, Henning Blomfeldt Thorsen, and Emil Evnum.

AugmentedWearEdu presented at the Digital Skills for Education & Culture Workshop

Today, we had the opportunity to present the latest results of AugmentedWearEdu at the Digital Skills for Education & Culture Workshop. Thank you to all the team!

The main objective of AugmentedWearEdu is to introduce a novel framework for e-Learning consisting of including haptic experiences to enable digital access to laboratories in higher education. This will be achieved by combining both virtual reality (VR) and augmented reality (AR) tools with a novel generation of wearable haptic devices. This will make it possible to engage students in a hapto-audio-visual hands-on laboratory environment. In this project, we will evaluate which of the available haptic technologies are suitable for e-Lerning and may foster the students’ ability to create complex simulations using existing or in-world modelling techniques and scripting tools, while offering the functionality to link to the real world and capture data which can be visualised in real-time. Haptics, VR and AR tools will be adopted either from our ongoing research activity or from various low-cost commercial off-the-shelf (COTS) tools. In this way, an innovative educational and research loop will also be established. This approach will contribute towards the achievement of fully-immersive, open and distance laboratory learning.

Filippo Sanfilippo