Project Abstract

Abstract:

We aim to develop a system that is capable of capturing the depth information of ambient environment, and reproduce it with a custom designed haptic feedback device based on the concept of gyroscope. There will be major component of this system:  a Microsoft Kinect sensor that is responsible of taking the 3D depth image of the environment, and a ungrounded gyroscopic haptic device to render feedback forces. This system will enable people to record the current environment as a snapshot of 3D point cloud in virtual space, and experience it sometime later.

Materials:

  1. Microsoft Kinect sensor, Kinect SDK, Point Cloud Library,
  2. Motor controlled gyroscope and it’s software.

References:

  1. Rydén, F., et al. (2011). Using kinect and a haptic interface for implementation of real-time virtual fixtures. Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS 2011).

(http://mobilerobotics.cs.washington.edu/rgbd-workshop-2011/camera_ready/ryden-rgbd11-kinect-haptics.pdf)

  1. Winfree, K. N., et al. (2009). A high fidelity ungrounded torque feedback device: The iTorqU 2.0. EuroHaptics conference, 2009 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2009. Third Joint, IEEE.
Project Abstract

Leave a comment