Abstract:
We aim to develop a system that is capable of capturing the depth information of ambient environment, and reproduce it with a custom designed haptic feedback device based on the concept of gyroscope. There will be major component of this system: a Microsoft Kinect sensor that is responsible of taking the 3D depth image of the environment, and a ungrounded gyroscopic haptic device to render feedback forces. This system will enable people to record the current environment as a snapshot of 3D point cloud in virtual space, and experience it sometime later.
Materials:
- Microsoft Kinect sensor, Kinect SDK, Point Cloud Library,
- Motor controlled gyroscope and it’s software.
References:
- Rydén, F., et al. (2011). Using kinect and a haptic interface for implementation of real-time virtual fixtures. Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS 2011).
- Winfree, K. N., et al. (2009). A high fidelity ungrounded torque feedback device: The iTorqU 2.0. EuroHaptics conference, 2009 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2009. Third Joint, IEEE.