First working demo under MATLAB

Current status:

We are able to capture depth image from Kinect sensor with ~30Hz frame rate, and use Novint Falcon to let people feel the image. Right now the force feedback is open-loop, which means it can only render the feedback force base on the distance between the Falcon virtual pointer and the depth image, without taking the rendered force from the last moment into account (close-loop). Since the MATLAB package for Falcon is limited in terms of explicitly programming the rendered force, we will need to look into its official C++ library, which will allow more flexibility.

Other directions of improvement can be done in image smoothing and collision detection. To be specific, the current depth image have two types of undetected space which are shown in black: type A is undetected because it’s too close to the sensor, type B is undetected because of the area can not be detected by both of the stereo cameras, so that disparity map can not be calculated. To solve this, we may try image smoothing as a brute-force solution, or even breaking these two cases down and tackle them separately.

Right now the collision is defined as a spring-like mechanism, however according to Hooke’s law, the slope k can be changed to represent different types of surface: soft, hard or something in between, it may also be interesting to try some non-linear transfer function.

First working demo under MATLAB

MATLAB for Kinect V1&Novint Falcon

Here we used the following links as the reference to setup MATLAB environment for Kinect V1 and Novint Falcon:

MATLAB:

1. Installing the support package:

http://www.mathworks.com/help/imaq/installing-the-kinect-for-windows-sensor-support-package.html

2. Some demos for acquiring image data:

http://www.mathworks.com/help/imaq/acquiring-image-and-skeletal-data-using-the-kinect.html

Novint Falcon:

Tutorial – Using Novint Falcon with Matlab

MATLAB for Kinect V1&Novint Falcon

Project Abstract Updated

According to the instructor’s comment, we revised out project abstract as follows:

Abstract:

We aim to develop a system that is capable of capturing the depth information of ambient environment, and reproduce it with a commercially available haptic feedback device Novint Falcon. Those will be major components of this system:  a Microsoft Kinect sensor that is responsible of taking the 3D depth image of the environment, and a grounded 3DOF haptic device Novint Falcon to render feedback forces. This system will enable people to record the current environment as a snapshot of 3D point cloud in virtual space, and experience it sometime later.

Materials:    Microsoft Kinect V1 sensor and SDK, Novint Falcon and SDK, MATLAB

References:

1. Rydén, F., et al. (2011). Using kinect and a haptic interface for implementation of real-time virtual fixtures. Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS 2011).

2. Cha, J., et al. (2008). Dibhr: depth image-based haptic rendering. Haptics: Perception, Devices and Scenarios, Springer: 640-650.

Project Abstract Updated

Project Abstract

Abstract:

We aim to develop a system that is capable of capturing the depth information of ambient environment, and reproduce it with a custom designed haptic feedback device based on the concept of gyroscope. There will be major component of this system:  a Microsoft Kinect sensor that is responsible of taking the 3D depth image of the environment, and a ungrounded gyroscopic haptic device to render feedback forces. This system will enable people to record the current environment as a snapshot of 3D point cloud in virtual space, and experience it sometime later.

Materials:

  1. Microsoft Kinect sensor, Kinect SDK, Point Cloud Library,
  2. Motor controlled gyroscope and it’s software.

References:

  1. Rydén, F., et al. (2011). Using kinect and a haptic interface for implementation of real-time virtual fixtures. Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS 2011).

(http://mobilerobotics.cs.washington.edu/rgbd-workshop-2011/camera_ready/ryden-rgbd11-kinect-haptics.pdf)

  1. Winfree, K. N., et al. (2009). A high fidelity ungrounded torque feedback device: The iTorqU 2.0. EuroHaptics conference, 2009 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics 2009. Third Joint, IEEE.
Project Abstract