We have been working on the combination of range or Time of Flight sensors with ordinary cameras. We use systems comprising one TOF camera, plus a binocular pair of RGB cameras. We also work with configurations of several such systems, as shown below. Data is available at the MixCam site.
The TOF cameras provide the scene structure, but no colour data. We would like to apply textures from the RGB cameras to TOF-based scene-models. However, the viewpoints of the three devices are different, and so this is not straightforward.
We match points between the left and right RGB views of a calibration data-set. This gives a projective (i.e. distorted) reconstruction of the scene. We extract a second 3D representation from the TOF data. We can now find the 3D transformation between the TOF and stereo systems, which enables us to project any 3D TOF point into the RGB images.
A 3D projective mapping, between the binocular and TOF reconstructions, is estimated. The RGB camera matrices can now be transformed, so that the 3D TOF points project correctly to the colour images, as shown below.
Gaps in the reprojection correspond to regions that are occluded from the TOF camera, and to very scattering surfaces (e.g. hair). Please see the following publication for details: