Motion Deblurring Using Hybrid Imaging |
 | Motion blur due to camera motion can significantly degrade the quality of
an image. Since the path of the camera motion can be arbitrary, deblurring of
motion blurred images is a hard problem. Previous methods to deal with this
problem have included blind restoration of motion blurred images, optical
correction using stabilized lenses and special CMOS sensors that limit the
exposure time in the presence of motion. In this project, we exploit the
fundamental trade-off between spatial resolution and temporal resolution to
construct a hybrid camera that can measure its own motion during image
integration. The acquired motion information is used to compute a point spread
function (PSF) that represents the path of the camera during integration. This
PSF is then used to deblur the image. To verify the feasibility of hybrid
imaging for motion deblurring, we have implemented a prototype hybrid camera.
This prototype system was evaluated in different indoor and outdoor scenes
using long exposures and complex camera motion paths. The results show that,
with minimal resources, hybrid imaging outperforms previous approaches to the
motion blur problem. |
Publications
"Motion-based Motion Deblurring," M. Ben-Ezra and S.K. Nayar, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.26, No.6, pp.689-698, Jun, 2004. [PDF] [bib] [©]
"Motion Deblurring using Hybrid Imaging," M. Ben-Ezra and S.K. Nayar, IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Vol.I, pp.657-664, Jun, 2003. [PDF] [bib] [©]
|
Images
 |
|
Hybrid Imaging:
The hybrid camera prototype used in the project is a rig of two cameras. The
primary system consists of a 3M pixel Nikon CoolPix camera equipped with a 6
Kenko zoom lens. The secondary system is a Sony DV camcorder. The Sony camera
images were reduced in size to simulate a low-resolution camera.
|
| |
|
|
 |
|
Motion Blurred Primary Image:
This image was captured by the primary system. The image exposure time was one
second and the focal length was 633mm (35mm equivalent).
|
| |
|
|
 |
|
Point Spread Function:
This PSF was computed from the image sequence that was captured by the
secondary system. The X and Y axes span the spatial motion of the camera, while
the color indicates the duration of image integration at each point (the speed
of the moving camera).
|
| |
|
|
 |
|
Motion Deblurring:
This image was computed using the captured primary image and the computed PSF.
It is the final output of the hybrid imaging system.
|
| |
|
|
 |
|
Motion Blurred Primary Image:
This image was captured by the primary system. The image exposure time was one
second and the focal length was 633mm (35mm equivalent).
|
| |
|
|
 |
|
Point Spread Function:
This PSF was computed from the image sequence captured by the secondary
system. The X and Y axes span the spatial motion of the camera, while the color
indicates the duration of image integration at each point (the speed of the
moving camera).
|
| |
|
|
 |
|
Motion Deblurring:
This image was computed using the captured primary image and the computed PSF.
It is the final output of the hybrid imaging system. The magnified image
regions show details.
|
| |
|
|
|
Videos
If you are having trouble viewing these .mpg videos in your browser, please save them to your computer first (by right-clicking and choosing "Save Target As..."), and then open them.
 |
|
Low Resolution Secondary Video:
This is the image sequence captured by the secondary camera.
|
| |
|
|
 |
|
Low Resolution Secondary Video:
This is the image sequence captured by the secondary camera.
|
| |
|
|
|
Slides
CVPR 2003 presentation     With videos (zip file)
|
Coded Rolling Shutter Photography: Flexible Space-Time Sampling
Super-Resolution: Jitter Camera
|
|
|