Rain Rendering Pipeline
 

The figure above show the main steps of image-based algorithm used for inserting rain in images and videos. The user provides an image or a video of the scene, a coarse depth map of the scene, the properties of the light sources, and the camera and rain parameters. The algorithm uses a particle system for creating the rain distribution and the drop trajectories. Then, it uses textures from our rain streak database to render novel rain streaks. Novel streaks are rendered for each light source. The box on the right shows details of the steps involved for each source. These steps account for the effects of drop size, distances from light sources and the camera, and the properties of the light source. The streak textures due to individual light sources and the ambient light texture are added to obtain the final streak texture. These final textures are then scaled and rotated to account for perspective effects and blurred and cropped to account for camera defocus and exposure time.