The visual effects of rain are complex. Rain consists of spatially distributed drops falling at high velocities. Each drop refracts and reflects the environment, producing sharp intensity pattern in an image. A group of such falling drops creates a complex time-varying signal in videos. In addition, due to the finite exposure time of the camera, intensities due to rain are motion blurred and hence depend on the background intensities. Thus, the visual manifestations of rain are a combination of both the dynamics of rain and the photometry of the environment. In this project, we have conducted a comprehensive analysis of the visual effects of rain on an imaging system. We have developed a correlation model that captures the dynamics of rain and a physics-based motion blur model that captures the photometry of rain. Based on these models, we have developed simple and efficient algorithms for detecting and removing rain from videos. The effectiveness of our algorithms is demonstrated via experiments on videos of complex scenes with moving objects and time-varying textures. The techniques presented here can be used in a wide range of applications including video surveillance, vision based navigation, video/movie editing and video indexing/retrieval.