Computational Hydrographic Printing

ACM SIGGRAPH 2015


Yizhong Zhang1   Chunji Yin1   Changxi Zheng2   Kun Zhou1

1Zhejiang University     2Columbia University

Abstract

Hydrographic printing is a well-known technique in industry for transferring color inks on a thin film to the surface of a manufactured 3D object. It enables high-quality coloring of object surfaces and works with a wide range of materials, but suffers from the inability to accurately register color texture to complex surface geometries. Thus, it is hardly usable by ordinary users with customized shapes and textures.

We present computational hydrographic printing, a new method that inherits the versatility of traditional hydrographic printing, while also enabling precise alignment of surface textures to possibly complex 3D surfaces. In particular, we propose the first computational model for simulating hydrographic printing process. This simulation enables us to compute a color image to feed into our hydrographic system for precise texture registration. We then build a physical hydrographic system upon off-the-shelf hardware, integrating virtual simulation, object calibration and controlled immersion. To overcome the difficulty of handling complex surfaces, we further extend our method to enable multiple immersions, each with a different object orientation, so the combined colors of individual immersions form a desired texture on the object surface. We validate the accuracy of our computational model through physical experiments, and demonstrate the efficacy and robustness of our system using a variety of objects with complex surface textures.

Links

Video

Citation

bibtex Yizhong Zhang, Chunji Yin, Changxi Zheng and Kun Zhou, Computational Hydrographic Printing, ACM Transactions on Graphics 34(4) (Proc. SIGGRAPH 2015), Aug, 2015.

Acknowledgements

We thank the anonymous reviewers for their constructive feedback, the authors of [Batty et al. 2012] for sharing their viscous sheet simulation code, and Margaret Qian for recording the voice-over for the video. This research was supported in part by NSFC (No. 61272305), the National Program for Special Support of Eminent Professionals of China, National Science Foundation (CAREER-1453101) as well as generous gifts from Intel. Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of funding agencies or others.