A PatchMatch-based Approachfor Matte Propagation in Videos

  • Marcos H. Backes UFRGS
  • Manuel M. Oliveira Neto UFRGS

Resumo


This thesis presents a temporally-coherent matte-propagation method for videos based on PatchMatch and edge-aware filtering. Given an input video and trimaps for a few frames, including the first and last, our approach generates alpha mattes for all frames of the video sequence. We also present a user scribble-based interface for video matting that takes advantage of the efficiency of our method to interactively refine the matte results. We perform quantitative comparisons against the state-of-the-art sparse-input video matting techniques and show that our method produces significantly better results according to three different metrics. We also perform qualitative comparisons against the state-of-the-art dense-input video matting techniques and show that ours produces similar quality results while requiring less than 7% of their user input.

Palavras-chave: Image/Video Editing, Alpha Matting, Video Matting, Video Compositing

Referências

Backes, M. H. and Oliveira, M. M. (2019). A patchmatch-based approach for matte propagation in videos. Computer Graphics Forum, 38(7):651-662.

Bai, X., Wang, J., Simons, D., and Sapiro, G. (2009). Video SnapCut: Robust Video Object Cutout Using Localized Classifiers. ACM TOG, pages 70:1-70:11.

Bao, L., Yang, Q., and Jin, H. (2014). Fast Edge-Preserving PatchMatch for Large Displacement Optical Flow. IEEE Transactions on Image Processing, pages 4996-5006.

Barnes, C., Shechtman, E., Finkelstein, A., and Goldman, D. B. (2009). PatchMatch: A Randomized Correspondence Algorithm for Structural Image Editing. ACM TOG, pages 24:1-24:11.

Cao, G., Li, J., Chen, X., and He, Z. (2019). Patch-based self-adaptive matting for high-resolution image and video. Vis Comput, pages 133-147.

Erofeev, M., Gitman, Y., Vatolin, D., Fedorov, A., and Wang, J. (2015). Perceptually Motivated Benchmark for Video Matting. In BMVC.

Gastal, E. S. L. and Oliveira, M. M. (2010). Shared Sampling for Real-Time Alpha Matting. Computer Graphics Forum, pages 575-584.

Gastal, E. S. L. and Oliveira, M. M. (2011). Domain Transform for Edge-aware Image and Video Processing. In ACM SIGGRAPH 2011 Papers, pages 69:1-69:12.

Karacan, L., Erdem, A., and Erdem, E. (2017). Alpha Matting With KL-Divergence-Based Sparse Sampling. IEEE Transactions on Image Processing, pages 4523-4536.

Lang, M., Wang, O., Aydin, T., Smolic, A., and Gross, M. (2012). Practical Temporal Consistency for Image-based Graphics Applications. ACM TOG., pages 34:1-34:8.

Levin, A., Lischinski, D., and Weiss, Y. (2008). A Closed-Form Solution to Natural Image Matting. IEEE TPAMI 2019, pages 228-242.

Li, D., Chen, Q., and Tang, C. (2013). Motion-Aware KNN Laplacian for Video Matting. In 2013 IEEE International Conference on Computer Vision, pages 3599-3606.

Wang, J., Bhat, P., Colburn, R. A., Agrawala, M., and Cohen, M. F. (2005). Interactive Video Cutout. In ACM SIGGRAPH 2005 Papers, pages 585-594.

Xu, N., Price, B., Cohen, S., and Huang, T. (2017). Deep Image Matting. arXiv:1703.03872 [cs].

Zheng, Y. and Kambhamettu, C. (2009). Learning based digital matting. In 2009 IEEE 12th International Conference on Computer Vision, pages 889-896.

Zou, D., Chen, X., Cao, G., and Wang, X. (2019). Unsupervised Video Matting via Sparse and Low-Rank Representation. IEEE TPAMI 2019, pages 1-1.
Publicado
30/06/2020
Como Citar

Selecione um Formato
BACKES, Marcos H.; OLIVEIRA NETO, Manuel M.. A PatchMatch-based Approachfor Matte Propagation in Videos. In: CONCURSO DE TESES E DISSERTAÇÕES (CTD), 33. , 2020, Cuiabá. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2020 . p. 43-48. ISSN 2763-8820. DOI: https://doi.org/10.5753/ctd.2020.11367.