Find in Library
Search millions of books, articles, and more
Indexed Open Access Databases
Data-Driven Variable Synthetic Aperture Imaging Based on Semantic Feedback
oleh: Congcong Li, Jing Li, Yanran Dai, Tao Yang, Yuguang Xie, Zhaoyang Lu
Format: | Article |
---|---|
Diterbitkan: | IEEE 2019-01-01 |
Deskripsi
Synthetic aperture imaging, which has been proved to be an effective approach for occluded object imaging, is one of the challenging problems in the field of computational imaging. Currently most of the related researches focus on fixed synthetic aperture which usually accompanies with mixed observation angle and foreground de-focus blur. But the existence of them is frequently a source of perspective effect decrease and occluded object imaging quality degradation. In order to solve this problem, we propose a novel data-driven variable synthetic aperture imaging based on semantic feedback. The semantic content we concerned for better de-occluded imaging is the foreground occlusions rather than the whole scene. Therefore, unlike other methods worked on pixel-level, we start from semantic layer and present a semantic labeling method based on feedback. Semantic labeling map deeply mines visual data in synthetic image and preserves the semantic information of foreground occluder. On the basis of semantic feedback strategy, semantic labeling map will conversely pass to synthetic imaging process. The proposed data-driven variable synthetic aperture imaging contains two levels: one is adaptive changeable imaging aperture driven by synthetic depth and perspective angle, the other is light ray screening driven by visual information in semantic labeling map. On this basis, the hybrid camera view and superimposition of foreground occlusion can be removed. Evaluations on several complex indoor scenes and real outdoor environments demonstrate the superiority and robustness performance of our proposed approach.