WiMi Hologram Cloud Develops A 3D Reconstruction Algorithm System Based on Image and Point Cloud Fusion
WiMi Hologram Cloud Inc., a leading global Hologram Augmented Reality (AR) Technology provider, announced the development of a 3D reconstruction algorithm system based on image and point cloud fusion.
3D reconstruction refers to the establishment of a mathematical model of 3D objects suitable for computer representation and processing, which is the basis for processing, manipulating, and analyzing 3D objects in a computer environment, and is also a key technology for expressing the objective world of virtual reality in a computer. WiMi has developed a 3D reconstruction algorithm system based on image and point cloud fusion, extracting point cloud information from multiple images and fusing the obtained point cloud information for data alignment. This method utilizes the advantageous features of both knowledge and provides accurate and rich information on realistic sense models.
Recommended AI: QuickLogic Drives eFPGA Innovation with New Aurora Development Tool Suite
WiMi’s system consists of three main parts:
- Data acquisition program, which is used to obtain the target model multi-view point cloud and image information;
- Information extraction program, which can achieve feature extraction and matching, camera matrix calculation, and dense reconstruction for sequenced images;
- The point cloud processing program performs point cloud alignment and heterogeneous source point cloud fusion. It can pre-compute the theoretical transfer matrix based on the loaded viewpoint information. The motion information extracted from the image is used to assist the point cloud for alignment and the fusion of heterogeneous point clouds.
Point cloud alignment aims to align and match two point clouds of the same target acquired in different orientations in the same environment to obtain the complete environmental model information. In the alignment process, the system acquires the initial information by simultaneously acquiring multi-viewpoint clouds and images of the environmental target.
After obtaining the initial information, the system calculates the image sequence to get the motion information and, after dense reconstruction, obtains the environment 3D point cloud. The system simultaneously completes the calibration of the scale factor in this process. Then the final environment point cloud model is obtained after fusing the point clouds again, and the 3D scene model is obtained after surface reconstruction.
Recommended AI: Philips Speech and Sembly AI Launch SmartMeeting As Answer To New Meeting Culture
The information extraction stage mainly involves feature extraction, feature point matching, and camera external parameter calibration techniques. The external parameter calibration of the camera is primarily obtained by mathematically solving the information using the extracted feature points, and the number and accuracy of the feature point pairs are the basis of the subsequent reconstruction process. After acquiring or fusing the point cloud data, the system needs to adopt appropriate algorithms to pre-process the acquired point cloud data and then perform surface reconstruction on the processed point cloud. And the dense reconstruction is to perform dense matching based on sparse point clouds to utilize the image information further to obtain rich point clouds.
The development of 3D scene reconstruction technology has become a vital computer technology. The technology has been able to reconstruct multiple objects, significant targets, and outdoor scenes, and its acquisition and fusion of color information and spatial distance information of 3D target scenes have been very accurate. WiMi will continue to expand the application of its 3D reconstruction algorithm system based on image and point cloud fusion to build a bridge between natural scenes and virtual space and help shape the real digital world.
Recommended AI: UTB Bot Unveils a New Way to Leverage Automation and Cryptocurrencies
[To share your insights with us, please write to sghosh@martechseries.com]
Comments are closed.