Seoul Robotics Launches First 3D Perception Software With Deep Learning
3D computer vision company sets industry standard for accuracy and reliability with new version of its proprietary software, SENSR
Seoul Robotics, a 3D computer vision company using AI and deep learning to power the future of autonomy, introduced the most advanced version of its patented 3D perception software. Building on the success of Seoul Robotics’ proprietary technology, this update allows organizations and institutions to glean industry-leading insights with less input. The latest iteration of the SENSR™ software can detect objects that are partially obstructed, fast-moving, or clustered together, in addition to providing unrivaled classification of bicycles, vehicles, and pedestrians.
Recommended AI News: BORN Group Establishes Philippine Office
Deep learning is pushing the limit of what is possible in the LiDAR industry, enabling unprecedented perception accuracy. While other 3D computer vision software relies on machine learning and rule-based systems, Seoul Robotics now uses deep learning to track more than 500 objects simultaneously and with an accuracy range of within 10 cm—which has yet to be achieved with rule-based systems. SENSR 2.2 also includes weather-filtering AI, allowing the software to track and detect objects even in severe weather conditions including heavy rain and snow. Because of its extreme versatility and accuracy, SENSR 2.2 is currently deployed by Seoul Robotics across the United States as well as in Japan, Korea, and numerous other countries.
“The introduction of deep learning into 3D perception software may be one of the last show-stopping enhancements in the LiDAR industry. Historically, the focus has been on advancing the LiDAR sensors themselves, but that’s changing. Moving forward, there will be heavy investments in 3D perception software that interprets the data into actionable solutions,” said HanBin Lee, CEO of Seoul Robotics. “The introduction of SENSR 2.2 is accelerating the adoption of solutions that will fuel autonomy across the globe.”
SENSR 2.2 is sensor-agnostic and compatible with over 75 different types of 3D sensors currently on the market today, including LiDAR, 3D cameras, and imaging radar. SENSR 2.2 brings heightened accuracy to a range of solutions, such as smart intersections, wrong-way detection, speeding, smart railroad crossings, crowd management, and smart retail. Seoul Robotics is rapidly expanding globally and has current partnerships with several top-tier organizations including BMW, Mercedes-Benz, Chattanooga Department of Transportation, Emart, and many others.
“Since we deployed Seoul Robotics’ technology into our smart city solutions we have seen an increase in our operational efficiencies and improvements in overall safety of our community,” said Kevin Comstock, smart city director for the City of Chattanooga. “Seoul Robotics has specifically helped the City of Chattanooga seamlessly monitor pedestrian traffic, and we are currently gathering data that will inform future capabilities of wrong-way detection. These efforts are saving money for the city, travel time for local residents, and–most importantly–lives.”
Seoul Robotics will be showcasing SENSR 2.2 during the IAA Mobility Conference on Sept. 7-12, 2021 in Munich, Germany. Stop by the booth in Hall B3.A75 to learn more about SENSR 2.2 and the company’s expansive portfolio of turnkey LiDAR solutions for automotive, security, smart cities, transportation infrastructure, crowd analytics, IoT, and industrial applications.
Recommended AI News: Achieve Partners Backs Cybersecurity Platform To Tackle Global Talent Shortage