This project focuses on Combining Lidar, Multibeam, Radar and Camera Survey Data collected with Kongsberg’s autonomous surface vehicle (ASV) Sounder.
The team will be hands on with some of the worlds most modern sensory system doing system collecting, analysing and presenting data from seabed and land.
USV Sounder comes equipped with some of the most advanced sensory equipment available today. Sensory inputs include an EM Mulitbeam, a lidar laser system, a FLIR camera, a SIMRAD HALO radar, water sound velocity measurement systems as well as a full spectrum of motion, position and velocity sensory system. The team will need to process data from this sensor package and present a unified full picture solution for combining lidar and multibeam data on an autonomous vehicle, and control and interface the autonomous vehicle.
Object recognition and tagging
The team will need to develop machine learning software that can recognize objects in the dataset and automatically tag them with a likely object identification tag such as “Subsea Wreck”, “Car”, “bridge” etc. This metadata needs to be made available and presented with the 3D rendered dataset.
The team will need to collect, analyse and present the data in Kongsberg’s Mapping Cloud solution as a 3D rendered point cloud with object detection tags and a simple user interface.
Example from another ongoing Kongsberg project with online presentation: https://friskoslofjord.kognif.ai/proteus/projects/string