Our Mission

Develop innovative artificial intelligence and machine learning methods to improve positioning technologies and environment perception systems supporting the development of driver assistance systems for buses.

KI ChatGPT 15-05-2024 1024x591

Our Objectives

Produce a TRL-7 prototype supporting ADAS systems and providing the following features

Accurate and Robust

A highly precise and robust system for position and attitude determination, utilising a variety of sensors such as GNSS, Galileo differentiators (high precision and OSNMA services), inertial sensors, odometer, lidar and cameras.

Environment Perception

3D bounding boxes for surrounding objects to provide detailed information on object dynamics and improve safety measures.

Situational Awareness

Recognition and classification of traffic lights to improve situational awareness and decision making.

Geo-referenced Maps

Mapping system that uses cameras and lidars to create 3D maps (based on point clouds) and RGB maps. These maps support re-localisation (3D maps) and lane detection (RGB).

Technologies and Solutions

One of the key points of this project is that AI/ML techniques for GNSS, inertial and visual/LIDAR sensors are combined in a very well-balanced way. In this way, the advantages of all these sensors are utilised and their limitations mitigated, resulting in an accurate, highly available and resilient overall solution.

Combining multiple complementary sensing modalities allows for covering the shortcomings of individual sensors, which is particularly critical in the context of autonomous driving, where a failure of one sensor can have lethal or significant monetary consequences in case it is not properly covered by another redundant sensor. 

Precise Mapping

LIDAR and Visual Absolute Positioning

Moving Objects Detection and Removal

3D Object Detection

Spoofing Detection

AI-Based IAR

AI for Detecting strong MP and NLOS signals

Calibration and Denoising of an IMU with AI

Learning of scenario-based measurement weights

Target Application

Driving aids refers to a variety of technologies and systems that are implemented in modern vehicles to improve driving safety, prevent accidents and increase driving comfort. These technologies can monitor various aspects of driving behaviour, such as speed, vehicle position, environment and vehicle dynamics.

This project focuses on safety-critical applications that require both high-precision positioning and environment perception features, either from the sensors on board the ego vehicle or from the information obtained from a geo-referenced map.

Red Light Warning​

To determine if a Red-Light Warning alert is to be issued, a traffic light detector and classification will be developed. In the event of a red-light situation, the system will assess the need to issue a warning based on the bus position (distance to the traffic light) and velocity, as well as the type of vehicle – information obtained from configuration. 

Curve Speed Warning​

The curve speed warning will rely on the high-accurate and reliable position and velocity provided by the sensor fusion positioning system and the geo-referenced map.

Wrong Way driving

The Wrong-Way Driving Detection Warning (WWDW) will leverage on the precise attitude information provided by the sensor fusion positioning system and the geo-referenced maps – note that the project generated maps will contain information about the correct way of driving based on the bus attitude information at map data recording time. 

Collision Avoidance​

Collision Avoidance system, the motion of the surrounding objects will be estimated based on the 3D bounding boxes algorithm.  

With this information and the bus position, velocity and attitude, a collision detector system will be developed. In the event of a collision risk, a warning alert will be prompted according to the format defined at specification time.