fbpx

Technology

Autonomous Driving System

Our unique approach to autonomous vehicle technology, combining remote driving (teleoperation) and our autonomous driving system (ADS), has allowed us to offer commercial services to key players in retail and logistics on public roads for over three years.

Autonomous Driving System

Our unique approach to autonomous vehicle technology, combining remote driving (teleoperation) and our autonomous driving system (ADS), has allowed us to offer commercial services to key players in retail and logistics on public roads for over three years.

An Automated Self-Driving System you can have confidence in:

We monitor our robots and gather data daily, constantly improving and learning from real traffic situations and feedback from our test team.

Clevon's Level 4 Autonomous System on Public Roads

How our Autonomous Driving System works

Mapping & Routing

Utilizing our teleoperation technology, we collect data and premap cities before deploying our Autonomous Driving mode. Then, coupling our recorded data with public API data, our routing algorithm finds the most optimal path and starts its journey.

Self-Perception

Using the robot's pre-defined model, characteristics, GNSS and IMU sensors, we globally localize it and estimate its current state.

Environmental Perception

Utilizing multiple deep neural networks, fusing our camera and radar information, we detect and identify the dynamic environment.

Prediction & Planning

Based on the collected data, current situation, and traffic laws, the robot predicts the movement of the surrounding objects. Then, taking the predictions into account, a collision-free and lawful plan that includes the robots' target trajectory and speed is generated.

Control

Based on our robots received information, the acceleration and steering angle are calculated and executed after the newly formulated plan.

The Clevon Sensor Suite

Camera & Radar

Our camera system provides a high-resolution 360° field of view around the robot. The front radar enriches this view by adding depth and velocity. Along with our radar, image object detection gives the robot carrier perception of its surrounding environment to ensure collision avoidance. Our software is adaptable for multiple radars and camera setups.

GNSS & IMU

GNSS and RTK technology gives our robot centimeter-precision positioning. An IMU provides the vehicle with high-frequency acceleration and rotation speed data resulting in a layer of redundancy and data density for the robot's self-estimation algorithm, which handles GNSS signal blackouts.

Before After

An Automated Driving Systems safety kit you can have confidence in:

We monitor our robot carriers and gather data daily, constantly improving and learning from real traffic situations and feedback from our test team.

Clevon's Self-Driving System on Public Roads

How our Autonomous Driving System works

Mapping & Routing

Utilizing our teleoperation technology, we collect data and premap cities before deploying our Autonomous Driving mode. Then, coupling our recorded data with public API data, our routing algorithm finds the most optimal path and starts its journey.

Self-Perception

Using the vehicle’s pre-defined model, characteristics, GNSS and IMU sensors, we globally localize it and estimate its current state.

Environmental Perception

Utilizing multiple deep neural networks, fusing our camera and radar information, we detect and identify the dynamic environment.

Prediction & Planning

Based on the collected data, current situation, and traffic laws, the robot predicts the movement of the surrounding objects. Then, taking the predictions into account, a collision-free and lawful plan that includes the vehicle's target trajectory and speed is generated.

Control

Based on our robots received information, the acceleration and steering angle are calculated and executed after the newly formulated plan.

The Clevon Sensor Suite

Camera & Radar

Our camera system provides a high-resolution 360° field of view around the robot. The front radar enriches this view by adding depth and velocity. Along with our radar, image object detection gives the robot carrier perception of its surrounding environment to ensure collision avoidance. Our software is adaptable for multiple radars and camera setups.

GNSS & IMU

GNSS and RTK technology gives our robot centimeter-precision positioning. An IMU provides the vehicle with high-frequency acceleration and rotation speed data resulting in a layer of redundancy and data density for the robot's self-estimation algorithm, which handles GNSS signal blackouts.

Before After

Frequently Asked Questions

Frequently Asked Questions

How does our approach for autonomy differ from others?

Our approach to autonomous driving is gradual and methodical. First, we developed and polished our teleoperation functionality, so the ARC can safely operate autonomously on public roads while always having a fallback. Then, using that real traffic experience, we gradually raise the autonomy level and increase the number of vehicles one teleoperator can manage at once.

What level of autonomy have we achieved?

Our autonomous robot carriers can traverse mapped areas by themselves under the full supervision of the teleoperator. It stops when there is a danger of collision and notifies the operator to take over the steering when there is a situation the vehicle cannot handle on its own.

What types of objects can our system detect?

Our perception system is under continuous development, and together with the evolution of technology and science, the performance and key parameters of the system will get even better over time. Our robot can perceive all pre-mapped static objects, such as driveable space, roads, lanes, buildings, traffic lights, -signs, and the dynamic environment like pedestrians, vehicles, and animals.

Does our sensor suite include lidar?

No. The current sensor suite covers the majority of our needs at this moment. High-performance lidars are costly and sensitive to different weather conditions. We aim to offer businesses an affordable delivery solution, so we keep manufacturing costs low while still providing safety and keeping the vehicles operational in all weather conditions year-round.

How does our approach for autonomy differ from others?

Our approach to autonomous driving is gradual and methodical. First, we developed and polished our teleoperation functionality, so the ARC can safely operate autonomously on public roads while always having a fallback. Then, using that real traffic experience, we gradually raise the autonomy level and increase the number of vehicles one teleoperator can manage at once.

What level of autonomy have we achieved?

Our autonomous robot carriers can traverse mapped areas by themselves under the full supervision of the teleoperator. It stops when there is a danger of collision and notifies the operator to take over the steering when there is a situation the vehicle cannot handle on its own.

What types of objects can our system detect?

Our perception system is under continuous development, and together with the evolution of technology and science, the performance and key parameters of the system will get even better over time. Our robot can perceive all pre-mapped static objects, such as driveable space, roads, lanes, buildings, traffic lights, -signs, and the dynamic environment like pedestrians, vehicles, and animals.

Does our sensor suite include lidar?

No. The current sensor suite covers the majority of our needs at this moment. High-performance lidars are costly and sensitive to different weather conditions. We aim to offer businesses an affordable delivery solution, so we keep manufacturing costs low while still providing safety and keeping the vehicles operational in all weather conditions year-round.

Is CLEVON 1 the right fit for your company?
Is CLEVON 1 the right fit for your company?