Skip to content
Adam Gotlib edited this page Oct 19, 2018 · 5 revisions

Selfie autonomous car

Selfie is the student project of autonomous cars. Vehicles based on 1:10 scale RC cars are customized to be able to operate autonomously in simulated road environments. They are equipped with camera, computer vision computing unit, controller and set of sensors like magnetic encoders, distance sensors and IMU.

This page contains only basic informations, more details will hopefully occur in finite time :)

Version 1.0 - Selfie

Software released as version 1.0 in first ready-to-go robot Selfie. It was prepared specifically for Carolo-Cup 2018 competition. It is fullfilling all the competition requirements, so basically it is able to perform:

  • road lane tracking,
  • collision avoiding,
  • overtaking maneuver
  • parallel parking
  • intersection handling
  • proper light signalization

Version 2.0 - Selfie 2.0

Software released as version 2.0 was onboarded in the brand new vehicle built for International Autonomous Robot Racing Competition (IARRC) 2018 in Waterloo, Canada. Selfie won first prize outperforming in both Drag and Circut Races. The vehicle was able to reach the speed of 12 m/s in stable Drag Race. Picture presents the car in development (without bodywork).

The IARRC competiton consists of project presentation part (written report, video, oral presentation) and racing part. Racing events are running outdoors. Track is marked with white and yellow lines and cones. In Drag Race cars are competing on 60 meters of straight track. In Curcuit Race cars have to drive about 1000 meters (3 laps of about 330 meters).

The main challenges to overcome were start up light detection, high-speed line detection and trajectory corrections, collision avoidance and varying lumininance conditions (areas covered with sun light or fully or partially shadowed).

Version 3.0 - Selfie 3.0

Preparations for the F1/10 competition in Fall 2018 resulted in creation not only of the next vehicle, but also completely new software stack, this time based on the ROS platform.

The goal of the competition was to design a system capable of autonomously driving a race track in an indoor setup. The resulting solution was navigating using LIDAR and gyroscope+encoders based odometry in combination with previously generated map, doing real-time localization with the particle filter algorithm. It also provided an ability to do the first drive based on LIDAR only to create the map using SLAM approach.

Clone this wiki locally