Information on individual educational components (ECTS-Course descriptions) per semester

  
Degree programme:Master Computer Science
Type of degree:FH MasterĀ“s Degree Programme
 Full-time
 Summer Semester 2021
  

Course unit titleAutonomous Systems: Perception
Course unit code024912020501
Language of instructionEnglish
Type of course unit (compulsory, optional)Compulsory optional
Semester when the course unit is deliveredSummer Semester 2021
Teaching hours per week4
Year of study2021
Level of course unit (e.g. first, second or third cycle)Second Cycle (Master)
Number of ECTS credits allocated6
Name of lecturer(s)Sebastian HEGENBART
Ralph HOCH
Robert MERZ


Prerequisites and co-requisites

024912010501 Autonomous Robots: Motion Fundamentals of probability theory

Course content

Part 1: Localization Sources of insecurity: sensor noise including dead reckoning, aliasing and ambient noise, and their models Probabilistic, map-based localization, belief representations and Markov acceptance Kalman filter localization Beacon systems SLAM: Autonomous mapping, EKF SLAM and graph-based SLAM Path planning: configuration space, Voronoi diagrams, graph search 

Part 2: Perception Sensors for mobile robots: physical models, quantitative characteristics, importance for autonomous, mobile systems Aspects of sensor data processing relevant to the operating system and software architecture: real-time, interrupt handling, presentation and processing of uncertainty Image recording: perspective transformation, camera matrix, camera calibration. Data pre-processing: cleanup, reduction, feature extraction Image processing: filter categories, edge detection, blob detection, detection of simple geometric figures Classification and clustering procedures, bag-of-features   Students work on a continuous (with respect to sensors and data processing) and incremental task, starting with simple sensor technology (distance measurement with infrared, camera with geometric objects in homogeneous colours) in a simulated environment and later also in a real environment.

Learning outcomes

Students are able to

  • describe the possible sources of uncertainty of a localization and their corresponding models and explain the types of representations of a "belief" of an autonomous robot.
  • explain the possibilities for presenting the surroundings (maps).
  • classify the concept of the Kalman filter. reflect the definition of SLAM and treatment techniques (Extended Kalman Filter SLAM, graph-based SLAM). name important path planning techniques.
  • describe and classify the qualities of the most important types of sensors of an autonomous mobile system (optical / acoustic physical models), their quantitative characteristics such as range, accuracy, noise.
  • specify the requirements and techniques for data acquisition using sensors.
  • differentiate between the specific data preprocessing methods of the different types of sensors.
  • explore the image processing methods relevant for autonomous robotics, in particular for feature extraction in the localization.
  • enumerate some pattern recognition procedures (classification, clustering).
  • explain the basic problem of the localization of an autonomous robot and its context with the environmental mapping and the central problem of simultaneous localization and mapping (SLAM).
  • describe the advantages and disadvantages of the different localization approaches (for example, landmark-based, beacon-based).
  • describe the complex interrelations of perception, localization and navigation.
  • explain the differences among sensor modalities and their significance in the tasks of an autonomous, mobile system.
  • classify and apply the basic techniques and concepts of image processing.
  • identify the importance of statistical methods such as classification / clustering in localization and object recognition.
  • apply the abstract concepts (such as Markov assumption, Kalman filter) in solving concrete problems.
  • use the techniques and methods in concrete robot tasks, select and implement the appropriate sensor types and data processing for the respective tasks and evaluate and analyze the corresponding results.
Planned learning activities and teaching methods

Integrated lecture. Theory input as a lecture Exercise with a robot simulator Competition with a real system (micro robots)

Assessment methods and criteria

Continual assessment. Assessment of exercises, presentation and assessment of solutions.

Comment

None

Recommended or required reading
  • Bekey, George A. (2005): Autonomous Robots: From Biological Inspiration to Implementation and Control. New. Cambridge, Mass: Intelligent Robotics and Auton.   
  • Corke, Peter (2011): Robotics, Vision and Control: Fundamental Algorithms in MATLAB. 1st ed. 2011. Berlin: Springer.   
  • Gonzalez, Rafael C. ; Woods, Richard E. (2007): Digital Image Processing. 3 edition. Upper Saddle River, N.J: Pearson.   
  • Hartley, Richard (2004): Multiple View Geometry in Computer Vision. CAMBRIDGE UNIVERSITY PRESS.   
  • „OpenCV “ (o. J.): OpenCV. Online im Internet: http://opencv.org/ (Zugriff am: 03.11.2016).  
  • Siegwart, Roland ; Nourbakhsh, Illah Reza ; Scaramuzza, Davide (2011): Introduction to Autonomous Mobile Robots. MIT Press.
Mode of delivery (face-to-face, distance learning)

Face-to-face

Summer Semester 2021go Top