University of Liège — Thibaut Cuvelier | Research engineer at Montefiore Institute

Introduction to intelligent robotics (INFO0948), 2016-2017

General information

Lecturers: B. Boigelot, Ph. Latour, A. Lejeune, R. Marée, M. Van Droogenbroeck, L. Wehenkel.

Teaching assistants: M. Baijot (I.83), Th. Cuvelier (I.76b).

Coordinator: L. Wehenkel.

Course specifications.

Tentative schedule for the lessons

Several theoretical lessons make up this course. Their main purpose is to help realising the project, but they will also give an overview of the field.

# Date Agenda Downloads Supplements
1 9 February 2017 Introduction to the course (ch. 1), L. Wehenkel Slides (last updated 2017-02-09 15:30) Reference book
Positions and Orientations (ch. 2), A. Lejeune Slides Reference book
Peter Corke's Robotics Toolbox: 3D transforms, quaternions, homogeneous coordinates
MATLAB Robotics Toolbox: coordinate system transforms
2 16 February 2017 Time and Motion, Mobile Robots (ch. 3 and 4), B. Boigelot Slides Reference book for Chapter 3 and Chapter 4
Peter Corke's Robotics Toolbox: trajectory interpolation
Simulator presentation, M. Baijot and Th. Cuvelier Slides. Videos:
3 23 February 2017 Navigation (ch. 5), B. Boigelot Slides Reference book
Peter Corke's Robotics Toolbox: path planning, angle difference (angdiff)
MATLAB Robotics Toolbox: path planning and occupancy grids, probabilistic roadmap (PRM), PurePursuit controller
4 2 March 2017 Localization and Kalman Filter (ch. 6), L. Wehenkel Slides Reference book
Peter Corke's Robotics Toolbox: localisation with Kalman filter and particle filter, map of landmarks
MATLAB Robotics Toolbox: particle filter, probabilistic roadmap (PRM), PurePursuit controller
5 9 March 2017 Fitting and Shape Matching (not in the reference book), Ph. Latour Slides To go further
For shape matching: An efficient data structure to work with many points: the k-d tree.

For the project
Peter Corke's Machine Vision Toolbox: MATLAB Computer Vision System Toolbox: MATLAB Statistics and Machine Learning Toolbox:
6 16 March 2017 Q & A for the project, M. Baijot and T. Cuvelier
23 March 2017 Danger Milestone A1 (details below)

You are expected to produce a 5-minute video of your robot that explores the map and eventually shows the map, with an audio comment explaining your implementation (why you chose a given path finding algorithm, how you decide the next point to explore, rather than what function you called). Ideally, the video should also show how your robot is making decisions (for example, show the map being built, the next point to explore, the trajectory).
Your submission must include both your source code and the video (either directly as a file, or a link to an external website where you hosted your video — in this case, smake sure that we can access the video at any time after your submission).
7 30 March 2017 Image Processing (ch. 12), M. Van Droogenbroeck Slides To go further
Reducing Errors in Object-Fetching Interactions through Social Feedback: video, article.

For the project
Reference book
MATLAB Image Processing Toolbox:
8 20 April 2017 Feature Extraction, Bag-of-features for Image Classification (ch. 13), R. Marée
The course begins at 9:30
Slides (20 MB) (previous version) Reference book
Peter Corke's Machine Vision Toolbox: MATLAB Computer Vision System Toolbox (high level): MATLAB Computer Vision System Toolbox (low level): MATLAB Statistics and Machine Learning Toolbox:
27 April 2017 Project follow-up: first a brief tour of what you have achieved so far since the first deadline (deeper for groups that had problems), then time for questions
The session begins at 10:00
13 June 2017 Danger Project submission

You are expected to submit:
  • your source code for the whole project.
  • a PDF report (between five and ten pages) explaining which milestones you have implemented, the ideas behind your algorithms, why you think they should work in general for a map that respects the hypotheses of the project, what ideas you rejected (and why). Basically, everything that you would like to present during your defence should be in your report. If you want to, you may include links to videos.
Note If you want to, you can submit your project beforehand: such a late deadline does not mean we expect you to work full steam until mid-June, but rather that you should decide when you work on the project so that it does not interfere with your other academic requirements. For students with a heavy exam session, it would be preferable to hand in your project at the end of May, but the choice is up to you.
14 June 2017 Danger Project presentations

The exam will mainly consist of a live demo of your solution on a house that differs from the one provided for training. Be ready to run your code on a laptop, with a different VREP file. Please also prepare videos showing the key elements of your solution, in case there is not enough time to run a full simulation sequence. Prepare two or three slides describing the key elements of your work. This defence shall last approximately ten to fifteen minutes per group.
One examiner residing in the USA, a videoconferencing system will be used, namely Skype. Please have it installed on your computer beforehand and test screen sharing with your project running in the simulator (only one computer per group is required, make sure it is powerful enough well in advance; contact us if the computers of all group members are not able to withstand Skype with the simulator). Also, be present at least 15 minutes in advance to ensure the examinations go smoothly (testing shared screens, uploading the exam map, etc.).
Hour Group names First group member Second group member
15:00 PitzAmraoui Adrien Pitz Issam Amraoui
15:15 MorelleScarlata clément Morelle
15:30 s101052s150793 Michaël Paquay Bryan Peeters
15:45 Group9999 Enrico Ghidoli
16:00 SWassermann Sarah Wassermann
16:15 VanheeTasset Maxime TASSET Laurent Vanhee
16:30 group42 Florian Peters
16:45 s131529s133011 Joris Sébastien Martin Castin
17:00 PierreAntoine Antoine Germay Pierre Nicolay
17:15 Mastrodicasarauw Rauw Stephane Simon Mastrodicasa
17:30 BoileauWauquaire Quentin Boileau Odile Wauquaire
17:45 paaur Pascal Leroy Aurelien Werenne
18:00 s134961s122239 Florian Merchie quentin Diprima
18:15 DuboisWehenkel Antoine Wehenkel Dubois Antoine
18:30 RaletBricmont Bricmont Jordan ARNAUD RALET
Note The group names, member names, and member orders have been taken from the submission platform.

The chapter numbers follow those of the reference book, Robotics, Vision and Control, first edition (freely available when connected from the university network).

The second edition is out since June 2017, 22: the numbering may have evolved, but the book is still be valuable for this course (it is also freely available when connected from the university network).

The slides that were used the previous years are still available on Renaud Detry's website. (Use the university's cabled network, or a password will be required.)

Project

Project statement, list of milestones, installation procedure. The project should be done in groups of two. If you have questions about the project, you can ask any teaching assistant. Submissions must be done on the dedicated platform (all the members of your group must register on the platform so that you can make a group). Deadlines:

  • 23 March: milestone A1 and short presentation of your robot exploring the room to produce a map, details above
  • 13 June: final submission, details above
  • 14 June: final examination, details above

You may find the following software useful during the project (for MATLAB only):

  • Peter Corke's Robotics Toolbox, also on GitHub (however, be cautious: it will not always work as expected); it is automatically installed when you perform the installation steps for the project (when running the script startup_robot.m
  • MATLAB Robotics Toolbox (available since R2015a, not by default in all MATLAB editions)

A few links more specifically about the simulator and the code you will have to write:

Would you choose not to use MATLAB, here are a few links that you might find useful:

  • the list of V-REP bindings (the same commands are used for all bindings, whatever the programming language)
  • For C++: Robotics Library (for motion planning), OpenCV (for image processing, feature extraction, and machine learning), mlpack (for machine learning)
  • For Octave: Peter Corke's robotics toolbox is partially useable with Octave

FAQ

The simulator shows a black screen and gives incorrect position/angle for the robot. What happened?

Usually, when the simulator outputs NaN values, it means that it got such invalid inputs, usually as velocities.

How to update VLFeat?

By default, the script startup_robot downloads an outdated version of VLFeat (0.9.9, while the current one is 0.9.20), which lacks many features (such as an SVM implementation). To update it, you can download the latest version on the official website, including binaries (on VLFeat's download page, the file is currently under the link VLFeat 0.9.20 binary package). Extract this archive on your computer (for example, in the directory matlab/rvctools/contrib/vlfeat-0.9.20, along with the embedded version of VLFeat, matlab containing the startup_robot.m file).

Before using VLFeat, you must use a script that sets up the needed paths (very much like startup_robot), each time you start MATLAB. If you followed the previous instructions, that script is matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_setup.m.

If you get strange errors when trying to use some functions, you may have to recompile the MEX files of VLFeat. To this end, you must at least have a C compiler installed that is recognised by MATLAB, such as Visual C++ (included with Visual Studio Community) or MinGW. Once you have a compiler, start the compilation by launching the script matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_compile.m.

With Visual C++, you might still get errors when compiling:

Error using mex
vl_fisher.c
C:\Programs Files ( x86)\Windows Kits\10\include\10.0.10150.0\ucrt\stdio.h(1925): warning C4005:
'snprintf': macro redefinition

C:\Robotics\matlab\rvctools\contrib\vlfeat-0.9.20\vl\host.h(315): note: see previous
definition of 'snprintf'

C:\Program Files (x86)\Windows Kits\10\include\10.0.10150.0\ucrt\stdio.h(1927): fatal error
C1189: #error: Macro definition of snprintf conflicts with Standard Library function declaration

Error in vl_compile (line 140)
mex(cmd{:}) ;

In this case, edit the file matlab/rvctools/contrib/vlfeat-0.9.20/vl/host.h to comment out the lines 315 and 335 (they look like # define snprintf _snprintf). (Another solution is to use the master branch of VLFeat.) Restart matlab/rvctools/contrib/vlfeat-0.9.20/toolbox/vl_compile.m.

How to use the simulator?

For the projet in this course, you will be asked to use the simulator V-REP. It emulates a complete robot (the youBot) evolving in its environment: it will move around, place its arm, grasp objects, take pictures within the simulator.

For the installation, please follow the TRS tutorial. It is highly recommended to use MATLAB as a programming environment.

When following the tutorial, if you have an error executing the statement binding_test() (step 3.2), make sure you have started the simulation in V-REP.

In order to shoot videos from V-REP, you can use the option Tools > Video recorder; however, the video will not contain any MATLAB overlay. In order to record your voice, you may have to use separate software for video making, such as KDEnlive. Other tools allow you to capture your screen (including MATLAB windows), such as OBS Studio (free and open-source software) or Camtasia (the trial version is sufficient).

Last modification: July 08 2017 21:02:56.