Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics

Page created by Carol Hopkins
 
CONTINUE READING
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Thesis Projects starting 2021 S1
Supervisor: Donald Dansereau, Australian Centre for Field Robotics
These projects center on using novel visual sensing to help robots see and do.
Contact: ​donald.dansereau@sydney.edu.au

New Imaging Technologies for Underwater Imaging

This project will investigate new imaging technologies for seeing through murky water. You will have
the opportunity to work with one or more cutting-edge approaches including event cameras, burst
photography, light field cameras, structured light cameras, speckle-projection RGBD and time of
flight cameras.

A key requirement when developing new imaging hardware is being able to measure performance in
a repeatable, quantifiable way while realistically replicating challenging real-world conditions. This
work will include establishing techniques for simulating underwater imaging conditions in the lab,
including one or more of low light, murky water, sea snow, and interference from veiling sunlight.

Desired skills, depending on focus:
        Image processing in C++, Python, or Matlab
        Lighting and imaging
        Knowledge of or a desire to learn about optical properties of water
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Lunar Rover: Seeing in 3D on the Moon

This project works towards building robots that can see in 3D to navigate on the moon. Hard
shadows, retroreflective regolith, and severely limited compute, mass and volume budgets all add
up to a very challenging scenario.

This project will evaluate and adapt cutting-edge 3D sensing technologies to work on the moon.
Candidate technologies include event cameras, speckle projection stereo, time of flight, and light
field imaging. The work will include establishing physical test scenarios that feature physically
realistic illumination, and the use of analogue materials that reproduce the optical properties of
lunar regolith. There is also an opportunity to work with computer simulations of the lunar
environment.

Desired skills:
   Image processing in Matlab, Python or C++
   Knowledge of optics, imaging, and RGBD cameras would be an asset
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Robotic Camera Calibration

A key step in making cameras work in robotics is calibration. Traditionally this involves human
intervention, moving checkerboards around the space in front of the camera. In this project you will
automate this process and investigate alternative approaches to classic checkerboard calibration.

    ●   Control the robotic arm and camera to reliably and autonomously calibrate the camera
    ●   Evaluate the quality of the calibration on-the-fly and adapt the motion strategy accordingly
    ●   Prototype and evaluate new kinds of calibration targets, leveraging recent developments in
        3D display technology

There is a potential in this project to contribute to the open-source Light Field Toolbox for Matlab.

Desired skills:
   Signal processing, image processing
   Experience with camera calibration, robotic arms
   C++, Python, or Matlab
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
A 3D Multispectral Camera

                                                                                      [Behmann et al 2016]

Hyperspectral sensing gives us rich information about the health of forests, crops, and reefs. In this
work you will build techniques for working with a 3D hyperspectral camera recently developed in the
group. This camera simultaneously measures the 3D shape and spectral signature of objects,
delivering some of the world's first 3D hyperspectral point cloud measurements.

This project will focus on making sense of the new forms of information provided by this camera.
Potential applications include detecting plant health or distinguishing visually similar objects. The
work will involve development of low-level software tools to handle the raw data provided by the
camera, and high-level algorithms for making decisions from the resulting imagery.

Desired skills as appropriate to the above:
   Image processing in Matlab, Python or C++
   Experience with RGBD cameras and working with point clouds
   Knowledge of optics would be an asset
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Smooth Motion Rendering with Light Fields

The Matrix (1999) popularized the “bullet time” effect, super-slow-motion from a moving
perspective. At the time this took a huge array of cameras and extensive processing. Modern Light
Field cameras offer us a chance to do something similar with only a few cameras. Each light field
camera captures a small array of perspectives, and so as few as two or three can be used to generate
smooth motion trajectories.

This project tackles fundamental problems in computer vision, light field registration and
interpolation, and has broad applicability from augmented reality to robotic vision.

There is an opportunity to contribute to an active open-source software project for processing light
fields.

Depending on interest and ability, this project could include:
   ● Building a registration and rendering pipeline to render a smooth bullet-time effect
   ● Building your own light field camera array; freeze dynamic events and explore them using a
      custom rendering pipeline

Desired skills as appropriate to the above:
   Image processing in Matlab, Python or C++
   Graphics, raytracing, image registration, camera calibration
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Light Field Imaging for Autonomous Driving

Light field cameras capture a 4D representation that natively handles occlusions and complex optical
properties like transparency and reflectivity. These can see around rain, through fog, and deliver
robust depth information.

In this project you’ll work with a cutting-edge camera prototype to develop the algorithms that let
autonomous cars and other robots see better in challenging conditions. There will be opportunities
to contribute to an open-source dataset and open-source light field processing tools. There is also a
chance to develop embedded algorithms that run directly on the camera’s built-in FPGA.

Desired skills:
   Image processing in Matlab, Python or C++
   Knowledge of optics and imaging
   For embedded processing: FPGA and/or embedded programming
Thesis Projects starting 2021 S1 - Supervisor: Donald Dansereau, Australian Centre for Field Robotics
Generalised Time of Flight Camera for Interactive
Robotic Vision

Time of flight cameras like the Microsoft Kinect use high-speed signal processing to measure the
time it takes light to travel through a scene. Recent work has shown how these devices can be
adapted to other problems like measuring velocity, or imaging through murky water. In this project
you work with a recently developed flexible time of flight camera to allow robots to better
understand their environments.

This project will focus on developing the embedded software needed to drive the programmable
time of flight camera to collect new forms of imagery. It will include some image processing to then
make sense of this imagery in a representative scenario like autonomous driving in dynamic
environments.

Desired skills:
   Image processing in Matlab, Python or C++
   Knowledge of optics and imaging
   FPGA and/or embedded programming
You can also read