Visual Computing for Multimodal Sensor Data
Classically, Visual Computing is considered as the discipline of computer science which deals with the acquisition, representation, manipulation, analysis, synthesis and application of visual information, i.e. images and image sequences in a spatial and temporal context.

This talk extends the concept of Visual Computing by broadening the
notion of "visual information" to a "data array acquired by sensors" (usually in 2D or 3D), where the sensors are not only classical imaging sensors, but also range sensors and sensors that possibly work at different wave length beside the visual one, e.g. THz- or multispectral infrared sensors.

The talk will give some examples of current research that fit to the given extended definition of Visual Computing. It will discuss the challenges of dealing with Time-of-Flight range sensing data, synthetic aperture radar (SAR) data acquired from earth observating satellites and THz-data.
Lecturer:
Prof. Dr. Andreas Kolb
Dates:
06.05.2011, 13:00 c.t., G29-R335
Additional Information:
> Lecture website <