Sensor Web Technologies Group

Overview

How we learn, work, and play has been forever transformed by the always-on connectivity of the Internet. Given the flood of information available, the automatic analysis of media in all its forms (including text, image, video, environmental, VR, and 3D) is centre stage in computing. This is because content-based applications such as retrieval, recognition, summarisation and recommendation all depend upon good quality analysis of content. 

The sensor web technologies group focuses on multi-modal analysis and interaction tools to extract and leverage useful information from multimedia data. We target traditional forms of media, such as text, video and environmental data. We also focus on real-time web sources, social media data, wearable devices, data from physiological sensors like EEG, HR, GSR, IOT, and fixed infrastructure in the physical world. Our research seeks to bridge the boundary between the digital and physical worlds, integrate new forms of media, provide fast reaction services to media events, and gather and curate large valuable archives of media data. This in turn supports gathering, organising, indexing, search, linkage and presentation. Our research efforts focus on multimodal media analysis, human interaction analytics, real-world lifelogging and navigating digital archives.

Staff Members

Dr Cathal Gurrin

Dr Gareth Jones

Dr Suzanne Little

Dr Alessandra Mileo

Dr Mark Roantree

Prof. Alan Smeaton

Mr Brian Stone

Affiliated Centres

INSIGHT: Centre for Data Analytics

ISG: Interoperable Systems Group

CDVP: Centre for Digital Video Processing

MESTECH: Marine & Environmental Sensing Technology Hub

Example Research Projects

>>>Insert example research projects here. <<<