Entry Date:
June 7, 2016

Tid'Zam


The Tidmarsh project is interested in the documentation of ecological processes to understand their spatial and temporal evolution. Its cross-reality component provides user experiences for numerical reconstructions of outdoor environments thanks to data collected from real-time sensor networks. Tid'Zam analyses multi-source audio streams in real-time to identify events happening on Tidmarsh, such as bird calls, frogs, or car noise. Its Deep Learning stack offers an interface to create and improve the different classifier units from a Web interface. In addition, its interactive HCI has been designed to provide a training feedback mechanism between users/experts and the neural networks in order to improve knowledge for both the system and the users.