image_cropped

Example research results:

Compression and Transmission of Vital Signs

Modern wearable devices allow monitoring vital parameters such as heart or respiratory rates, electrocardiogram, photo-plethysmographic or even video signals, and are being massively commercialized in the consumer electronics market. A common issue of wearable technology is that signal processing and transmission are power demanding and, as such, require frequent battery charges. In our research, we consider biometric signal compression as a means to boost the battery life of wearables, while still allowing for fine-grained and long-term monitoring applications. We have proposed a few algorithms based on different approaches: 1) online motif extraction and pattern identification, 2) online and subject-adaptive dictionaries, and 3) denoising autoencoders. These techniques are compared with other recent algorithms from the literature based on: compressive sensing, discrete cosine and Wavelet transforms, principal component analysis and lightweight temporal compression. As we quantify in our performance evaluation, our algorithms allow reductions in the signal size of up to 70 (dictionary-based) or 100 (autoencoders) times, and obtain similar reductions in the energy demand, by still keeping the reconstruction error within 4% of the peak-to-peak signal amplitude.

Motion Analysis

We are investigating system identification techniques based on inertial signals from wearable devices, such as smartphones. Their goal is to recognize a target user from their way of walking, using the accelerometer and gyroscope (inertial) signals provided by a commercial smartphone worn in the front pocket of the user’s trousers. Our design features several innovations including: a robust and smartphone-orientation-independent walking cycle extraction block, a novel feature extractor based on convolutional neural networks, a one-class support vector machine to classify walking cycles, and the coherent integration of these into a multi-stage authentication system. To the best of our knowledge, our system is the first exploiting convolutional neural networks as universal feature extractors for gait recognition, and using classification results from subsequent walking cycles into a multi-stage decision making framework. Experimental results show the superiority of our approach against state-of-the-art techniques, leading to misclassification rates (either false negatives or positives) smaller than 0.15% in fewer than five walking cycles.

Smartphone gait signals - DATASET

The signals used to design and test IDNet are freely downloadable here below. Our dataset features accelerometer and gyroscope signals collected from fifty users through a number of different smartphones. Motion traces were acquired wearing the smartphone in the front pocket of the user’s trousers. Multiple acquisition sessions were carried out for each user to account for multiple types of terrain and clothes. The collected traces were anonymized and organized in the below “.tar.gz” archive.

For further publications see human data analysis papers.

We gratefully acknowledge