translate

APE.properties

Displaying 321 - 340 of 1717

A treatment–response index from wearable sensors for quantifying Parkinson's disease motor states

I. Thomas, J. Westin, M. Alam, F. Bergquist, D. Nyholm, M. Senek, M. Memedi

Accelometry data was collected to identify dyskinesia during hand pronation. After being administered with single levodopa dose, participants were asked to perform standardized wrist rotation tests, using each hand while wearing a wrist sensor.
Tags: 
Parkinson's diseaseDyskinesia

Collection and Preprocessing of Czech Sign Language Corpus for Sign Language Recognition

P. Campr, M. Hrúz, J. Trojanová

The Czech sign language corpus is intended for training and testing of sign language recognition systems. A dataset of video streams and hand tracking of sign language numbers, days, month names, alphabets, town names and selected signs is collected.
Tags: 
Sign languageCzech Sign Language

Automatic Identification of Mild Cognitive Impairment through the Analysis of Italian Spontaneous Speech Productions

D. Beltrami, L. Calzà, G. Gagliardi, E. Ghidoni, N. Marcello, R. Favretti, F. Tamburin

Liguistic symptoms of cognitive decline in elderly patients are investigated through speech. Participants read and answered three tasks, these utterances were labeled for features and transcribed.
Tags: 
Early DementiaDementiaMild Cognitive Impairment

Monitoring Parkinson's Disease Progression Using Behavioural Inferences, Mobile Devices and Web Technologies

J. Vega

A dataset using smartphone usage is collected from patients with Parkinson's disease to monitor the progression of the disease. Smartphone usage data from 2 patients with Parkinson's disease is collected, along with sensor data such as location and weather.
Tags: 
Parkinson's disease

Automatic Readability Assessment

L. Feng

A text corpora annotated with indicators of reading difficulty is collected to train a Machine Learning algorithm for predicting readign difficulty for a piece of text. English documents on various documents were collected
Tags: 
Mild Intellectual Disability

Turn Right: Analysis of Rotation Errors in Turn-by-Turn Navigation for Individuals with Visual Impairments

D. Ahmetovic, U. Oh, S. Mascetti, C. Asakawa

This work studies rotation errors and their effect on turn-by-turn guidance for individuals with visual impairments to inform the design of navigation assistance in real-world scenarios. A dataset of indoor trajectories of 11 blind participants using NavCog, a turn-by-turn smartphone navigation assistant, was collected.
Tags: 
navigation

SignsWorld Atlas

Samaa M. Shohieba, Hamdy K. Elminirc, A.M.Riadb

A dataset of facial expressions, videos and lip reading, for capturing Arabic sign language and non-manual signs is collected. 10 signers under controlled lighting conditions contributed to the image and video dataset that contains hand shapes, Arabic finger spelling alphabets, numbers, indivdual signs and signs in continuous sentences, lip movement, and facial expressions.
Tags: 
Sign languageArabic sign language

Automatic childhood autism detection by vocalization decomposition with phone-like units

D. Xu, Jeffrey A. Richards, J. Gilkerson, U. Yapanel, S. Gray, J. Hansen

To create a mechanism for child autism detection using the LETM (Language ENvironment Analysis) System, utilizing speech signal processing technology Recorded through a small digital recorder (DLP – digital language processor) worn by the child in the pocket of specially designed clothing.
Tags: 
Autism detectionAutismAutism Spectrum DisordersSpeech Analysis

LSF-ANIMAL: A Motion Capture Corpus in French Sign Language Designedfor the Animation of Signing Avatars

L. Naert, C. Larboulette, S. Gibet

A MoCap dataset of body movements for French sign language is collected and will be used for animating a signing avatar. Dataset is evaluated by people, including native signers. 2 signers made guestures for 4 subsets of words and actions which were captured by Motion capture system and annotated using ELAN software.
Tags: 
Sign languageFrench Sign Language

Automatic Emotion Recognition in Robot-Children Interaction for ASD Treatment

M. Leo, M. Coco, P. Carcagni, C. Distante, M. Bernava, G. Pioggia, G. Palestra

This work analyzes facial expression to evaluate robot-ASD children interactions. Children were asked to imitate expression of robots and pictures were taken, and children's behaviours were analyzed using machine-learning strategies.
Tags: 
EmotionsEmotion RecognitionAutism Spectrum Disorders

Imprecise vowel articulation as a potential early marker of Parkinson's disease: effect of speaking task.

J. Rusz, R. Cmejla, T. Tykalova

This work aims to analyze vowel articulation in individuals with Parkinson's disease. Data on speaking tasks such as phonation, sentence, passage and monologues are collected.
Tags: 
Parkinson's diseaseDysarthria

RWTH-BOSTON-400

P. Dreuw, C. Neidle, V. Athitsos, S. Sclaroff, H. Ney

A dataset of video streams of American sign language sentences is collected. Multiple RGB camera captured signs as speakers made them.
Tags: 
Sign languageAmerican Sign Language

Using a Language Technology Infrastructure for German in order to Anonymize German Sign Language Corpus Data

J. Bleicken, T. Hanke, U. Salden, S. Wagner

The study aims to automatically apply anonymization to the German Sign Language dataset, DGS Korpus. Multi-camera video recordings of conversations in German sign language were collected, annotated, and translated.
Tags: 
Sign languageGerman sign language

Intonation and speech rate in Parkinson's disease: general and dynamic aspects and responsiveness to levodopa admission.

S. Skodda, W. Grönheit, U. Schlegel

This work aims to analyze fundamental frequency (F(0)) variability (fundamental frequency standard deviation [F(0)SD]) and net speech rate (NSR) in the course of reading in Parkinsonian patients' speech. German speaking patients with Parkinson's disease went through a reading task, and they were recorded.
Tags: 
Parkinson's disease

What Comes First: Combining Motion Capture and EyeTracking Data to Study the Order of Articulators in Constructed Action in Sign Language Narratives

T. Jantunen, A. Puupponen, B. Burger

Authors capture eye tracking and motion capture data to explore how the dominant hand, the head, the chest and the eyes are used to sign short Finnish Sign Language stories. MoCap and eye tracking data is collected for 7 stories signed in Finnish sign language. ELAN is used to align the data.
Tags: 
Sign languageFinnish Sign LanguageEye trackingMotion Capture

Unbiased and Mobile Gait Analysis Detects Motor Impairment in Parkinson's Disease

J. Klucken, J. Barth

A machine learning classifier to automatically detect motor impariment in patients with Parkinson's disease is constructed. To train the classifier a dataset of gait movements measured by acclerometer is collected. A mobile biosensor attached to the foot, eGaIT, is used to collect accelerometer, gyroscopes data during standarized gait and leg functions. Two participant groups are used to collect independent training and validation datasets.
Tags: 
Parkinson's disease

A machine learning algorithm successfully screens for Parkinson's in web users

B. Youngmann, L. Allerh, O. Paltiel, E. Yom‐Tov, D. Arkadir

Mouse and keyboard interactions with a search engine can provide data that is used in a classifier to reliably detect individuals at high risk for actual Parkinson's disease and serve as an idicator for the progression of the disease Microsoft bing search queries were collected and the cursor movements and keystrokes were recorded
Tags: 
Parkinson's disease

Personalizing ASR for Dysarthric and Accented Speech with Limited Data

J. Shor, D. Emanuel

A new dataset and transfer learning approach is described in order to improve automatic speech recognitions for underrepresentive populations. Participants read sentences from Cornell Movie-Dialogs Corpus, a collection of sentences used by text-to-speech voice actors, and a modified selection of sentences from the Boston Children’s Hospital.
Tags: 
DysarthriaSpeech recognitionAmyotrophic Lateral SclerosisAccented Speech

Providing good memory cues for people with episodic memory impairment

Matthew L. Lee, Anind K. Dey

This work studies best memory cues for aiding individuals with episodic memory impairment to inform the design of lifelogging technologies for automatically capturing and extracting the cues and supporting recollection of episodic memory. Each participant wore a SenseCam for a session of any particular activity of 2-8 hours respectively. The SenseCam took images every 2 minutes from a first-person perspective, then they were shown time distributed samples of 100 images and their descriptions about the images were collected.
Tags: 
Alzheimer's diseaseEpisodic Memory Impairment

A Video-Based Method for Automatically Rating Ataxia

R. Jaroensri

A dataset of video demonstrations is collected to build a neural network based classifier for automatically ataxia. A video of hand movements is collected while participants perform a hand-to-note test and then it is labelled by a clinician.
Tags: 
Parkinson's diseaseAtaxia