translate

jQuery_implementation

Displaying 481 - 500 of 1717

GSLC: Creation and annotation of a Greek sign language corpus for HCI

E. Efthimiou, S. Fotinea

A dataset of video streams of Greek sign language phonemes and sentences is collected under a standard framework(EAGLES). The video corpora were performed by native signers under controlled conditions.
Tags: 
Sign languageGreek Sign LanguageGreek

Gait in Parkinson’s Disease

Jeffrey M. Hausdorff

A dataset of gait from patients with Parkinson's disease and healthy people is collected. Participants wore 16 sensors that recorded the normal reaction forces of steps and interial measurements as they walked for 2 minutes on level ground.
Tags: 
Parkinsons

Predicting Parkinson's Disease Progression with Smartphone Data

The Michael J. Fox Foundation For Parkinson's Research

This work aims to objectively measure the symptoms of Parkinson’s disease with a smartphone Data related to Audio, Accelerometry, Compass, Ambient light, Proximity, Battery level were collected from smartphones.
Tags: 
Parkinsons

American Sign Language Lexicon Video Dataset (ASLLVD)

C. Neidle, A. Sclaroff

The dataset consists of videos of American Sign Language. The videos show the sign language gestures from multiple angles and are added with linguistic annotations.
Tags: 
Sign languageAmerican Sign Language

WeAllWalk

G. Flores, R. Muchi

This study aims to understand how blind people navigate indoors using smartphones. Blind people wore inertial sensors and accelerometers in their phones.
Tags: 
blindblindness

Recognition of Fingerspelling Sequences in Polish Sign Language Using Point Clouds Obtained from Depth Images

D. Warcho, T. Kapuściński, M. Wysocki

A point cloud dataset of static Polish sign language fingerspellings is collected to advance research on automatic sign language recognition. Unknown number of users performed fingerspellings for 16 Polish sign language alphabets, this data was recorded using a 3D Kinect sensor.
Tags: 
Sign languagePolish sign languagePolish

Corpus Vlaamse Gebarentaal

M. Herreweghe, M. Vermeerbergen

Signers talked about a series of themes such as, making appointments, talking about school days and more. RGB camera recorded the data and edited and annotated. RGB camera recorded signed conversations.
Tags: 
Sign language

LSE-Sign: A Lexical Database for Spanish Sign Language

E. Gutierrez-Sigut, B. Costello, C. Baus, M. Carreiras

A Spanish sign language corpus is collected using RGB video cameras of sign units by 2 naitve signers (male and female). RGB camera recorded the gestures for 2400 words in Spanish Sign language and 2700 non-signs.
Tags: 
Sign languageSpanish sign language

Hakim Fluency Corpus

H. Hakim, N. Ratner

This work aims to understand a linguistic component to stuttering and investigate an experimental method for assessing the linguistic abilities of children with specifically language-impaired and children who stutter. Audio files contain the pariticpants' responses to non-word repetition tasks used as measures for children's linguistic abilities found to be correlated with vocabulary and reading skills.
Tags: 
Stutteringlanguage impaired

POLER Corpus

William D. Gaillard, M. Berl, N. Ratner, A. Strekas, M. Lowe

This dataset is part of the project Plasticity of Language in Epilepsy Research and used to compare the language skills of two groups of children with epilepsy—those with recent onset seizures and those with more chronic seizure activity. Participanst were asked to perform standardized IQ and language testing and produce narratives to accompany the book Frog, Where Are You? (1969) for analysis of syntax, vocabulary and narrative components.
Tags: 
EpilepsyLocalization-related epilepsy

Corpus NGT

O. Crasborn, I. Zwitserlood

This work is an effort to record and archive video data from Sign Language of the Netherlands. The signers forming a diverse group in age and regional background participated in recordings done in pairs which resulted in about 1.5 hours of useable signed data per pair. An upper body view and a top view of each signer were recorded.
Tags: 
Sign language of the NetherlandsSign language

British Sign Language Corpus

A. Schembri, J. Fenlon, R. Rentelis, S. Reynolds, K. Cormier

Data from deaf native, near-native and fluent signers across the United Kingdom were collected to create a machine-readable digital corpus of spontaneous and elicited British Sign Language (BSL) digital video. Participants were filmed in pairs with one camera focused on each participant and a third focused on both members of the pair.
Tags: 
Sign languageBritish Sign Language

Gait in Aging and Disease Database

Jeffrey M. Hausdorff

This work creates a dataset of walking strides to understand differences in gait movements of people with Parkinson's disease and healthy adults. Young, old healthy participants and patients with Parkinson's disease walk on a level ground. Healthy participants walked on a circular path for 15 minutes, and patients with Parkinson's walked on a long hallway for 6 minutes, while they all wore a gait belt and 3D accelerometer.
Tags: 
Parkinsons

EllisWeismer Corpus

S. Weismer

This dataset is part of a project titled "Linguistic Processing in Specific Language Delay” and used to identify late talkers at risk for language disorders in order to provide them with earlier language intervention. Language sample collection was performed at the yearly assessment visits at different ages, and studies have examined lexical, morphological, and discourse processing.
Tags: 
Late talkersLanguage development

SLANG3k

P. Toman

The study investigates automatic identification of internet-based signed language data using Neural Networks. Online videos were searched for different sign languages.
Tags: 
Sign languageAmerican Sign LanguageBritish Sign LanguageGerman sign language

RWTH-PHOENIX-Weather 2014 T

N. Camgoz, S. Hadfield, O. Koller, H. Ney, R. Bowden

This work aims to generate spoken language translations from sign language videos, taking into account the different word orders and grammar, and argues that existing linguistic datasets are too small to support deep learning. The study uses the daily news and weather forecast airings of the German public tv-station featuring sign language interpretation. The corpus covers a vocabulary of 1066 different signs annotated using sign glosses by deaf specialists
Tags: 
Sign languageGerman sign languageGerman

Swedish Sign Language Dictionary

J. Mesch, L. Wallin, T. Björkstr

This dictionary includes video files for signs in Swedish Sign Language created for general educational resources. Each sign entry on the web-based dictionary consists of sign demonstration, variants by one or two hands, usage examples (sentence), and photo and sign notation.
Tags: 
Sign languageSwedish sign languageSwedish

FluencyBank Voices-AWC Corpus

Y. Zaalen

This work aims to capture participants’ experience of stuttering to buid a new instrument (OASES) for measuring the overall impact of stuttering. Each participant completed a reading task and an interview to learn more about the behaviors and affective/cognitive features of living with cluttering.
Tags: 
ClutteringFluency disordersDysfluencies

The ASL Lexicon Video Dataset

V. Athitsos, C. Neidle, S. Sclaroff, J. Nash, A. Stefan, Q. Yuan, A. Thangali

Native ASL signers perform different signs as part of a project to develop a computer vision system that allows users to look up the meaning of an ASL sign. The video sequences are collected in a recording facility, using a four-camera system that simultaneously captures two frontal views, one side view, and one view zoomed in on the face of the signer.
Tags: 
Sign languageAmerican Sign Language

WLA4ND: a Wearable Dataset of Learning Activities for Young Adults with Neurodiversity to Provide Support in Education

H. Zheng, P. Mahapasuthanon, Y. Chen, H. Rangwala, A. Evmenova, V. Motti

The dataset consists of data collected with smartwatch built-in sensors, representing learning activities from participants with neurodiversity, which can be used for activity recognition, automatic labeling, and assistive wearable applications. The activities include reading, writing, typing, answering follow-up questions, and off-task that are collected using motion sensors and a heart rate sensor.
Tags: 
NeurodiversityAutism Spectrum DisordersAttention Deficit and Hyperactivity DisorderIntellectual and Developmental Disability