Second, our findings will have an impact on projects on automatic recognition of sign language, as it is clear that both grammatical information and emotions should be considered by recognition models when applied to facial expressions". future interpreters) should be aware of how both emotions and grammar are affecting facial expressions. "First, students learning Kazakh-Russian Sign Language (e.g. The evolution of new technologies has clearly contributed to improve and extend the communication opportunities of hearing impaired people, Kimmelman says: Showcasing new possibilities The results of this study show some real practical implications. In future, we will also investigate how eyebrow movement aligns with specific signs in the sentence, in addition to how average eyebrow position is affected, and we will investigate other facial features and head and body position". In addition, we found some complex interactions between factors influencing eyebrow positions, indicating the need for future research. "We found that with surprised general questions the eyebrows raise even higher than with neutral questions. Kimmelman also found that emotional and grammatical marking can be combined: For instance, general questions are marked with raised eyebrows, surprise is marked with raised eyebrows, and anger is marked with lowered eyebrows". The study showed as expected, that both emotions and grammar affected eyebrow position in Kazakh-Russian Sign Language. "Using this software, we were able to precisely and objectively measure eyebrow positions across different conditions, and conduct quantitative analysis. The main novelty in this study was the use of the OpenPose software, which allows automatic tracking of hands, body, and facial features in 2D videos. Based on research on other sign languages, we had expected that both emotions and grammar would affect eyebrow position, and that we might find some interactions", Kimmelman explains. The signer posed the questions three times, adding three different emotions (neutral, angry, surprised). The girl fell – Did the girl fall? – Where did the girl fall?). "In our study we asked nine native signers to produce the same sentences as statements and two types of questions (e.g. Kimmelman and his colleagues at Nazarbayev University in Kazakhstan investigated these questions for Kazakh-Russian Sign Language, which is the sign language used in Kazakhstan. In his latest study published in PLOS ONE, Kimmelman is looking at what happens when different emotions are combined with different sentence types, using 2D video analysis software to track eyebrow position in video recordings. Kimmelman is Associate Professor at the University of Bergen, where he works as a linguist, primarily on the grammar of sign languages, specifically Russian Sign Language (RSL) and Sign Language of the Netherlands (NGT). Will a computer software be able to capture the correct meaning?" Vadim Kimmelman asks. What we need to know more about, is what happens when grammar and emotions needs to be expressed at the same time. At the same time, signers use the face to express emotions – either their own, or when quoting someone else. For example: eyebrow raise is necessary to mark general questions in most sign languages. "In sign language, facial expressions are used to express both linguistic information and emotions. We have already seen examples of robot-projects that can convert text into sign language, but it has proven more challenging to translate from sign language into spoken languages.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |