Please use this identifier to cite or link to this item:
https://publication.npru.ac.th/jspui/handle/123456789/1330
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Gedkhaw, Eakbodin | - |
dc.date.accessioned | 2021-08-20T08:22:33Z | - |
dc.date.available | 2021-08-20T08:22:33Z | - |
dc.date.issued | 2021-07-08 | - |
dc.identifier.uri | https://publication.npru.ac.th/jspui/handle/123456789/1330 | - |
dc.description.abstract | In this paper propose 2D Convolutional Neural Networks in Thai sign language recognition. Network was train by end-to-end for continuous gesture recognition. 2D convolutions was choose to extract features related to the gestures of Thai sign language. The design of the 2D CNN model and the generate of Thai sign language gesture data set, using 2D convolution and pooling layers, was used to train differentiation of the data. In experiment use 3 Thai sign language gestures: "Hello", "Love" and "Sick", 1000 images of each gesture, total 3000 images were selected in training and used 100 images of each gesture, total of 300 images were selected to test. The experiment shown that the designed model can greatly enhance the gesture perception, accuracy value is 0.93 and loss value is 0.27 obtained from 2D CNNs. The total learn time is 1 hr 15 min 46 sec. | en_US |
dc.publisher | The 13th NPRU National Academic Conference Nakhon Pathom Rajabhat University | en_US |
dc.subject | 2D Convolution Neural Network | en_US |
dc.subject | Sign Language Recognition | en_US |
dc.subject | Deep Learning | en_US |
dc.title | The Performance of Thai Sign Language Recognition Using 2D Convolutional Neural Networks | en_US |
dc.title.alternative | ประสิทธิภาพการรู้จำภาษามือไทยโดยใช้เครือข่ายประสาทเทียมแบบคอนโวลูชัน 2 มิติ | en_US |
Appears in Collections: | Proceedings of the 13th NPRU National Academic Conference |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
npru_075.pdf | 349.27 kB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.