Το έργο με τίτλο Αναγνώριση συναισθημάτων από εκφράσεις προσώπου χρησιμοποιώντας SVM αλγόριθμο από τον/τους δημιουργό/ούς Lemonis Ioannis διατίθεται με την άδεια Creative Commons Αναφορά Δημιουργού 4.0 Διεθνές
Βιβλιογραφική Αναφορά
Ιωάννης Λεμονής, "Αναγνώριση συναισθημάτων από εκφράσεις προσώπου χρησιμοποιώντας SVM αλγόριθμο", Διπλωματική Εργασία, Σχολή Ηλεκτρολόγων Μηχανικών και Μηχανικών Υπολογιστών, Πολυτεχνείο Κρήτης, Χανιά, Ελλάς, 2019
https://doi.org/10.26233/heallink.tuc.80391
The recognition of facial expressions and the corresponding emotion that they convey is something every human understands and it is universal without requiring training. It is instinctively “coded” in humans DNA. Human – Computer interaction can greatly benefit from recognizing various human emotions just by looking at us (no need for same language or any spoken language at all). The way we can achieve this is by taking frames (or live video) that depict human faces showing different emotional expressions and convert them to greyscale which makes it easier and faster to locate the facial area. We then apply a pre-trained mask with 68 flags-points (each one with unique coordination) based on a method that tries to apply these flags around the main facial areas that show details of emotions, around the orifices (mouth, eyes, nose). We proceed by taking all possible combinations of these 68 points and calculate the Euclidean distance between each set and then by using an SVM Machine learning method we train the system to recognize four main emotions (Anger, Joy, Tranquility, Sadness) experimented on two Data Sets (with depictions of expression and corresponding feeling). The data sets being used for the training and validation of the SVM are: Patras A.I.nD.M. data set of 84 directed facial poses (portrait angle) and Fer2013 data set of 960 Not directed facial poses (random angle). Τhe Fer2013 data set was created for a Facial Recognition competition to test the recognition algorithms to their limits, with very small resolution 48x48 pixels and full with out of focus, obscured (either by hair, hands, sunglasses, hats, bad angle etc ) pictures, some of them are not even from real photographs rather portraits or drawings of faces, with wide variety of different people from around the world. After the training we apply a validation test to find out how accurate it is. Of the four examined emotions Joy is the overall best distinguished with 100% (in all three statistical measures: Sensitivity, Specificity and Accuracy) in both Patras data sets tests 72%, 75% and 85% (Sen, Spe, Acc) in the Fer2013 data set accordingly. Anger comes second with 100%, 83%, 87.5% in Patras data set and 56%, 85%, 78% in Fer2013 data set. Tranquility is third with 50%, 100%, 87.5% (Sen, Spe, Acc) in Patras data set and 45%, 67% and 62% in Fer2013 data set. Finally Sadness with 87.5%, 50% and 83% in Patras data set and 21%, 80% and 62% in Fer2013 data set.