System for Prediction of Human Emotions and Depression level with Recommendation of Suitable Therapy
DOI:
https://doi.org/10.51983/ajcst-2017.6.2.1787Keywords:
Emotions, Rule Based approach, Speech synthesis, Pitch detection, Depression, Musical therapy, Blood pressure, Voice prosody, EEG signals, decision fusion, optimal weighting, Prediction AlgorithmAbstract
In today’s competitive world, an individual needs to act smartly and take rapid steps to make his place in the competition. The ratio of the youngsters to that of the elder people is comparatively more and also they contribute towards the development of the society. This paper presents the methodology to extract emotion from the text at real time and add the expression to the textual contents during speech synthesis by using Corpus , emotion recognition module etc. Along with the emotions recognition from the human textual data the system will analyze the various human body signals such as blood pressure, EEG signals , vocal prosody to predict the level of depression so that suitable therapy can be suggested using Prediction algorithm. In text analysis, all emotional keywords and emotion modification words are manually defined. To understand the existence of test was carried out on set of textual sentences and preliminary rules written for 34 different emotions. These rules are used in an automated procedure that assigns emotional state values to words. These values are then used by speech synthesizer to add emotions to speech & input sentence. Pitch detection algorithm has been implemented for pitch recognition.
References
“Analysis of Human Body Signals for Prediction of Depression Disorder ” in IJCTA , Vol.10 No. :8, pp. 665-672,2017.
“ Healing Hands for Depressed People (D-HH) through Analysis of human body signals to predict the level of depression and recommendation of suitable remedy ” in ICCUBEA 2016.
Relative Body Parts Movement for Automatic Depression Analysis Jyoti Joshi Abhinav Dhall Roland Goecke Jeffrey F. Cohn, IEEE 2013.
Content Based Clinical Depression Detection in Adolescents Lu-Shih Alex Low, Namunu C. Maddage, Margaret Lech, Lisa Sheeber, Nicholas Allen, EUSIPCO 2009.
Towards automatic analysis of gestures and body expressions in depression Marwa Mahmoud and Peter Robinson, EAI International Conference 2016.
Analysis of Prosodic Speech Variation in Clinical Depression Elliot Moore II, Mark Clements, John Peifert and Lydia Weissert, IEEE September 2013.
EEG Signal and Video Analysis Based Depression Indication, Yashika Katyal, Suhas V Alur, Shipra Dwivede, Menaka R, IEEE 2014.
Eye Movement Analysis for Depression Detection Sharifa Alghowinem, Roland Goecke, Michael Wagner, Gordon Parker, Michael Breakspear, IEEE 2013.
Computer-aided detection of depression from magnetic resonance images Kuryati Kipli, Abbas Z. Kouzani, Matthew loordens, ICME 2012.
Multichannel Weighted Speech Classification System for Prediction of Major Depression in Adolescents Kuan Ee Brian Ooi, Margaret Lech, and Nicholas B. Allen, IEEE February 2013.
Detecting Depression Severity from Vocal Prosody Ying Yang, Catherine Fairbairn, and Jeffrey F. Cohn, Associate Member, IEEE 2013.
Analyzing Sentimental Influence of Posts on Social Networks, Beiming Sun and Vincent TY Ng IEEE 2014.
V. Knott, C. Mahoney, S. Kennedy, and K. Evans, "EEG power, frequency, asymmetry and coherence in male depression," 2010 .
A. J. Calder, A. M. BUlton, P. Miller, A.W. Young, A Principal Component Analysis of Facial Expressions Vision research, 2011.
J. Joshi, R. Goecke, S. Alghowinem, A. Dhall, M. Wagner, J. Epps, G. Parker, and M. Breakspear, “Multimodal Assistive Technologies for Depression Diagnosis and Monitoring,” Springer, Journal on Multi- modal User Interfaces, 2013.
J. Joshi, R. Goecke, M. Breakspear, and G. Parker, “Can body expressions contribute to automatic depression analysis?” , 2013.
C. Mathers, T. Boerma, and D. M. Fat, “The global burden of disease, 2004 update,” 2004.
https://www.disabledworld.com/artman/publish/bloodpressurechart.shtml
Swati D. Bhutekar, Prof. M. B. Chandak “Corpus Based Emotion Extraction To Implement Prosody Feature In Speech Synthesis Systems ”,Ijcer, International Journal of Computer & Electronic research,Vol.1, No. 2, August 2012, ISSN 2778-5795, 67-75.
https://kdd.ics.uci.edu/databases/eeg/eeg.html
Curry Guinn and Rob Hubal,” Extracting Emotional Information from the Text of Spoken Dialog”, RTI International, 3040 Cornwallis Road, Research Triangle Park, North Carolina, USA, 27709
Ze-Jing Chuang and Chung-Hsien Wu,” Multi-Modal Emotion Recognition from Speech and Text”, Computational q Linguistics and Chinese Language Processing, Vol. 9, No. 2 , August 2004, pp. 45-62 45 , The Association for Computational Linguistics and Chinese Language Processing
Ying, Yamg, Catherine Fairbairn, and jeffery F. cohn,Associate Member,IEEE, “Detecting Depression Severity from Vocal Prosody”, IEEE Transactions on Affective Computing,Vol 4, No. 2, April-June 2013
Swati Bhutekar, Manoj Chandak, Ajay Agrawal, “Emotion Extraction: machine learning for text-based emotion”,Proceedings published by International Journal of Computer Applications® (IJCA)ISSN: 0975 – 8887, MPGI National Multi Conference 2012 (MPGINMC-2012)7-8 April, 2012 ,“Recent Trends in Computing”, pp.20-23.
Itamar Chazanovitz,Meital Greenwald , “Text based emotion estimation”, Ben-Gurion University of the Negev Department of Computer Science September 2008
Xu Zhe, David John and Anthony C. Boucouvalas,” Emotion Extraction Engine: Expressive Image generator”, Multimedia Communications Research Group, School of Design, engineering and Computing, Bournemouth University.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2017 The Research Publication

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.