Description
Highlights for Depression Recognition by Deep Learning
- An RNN (i.e. LSTM) based Deep Learning model has been  modeled for the identification of depression.
- The computer-aided techniques based on neural networks predominately work on EEG as a biomarker for depression analysis.
- The authors’ model is an LSTM structure that has been layered.
- The EEG signals   are recorded at a sampling rate of 500 Hz using a total of 64 scalp electrodes.
- The data is preprocessed for artifacts removal and windowed into a chunk of 8 secs.
- The MDD Patients and Healthy Controls EEG dataset can be downloaded from this link: https://figshare.com/articles/dataset/EEG_Data_New/4244171
Introduction
You are reading this article on depression recognition by deep learning and will download the free code at https://free-thesis.com. Like the rest of the globe, India has observed an upsurge in mental health cases during the pandemic. Several suicides have been linked to the elevated anxiety and terror caused by COVID-19. SARS and other pandemics unquestionably impacted people’s mental health at some time. It is difficult to interpret electrical data manually, thus the brain’s activity is recorded in milliamperes (mA). Depressed people have a hemispheric asymmetry in their thought signals, which means that EEG may detect that unusual activity.
For the purpose of minimizing problems and accurately analyzing depression circumstances, the authors of this paper have examined the most modern machine learning algorithms. The approach uses EEG data to check for depression using a computer or artificial intelligence. Three levels of stacked LSTM networks are used to extract the features at the highest level. Advanced approaches including data mining and Deep Neural Networks, Recurrent Neural networks (RNN), and Convolution Neural Networks (CNN) have emerged in recent years (CNN).
DeepNet (Stacked Deep Learning Model)
Neural network methods, which are becoming increasingly popular, are used in the bulk of computer-aided EEG screening procedures. Even when the data is insufficient for an extended period of time, the learning process with and without feedback is critical. This difficulty can be alleviated using the LSTM approach. As seen in figure 1, the model developed by the authors is a multi-layered LSTM structure.

Feature extraction using 1D convolution and Dropout Layer
Features may be extracted using the approach of convolution, which is commonly utilized. It’s known as a Conv 1D layer if the operation is only done in one dimension (conversion in one dimension). To generate a kernel (weighted fitter) that slides over the input signal, the shift-compute process, or shift-fitter approach, is performed. It’s possible that the procedure is accidental or not. In this implementation, the author has used a non-casual convolution process.
The stacked structure of the proposed model
The suggested model uses three layers of LSTM layers. The following table provides a comprehensive breakdown of the model’s several layers: The input data vector is first convoluted, and then an activation function, which is the ReLU layer, is applied. Layered LSTM structures are piled on top of each other in the following three levels. The classification layer’s output neurons are included in the last three levels of consideration. The model is able to fulfill its goal using the Adam optimizer, which is a stochastic gradient descent technique with a learning rate of 0.001.

Performance Analysis
EEG signals taken for study in this work are taken from the analysis carried out in the psychology department, University of Arizona, USA. The complete designing and comparison of the model of screening model are carried out Colaboratory, also known as “Colab” popularly. The data has been bifurcated into training, validation, and testing with a ratio of 70% for training, and the rest 30% are again divided into 20% for validation and
10% for testing. With this ratio, there are 9428 EEG records for training and 2563 and 1000 for validation and testing respectively. The model has been tested for 10% of the untrained dataset. The dataset has been randomly shuffl to avoid the bias conditions. The accuracy achieved by the model is 84% for an epoch of 100 as shown in the Fig.2. While the model loss is around 0.35% as evident from the Fig.3. The model has taken 59 ms to complete an epoch of 100 with the testing conditions. Precision and recall values recorded are 85% and 80% respectively. Through these values, we can analyze the power of stacked LSTM memory cells. The model has exploited two main aspects of the structure that low complexity as compared to the other existing model with a comparable accuracy due to the inherent ability of the staked LSTM layer.




Conclusion
Stacked LSTM structures are introduced in the model for depression identification. The stacking not only increases the model’s capacity to extract features, but also its ability to forecast. A total of 45 patients’ data has been used to test the model (24 normal and 21 depressed). With a false positive rate of 0.07 percent, the model achieves a classification accuracy of 84%. The authors want to introduce the ensemble notion to increase the model’s accuracy, which is currently at an early stage.
Similar Other Work available on https://free-thesis.com
Published Paper similar to this work
- Ranjani, M., and P. Supraja. “Diagnosing Mental Disorders based on EEG Signal using Deep Convolutional Neural Network.”
- Chen, Xun, Chang Li, Aiping Liu, Martin J. McKeown, Ruobing Qian, and Z. Jane Wang. “Towards Open-World EEG Decoding via Deep Learning.” arXiv preprint arXiv:2112.06654 (2021).
- Zhang, Tao, Minjie Liu, Tian Yuan, and Najla Al-Nabhan. “Emotion-Aware and Intelligent Internet of Medical Things Toward Emotion Recognition During COVID-19 Pandemic.” IEEE Internet of Things Journal 8, no. 21 (2020): 16002-16013.
- Zhang, Xiaowei, Junlei Li, Kechen Hou, Bin Hu, Jian Shen, and Jing Pan. “EEG-based depression detection using convolutional neural network with demographic attention mechanism.” In 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 128-133. IEEE, 2020.
- Acharya UR, Oh SL, Hagiwara Y, Tan JH, Adeli H, Subha DP. Automated EEG-based screening of depression using deep convolutional neural network. Comput Methods Programs Biomed. 2018 Jul;161:103-113. doi: 10.1016/j.cmpb.2018.04.012. Epub 2018 Apr 18. PMID: 29852953.
vishal.gupta (verified owner) –
thanks
vishal.gupta (verified owner) –
thnk
otuo.acheampong (verified owner) –
very useful website
otuo.acheampong (verified owner) –
very useful website
vishal.gupta (verified owner) –
very useful and helpful
chou_aib (verified owner) –
Great
surumi.shajahan (verified owner) –
goood
vijay.shinde (verified owner) –
a
vijay.shinde (verified owner) –
a
ftafta –
THANK YOU
amolj (verified owner) –
nice
aadil.hussain (verified owner) –
Best
akbaralipaper (verified owner) –
excellent
dayanand.kumar (verified owner) –
thank you
samiksha.khandelwal (verified owner) –
Amazing
katswiri.felix (verified owner) –
Sure
dadasdas.sdadasd (verified owner) –
great
dadasdas.sdadasd (verified owner) –
dsadas
dadasdas.sdadasd (verified owner) –
safsdfsd
raja.kumar (verified owner) –
Helpful
aniket.patel (verified owner) –
aweasom
RP_luo0713 –
hao
raja.kumar (verified owner) –
didn’t get the code
most khadeja.khatun –
It is helpful for begineers
omid.mokhlessi (verified owner) –
thanks
omid.mokhlessi (verified owner) –
thanks
omid.mokhlessi (verified owner) –
thanks
omid.mokhlessi (verified owner) –
thanks
omid.mokhlessi (verified owner) –
thanks
omid.mokhlessi (verified owner) –
thanks
shubham.sharma-9224 (verified owner) –
fair
palak.khurana (verified owner) –
very useful
shubham.sharma-6885 (verified owner) –
–
john.daniel (verified owner) –
good
akshay.warade (verified owner) –
good
bijoy.harun (verified owner) –
very helpful
bijoy.harun (verified owner) –
very hepful
sujin.kumar (verified owner) –
nice
sujin.kumar (verified owner) –
hfg
sujin.kumar (verified owner) –
hfg
sujin.kumar (verified owner) –
hfg
raja.abilash (verified owner) –
good
raja.abilash (verified owner) –
good
raja.abilash (verified owner) –
good
gohan.chaula (verified owner) –
Z
sanduni.ridmika (verified owner) –
great
sravan satish.kondreddy (verified owner) –
sravan
anju.negi (verified owner) –
good
dsfsdfsd.dsfsdf (verified owner) –
NA
panduranga.terlapu (verified owner) –
GOOD
panduranga.terlapu (verified owner) –
GOOD
ankita.kokkera (verified owner) –
N/A
panduranga.terlapu (verified owner) –
good
joy.j (verified owner) –
nice
emmanuel.kinoti (verified owner) –
good
sam.gouehi (verified owner) –
thanks
sam.gouehi (verified owner) –
thanks
d.j (verified owner) –
Excellent
savita yadav (verified owner) –
.
savita yadav (verified owner) –
.
saurabh.marshettiwar (verified owner) –
must try
sumit.kumar-8291 (verified owner) –
NA
sumit.kumar-8291 (verified owner) –
NA
abirham.getie (verified owner) –
great work
fatma.elshihna (verified owner) –
very good
fatma.elshihna (verified owner) –
very good
fatma.elshihna (verified owner) –
,
savneet.kaur (verified owner) –
nice
unsa.soomro (verified owner) –
thanks
unsa.soomro (verified owner) –
thanks
muhammad ussama.siraj (verified owner) –
needed
emmanuel.kinoti (verified owner) –
ok
achraf.bag (verified owner) –
he
abdulraufgarba (verified owner) –
Thank you.
gaurav.kakoti (verified owner) –
Nice
r.j (verified owner) –
km
frostshand (verified owner) –
excellent
kautilya.chap (verified owner) –
good and better than expected
anshita.dhoot (verified owner) –
nice
anshita.dhoot (verified owner) –
nice
anshita.dhoot (verified owner) –
nice
chitrabanu.b (verified owner) –
Thank you
chitrabanu.b (verified owner) –
good
shiza_khan41 –
ok