Abstract: The emergence of novel technologies such as virtual reality, augmented reality, and wearable computers has led to an increasing amount of research in human-computer systems. With multiple source of human information a computer can actually take a look into the real world emotions carried by the user’s head and act accordingly by observing their mental state. This paper concentrates on recognition of “inner” emotions from a non-stationary electroencephalogram (EEG) signals. The EEG signals in discrete format are collected from DEAP dataset, which contains 32 participants’ recorded EEG signals when excited to video stimuli. This paper proposes a computer based analysis which employs signal processing techniques of Frequency domain and Wavelet analysis for feature extraction and artificial neural networks for classifying the emotions carried by the participant from emotional characteristics exhibited in different frequency bands (Gama, Beta, Alpha, Theta and Delta). The average accuracies of Radial basis function and Multilayer perceptron models from wavelet analysis are 85.45% and 76.36% respectively which show better classification results over frequency domain analysis with 54.54% and 63.63% accuracies respectively. Considering each pair of channels Occipital lobe channels (Oz, O1, O2) are giving better results among the 40 channels covering whole head. Considering different frequency bands, high frequency bands (Gama and Beta) gives better results than lower frequency bands which are analysed from precision, sensitivity, specificity and F-scope calculated for each frequency band with different techniques of frequency and wavelet analysis.
Keywords: EEG, RBF, MLP, precision, sensitivity, specificity, F-scope, Fast Fourier Transform, Wavelet Transform, Gamma(?), Beta(ß), Alpha(a), Theta(?), Delta(d).