This project is for classification of emotions using EEG signals recorded in the DEAP dataset to achieve high accuracy score using 1 D Convolutional nueral network
• convolutional neural network is a class of deep neural networks,used for the classification of both images as well as for time series classification
• In the current work, music video clips are used as the visual stimuli to elicit different emotions. To this end, a relatively large set of music video clips was gathered.
• 32 participants took part in the experiment and their EEG and peripheral physiological signals were recorded as they watched the 40 selected music videos.
• Participants rated each video in terms of arousal, valence, like/dislike, dominance and familiarity. For 22 participants, frontal face video was also recorded.
• The database contains all recorded signal data, frontal face video for a subset of the participants and subjective ratings from the participants.
• The data base can be downloaded from the following link: https://www.eecs.qmul.ac.uk/mmv/datasets/deap/
• In the code we used Fast fourier transform to extract feature from the preprocessed python data file
• Then after normalization data is begin classified using 1D CNN