/DedAI

DedAI v2: AI & EEG-driven emotion-aware music curation. Real-time emotion detection coupled with AI-curated playlists. Developed in collaboration with Emotiv. Targets music lovers, wellness centers, and educational institutions.

Primary LanguagePythonOtherNOASSERTION

DedAI v2: Emotion-Aware Music Curation Through EEG and AI

Build Status Test Coverage Code Quality

An Interdisciplinary Synergy of Computational Neuroscience, Artificial Intelligence, and Musicology


Table of Contents

  1. Introduction
  2. Features
  3. Scientific Justification
  4. Technology Stack
  5. Collaboration with Emotiv
  6. Installation
  7. Usage
  8. API Reference
  9. Performance Metrics
  10. Case Studies
  11. Development Roadmap
  12. Interactive Tutorials
  13. Contributing
  14. FAQ
  15. License
  16. Acknowledgements
  17. Citations
  18. Contact & Social Media

Introduction

DedAI v2 is a pioneering venture that leverages computational neuroscience and machine learning techniques to curate personalized music playlists based on real-time emotion detection through EEG (Electroencephalography) data. This project embodies the confluence of neuroaesthetics, the theory of emotional intelligence in music, and state-of-the-art AI algorithms.

Demo GIF


Features

  • Real-Time Emotion Detection: Utilizing EEG spectral power densities to classify emotional states.
  • AI-Driven Music Recommendation: Employing Reinforcement Learning and NLP algorithms to recommend music that aligns with the user's emotional state.
  • User Clustering for Enhanced Personalization: Utilizing unsupervised learning algorithms to cluster users based on their emotional and musical preferences.

Model Architecture


Scientific Justification

This section elaborates on the scientific methodologies and algorithms deployed in DedAI v2:

EEG Data Processing

  • Fast Fourier Transform (FFT)
  • Wavelet Transform

Machine Learning Algorithms

  • Reinforcement Learning
  • Natural Language Processing (NLP)

For more details, please refer to our Scientific Whitepaper.


Technology Stack

  • EEG Data Processing: Python, FFT, Wavelet Transform
  • Machine Learning: TensorFlow, Keras, Scikit-learn
  • Backend: Flask, RESTful API
  • Frontend: Flutter

Collaboration with Emotiv

Developed in partnership with Emotiv, a global leader in neuroengineering, to integrate high-fidelity EEG hardware for unparalleled accuracy in emotion detection.


Installation

pip install -r requirements.txt

Usage

Please refer to our User Guide and API Documentation.

API Reference

For an extensive API reference, visit our API Docs.

Performance Metrics

  • Emotion Detection Accuracy: TBA
  • User Satisfaction Rate: TBA

Case Studies

  • Wellness Centers: Improved patient mental health by TBA
  • Educational Institutions: Enhanced student focus by TBA

Interactive Tutorials

For a hands-on experience, refer to our Jupyter Notebooks.

Contributing

We welcome contributions! Please read our Contributing Guidelines for more information.

FAQ

  • Is DedAI v2 available for commercial use?

    • At this stage, DedAI v2 is under active development and is not available for commercial use. We operate under a restricted license that protects the intellectual property while allowing for community contributions via feature suggestions. For more details, please consult our License.
  • What types of EEG devices are compatible with DedAI v2?

    • DedAI v2 is developed in partnership with Emotiv, so Emotiv's EEG devices are fully supported. We aim to extend compatibility to other EEG hardware in the future.
  • How does DedAI v2 ensure data privacy?

    • All EEG data is anonymized and encrypted to ensure user privacy. For more details, refer to our Privacy Policy.
  • Can DedAI v2 integrate with other music streaming platforms?

    • Integration with other music streaming platforms is part of our future development roadmap.
  • What machine learning algorithms does DedAI v2 use for emotion detection?

    • We leverage a blend of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) for emotion detection. Detailed information can be found in our Technical Documentation.
  • How accurate is the emotion detection?

    • We are in the process of rigorously validating our emotion detection algorithms, and thus, a definitive accuracy figure is not available at this time.
  • Is real-time mood tracking available?

    • Real-time mood tracking is currently under development and will be introduced in a future release.
  • How can I contribute to DedAI v2?

    • We encourage community involvement and contributions. Please consult our Contributing Guidelines for more details on how you can participate.
  • What programming languages and frameworks are used in DedAI v2?

    • DedAI v2 is primarily developed using Python, TensorFlow, and Flask. The frontend is built with Flutter.
  • Are there any academic publications related to DedAI v2?

    • We are in discussions with research institutions for potential collaborations and academic publications.
  • Is there an API I can use to integrate DedAI v2 into my own projects?

    • A comprehensive API is available for developers who wish to integrate DedAI v2 functionalities into their own projects. Refer to our API Documentation for more details.

License

This project operates under the DedAI v2 Restricted License Agreement. See LICENSE.md for details.

Acknowledgements

Special thanks to the computational neuroscience community and Emotiv for their invaluable insights and collaboration. We also acknowledge the foundational algorithms and seminal papers that contribute to this project.

Citations

For scholarly use, please cite this repository using the following Bibtex entry:

@misc{DedAI2023,
  author = {Elliott Mitchell},
  title = {DedAI v2: Emotion-Aware Music Curation Through EEG and AI},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\\url{https://github.com/HawkSP/DedAI}}
}