Face Emotion Detection and Recognition platform
In the context in which emotions play a crucial role, the SocraTech project represents a milestone in the field of real-time emotional detection. Our main goal has been to create a product that blends rigorous scientific approach with cutting-edge technologies, putting data privacy at the center.
Facial Emotion Recognition (FER)
it is a branch of computer vision that deals with identifying and interpreting an individual's facial expressions in order to determine their mood. This field of research has a wide range of applications, including human-computer interaction, emotion detection in surveillance contexts, evaluating users' emotional responses when interacting with digital systems, and much more. The main goal of emotion recognition is to develop artificial intelligence models capable of understanding and interpreting emotional signals transmitted through facial expression. This analytics capability can allow human-machine interaction systems to intelligently adapt to user emotions, improving the overall experience and enabling more natural and personalized communication. In the context of this work, we have developed a deep learning model based on a convolutional neural network (CNN) to perform the task of Emotion Detection Recognition.
Lines of code
Version
The phases of the project
Dataset search: In the initial phase of the project, we carefully evaluated the available datasets, opting to use Aff-Wild2. This data source was carefully chosen to provide a solid foundation for training our emotional recognition model.
Data management and emotional imbalance: In the context of machine learning projects, ensuring high-quality data is critical. During this phase, we worked on cleaning the datasets, removing invalid and inconsistent data. In particular, we addressed the problem of emotion imbalance within the dataset, where common emotions had a strong presence, while others were underrepresented. This process was vital to ensuring that our model was trained on data that reflected a full range of emotional experiences.
Dependencies
Last update
Data preprocessing was another crucial aspect of our work. We worked on the images, detecting and aligning faces to improve the quality of the data and the accuracy of the model. This step helped ensure that our model could make accurate and reliable emotion detections.
Modeling: Through a series of extensive experiments, exploring models of different sizes and approaches, we identified a custom solution based on convolutional neural networks. This choice allowed us to find the ideal balance between accuracy, resource efficiency (reducing training time and computational requirements) and processing speed. This model represents the core of our solution, allowing us to interpret emotion information highly efficiently.
In the final phase of our journey, we have developed a highly scalable architecture based on cutting-edge technologies such as Docker, Django and VueJs. Thanks to Django, our backend has become a solid foundation, granting us the ability to effectively manage and monitor the results of model predictions and process time series of emotions for users. Our frontend, built in VueJs, offers an elegant and highly functional user experience. Furthermore, Docker played a fundamental role in the process of deploying the machine learning model, the backend and the frontend, providing portability and scalability, two key elements for the success of our solution.
SocraTech represents an advanced Emotion Recognition solution that fuses the principles of psychology with artificial intelligence through machine learning. The primary goal of this innovative platform is to accurately detect an individual's emotional state, enabling a deeper understanding of their reactions and interactions. SocraTech represents an innovative bridge between the science of psychology and artificial intelligence, with the aim of providing a more in-depth and ethically responsible understanding of human emotions, while respecting privacy and current regulations. We are very proud to have been part of this unique project.