wolfeiq commited on
Commit
971764e
·
verified ·
1 Parent(s): e39b708

Upload gyroscope_data_compiled_all.xlsx

Browse files

🎯 Emotion Recognition ML on the Basis of Smartphone Accelerometer Sensors
✨ Feel Your Words in Color ✨

Hi 👋,

I tried to find a correlation between emotional states (or vibes, as they are referred to in the deployed mobile applications) and the way the user moves the phone while typing (for example, a message) 📱. I wanted to represent these emotions then in color 🎨 (an accompanying color bubble after each message on WhatsApp, for example 💬).

I collected data from myself in different emotional states, ending up with around an hour worth of data for each emotion ⏱️. In this approach, I identified the emotions as calm/relaxed 🧘, frustrated 😤, happy 😄, anxious 😰 and sad 😢. The data was collected from both accelerometer and gyroscope sensors at 50Hz 📊 on a custom coded application as I was typing on an Android Samsung AG52 device 🤖. I visualized the collected data in Excel to remove outliers visible by eye 👀. The difference between gyroscope sensor data for different emotional states was insignificant ❌ and I did not preprocess it further, focusing on the accelerometer data ✅. In line with the previous statement, I refrained from using the Kalman filter, as well as the Butterworth or complementary filters ⚙️. The accelerometer data for five emotions was then handled manually for feature selection 🛠️. I examined the graphs visually and selected parameters I thought distinguished the graphs from each other the most 📈. I came up with ranges for each selected feature that was acceptable for that feature based on the emotion 🧠. If a feature range overlapped significantly between different emotions, I got rid of that feature ❌. This was done all done manually. This is also how I further removed outliers from the initial data set. No synthetic data was introduced 🚫. Lastly, I trained a very simple Random Forest model 🌳 with an 85% accuracy and deployed it in my applications 🚀.

📎 The raw data for the gyroscope and accelerometer sensors for myself can be accessed here (it takes long to load because of the graphs).

📁 The feature selection data is uploaded in this repository, as well as the Random Forest training file.

In the second approach, I collected data from 10 participants 👥 for three emotions (relaxed, frustrated, happy). I similarly dismissed the gyroscope data 🌀. The Random Forest model showed a similar accuracy of around 83% 📉 for this accelerometer data set. I trained four different CNN models 🧠 to experiment which one could perform the best. I picked out a simple baseline one, a deep CNN, an attention-based and a residual model 🔬. The baseline and attention-based models were optimized for their hyperparameters with the Optuna framework 🔧. At the end, all things equal, the baseline CNN performed best after Optuna optimization 🏆. The file is accessible here. The data that was fed in was minimally preprocessed and not denoised in any way ✂️. However, the CNNs were trained on the 3 input data, i.e. 3 emotions and not 5.

🛠️ My projects using the RF model currently in action:
🌐 phatedapp.com

🌐 portablehackerhouse.com

🛡️ The data is not stored on servers, unless the Premium option is purchased in the Phated app to fine-tune the model. It is processed ephemerally ⚡.

.gitattributes CHANGED
@@ -57,3 +57,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
57
  # Video files - compressed
58
  *.mp4 filter=lfs diff=lfs merge=lfs -text
59
  *.webm filter=lfs diff=lfs merge=lfs -text
 
 
57
  # Video files - compressed
58
  *.mp4 filter=lfs diff=lfs merge=lfs -text
59
  *.webm filter=lfs diff=lfs merge=lfs -text
60
+ gyroscope_data_compiled_all.xlsx filter=lfs diff=lfs merge=lfs -text
gyroscope_data_compiled_all.xlsx ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f2c9a6f8d506418500aebd211f7b1897bfadda1f3d67a87348f71b713961d081
3
+ size 188341416