When a developer programs, she experiences a broad range of emotions: from happiness and joy to sheer frustration. In psychology, there is a lot of work done on the connection between emotional state and productivity: if you are happier you work better. But, this work has not been done in software engineering.
There has been done some work based on IDE interactions (if there is less activity, a developer might be stuck) but Sebastian want to go further and take biometric parameters into account. It has been established that based on biometric signals, the emotional state of a person can be detected. Sebastian builds upon this and investigates whether these can be used in software engineering too. A study was conducted with 17 participants: 6 pros and 11 students. They had to perform two small change tasks for 30 minutes, while wearing the sensors:
When strong positive or negative emotions were sensed, the users got a small questionnaire in which they had to rate their progress, valence (happiness), arousal and the reasons for the current emotion.
Before the study, they had to watch a video of a fish tank for 2 minutes, to use as a baseline. After the tasks, they were shown happy and sad pictures to induce emotions and cross validate.
This all resulted in 213 data points, with which the research questions could be answered.
Correlation between emotions and progress
Turns out: valance has a very strong correlation with progress, but the participant has a high effect too: apparently, this effect differs per person. Sebastian found there are two types of people:
Correlation was found for 12 people, but not for the other 5.
Reasons for change
So, what makes developers happy: finding relevant parts of the code. But, this is also on the list of things that are frustrating: not being able to find code, or not knowing what next steps to take.
Emotions and progress and often connected, people specifically reported being frustrated because they did not get something done.
Can biometric sensors be used to determine emotions?
The machine learning algorithm Sebastian developed was able to predict emotions for 82% of the cases (which is an improvement over random of about 40%) For the progress prediction, 68% of the cases were correctly classified (improvement over random of 30%) So yes, biometric sensing works.
This gives rise to nice future work, for example by creating tools to communicate the fact that a developer is in flow and really should not be disturbed, or suggest help when a developer is really stuck. But of course there are some interesting privacy and ethical issues here.