There is some debate surrounding the birth of AI (Artificial Intelligence). John McCarthy used the term at the Dartmouth Conference in 1956. This was the year that Elvis Presley released his first hit record, Heartbreak Hotel. At the same time, IBM created a computer with a 5Mb hard drive. A breakthrough in technology, the computer as a whole weighed in at around 1 ton and spanned 16 square metres of floor space. The hard drive alone required a forklift truck to move it around. To add a little perspective, it would be another 4 years until the Etch A Sketch appeared and charmed children (and adults) across the globe. I genuinely believe it had about the same processing power as my first home computer!
Advances in research are supported by the relentless pace of technology. Graham Moore’s law, from 1965, is still as prevalent today as it was then. We are now in an age where machines are truly challenging the Turing Test in terms of displaying ‘cognitive awareness’. Blurring the boundaries As humans, we are extremely good at matching complex patterns. Computers have always struggled to do some of the things that we take completely for-granted. Such as spotting a friend in a busy crowd, reading poorly scrawled handwriting or picking out a specific conversation in a noisy environment. Our unique ability to perform these tasks is quickly being eroded. Cognitive services now exist that provide these abilities without human intervention. At least this is what the marketing materials promise. We decided to partner with Microsoft to see how their facial emotion service really works. Going live at Microsoft Our challenge, at Objectivity, was simple. With very limited time and resources, conduct a proof of concept exercise. The output should be an application that performs real-time emotional analysis of an audience. We would need a venue, an audience and most probably a few late evenings. Microsoft kindly solved the first challenge; our account managers called on potential and existing customers to be willing participants. We presented our findings on 16th March 2017 at Microsoft's Paddington offices. Our PoC was a combination of Python, Microsoft Cognitive Services and open source technologies. Prayers, crossed fingers and the occasional whispered expletive also contributed to the overall experience. The live demo went extremely well. A webcam, focused on the audience, continually passed data to the Microsoft Emotion API. Our application processed information from the API, and presented it back to us in a visually compelling way. The diagram shows how we put it all together. I think the best part of the demo was watching the happiness measurement increase significantly when I asked for a smile to send back to my managing director. We learned a lot from the experience; it was encouraging how easily different technologies worked together. So how well did we do? A few examples show how it all works far better than I can describe. The examples here are taken from our pre-event tests. The first image shows some of our team at work and, as a result, naturally happy. If you look closely, you will see that the application failed to identify one of the faces. This is because he is partly hidden by the person in front of him. The chart shows an anonymous summary of the prevailing emotion of the group. The second image is interesting. It shows the winners of our IDEA_App competition. Michal, the guy on the right is largely facing away and, not unreasonably, the API does not detect his face. The prevailing emotion for Blazej, on the left, is contempt. From a purely facial perspective, this seems like a fair analysis. In reality, the overall context shows a team that are proud to be voted winners. It's a great example of how complex we are. Whilst our faces show clear emotions, we need to consider our body language as a whole to understand how we truly feel. The chart in this case, shows a broader distribution of emotions. Overall, the prevailing sentiment of happiness seems to be correct. Final remarks In conclusion, the Emotion API performed extremely well. In all tests, the prevailing emotions were in line with human judgement. On its own, there are use cases where the emotion API can provide value. When coupled with sentiment analysis and other emerging technologies, the possibilities really start to get interesting. Some scenarios are:
- Measuring the emotions of an audience at a seminar, workshop or presentation to show how engaged the audience is.
- Having advertising boards with motion sensors. They detect when someone approaches and measures their interest. Of course, the same applies to shop windows or even online advertising.
- User experience testing, measuring emotions coupled with eye tracking when browsing a site or using an application.
- Monitoring students to show the general interest in a lecture. If their faces are not recognised, they're not looking forwards!
In each case, real time feedback allows the presenter/tutor/organisation to adjust their strategies based on what they are being told. I'm sure you can think of many more scenarios where this can benefit your business. I'd like to thank Microsoft for their help with this exercise. I'd also like to thank everyone that came along to hear what we had to say. Innovation is the lifeblood that delivers value for our customers and attracts the best people to our business.