Key take-aways from sentiment analysis symposium 2016
In July 2016, I was fortunate enough to speak at the Sentiment Analysis Symposium in New York. It is one of the most important events for those who invent text analytics solutions and for those who use them. I also attended the co-located sentiment analysis tutorial run by Jason Baldridge. Overall, this was an excellent event to get up to speed on the current state of affairs and the future outlook. So, here are my key take-aways.
1. Personality mining
Sentiment Analysis is now used not just for opinion mining, but also for detecting people’s personalities. This leads to a multitude of new business use cases. For example, it turns out people who are:
- less neurotic and more open in their statements
- more susceptible to ads
Furthermore, the level of neuroticism can be guessed from their language.
2. It’s all about data!
When doing Sentiment Analysis, data trumps lexicons and (in many cases) algorithms. Often you get ‘more bang for your buck’, or in other words better accuracy, if you annotate more examples, rather than experiment using better algorithm settings.
3. Deep Learning is ‘just’ a better Machine Learning
Deep Learning trumps Machine Learning when it comes to capturing non-regularities within your data. It also means that you need to do less feature engineering. But the same thing applies here: garbage in, garbage out.
4. Emotion Detection is the new black
Anjali Lai from Forrester gave an excellent keynote on emotions arguing that emotion drives loyalty, and loyalty drives business. Companies who can evoke certain emotions in their customers will be more successful. For example, by telling compelling brand stories.
5. Wheels of emotions are overrated
Several presenters and attendees were discussing various wheels of emotions, including the logo of the event. But not all emotions are equally useful. According to Forrester, some specific emotions correlate with loyalty: For example, customers who feel surprised by a brand, spend more.
So, how would one detect emotions?
- Detecting emotions in text is usually lexicon driven and is relatively simple, but not all people talk emotively;
- Using visual cues to analyze facial expression is language independent. However effective cues are different for each use case, and face coding, in general, won’t work on a poker face;
- Analyzing the tone of voice is culturally agnostic but requires a large training set and is limited in the number of emotions it can detect;
- Measuring heart beat and fMRIs are objective but equipment-heavy and intrusive.
I loved how diverse the conference was. Technologists, marketers, analysts, academics and business people from at least 6 different countries attended. Kudos to the organizer Seth Grimes who also made a noticeable effort in making sure speakers and panelists were diverse.
If you’d like to speak at or attend a similar event, check out LT-Accelerate in Brussels coming up in November. I will be speaking at the Customer Experience Management Asia Summit in September and at the O’Reilly’s Strata + Hadoop World in December, both in Singapore. Drop me a line if you plan to attend any of these!