September 28, 2011
Text Analytics News caught up with three professionals at companies that are pioneering text analytics and big data to ask them some questions about their personal experiences in the analytics industry and the future outlook of the text analytics market. The three professionals include:
- Catherine Van Zuylen – Vice President of Products, Attensity
- Tom H. C. Anderson – President, Anderson Analytics
- Daniel Graham - General Manager Enterprise Systems, Teradata
Q. When and how did you first get into text analytics, and what was it that attracted you to this technology in the first place?
Anderson: From 2001 through 2005 I worked on Starwood Hotels Customer Satisfaction Program, and during that time we had over a million VOC Surveys coming in per year. It was very different from other types of research I had conducted in the past, and while we were using very advanced analysis on quantitative data points I realized that the text comments could add so much more explanatory power to the analysis.
In graduate school my main area of interest was in gaining consumer insights through data mining, and though text mining (NLP) was still relatively new at least in terms of adoption, I started exploring the various software already on the market and found we could more easily answer and solve various business questions by leveraging the unstructured data available to us. So in 2005 I started Anderson Analytics and during the past six years we’ve leveraged these new techniques in a number of ways I hadn’t even thought of at the time.
The rise of social media during this time certainly helped propel this field and offer even greater opportunities. Expanding beyond survey research to working with data from Bulletin boards, LinkedIn, Facebook and Twitter has certainly kept it exciting.
Graham: Around 2000, I was at IBM working with Watson Labs on text analysis projects. At the time, we were working with a stock brokerage in Manhattan with their call center. Consumers would call in and tell the contact center person dozens of personal things, some of which could be used to improve service and also sell more to the consumer. I learned a great deal about ontologies, text processing, and text analytics. At about the same time, a few Professional Services people were able to help Ford Motor use a data warehouse to solve the Firestone Tire blow out problem affecting their SUVs and trucks. There was lot of text processing and data warehouse analytics needed for that.
Being in analytics for most of my career, the use of text analysis was fascinating. There was huge potential in many applications. I think the complexity of the text problems were also attractive – these were tough computer science problems which just makes it all the more fun when you solve them. I have always been fascinated with why the computer can’t read. This is a start.
Zuylen: I’ve been involved in the text analytics space for nearly a decade. I was attracted to the text analytics space because it provides the ability to truly organize the world’s textual information – transforming research papers, tweets, emails, and billions of other documents into actionable insights that have the power to transform human knowledge and drive real global understanding and innovation.
Q. How does your company use text analytics?
Graham: Teradata builds data warehouses for the Global 3000 and mid-market customers. We are helping our customers add text into the data warehouse, primarily in the area of customer relationship marketing. Numerous customers have already been doing text processing and feed that into the data warehouse for analysis. It’s simple enough to capture emails and tweets in the data warehouse.
Deriving value from these allows Marketing people to assess likes and dislikes about their products and how better to promote the products. This drove us to partnerships with Attensity, SAS, and Clarabridge and our acquisition of AsterData. AsterData opens up the social media analytics and social networking graphs so we can tie together consumer preferences with key influencers in the marketplace. Sifting through an internet full of text needs extreme scalability and parallel processing coupled with the text operators AsterData provides. Its exciting.
Anderson: Initially Anderson Analytics leveraged what was already on the market. Notably early on we were heavy users of Text Mining for Clementine. It was very much about understanding the best tool for the job. We found that no one vendor had the best solution for every situation.
There are more social media monitoring companies now than I can keep track of. Eventually, after trying so many different tools on so many projects you begin to develop a set of best practices. We published and won awards for some of our early white papers for leveraging text analytics in traditional market research as well as social media analysis. Finally we realized that vendors weren’t really offering something for the needs of our main use case (market research), so we started to build our own internal tools and are now in development of OdinText, which truly leverages what we have learned over the years.
Zuylen: At Attensity, we use text analytics to bring order out of chaos so companies can better connect with their customers. We take billions of social media, email, CRM records and other documents in 32 languages and both analyze those and route them to the appropriate person for action.
Q. What do you think the future of text analytics hold for us?
Zuylen: Not since the Tower of Babel have we been presented with the ability to rapidly share information on such a global scale – which I believe has the power to transform the world in such disparate efforts as cancer research, world diplomacy, customer understanding, and more. As we continue to innovate, it will become easier and easier for us to “read” more languages and structure an ever-growing amount of information so that we can more effectively learn from each other and accelerate our improvement of the global condition.
Graham : Text is one form of “big data” – it’s high volume and continuous in nature. Right now, the hot spot is in social media analytics, things like Facebook and Twitter where consumers voice opinions. There is a lot of work to do to harness this treasure trove of data and convert it into marketing decisions that grow revenues. Similarly, a lot of brand management and customer support will get immense value from a large corpus of text pulled from the internet into the data warehouse. The transition from anecdotal decision making to decisions based on trends derived from huge amounts of text can only be good for us all. So it’s easy to predict social media analytics will drive text analytics into the mainstream. This will open up dozens of other text analytic uses such as fraud detection, risk analysis, healthcare research, government trend analysis, security applications, etc. It will take another 10 years before the computer can read, so this is a fun time of innovation.
Anderson: Certainly as I said earlier, social media is an interesting and game changing technology. It’s now more important than ever to utilize text analytics as the relationship between company, employee and customer.
While we do not know yet where social media will go, and all the ways we can leverage this for competitive advantage, I think it’s equally important to not forget about many of the other sources of text data. As I support development and innovation in this field, I feel very strongly that in the near future true value will come less from incremental advances in NLP, and more from use case expertise provided by professionals from domain experts. Ours happens to be consumer insights, but there will be many more software options leveraging text analytics in various fields like pharmaceutical, finance, security etc., all providing value to the field by incorporating their unique knowledge into methods and software applications.