Oct

When Big Data is not so big anymore

                                                   

We are inundated with information. There is so much information around us they coined a special term - Big Data. To emphasize the sheer size of it.

It is, of course, a problem - to deal with a large amount of data. Various solutions have been created to address it efficiently.  

At nmodes we developed a semantic technology that accurately filters relevant conversations. We applied it to social networks, particularly Twitter. Twitter is a poster child of Big Data. They have 500 million conversations every day. A staggering number. And yet, we found that for many topics, when they are narrowed down and accurately filtered, there are not that many relevant conversations after all.

No more than 5 people are looking for CRM solutions on an average day on Twitter. Even less - two per day on average - are asking for new web hosting providers explicitly, although many more are complaining about their existing providers (which might or might not suggest they are ready to switch or looking for a new option).  

We often have businesses coming to us asking to find relevant conversations and expecting a large number of results. This is what Big Data is supposed to deliver, they assume. Such expectation is likely a product of our ‘keyword search dependency’. Indeed, when we run a keyword search on Twitter, or search engines, or anywhere we get a long list of results. The fact that most of them (up to 98% in many cases) are irrelevant is often lost in the visual illusion of having this long, seemingly endless, list in front of our eyes.

With the quality solutions that accurately deliver only relevant results we experience, for the first time, a situation when there are no longer big lists of random results. Only several relevant ones.  

This is so much more efficient. It saves time, increases productivity, clarifies the picture, and makes Big Data manageable.  

Time for businesses to embrace the new approach.

 

Interested in reading more? Check out our other blogs:

Meet Eliza, the Mother of AI

                                                             

Meet Eliza, the Mother of AI..

Today, Artificial Intelligence seems to be the buzz of every major enterprise. Salesforce is formally announcing Einstein this fall, IBM has worked on Watson for years now, and after 20 years of working with AI, Microsoft has made a few attempts to bring the technology to the market. With all this activity, you may be asking yourself what kind of impact AI will have on you and your business, and where you might want to look to investigate the possibilities Artificial Intelligence represents.

Before we discuss how AI will impact customer support and consumer experience, and how you may leverage it in your contact center, I thought it would be fun to take a look where AI got its start.

The term AI was coined by computer scientist John McCarthyin 1956 who subsequently went on to create the Dartmouth Conference to advance the ideas and technologies associated with machine intelligence. While this collective of thought leaders and scientists made huge advancements through programs at MIT and others, most of their work was only circulated in academic fields.

Not many people were aware of Artificial Intelligence, how it worked or its potential uses, until around 1964 when MIT computer Scientist Joseph Weizenbaumwrote Eliza, a program based on Natural Language Processingthat was able to successfully question and respond to human interactions in such a way as to almost sound like a real human being. Eliza, with almost no information about human responses was able to use scripts and pattern  matching to simulate responses that might occur between two people.

The most famous of these simulations, highlighting  AI ability to intersect with modern needs and technology, was DOCTOR. DOCTOR was able to question and respond to a human in such a way so as to almost sound like an actual psychotherapist. As the human subject made statements, DOCTOR asked questions and made statements relevant to the conversation as if it were a present and conscious being… almost.

Over the years  computer scientists, whether academics or industry professionals,  have worked tirelessly to improve upon these developments with the hope of delivering a computer program capable not only to ask and respond, but to understand the context of a conversation. A program that can relate relevant data to responses, thus providing value to the human it’s conversing with, while helping to chart the course of the conversation, just as if you and I were talking over a cup of coffee or across a conference room table.

Why is this important, you may ask? With the introduction of Chatbots, we began to see some of the potential in Artificial Intelligence. Companies could now front-end customer chat interactions that allowed the company to be more responsive to its customers while shortening wait times and deflecting inquiries from the call center, which as we all know are hugely expensive.

The one problem with Chatbots? Customers hated dealing with limited technology that was cold, often incorrect, and frustrating. People are accustomed to dealing with the cold, sterile nature of technology when they type numbers in a phone to be routed but expected a human to be chatting with them. These negative experiences have made a number of companies a little gun shy about implementing true Artificial Intelligence. The last thing a business wants is a customer complaining, especially on Social Media, about a poor customer experience due to a bad interaction with technology.

There is a significant difference between Chatbot technology and true AI, consequently the outcomes and customer experience are proving to be very different. Where a Chatbot is more like an IVR, answering simple questions and routing customers to the correct agent, Artificial Intelligence is aware of the conversation and able to present relevant responses, thereby providing a faster response and shorter customer interaction times and better customer service. I mean, if Eliza’s DOCTOR could simulate a psychotherapist in 1964, what can AI do for your contact center in 2016?

READ MORE

The End of Digital Monitoring Paradigm

                                 

Digital industry is changing rapidly.

For the last decade analysis of social chatter and capture of consumer sentiment was considered the cutting edge of the marketing strategy.  In these early days of the new era of digital information businesses were told to listen to what market is saying about them. They were educated on the importance of media monitoring and the advantages it creates for strategic growth.

This picture has become outdated.

Listening to Big Data, in all its aspects and forms, is no longer enough. After you successfully listened and understood what customer said the next natural step would be to act, or respond. And so the digital domain is now spreading to include responses, with a host of innovative technological solutions reshaping the field rapidly.  Advances in artificial intelligence in particular create disruptive scalable opportunities in the space traditionally known for its slow manual progression.

Facebook was among the firstto enter the market, introducing bots into the process of connecting users with brands. Then there was Microsoft's turn.

Following these developments bots became the hottest trend in Silicon Valley in 2016.

nmodes fits seamlessly into this new world order. We deliver AI solutions that power business sales process. Our listening solution accurately monitors and captures real-time needs and interests of individual customers within the defined audience. And our Intelligent Assistant solution brings scalability to responses without compromising on quality.  

 

READ MORE