Chatbots and assistant programs designed to support conversations with human users rely on natural language processing (NLP). This is a field of scientific research that aims at making computers understand the meaning of sentences in natural language. The algorithms developed by NLP researchers helped power first generation of virtual assistants such as Siri or Cortana. Now the same algorithms are made available to the developer community to help companies build their own specialized virtual assistants. Industry products that offer NLP capabilities based on these algorithms are often called AI engines.
The most powerful and advanced AI engines currently available on the market are (in no particular order): IBM Watson, Google DialogFlow, Microsoft LUIS, Amazon Lex.
All these engines use intents and entities as primary pnguistic identifies to convey the meaning of incoming sentences. All of them offer conversation flow capability. In other words, intents and entities help to understand what the incoming sentence is about. Once the incoming sentence is correctly identified you can use the engine to provide a reply. You can repeat these two steps a large number of times, thus creating a conversation, or dialog.
In terms of language processing ability and simplicity of user experience IBM Watson and Google DialogFlow are currently above the pack. Microsoft LUIS is okay too; still, keeping in mind that Microsoft are aggressively territorial and like when users stay within their ecosystem, it is most efficient to use LUIS together with other Microsoft products such as MS Bot Framework.
Using AI engine conversation flow to create dialogs makes building conversations a simple, almost intuitive, task, with no coding involved. On the flip side, using AI engine conversation flow limits your natural tendency to make conversations natural. The alternative, delegating the conversation flow to the business layer of your chatbot, adds richness and flexibility to your dialog but makes the process more comppcated as it now requires coding. Cannot sell a cow and drink the milk at the same time, can you?
Amazon Lex lacks the semantic sophistication of their competitors. One can say (somewhat metaphorically) that IBM Watson was created by linguists and computer scientists while Amazon Lex was created by sales people. As a product it is well packaged and initially looks pleasing on the eye, but once you start digging deeper you notice the limitations. Also, Amazon traditionally excelled in voice recognition component (Amazon Alexa) and not necessarily in actual language processing.
The space of conversational AI is fluid and changes happen rapidly. The existing products are evolving continuously and a new generation of AI engines is in the process of being developed.

What Is AI Engine and Do I Need It?

The End of Digital Monitoring Paradigm
Digital industry is changing rapidly.
For the last decade analysis of social chatter and capture of consumer sentiment was considered the cutting edge of the marketing strategy. In these early days of the new era of digital information businesses were told to listen to what market is saying about them. They were educated on the importance of media monitoring and the advantages it creates for strategic growth.
This picture has become outdated.
Listening to Big Data, in all its aspects and forms, is no longer enough. After you successfully listened and understood what customer said the next natural step would be to act, or respond. And so the digital domain is now spreading to include responses, with a host of innovative technological solutions reshaping the field rapidly. Advances in artificial intelligence in particular create disruptive scalable opportunities in the space traditionally known for its slow manual progression.
Facebook was among the firstto enter the market, introducing bots into the process of connecting users with brands. Then there was Microsoft's turn.
Following these developments bots became the hottest trend in Silicon Valley in 2016.
nmodes fits seamlessly into this new world order. We deliver AI solutions that power business sales process. Our listening solution accurately monitors and captures real-time needs and interests of individual customers within the defined audience. And our Intelligent Assistant solution brings scalability to responses without compromising on quality.
When Big Data is not so big anymore
We are inundated with information. There is so much information around us they coined a special term - Big Data. To emphasize the sheer size of it.
It is, of course, a problem - to deal with a large amount of data. Various solutions have been created to address it efficiently.
At nmodes we developed a semantic technology that accurately filters relevant conversations. We applied it to social networks, particularly Twitter. Twitter is a poster child of Big Data. They have 500 million conversations every day. A staggering number. And yet, we found that for many topics, when they are narrowed down and accurately filtered, there are not that many relevant conversations after all.
No more than 5 people are looking for CRM solutions on an average day on Twitter. Even less - two per day on average - are asking for new web hosting providers explicitly, although many more are complaining about their existing providers (which might or might not suggest they are ready to switch or looking for a new option).
We often have businesses coming to us asking to find relevant conversations and expecting a large number of results. This is what Big Data is supposed to deliver, they assume. Such expectation is likely a product of our ‘keyword search dependency’. Indeed, when we run a keyword search on Twitter, or search engines, or anywhere we get a long list of results. The fact that most of them (up to 98% in many cases) are irrelevant is often lost in the visual illusion of having this long, seemingly endless, list in front of our eyes.
With the quality solutions that accurately deliver only relevant results we experience, for the first time, a situation when there are no longer big lists of random results. Only several relevant ones.
This is so much more efficient. It saves time, increases productivity, clarifies the picture, and makes Big Data manageable.
Time for businesses to embrace the new approach.