Chatbots and assistant programs designed to support conversations with human users rely on natural language processing (NLP). This is a field of scientific research that aims at making computers understand the meaning of sentences in natural language. The algorithms developed by NLP researchers helped power first generation of virtual assistants such as Siri or Cortana. Now the same algorithms are made available to the developer community to help companies build their own specialized virtual assistants. Industry products that offer NLP capabilities based on these algorithms are often called AI engines.
The most powerful and advanced AI engines currently available on the market are (in no particular order): IBM Watson, Google DialogFlow, Microsoft LUIS, Amazon Lex.
All these engines use intents and entities as primary pnguistic identifies to convey the meaning of incoming sentences. All of them offer conversation flow capability. In other words, intents and entities help to understand what the incoming sentence is about. Once the incoming sentence is correctly identified you can use the engine to provide a reply. You can repeat these two steps a large number of times, thus creating a conversation, or dialog.
In terms of language processing ability and simplicity of user experience IBM Watson and Google DialogFlow are currently above the pack. Microsoft LUIS is okay too; still, keeping in mind that Microsoft are aggressively territorial and like when users stay within their ecosystem, it is most efficient to use LUIS together with other Microsoft products such as MS Bot Framework.
Using AI engine conversation flow to create dialogs makes building conversations a simple, almost intuitive, task, with no coding involved. On the flip side, using AI engine conversation flow limits your natural tendency to make conversations natural. The alternative, delegating the conversation flow to the business layer of your chatbot, adds richness and flexibility to your dialog but makes the process more comppcated as it now requires coding. Cannot sell a cow and drink the milk at the same time, can you?
Amazon Lex lacks the semantic sophistication of their competitors. One can say (somewhat metaphorically) that IBM Watson was created by linguists and computer scientists while Amazon Lex was created by sales people. As a product it is well packaged and initially looks pleasing on the eye, but once you start digging deeper you notice the limitations. Also, Amazon traditionally excelled in voice recognition component (Amazon Alexa) and not necessarily in actual language processing.
The space of conversational AI is fluid and changes happen rapidly. The existing products are evolving continuously and a new generation of AI engines is in the process of being developed.

What Is AI Engine and Do I Need It?

Integrated Real-Time Data Boosts Content Delivery
How to make content more relevant and appealing to the content consumer?
This is a problem that has been on the mind of content creators for some time now. In our age of information abundance it is not easy to stand out and make your voice heard. The competition for the consumer’s attention is escalating, and with the number of information sources ever increasing, it will only get tougher.
Traditionally, a content delivery does not change across the target audience. A commercial, or a blog, looks and is experienced in the same way by all viewers and readers. We are entrenched in this paradigm, and can hardly imagine it being otherwise.
It turns out, the advancement of new technologies capable of capturing individual intents in real time brings up new opportunities in creating personalized experiences within the framework of content delivery.
This is how content can become more relevant - by becoming more personalized.
In a rudimentary form, we are already familiar with this approach as seen in online advertising. Some web and social resources aim at personalizing their promotional campaigns based on whatever drops of behavioural patterns and interests they can squeeze out of our web searches. The problem, of course, is that the technologies used to power these campaigns understand human behaviour poorly and results, therefore, more often than not leave a great deal to be desired. To put it mildly.
nmodes has been working on semantic processing of intent for several years. We now can capture intent from unstructured data (human conversations) with accuracy of 99%. (Interestingly, many businesses do not require this level of accuracy, being satisfied with 90%-92%, but we know how to deliver it anyway).
We recently started to experiment with personalizing content by using available consumer intent.
We used Twitter because of its real-time appeal.
We started by publishing a story, dividing it into several episodes:
And we kept the constant stream of data flowing, concentrating on intent to dine in Paris:
We then merged the content of the story with consumer intent to dine in Paris as captured by our semantic software. Like this:
This merging approach shows promising results - the engagement rate jumped above 90%.
Overall we are only at the beginning of a tremendous journey. We know that other companies are beginning to experiment, and the opportunities from introducing artificial intelligence related technologies into content delivery are plentiful.
There is a long road ahead, and we've made a one small step. But it is a step in a very exciting direction.
Pros and cons of automation
Automation drives forward the economy. It allows businesses to scale and service large groups of customers. Automation first appeared in traditional industries, such as cotton production in England in 18th century or car conveyors in the US in early 20th century. The automation replaced physical labor.
With the invention of computers automated systems began to replace intellectual labour such as math calculations. Most of the software applications we use today can be described as automation. Online payments processing, online tickets purchasing, tax returns software, computer games, search engines, and endless other programs are all examples of software automation system.
As a next step we are now aiming at automating human decision making processing and high-level intellectual activities, historically considered to be sole domain of humans.
One interesting aspect of automation is lesser quality of service compared to manual service.
This is to be expected. If we gain in quantity we lose in quality.The gain in quantity is what automation is about - it allows to reach out to a large number of customers. Manual product or service can reach out to individuals only. The price we pay for the ability to deliver product or provide service en masse is the drop in quality.
Sometimes automation is an obvious choice. This is when the gain, the scalability, hugely outweighs the costs, lower quality. Search engine is a popular successful example. In other cases, the advantage in not so obvious. Online travel booking offers fast service without leaving the comforts of the home, but it does not often deliver the best option, such as finding the cheapest flight, and therefore many people still use ‘manual’ travel agents.