Mar

How nmodes technology is unique



nmodes AI is based on semantic algorithms. They require significantly less computational capacity compared to standard machine learning algorithms used by a majority of conversational AI systems today.

As a result, the infrastructure requirements are drastically reduced. In simple terms, what Google Home or Amazon Alexa do with the help of supercomputers or advanced computer farms, nmodes AI can do on a basic server.

And it gives nmodes ability to delegate conversational capacity to the users.  With the help of nmodes AI every business (and individual) can create their own AI to handle the details of the business (products, customers, etc) not accessible from the outside.

 
Interested in reading more? Check out our other blogs:

MAKING AI MAINSTREAM



We are experiencing a strong demand for conversational AI solutions. It is coming from every corner of the B2C market. It is growing by the day.

Conversational AI is becoming increasingly popular among the consumer facing business community. It is easy to see why - AI offers sales and customer service scalability and therefore is critical for the long-term success of a business.

Conversational AI solutions such as chatbots, voice bots, and virtual assistants provide much needed speed and efficiency, in an age where the rapid advancement of technology makes them virtually the only sustainable customer service solution.

Bu there is a catch - AI is complicated. Mainstream businesses do not have in house AI expertise. And it is not part of their business model to develop such expertise.

Today’s market offer several good conversational AI solutions, such as IBM Watson or Google DialogFlow. However, getting a business value out of them requires the very AI expertise that mainstream companies do not possess.

So what can be done?

Any AI solution should follow these three steps in order for the mainstream business community to fully benefit from it:

  1. Conversational AI should come as a service,
  2. The service should be available in natural language,
  3. The service should be fully personalized.  
 In the next several posts we will explore how the AI industry, including nmodes, is moving towards achieving these goals.
READ MORE

What Is AI Engine and Do I Need It?

Chatbots and assistant programs designed to support conversations with human users rely on natural language processing (NLP). This is a field of scientific research that aims at making computers understand the meaning of sentences in natural language. The algorithms developed by NLP researchers helped power first generation of virtual assistants such as Siri or Cortana. Now the same algorithms are made available to the developer community to help companies build their own specialized virtual assistants. Industry products that offer NLP capabilities based on these algorithms are often called AI engines.

The most powerful and advanced AI engines currently available on the market are (in no particular order): IBM Watson, Google DialogFlow, Microsoft LUIS, Amazon Lex.

All these engines use intents and entities as primary pnguistic identifies to convey the meaning of incoming sentences. All of them offer conversation flow capability. In other words, intents and entities help to understand what the incoming sentence is about. Once the incoming sentence is correctly identified you can use the engine to provide a reply. You can repeat these two steps a large number of times, thus creating a conversation, or dialog.

In terms of language processing ability and simplicity of user experience IBM Watson and Google DialogFlow are currently above the pack. Microsoft LUIS is okay too; still, keeping in mind that Microsoft are aggressively territorial and like when users stay within their ecosystem, it is most efficient to use LUIS together with other Microsoft products such as MS Bot Framework.

Using AI engine conversation flow to create dialogs makes building conversations a simple, almost intuitive, task, with no coding involved. On the flip side, using AI engine conversation flow limits your natural tendency to make conversations natural. The alternative, delegating the conversation flow to the business layer of your chatbot, adds richness and flexibility to your dialog but makes the process more comppcated as it now requires coding. Cannot sell a cow and drink the milk at the same time, can you?

Amazon Lex lacks the semantic sophistication of their competitors. One can say (somewhat metaphorically)  that IBM Watson was created by linguists and computer scientists while Amazon Lex was created by sales people. As a product it is well packaged and initially looks pleasing on the eye, but once you start digging deeper you notice the limitations. Also, Amazon traditionally excelled in voice recognition component (Amazon Alexa) and not necessarily in actual language processing.

The space of conversational AI is fluid and changes happen rapidly. The existing products are evolving continuously and a new generation of AI engines is in the process of being developed.

READ MORE