Oct

When Big Data is not so big anymore

                                                   

We are inundated with information. There is so much information around us they coined a special term - Big Data. To emphasize the sheer size of it.

It is, of course, a problem - to deal with a large amount of data. Various solutions have been created to address it efficiently.  

At nmodes we developed a semantic technology that accurately filters relevant conversations. We applied it to social networks, particularly Twitter. Twitter is a poster child of Big Data. They have 500 million conversations every day. A staggering number. And yet, we found that for many topics, when they are narrowed down and accurately filtered, there are not that many relevant conversations after all.

No more than 5 people are looking for CRM solutions on an average day on Twitter. Even less - two per day on average - are asking for new web hosting providers explicitly, although many more are complaining about their existing providers (which might or might not suggest they are ready to switch or looking for a new option).  

We often have businesses coming to us asking to find relevant conversations and expecting a large number of results. This is what Big Data is supposed to deliver, they assume. Such expectation is likely a product of our ‘keyword search dependency’. Indeed, when we run a keyword search on Twitter, or search engines, or anywhere we get a long list of results. The fact that most of them (up to 98% in many cases) are irrelevant is often lost in the visual illusion of having this long, seemingly endless, list in front of our eyes.

With the quality solutions that accurately deliver only relevant results we experience, for the first time, a situation when there are no longer big lists of random results. Only several relevant ones.  

This is so much more efficient. It saves time, increases productivity, clarifies the picture, and makes Big Data manageable.  

Time for businesses to embrace the new approach.

 

Interested in reading more? Check out our other blogs:

Microsoft AI products

                                                 

Microsoft product strategy has always been and still remains that of ‘zero alternative’. Their ultimate policy is for their customers to have no choice but to embrace only Microsoft products. Consequently they created and are offering products and solutions in (almost) every segment of IT enterprise and consumer market, including, but certainly not limited to, their own data base, their own cloud services, operating system, office tools, programming language, and many more.

Not only do Microsoft offer wide variety of products, they tie them up together in a unified ecosystem that makes it easy for components to connect and interact. At the same time, this ecosystem is hostile to non-Microsoft products.

Microsoft strategy for the burgeoning, fast growing AI segment is similar:

Create products to address all parts of the AI market, add them to the ecosystem to ensure easy compatibility from within and difficulty of use from outside.

Currently the products on offer are:

- Microsoft AI engine, called LUIS. It is supposed to compete with other major industrial AI systems such as IBM Watson, and has similar training methodology. It offers webhook interfacing via endpoints.  

- Microsoft chatbot building platform, called, surprisingly, Microsoft Bot Platform. It addresses the popular demand for easy chatbot design and provides seamless connectivity with main user interfaces, such as web interface, SMS, mobile, and messaging platforms.

- In addition Microsoft offers their own messaging platform in Skype.

The main advantage of  using Microsoft AI products is the built-in connectivity with user interfaces.

The main disadvantage is in their ‘zero alternative’ policy - once you’ve chosen a Microsoft product you are likely will be forced to choose only Microsoft products for the duration of your project.

 

READ MORE

MAKING AI MAINSTREAM



We are experiencing a strong demand for conversational AI solutions. It is coming from every corner of the B2C market. It is growing by the day.

Conversational AI is becoming increasingly popular among the consumer facing business community. It is easy to see why - AI offers sales and customer service scalability and therefore is critical for the long-term success of a business.

Conversational AI solutions such as chatbots, voice bots, and virtual assistants provide much needed speed and efficiency, in an age where the rapid advancement of technology makes them virtually the only sustainable customer service solution.

Bu there is a catch - AI is complicated. Mainstream businesses do not have in house AI expertise. And it is not part of their business model to develop such expertise.

Today’s market offer several good conversational AI solutions, such as IBM Watson or Google DialogFlow. However, getting a business value out of them requires the very AI expertise that mainstream companies do not possess.

So what can be done?

Any AI solution should follow these three steps in order for the mainstream business community to fully benefit from it:

  1. Conversational AI should come as a service,
  2. The service should be available in natural language,
  3. The service should be fully personalized.  
 In the next several posts we will explore how the AI industry, including nmodes, is moving towards achieving these goals.
READ MORE