May

Microsoft AI products

                                                 

Microsoft product strategy has always been and still remains that of ‘zero alternative’. Their ultimate policy is for their customers to have no choice but to embrace only Microsoft products. Consequently they created and are offering products and solutions in (almost) every segment of IT enterprise and consumer market, including, but certainly not limited to, their own data base, their own cloud services, operating system, office tools, programming language, and many more.

Not only do Microsoft offer wide variety of products, they tie them up together in a unified ecosystem that makes it easy for components to connect and interact. At the same time, this ecosystem is hostile to non-Microsoft products.

Microsoft strategy for the burgeoning, fast growing AI segment is similar:

Create products to address all parts of the AI market, add them to the ecosystem to ensure easy compatibility from within and difficulty of use from outside.

Currently the products on offer are:

- Microsoft AI engine, called LUIS. It is supposed to compete with other major industrial AI systems such as IBM Watson, and has similar training methodology. It offers webhook interfacing via endpoints.  

- Microsoft chatbot building platform, called, surprisingly, Microsoft Bot Platform. It addresses the popular demand for easy chatbot design and provides seamless connectivity with main user interfaces, such as web interface, SMS, mobile, and messaging platforms.

- In addition Microsoft offers their own messaging platform in Skype.

The main advantage of  using Microsoft AI products is the built-in connectivity with user interfaces.

The main disadvantage is in their ‘zero alternative’ policy - once you’ve chosen a Microsoft product you are likely will be forced to choose only Microsoft products for the duration of your project.

 

Interested in reading more? Check out our other blogs:

When Big Data is not so big anymore

                                                   

We are inundated with information. There is so much information around us they coined a special term - Big Data. To emphasize the sheer size of it.

It is, of course, a problem - to deal with a large amount of data. Various solutions have been created to address it efficiently.  

At nmodes we developed a semantic technology that accurately filters relevant conversations. We applied it to social networks, particularly Twitter. Twitter is a poster child of Big Data. They have 500 million conversations every day. A staggering number. And yet, we found that for many topics, when they are narrowed down and accurately filtered, there are not that many relevant conversations after all.

No more than 5 people are looking for CRM solutions on an average day on Twitter. Even less - two per day on average - are asking for new web hosting providers explicitly, although many more are complaining about their existing providers (which might or might not suggest they are ready to switch or looking for a new option).  

We often have businesses coming to us asking to find relevant conversations and expecting a large number of results. This is what Big Data is supposed to deliver, they assume. Such expectation is likely a product of our ‘keyword search dependency’. Indeed, when we run a keyword search on Twitter, or search engines, or anywhere we get a long list of results. The fact that most of them (up to 98% in many cases) are irrelevant is often lost in the visual illusion of having this long, seemingly endless, list in front of our eyes.

With the quality solutions that accurately deliver only relevant results we experience, for the first time, a situation when there are no longer big lists of random results. Only several relevant ones.  

This is so much more efficient. It saves time, increases productivity, clarifies the picture, and makes Big Data manageable.  

Time for businesses to embrace the new approach.

 

READ MORE

Pros and cons of automation

Automation drives forward the economy. It allows businesses to scale and service large groups of customers. Automation first appeared in traditional industries, such as cotton production in England in 18th century or car conveyors in the US in early 20th century. The automation replaced physical labor.

With the invention of computers automated systems began to replace intellectual labour such as math calculations. Most of the software applications we use today can be described as automation. Online payments processing, online tickets purchasing, tax returns software, computer games, search engines, and endless other programs are all examples of software automation system.

As a next step we are now aiming at automating human decision making processing and high-level intellectual activities, historically considered to be sole domain of humans.

 

One interesting aspect of automation is lesser quality of service compared to manual service.

This is to be expected. If we gain in quantity we lose in quality.The gain in quantity is what automation is about - it allows to reach out to a large number of customers. Manual product or service can reach out to individuals only. The price we pay for the ability to deliver product or provide service en masse is the drop in quality.

 

Sometimes automation is an obvious choice. This is when the gain, the scalability, hugely outweighs the costs, lower quality. Search engine is a popular successful example. In other cases, the advantage in not so obvious. Online travel booking offers fast service without leaving the comforts of the home, but it does not often deliver the best option, such as finding the cheapest flight, and therefore many people still use ‘manual’ travel agents.

 

READ MORE