Oct

When Big Data is not so big anymore

                                                   

We are inundated with information. There is so much information around us they coined a special term - Big Data. To emphasize the sheer size of it.

It is, of course, a problem - to deal with a large amount of data. Various solutions have been created to address it efficiently.  

At nmodes we developed a semantic technology that accurately filters relevant conversations. We applied it to social networks, particularly Twitter. Twitter is a poster child of Big Data. They have 500 million conversations every day. A staggering number. And yet, we found that for many topics, when they are narrowed down and accurately filtered, there are not that many relevant conversations after all.

No more than 5 people are looking for CRM solutions on an average day on Twitter. Even less - two per day on average - are asking for new web hosting providers explicitly, although many more are complaining about their existing providers (which might or might not suggest they are ready to switch or looking for a new option).  

We often have businesses coming to us asking to find relevant conversations and expecting a large number of results. This is what Big Data is supposed to deliver, they assume. Such expectation is likely a product of our ‘keyword search dependency’. Indeed, when we run a keyword search on Twitter, or search engines, or anywhere we get a long list of results. The fact that most of them (up to 98% in many cases) are irrelevant is often lost in the visual illusion of having this long, seemingly endless, list in front of our eyes.

With the quality solutions that accurately deliver only relevant results we experience, for the first time, a situation when there are no longer big lists of random results. Only several relevant ones.  

This is so much more efficient. It saves time, increases productivity, clarifies the picture, and makes Big Data manageable.  

Time for businesses to embrace the new approach.

 

Interested in reading more? Check out our other blogs:

AI: Our Only Weapon Against Climate Change?



Artificial Intelligence, more commonly referred to as simply AI, has been, since it’s early days, changing our lives in many ways. AI has become one of the greatest inventions of the human mind. When thinking of AI, we do not normally associate AI as being involved in helping farmers grow more crops to feed the exponentially growing population, or helping develop cancer treatment, or even keeping kids safe from trafficking and abuse by finding improper online activities. Instead we think of computers to phones, to self-driving cars and robots. However AI doesn’t just power the gadgets that we have grown so accustomed to in our daily lives, but it is increasingly being used to help solve impending social challenges.

One of these impending social issues is the quite literally hot topic – global warming. The challenges of global warming are growing by the day, as its impacts are becoming more severe and harder to manage. Melting ice caps, severe sever weather changes, extinction of species, are just a few of the consequences of the manmade climate change that is plaguing our world today. Despite widespread acceptance and awareness, the rate at which the world is embracing positive change is unfortunately not fast enough.

Fortunately there are many large companies that are setting an example by using AI to develop new ways in which to battle global warming. In fact, it seems as though AI is the only solution we have. It is helping us not only track and our present data, but also analyze our past data so that we can make informed decisions about the future. One such example is the use of AI to collect large amounts of data on land, animals, weather, ecosystems, etc… and organize it, so that scientists and governments can then determine what needs to be done, and the most cost effective ways to engage conservation methods.

We are quite surely seeing more and more AI initiatives being undertaken to help create a more eco-friendly world.

In order to reduce human influence on nature, increasing levels of human interference with natural processes are required”  (Harvard University)

Whatever the downfalls of AI may be, its ability to help us against destroying our planet is perhaps its most important trait – because as hard as it may be to accept, our planet is dying and AI can help us prevent that. 

READ MORE

Integrated Real-Time Data Boosts Content Delivery

How to make content more relevant and appealing to the content consumer?

This is a problem that has been on the mind of content creators for some time now. In our age of information abundance it is not easy to stand out and make your voice heard. The competition for the consumer’s attention is escalating, and with the number of information sources ever increasing, it will only get tougher.

Traditionally, a content delivery does not change across the target audience. A commercial, or a blog, looks and is experienced in the same way by all viewers and readers. We are entrenched in this paradigm, and can hardly imagine it being otherwise.

It turns out, the advancement of new technologies capable of capturing individual intents in real time brings up new opportunities in creating personalized experiences within the framework of content delivery.  

This is how content can become more relevant - by becoming more personalized.

In a rudimentary form, we are already familiar with this approach as seen in online advertising. Some web and social resources aim at personalizing their promotional campaigns based on whatever drops of behavioural patterns and interests they can squeeze out of our web searches.  The problem, of course, is that the technologies used to power these campaigns understand human behaviour poorly and results, therefore, more often than not leave a great deal to be desired. To put it mildly.

nmodes has been working on semantic processing of intent for several years. We now can capture intent from unstructured data (human conversations) with accuracy of 99%. (Interestingly, many businesses do not require this level of accuracy, being satisfied with 90%-92%, but we know how to deliver it anyway).

We recently started to experiment with personalizing content by using available consumer intent.

We used Twitter because of its real-time appeal.

We started by publishing a story, dividing it into several episodes:

 

And we kept the constant stream of data flowing, concentrating on intent to dine in Paris:

We then merged the content of the story with consumer intent to dine in Paris as captured by our semantic software. Like this:

This merging approach shows promising results - the engagement rate jumped above 90%.

Overall we are only at the beginning of a tremendous journey. We know that other companies are beginning to experiment, and the opportunities from introducing artificial intelligence related technologies into content delivery are plentiful.

There is a long road ahead, and we've made a one small step.  But it is a step in a very exciting direction.

 

READ MORE