Nov

Towards smarter data - accuracy and precision

                                                   

There is a huge amount of information out there. And it is growing. To make it efficient and increase our competitive advantage we need to evolve and start using information in a smart way, by concentrating on data that drives business value because it is accurate, actionable, and agile. Accuracy is an important measure that determines the quality of data processing solutions.

How accuracy is calculated?

It is easy to do with structured data, because the requirements are formalizable. It is less obvious with unstructured data, e.g. a stream of social feeds, or any data set that involves natural language. Indeed, the sentences of natural language are subject to multiple interpretations, and therefore allow a degree of subjectivity. For example, should a sentence ‘I haven’t been on a sea cruise for a long time’ be qualified for a data set of people interested in going on a cruise? Both answers, yes and no, seem valid.

In these cases an argument was put forward endorsing a consensus approach which polls data providers is the best way to judge data accuracy. This approach essentially claims that attributes with the highest consensus across data providers is the most accurate.

At nmodes we deal with unstructured data all the time because we process natural language messages, primarily from social networks. We do not favor this simplistic approach, as it is considered biased, inviting people to make assumptions based on what they already believe to be true, and making no distinction between precision and accuracy. Obviously the difference is that precision measures what you got right, and accuracy measures both what you got right and what you got wrong. Accuracy is a more inclusive and therefore more valuable characteristic.

Our approach is

a) to validate data against third party independent sources (typically of academic origin) that contain trusted sets and reliable demography. Validating nmodes data against third party sources allows us to verify that our data achieves the greatest possible balance of scale and accuracy.

b) to enrich upon the existing test sets by purposefully including examples ambiguous in meaning and intent, and providing additional levels of categorization to cover these examples.

Accuracy is becoming important when businesses move from rudimentary data use, typical of the first Big Data years, to a more measured and careful approach of today. Understanding how it is calculated and the value it brings helps in achieving long-term sustainability and success.

 

Interested in reading more? Check out our other blogs:

Integrated Real-Time Data Boosts Content Delivery

How to make content more relevant and appealing to the content consumer?

This is a problem that has been on the mind of content creators for some time now. In our age of information abundance it is not easy to stand out and make your voice heard. The competition for the consumer’s attention is escalating, and with the number of information sources ever increasing, it will only get tougher.

Traditionally, a content delivery does not change across the target audience. A commercial, or a blog, looks and is experienced in the same way by all viewers and readers. We are entrenched in this paradigm, and can hardly imagine it being otherwise.

It turns out, the advancement of new technologies capable of capturing individual intents in real time brings up new opportunities in creating personalized experiences within the framework of content delivery.  

This is how content can become more relevant - by becoming more personalized.

In a rudimentary form, we are already familiar with this approach as seen in online advertising. Some web and social resources aim at personalizing their promotional campaigns based on whatever drops of behavioural patterns and interests they can squeeze out of our web searches.  The problem, of course, is that the technologies used to power these campaigns understand human behaviour poorly and results, therefore, more often than not leave a great deal to be desired. To put it mildly.

nmodes has been working on semantic processing of intent for several years. We now can capture intent from unstructured data (human conversations) with accuracy of 99%. (Interestingly, many businesses do not require this level of accuracy, being satisfied with 90%-92%, but we know how to deliver it anyway).

We recently started to experiment with personalizing content by using available consumer intent.

We used Twitter because of its real-time appeal.

We started by publishing a story, dividing it into several episodes:

 

And we kept the constant stream of data flowing, concentrating on intent to dine in Paris:

We then merged the content of the story with consumer intent to dine in Paris as captured by our semantic software. Like this:

This merging approach shows promising results - the engagement rate jumped above 90%.

Overall we are only at the beginning of a tremendous journey. We know that other companies are beginning to experiment, and the opportunities from introducing artificial intelligence related technologies into content delivery are plentiful.

There is a long road ahead, and we've made a one small step.  But it is a step in a very exciting direction.

 

READ MORE

AI: Our Only Weapon Against Climate Change?



Artificial Intelligence, more commonly referred to as simply AI, has been, since it’s early days, changing our lives in many ways. AI has become one of the greatest inventions of the human mind. When thinking of AI, we do not normally associate AI as being involved in helping farmers grow more crops to feed the exponentially growing population, or helping develop cancer treatment, or even keeping kids safe from trafficking and abuse by finding improper online activities. Instead we think of computers to phones, to self-driving cars and robots. However AI doesn’t just power the gadgets that we have grown so accustomed to in our daily lives, but it is increasingly being used to help solve impending social challenges.

One of these impending social issues is the quite literally hot topic – global warming. The challenges of global warming are growing by the day, as its impacts are becoming more severe and harder to manage. Melting ice caps, severe sever weather changes, extinction of species, are just a few of the consequences of the manmade climate change that is plaguing our world today. Despite widespread acceptance and awareness, the rate at which the world is embracing positive change is unfortunately not fast enough.

Fortunately there are many large companies that are setting an example by using AI to develop new ways in which to battle global warming. In fact, it seems as though AI is the only solution we have. It is helping us not only track and our present data, but also analyze our past data so that we can make informed decisions about the future. One such example is the use of AI to collect large amounts of data on land, animals, weather, ecosystems, etc… and organize it, so that scientists and governments can then determine what needs to be done, and the most cost effective ways to engage conservation methods.

We are quite surely seeing more and more AI initiatives being undertaken to help create a more eco-friendly world.

In order to reduce human influence on nature, increasing levels of human interference with natural processes are required”  (Harvard University)

Whatever the downfalls of AI may be, its ability to help us against destroying our planet is perhaps its most important trait – because as hard as it may be to accept, our planet is dying and AI can help us prevent that. 

READ MORE