Aug

AI: Our Only Weapon Against Climate Change?



Artificial Intelligence, more commonly referred to as simply AI, has been, since it’s early days, changing our lives in many ways. AI has become one of the greatest inventions of the human mind. When thinking of AI, we do not normally associate AI as being involved in helping farmers grow more crops to feed the exponentially growing population, or helping develop cancer treatment, or even keeping kids safe from trafficking and abuse by finding improper online activities. Instead we think of computers to phones, to self-driving cars and robots. However AI doesn’t just power the gadgets that we have grown so accustomed to in our daily lives, but it is increasingly being used to help solve impending social challenges.

One of these impending social issues is the quite literally hot topic – global warming. The challenges of global warming are growing by the day, as its impacts are becoming more severe and harder to manage. Melting ice caps, severe sever weather changes, extinction of species, are just a few of the consequences of the manmade climate change that is plaguing our world today. Despite widespread acceptance and awareness, the rate at which the world is embracing positive change is unfortunately not fast enough.

Fortunately there are many large companies that are setting an example by using AI to develop new ways in which to battle global warming. In fact, it seems as though AI is the only solution we have. It is helping us not only track and our present data, but also analyze our past data so that we can make informed decisions about the future. One such example is the use of AI to collect large amounts of data on land, animals, weather, ecosystems, etc… and organize it, so that scientists and governments can then determine what needs to be done, and the most cost effective ways to engage conservation methods.

We are quite surely seeing more and more AI initiatives being undertaken to help create a more eco-friendly world.

In order to reduce human influence on nature, increasing levels of human interference with natural processes are required”  (Harvard University)

Whatever the downfalls of AI may be, its ability to help us against destroying our planet is perhaps its most important trait – because as hard as it may be to accept, our planet is dying and AI can help us prevent that. 

Interested in reading more? Check out our other blogs:

Building Facebook Messenger chatbot: what they forgot to tell you.

                                     

There are lots of written tutorials and online videos on this subject.

Yet many of them omit important details of the bot building process. These details may vary from one user to another and are difficult to describe in a unilateral fashion. Consequently it is easier for tutorial writers not to mention them at all. We try here to fill the gap and provide some additional clarity.

1. Creating Facebook app.

One of the first steps in building a Facebook Messenger bot is creating a Facebook App. It requires a business Facebook page. This might seem obvious to avid social users yet worth mentioning: a business Facebook page can only be created from a personal Facebook page. If you already have a business Facebook page move on to the next step. If you have a personal Facebook page go on and create a business page. If you are among the lucky ones that live without Facebook presence now is your chance to become like everybody else.

2. Getting SSL certificate.

Next you need to setup a webhook. Your web application is hosted on a web server and the webhook’s role is to establish connection between Facebook and your web application via your web server. In order for the webhook to work you need SSL certificate because Facebook supports only secure connections (HTTPS) to external web servers. So first, you need to purchase it. The costs change from one company to another but it is important to buy a reliable certificate otherwise Facebook might reject it. All major ISP companies offer SSL products. Second, you need to install it on your web server. The installation process can be tricky. Sometimes you can get technical help from the ISP company that sold you the certificate (as a rule of thumb, the bigger the brand the better their technical support is supposed to be. But the cost may be higher too). You can also rely on popular tools, such as keytool command utility, assuming you know how to use them. In any case, it might be a good idea to allocate several days, up to a week, for this step when planning your project.

3. Choosing the server environment.

Your options are (almost) unlimited. Many online tutorials use Heroku which is a cloud-based web application platform, but a simple Tomcat web server would suffice too. Your decisions should be based on your business requirements.  A lightweight server such as Tomcat is a good fit when it comes to web centric, user facing applications. If backend integration comes into play, a web application server should be considered.

Your choice of programming languages is also broad. PHP is one popular option, Java is another but the list by no means ends here. Your chatbot app communicates with Facebook using POST requests, so any language that supports web protocols will work. Again, make decisions having your business goals in mind.

READ MORE

Towards smarter data - accuracy and precision

                                                   

There is a huge amount of information out there. And it is growing. To make it efficient and increase our competitive advantage we need to evolve and start using information in a smart way, by concentrating on data that drives business value because it is accurate, actionable, and agile. Accuracy is an important measure that determines the quality of data processing solutions.

How accuracy is calculated?

It is easy to do with structured data, because the requirements are formalizable. It is less obvious with unstructured data, e.g. a stream of social feeds, or any data set that involves natural language. Indeed, the sentences of natural language are subject to multiple interpretations, and therefore allow a degree of subjectivity. For example, should a sentence ‘I haven’t been on a sea cruise for a long time’ be qualified for a data set of people interested in going on a cruise? Both answers, yes and no, seem valid.

In these cases an argument was put forward endorsing a consensus approach which polls data providers is the best way to judge data accuracy. This approach essentially claims that attributes with the highest consensus across data providers is the most accurate.

At nmodes we deal with unstructured data all the time because we process natural language messages, primarily from social networks. We do not favor this simplistic approach, as it is considered biased, inviting people to make assumptions based on what they already believe to be true, and making no distinction between precision and accuracy. Obviously the difference is that precision measures what you got right, and accuracy measures both what you got right and what you got wrong. Accuracy is a more inclusive and therefore more valuable characteristic.

Our approach is

a) to validate data against third party independent sources (typically of academic origin) that contain trusted sets and reliable demography. Validating nmodes data against third party sources allows us to verify that our data achieves the greatest possible balance of scale and accuracy.

b) to enrich upon the existing test sets by purposefully including examples ambiguous in meaning and intent, and providing additional levels of categorization to cover these examples.

Accuracy is becoming important when businesses move from rudimentary data use, typical of the first Big Data years, to a more measured and careful approach of today. Understanding how it is calculated and the value it brings helps in achieving long-term sustainability and success.

 

READ MORE