The Panama Papers- It's all about the data!

#artificialintelligence

The latest buzz about Panama Papers has shaken the world. As we all know the Panama Papers is a set of 2.6 TB of data that includes 11.5 million confidential documents with detailed information about more than 214,000 offshore companies listed by the Panamanian corporate service provider Mossack Fonseca. The Panama Papers has set an excellent example for the world about the importance of data science when it comes to analyzing big data. This leak makes us realize that appropriate approaches are needed to handle the challenges of data management for the present and the future. Let's take a deep dive into the Panama Papers and dig down the secret behind the biggest leak ever This leak contains 4.8 million emails, 3 million database entries, 21.5 million PDFs, around one million images and 320,000 text documents.


Why Machine Learning and Big Data need Behavioral Economists

#artificialintelligence

Researchers from Princeton University received mass media attention when they recently predicted the demise of Facebook. Data scientists at Facebook soon hit back with their own'study:' "In keeping with the scientific principle (used by Princeton) 'correlation equals causation,' our research unequivocally demonstrated that Princeton may be in danger of disappearing entirely." Is it surprising that the original Princeton study found its way onto the front pages of newspapers and magazines across the world? Probably not – the fact is statistical results with a causal interpretation have a stronger effect on our thinking than non-causal information. What the data scientists at Princeton relied upon in presenting their paper was our individual human inability to think statistically.


Television And Geography As Big Data: Mapping A Decade Of Television News

#artificialintelligence

What happens when we begin to think of all information as data that can be explored to yield new insights into our world? What would it look like to take nearly a decade of CNN, Fox News, and MSNBC television broadcasts and two years of BBC News broadcasts and run them through sophisticated natural language processing algorithms to identify every mention of a location on earth in their coverage and then create a series of maps that visualize the places we hear about when we turn to the news? What would those maps look like and what might they tell us about what we see when we turn on our televisions each day? Half a decade ago I began working with the Internet Archive's incredible Television News Archive to explore how powerful computer algorithms could allow us to "see" the news in entirely new ways. From simple longitudinal keyword searches to mass emotion mining to geographic mapping to the most powerful deep learning algorithms watching political ads, television has an incredible amount to teach us as we explore it through the modalities and lenses of massive data mining.


Artificial Intelligence and Machine Learning in Big Data and IoT: The Market for Data Capture …

@machinelearnbot

Artificial Intelligence and Machine Learning in Big Data and IoT: The Market for Data Capture … NEW YORK, Dec. 16, 2016 /PRNewswire/ Overview:More than 50% of enterprise IT organizations are experimenting with Artificial Intelligence (AI) in various forms such as Machine Learning, Deep Learning, Computer Vision, Image Recognition, Voice Recognition, Artificial Neural Networks, and more. AI is not a single technology but a convergence of various technologies, statistical models, algorithms, and approaches. Machine Learning is a sub-field of computer science that evolved from the study of pattern recognition and computational learning theory in AI.Every large corporation collects and maintains a huge amount of human-oriented data associated with its customers including their preferences, purchases, habits, and other personal information. As the Internet of Things (IoT) progresses, there will an increasingly large amount of unstructured machine data.


Fake News? Big Data And Artificial Intelligence To The Rescue

#artificialintelligence

The impact of fake news on the recent election has focused public attention on this multi-tentacled and growing problem. Vast swaths of the population fall prey to such misinformation, while others struggle to discern unbiased truth from the morass of lies and distortions that surrounds us. Experts recommend that we to follow basic principles of information hygiene to separate fake from real, including checking sources, looking for bad grammar and typos, and seeking out corroborating information. And top of the list: never believe anything you read on . However, none of these techniques is particularly effective.