In many respects, we are reinventing modern programming tools for the A.I. age. Models and expensive resources like talent, data and computing power are currently centralized within large tech corporations. TensorFlow, Tensorflow Hub, AutoML, Algorithmia, and cloud computing are all examples of increasing decentralization of artificial intelligence. Accelerate development (1000 brains are better than 100). Make A.I. safer (more people involved to check and balance development).
Businesses have entered the most rapid period of technological change in history, and artificial intelligence (AI) is on the cusp of revolutionizing the entire workforce, Ginni Rometty, chairman, president, and CEO of IBM, said in a keynote address at the 2018 Gartner Symposium/IT Expo in Orlando on Tuesday. "The pace is unabated," Rometty said. "You have to change the way you work, because this isn't going to stop." AI has become one of the great, meaningless buzzwords of our time. In this video, the Chief Data Scientist of Dun and Bradstreet explains AI in clear business terms.
Having worked in the cryptography space for over two decades, and having been an active participant in the cryptocurrency evolution since its inception, I take a deep interest in the subject. In particular, I believe that the intersection of artificial intelligence (AI) and blockchain is an exciting but challenging new development. Matt Turck recently discussed why the topic matters and highlighted interesting projects in the space, referring to AI (big data, data science, machine learning) and blockchain (decentralized infrastructure) as the defining technologies of the next decade. Evidently, the time is already ripe for these new concepts, despite them being novel and still underdeveloped. Currently, AI startups are being overwhelmingly acquired by companies such as IBM, Apple, Facebook, Amazon, Google, Intel and Alibaba, among others.
Abu Sebastian, an author on the paper, explained that executing certain computational tasks in the computer's memory would increase the system's efficiency and save energy. "If you look at human beings, we compute with 20 to 30 watts of power, whereas AI today is based on supercomputers which run on kilowatts or megawatts of power," Sebastian said. "In the brain, synapses are both computing and storing information. In a new architecture, going beyond von Neumann, memory has to play a more active role in computing." The IBM team drew on three different levels of inspiration from the brain.
Edge services and edge computing have been in talks since at least the 90s. When Edge computing is extended to the cloud it can be managed and consumed as if it were local infrastructure. It's the same as how humans find it hard to interact with infrastructure that is too far away. Edge Analytics is the exciting area of data analytics that is gaining a lot of attention these days. While traditional analytics, answer questions like what happened, why it happened, what is likely to happen and options on what you should do about it Edge analytics is data analytics in real time.
Every second, approximately 6,000 tweets are posted on Twitter. That's a significant amount of data -- and it represents only one social media platform out of hundreds. Social media offers an enormous volume of unstructured data that can generate knowledge and help make better decisions on a larger scale. While humans are clearly efficient data generators, computers are having a difficult time processing and analyzing the sheer volume of data. Arizona State University Associate Professor Ming Zhao leads the development of GEARS, a big data computing infrastructure designed for today's demanding big data challenges.
Today marks the start of the fall Strata Data Conference in New York City, which has traditionally been the big data community's biggest show of the year. It's been a wild ride for the big data crowd in 2018, one that's brought its share of highs and lows. Now it's worth taking some time to consider where big data has come, and where it's possibly headed in the future. Here are five things to keep in mind as the Strata Data Conference kicks off. We've said this before, but it bears repeating: Hadoop is just one of many technologies angling for relevance in today's increasingly heterogeneous at-scale computing environment.
Twenty years ago, the Open Source framework was published, delivering what would be the most significant trend in software development since that time. Whether you want to call it "free software" or "open source", ultimately, it's all about making application and system source codes widely available and putting the software under a license that favors user autonomy. According to Ovum, open source is already the default option across several big data categories ranging from storage, analytics and applications to machine learning. In the latest Black Duck Software and North Bridge's survey, 90% of respondents reported they rely on open source "for improved efficiency, innovation and interoperability," most commonly because of "freedom from vendor lock-in; competitive features and technical capabilities; ability to customize; and overall quality." There are now thousands of successful open source projects that companies must strategically choose from to stay competitive.
As artificial intelligence (AI) and machine learning (ML) begin to move out of academia into the business world, there's been a lot of focus on how they can help business intelligence (BI). There are a lot of potential in systems that use natural language search to help management more quickly investigate corporate information, perform analysis, and define business plans. A previous column discussing "self-service" business intelligence (BI) briefly mentioned two areas of focus where ML can help BI. While the user interface, the user experience (UX), matters, it's visibility is only the tip of the iceberg. The data being supplied to the UX is even more important.
This is an eclectic collection of interesting blog posts, software announcements and data applications I've noted over the past month or so. ONNX Model Zoo is now available, providing a library of pre-trained state-of-the-art models in deep learning in the ONNX format. In the 2018 IEEE Spectrum Top Programming Language rankings, Python takes the top spot and R ranks #7. Julia 1.0 has been released, marking the stabilization of the scientific computing language and promising forwards compatibility. Google announces Cloud AutoML, a beta service to train vision, text categorization, or language translation models from provided data.