deep learning


Malevolent Machine Learning

Communications of the ACM

At the start of the decade, deep learning restored the reputation of artificial intelligence (AI) following years stuck in a technological winter. Within a few years of becoming computationally feasible, systems trained on thousands of labeled examples began to exceed the performance of humans on specific tasks. One was able to decode road signs that had been rendered almost completely unreadable by the bleaching action of the sun, for example. It just as quickly became apparent, however, that the same systems could just as easily be misled. In 2013, Christian Szegedy and colleagues working at Google Brain found subtle pixel-level changes, imperceptible to a human, that extended across the image would lead to a bright yellow U.S. school bus being classified by a deep neural network (DNN) as an ostrich.


Enhancement of Deep Learning in Image Classification Performance Using Xception with the Swish Activation Function for Colorectal Polyp Preliminary Screening

#artificialintelligence

One of the leading forms of cancer is colorectal cancer (CRC), which is responsible for increasing mortality in young people. The aim of this paper is to provide an experimental modification of deep learning of Xception with Swish and assess the possibility of developing a preliminary colorectal polyp screening system by training the proposed model with a colorectal topogram dataset in two and three classes. The results indicate that the proposed model can enhance the original convolutional neural network model with evaluation classification performance by achieving accuracy of up to 98.99% for classifying into two classes and 91.48% for three classes. For testing of the model with another external image, the proposed method can also improve the prediction compared to the traditional method, with 99.63% accuracy for true prediction of two classes and 80.95% accuracy for true prediction of three classes.


Now Available on Amazon SageMaker: The Deep Graph Library Amazon Web Services

#artificialintelligence

Today, we're happy to announce that the Deep Graph Library, an open source library built for easy implementation of graph neural networks, is now available on Amazon SageMaker. In recent years, Deep learning has taken the world by storm thanks to its uncanny ability to extract elaborate patterns from complex data, such as free-form text, images, or videos. However, lots of datasets don't fit these categories and are better expressed with graphs. Intuitively, we can feel that traditional neural network architectures like convolution neural networks or recurrent neural networks are not a good fit for such datasets, and a new approach is required. A Primer On Graph Neural Networks Graph neural networks (GNN) are one of the most exciting developments in machine learning today, and these reference papers will get you started.


The Combination of AI & Blockchain Could Revolutionize These 10 Industries

#artificialintelligence

Organizations are increasingly looking to adopt blockchain technologies for alternative data storage. And with heaps of data distributed across blockchain ledgers, the need for data analytics with AI is growing. The combination of AI and blockchain is fueling the onset of the "Fourth Industrial Revolution" by reinventing economics and information exchange. From healthcare to government, the potent combination of both AI and blockchain is slowly but surely transforming industries. Google DeepMind is developing an "auditing system for healthcare data".


Deep Learning for Automatically Visual Evoked Potential Classification During Surgical Decompression of Sellar Region Tumors TVST

#artificialintelligence

In the previous paper, we built a convolutional neural network to differentiate normal VEPs from abnormal VEPs from signals obtained from multifocal VEP examination.7 Still images are more suitable for the convolutional neural network. In data with dynamic properties, a combination of the convolutional and recurrent neural network was more suitable. The recurrent neural network has been proven to be useful in analyzing data, such as clinical notes,23,24 anesthesia parameters,25 and cardiographs.26 Here, we combined a convolutional neural network and recurrent neural network with the assumption that the former can differentiate static images and the latter can recognize dynamic patterns. We chose the long-short memory layer because of its property of selectively remembering and forgetting patterns for long and short durations of time.


Enabling the Deep Learning Revolution - KDnuggets

#artificialintelligence

Deep Learning (DL) models are revolutionizing the business and technology world with jaw-dropping performances in one application area after another -- image classification, object detection, object tracking, pose recognition, video analytics, synthetic picture generation -- just to name a few. However, they are like anything but classical Machine Learning (ML) algorithms/techniques. DL models use millions of parameters and create extremely complex and highly nonlinear internal representations of the images or datasets that are fed to these models. Whereas for the classical ML, domain experts and data scientists often have to write hand-crafted algorithms to extract and represent high-dimensional features from the raw data, deep learning models, on the other hand, automatically extracts and work on these complex features. A lot of theory and mathematical machines behind the classical ML (regression, support vector machines, etc.) were developed with linear models in mind.


An introduction to deep learning with Brain.js - LogRocket Blog

#artificialintelligence

Using Brain.js is a fantastic way to build a neural network. It learns the patterns and relationship between the inputs and output in order to make a somewhat educated guess when dealing with related issues. One example of a neural network is Cloudinary's image recognition add-on system. I was also shocked the first time I read the documentation of Brain.js, In this post, we will discuss some aspects of understanding how neural networks work.


Popular Deep Learning Courses of 2019 - KDnuggets

#artificialintelligence

Deep Learning is gaining more momentum and notoriety among the data science generation of this decade. A few years ago, it was not as mainstream as Machine Learning techniques, such as Logistic Regression and Random Forest for example. Nowadays, it is all about Neural Networks, Activation Functions, Multiple Layers, Drop-out, etc. There is good reason for this one, which is simply, Deep Learning has shown to perform better than Machine Learning algorithms at times. The following courses are famous among peers for knowledge on the new wave of Deep Learning and AI.


What is TensorFlow? Where can use TensorFlow? Besant Technologies

#artificialintelligence

TensorFlow is a machine learning open library, known for its Python-friendly interface. This open-source library is used for numerical computations and can increase the rate at which machine learning can be achieved by the users. Formulated by Google, this product promises to evolve as one of the most successful deep learning libraries for virtual learners. TensorFlow is a brainchild of Google's research division and works at making large scale machine learning simpler. The product is designed keeping in view of researchers, programmers and data scientists.


AI Hardware Built from a Software-first Perspective: Groq's Flexible Silicon Architecture - News

#artificialintelligence

Semiconductor industry startups are usually founded by hardware engineers who develop a silicon architecture and then figure out how to map software for that specific hardware. Here is a tale of a chip startup founded in the age of artificial intelligence (AI) that has a software DNA. Groq was founded in 2016 by a group of software engineers who wanted to solve AI problems from the software side. When they approached the issue without any preconceptions of what an AI architecture may need to look like, they were able to create an architecture that can be mapped to different AI models. The company is focused on the inference market for data centers and autonomous vehicles, and its first product is a PCIe plug-in card for which Groq designed the ASIC and AI accelerator and developed the software stack.