Goto

Collaborating Authors

Neural Networks


How to teach computers to recognize dogs and cakes in images

#artificialintelligence

Artificial intelligence is a very interesting topic that evokes a lot of emotions. This is due to the fact that the development of new technologies involves many opportunities and threats. Some artificial intelligence technologies have been around for a long time, but advances in computational power and numerical optimization routines, the availability of huge amounts of data, have led to great breakthroughs in this field. Artificial intelligence is widely used to provide personalized recommendations when shopping or simply searching for information on the web. More advanced inventions include autonomous self-driving cars -- which, in a simplified way, make decisions about the next movements of vehicles based on data collected from various types of sensors installed in them.


Demystifying Differentiation and Optimisers in Neural Network

#artificialintelligence

In the previous article, I talked about the different loss functions available and understood the gradient function. The gradient function uses differentiation of loss with respect to the weight to calculate how the network must change the weight to get closer to the expected output.


spaCy Version 3.0 Released: All Features & Specifications

#artificialintelligence

The 3.0 version has state of the art transformer-based pipelines and pre-trained models in seventeen languages. The first version of spaCy was a preliminary version with little support for deep-learning workflows. The second version, however, introduced convoluted neural network models in seven different languages. The third version is a massive improvement over both of these versions. The 3.0 version has completed dropped support for Python 2 and only works on Python 3.6.


GPU Training on Apple M1

#artificialintelligence

Since it's release in November 2020, the first Macs with an Arm-based M1 chip, have been a topic of discussion in the developer community. The new M1 chip on the MacBook Pro consists of 8 core CPU, 8 core GPU, and 16 core neural engine, in addition to other things. Both the processor and the GPU are far superior to the previous-generation Intel configurations. So far, it has proven itself to be superior to anything Intel has offered. However, the Deep Learning gang was struggling with native arm support, especially since most libraries/frameworks support cuda and x86 architecture.


Real-World Blind Face Restoration with Generative Facial Prior

#artificialintelligence

Technology and Technological developments in this decade have led to some of the most awe-inspiring discoveries. With rapidly changing technology and systems to support them and provide back-end processing power, the world seems to be becoming a better place to live day by day. Technology has reached such new heights that nothing our ingenious mind today thinks about looks impossible to accomplish. The driving factor of such advancements in this new era of technological and computational superiority seems to be wrapped around two of the most highly debated domains and topics, namely Machine Learning & Artificial Intelligence. The canvas and ideal space that these two domains provide are unfathomable.


How AI is powering the future of financial services

#artificialintelligence

Financial institutions are using AI-powered solutions to unlock revenue growth opportunities, minimise operating expenses, and automate manually intensive processes. Many in the financial services industry believe strongly in the potential of AI. A recent survey by NVIDIA of financial services professionals showed 83% of respondents agreeing that AI is important to their company's future success. The survey, titled'State of AI in Financial Services', also showed a substantial financial impact of AI for enterprises with 34% of those who replied agreeing that AI will increase their company's annual revenue by at least 20%. The approach to using AI differed based on the type of financial firm.


USC researchers enable AI to use its "imagination." - USC Viterbi

#artificialintelligence

The new AI system takes its inspiration from humans: when a human sees a color from one object, we can easily apply it to any other object by substituting the original color with the new one. Now, imagine the same cat, but with coal-black fur. Now, imagine the cat strutting along the Great Wall of China. Doing this, a quick series of neuron activations in your brain will come up with variations of the picture presented, based on your previous knowledge of the world. In other words, as humans, it's easy to envision an object with different attributes.


DEELIG: A Deep Learning Approach to Predict Protein-Ligand Binding Affinity - Docwire News

#artificialintelligence

Protein-ligand binding prediction has extensive biological significance. Binding affinity helps in understanding the degree of protein-ligand interactions and is a useful measure in drug design. Protein-ligand docking using virtual screening and molecular dynamic simulations are required to predict the binding affinity of a ligand to its cognate receptor. Performing such analyses to cover the entire chemical space of small molecules requires intense computational power. Recent developments using deep learning have enabled us to make sense of massive amounts of complex data sets where the ability of the model to "learn" intrinsic patterns in a complex plane of data is the strength of the approach.


Real-time Interpretation: The next frontier in radiology AI - MedCity News

#artificialintelligence

In the nine years since AlexNet spawned the age of deep learning, artificial intelligence (AI) has made significant technological progress in medical imaging, with more than 80 deep-learning algorithms approved by the U.S. FDA since 2012 for clinical applications in image detection and measurement. A 2020 survey found that more than 82% of imaging providers believe AI will improve diagnostic imaging over the next 10 years and the market for AI in medical imaging is expected to grow 10-fold in the same period. Despite this optimistic outlook, AI still falls short of widespread clinical adoption in radiology. A 2020 survey by the American College of Radiology (ACR) revealed that only about a third of radiologists use AI, mostly to enhance image detection and interpretation; of the two thirds who did not use AI, the majority said they saw no benefit to it. In fact, most radiologists would say that AI has not transformed image reading or improved their practices.


Researchers demonstrate that malware can be hidden inside AI models

#artificialintelligence

Researchers Zhi Wang, Chaoge Liu, and Xiang Cui published a paper last Monday demonstrating a new technique for slipping malware past automated detection tools--in this case, by hiding it inside a neural network. The three embedded 36.9MiB of malware into a 178MiB AlexNet model without significantly altering the function of the model itself. The malware-embedded model classified images with near-identical accuracy, within 1% of the malware-free model. Just as importantly, squirreling the malware away into the model broke it up in ways that prevented detection by standard antivirus engines. VirusTotal, a service that "inspects items with over 70 antivirus scanners and URL/domain blocklisting services, in addition to a myriad of tools to extract signals from the studied content," did not raise any suspicions about the malware-embedded model.