Goto

Collaborating Authors

Education


The Coming Convergence of NFTs and Artificial Intelligence

#artificialintelligence

In the near future, we should see the value of AI-generated NFTs to expand beyond generative art into more generic NFT utility categories providing a natural vehicle for leveraging the latest deep learning techniques. An example of this value proposition can be seen in digital artists like Refik Anadol who are already experimenting with cutting edge deep learning methods for the creation of NFTs. Anadol's studio have been a pioneer in using techniques such as GANs, and even dabbling into quantum computing, trained models in hundreds of millions images and audio clips to create astonishing visuals. NFTs have been one of the recent delivery mechanisms explored by Anadol.


Forecasting with Machine Learning Models

#artificialintelligence

TL;DR: We introduce mlforecast, an open source framework from Nixtla that makes the use of machine learning models in time series forecasting tasks fast and easy. It allows you to focus on the model and features instead of implementation details. With mlforecast you can make experiments in an esasier way and it has a built-in backtesting functionality to help you find the best performing model. You can use mlforecast in your own infrastructure or use our fully hosted solution. Just send us a mail to federico@nixtla.io


epitope3D: a machine learning method for conformational B-cell epitope prediction - PubMed

#artificialintelligence

The ability to identify antigenic determinants of pathogens, or epitopes, is fundamental to guide rational vaccine development and immunotherapies, which are particularly relevant for rapid pandemic response. A range of computational tools has been developed over the past two decades to assist in epitope prediction; however, they have presented limited performance and generalization, particularly for the identification of conformational B-cell epitopes. Here, we present epitope3D, a novel scalable machine learning method capable of accurately identifying conformational epitopes trained and evaluated on the largest curated epitope data set to date. Our method uses the concept of graph-based signatures to model epitope and non-epitope regions as graphs and extract distance patterns that are used as evidence to train and test predictive models. We show epitope3D outperforms available alternative approaches, achieving Mathew's Correlation Coefficient and F1-scores of 0.55 and 0.57 on cross-validation and 0.45 and 0.36 during independent blind tests, respectively.


How Edtech Platforms Are Finding Takers In This Ever-evolving Industry

#artificialintelligence

The technological progression in India is changing the landscape of the education industry, especially as the sector integrates the latest advancements with the system seamlessly. With the pandemic catching the country off-guard and on-site education models coming to a halt, virtual classrooms and tech-enabled learning are catching pace. Aligning with the new normal, tech-enabled learning is reforming the education culture for the millennial learners. Edtech has not only escalated the game to a whole new level for educators and universities but is also anticipated to be the future of learning. The paradigm shift in the education sector is proving to be a necessary intervention as it's on its way to making learning more accessible and transformational for everyone alike.


How to debug a synapse classifier with webKnossos

#artificialintelligence

What is your role at scalable minds? As a data scientist, I continuously work on our products, talk to collaborators about their needs, handle deadlines, and train machine learning models. This means sometimes training a classifier, debugging it iteratively and so on, and sometimes building data processing pipelines. What do you like most about your work here? The fact that we support researchers in neuroscience and life sciences, which is a meaningful purpose.


Diversity's Critical Role in AI and Innovation

#artificialintelligence

We were delighted to be joined by over 100 Women in AI at the end of November for the first instalment of our Women in AI virtual evenings. The evening of virtual networking, discussion and keynote presentations, supported by TD Bank, covered topics including'Diversity's Critical Role in AI and Innovation', Action Recognition for Behaviour Understanding from Video in 2020 and more. Speakers included Jane Ho, Associate VP, Data & Analytics at TD Bank, Ashley Cohen, Principal Analytical Lead of Google, Tanmana Sadhu, Computer Vision Engineer at Huawei Canada, Inmar Givoni, Director of Engineering of Uber ATG, Sedef Akinli Kocak, Senior Lecturer at Ryerson University and Hakimeh Purmehdi, Senior Data Scientist at Ericsson. A summary of highlights is below, including a video recording of the panel discussion. Artificial intelligence and Machine Learning models are heavily reliant on the data that feed them. While AI can improve human decision making; however, since data can be biased based on human decisions made in the past, AI output may inherit or even amplify biases.


MLPerfTM HPC: A Holistic Benchmark Suite for Scientific Machine Learning on HPC Systems

#artificialintelligence

Scientific communities are increasingly adopting machine learning and deep learning models in their applications to accelerate scientific insights. High performance computing systems are pushing the frontiers of performance with a rich diversity of hardware resources and massive scale-out capabilities. There is a critical need to understand fair and effective benchmarking of machine learning applications that are representative of real-world scientific use cases. MLPerfTM is a community-driven standard to benchmark machine learning workloads, focusing on end-to-end performance metrics. In this paper, we introduce MLPerf HPC, a benchmark suite of largescale scientific machine learning training applications, driven by the MLCommonsTM Association.


10 Days of No Code Artificial Intelligence Bootcamp

#artificialintelligence

The no-code AI revolution is here! Do you have what it takes to leverage this new wave of code-friendly tools paving the way for the future of AI? Businesses of all sizes want to implement the power of Machine Learning and AI, but the barriers to entry are high. That's where no-code AI/ML tools are changing the game. From fast implementation to lower costs of development and ease of use, departments across healthcare, finance, marketing and more are looking to no-code solutions to deliver impactful solutions. But groundbreaking as they are, they're nothing without talent like YOU calling the shots... Yes?! Then this course is for you.


Introduction to Boosted Trees

#artificialintelligence

Welcome to my new article series: Boosting algorithms in machine learning! This is Part 1 of the series. Here, I'll give you a short introduction to boosting, its objective, some key definitions and a list of boosting algorithms that we intend to cover in the next posts. You should be familiar with elementary tree-based machine learning models such as decision trees and random forests. In addition to that, it is recommended to have good knowledge of Python and its Scikit-learn library.


How to Improve Deep Learning Forecasts for Time Series

#artificialintelligence

Clustering time series data before fitting can improve accuracy by 33% -- src. In 2021, researchers at UCLA developed a method that can improve model fit on many different time series'. By aggregating similarly structured data and fitting a model to each group, our models can specialize. While fairly straightforward to implement, as with any other complex deep learning method, we are often computationally limited by large data sets. However, all of the methods listed have support in both R and python, so development on smaller datasets should be pretty "simple."