Teikametrics Raises $15 Million to Extend Its AI Multi-Channel Optimization


Teikametrics, a leading SaaS provider of AI-powered optimization for brands and sellers on Amazon and Walmart, announced the completion of a $15 million strategic funding round backed by new and existing investors. The announcement follows Teikametrics' selection as one of Walmart's first exclusive advertising optimization partners, and the addition of Srinivas Guddanti, a 14-year senior Amazon veteran, as its Chief Product Officer. "We're thrilled to lead this new round of capital in Teikametrics" Jump Capital led the round and were joined by follow-on investments from Granite Point Capital, MIT Professor of Econometrics, Jerry Hausman, and the former Head of Growth at Facebook and Uber, Ed Baker. "We're thrilled to lead this new round of capital in Teikametrics," said Michael McMahon, founding partner of Jump Capital. "The Company has grown rapidly, and the success of its proprietary AI technology for Amazon is a strong proof point for a broader ecommerce platform opportunity. The partnership with Walmart is a landmark event and we are excited to fund the expansion of the Teikametrics platform across multiple ecommerce channels."

5 supply chain technologies that deliver competitive advantage MIT Sloan


Facing globalization, increased product complexity, and heightened customer demands, companies are taking up advanced technologies to transform their supply chain from a pure operations hub into the epicenter of business innovation. Using sensors and ever-improving internet connectivity, forward-thinking companies are collecting data at every checkpoint, from the status of raw materials flow to the condition and location of finished goods. Machine learning, artificial intelligence (AI), and advanced analytics help drive automation and deliver insights that promote efficiencies -- making on-the-fly route changes to accelerate product delivery, for example, or swapping out materials to take advantage of better pricing or availability. Additive manufacturing is also opening doors to easy production of spare parts, enabling companies to slash inventory, cut costs, and create supplementary revenue streams. These advanced technologies are serving as a springboard for new business models -- for example, many firms are piggybacking off the "internet of things" (IoT) to offer predictive maintenance services that guarantee product uptime while generating recurring revenue.

New machine learning method could supercharge battery development for electric vehicles


Battery performance can make or break the electric vehicle experience, from driving range to charging time to the lifetime of the car. Now, artificial intelligence has made dreams like recharging an EV in the time it takes to stop at a gas station a more likely reality, and could help improve other aspects of battery technology. For decades, advances in electric vehicle batteries have been limited by a major bottleneck: evaluation times. At every stage of the battery development process, new technologies must be tested for months or even years to determine how long they will last. But now, a team led by Stanford professors Stefano Ermon and William Chueh has developed a machine learning-based method that slashes these testing times by 98 percent.

12-Hour Machine Learning Challenge: Build & deploy an app with Streamlit and DevOps tools - KDnuggets


TL;DR --In this article, I want to share my learnings, process, tools, and frameworks for completing a 12-hour ML challenge. I hope you can find it useful for your personal or professional projects. Disclaimer: this is not sponsored by Streamlit, any of the tools I mention, nor any of the firms I work for. Follow me on Medium, LinkedIn, or Twitter. It used to be the time of the year when I hung out with my wife and puppy on the couch and binge-watched movies and shows.

Validation Set Over-fitting because of Random Initialization & Selection Bias


Deep Learning models are called the universal function approximators. Their strength comes from their great ability of modeling the relationship of given input and output. However, this is also their primary weakness for coming up with generalizable solutions to a problem and what makes them so prone to over-fitting (memorizing) the training-set and not working with new data. The current method of ensuring the generalization of Deep Learning models was simply using a validation set to decide on how many iterations (epochs) should be done for training a model; or in other words, early stopping. Then, the Data Scientist would test the trained model on a blind test-set to ensure sure that none of the training hyper-parameters are also over-fitting.

Spectroscopy and Chemometrics News Weekly #7, 2020


LINK "An overview of near-infrared spectroscopy (NIRS) for the detection of insect pests in stored grains" LINK "A high-throughput quantification of resin and rubber contents in Parthenium argentatum using near-infrared (NIR) spectroscopy" LINK The latest generation of near-infrared (NIR) spectroscopy systems designed for on-line measurement of properties opens up new possibilities for measuring product properties. LINK "In situ ripening stages monitoring of Lamuyo pepper using a new generation NIRS sensor" LINK "Detection of aflatoxin B1 on corn kernel surfaces using visible-near infrared spectra" LINK " Estimation of soil phosphorus availability via visible and near-infrared spectroscopy" LINK "Multivariate Classification of Prunus Dulcis Varieties using Leaves of Nursery Plants and Near Infrared Spectroscopy." LINK "Detection of Dibutyl Phthalate (DBP) Content in Liquor Based on Near Infrared Technology" LINK "Analysis of incensole acetate in Boswellia species by near infrared ...

What math classes are relevant for machine learning?


Finally, you need to have practical experience. Nothing better than connecting with a community that has the same interests as you. You can also face competitions. Most of this answer comes from a previous answers 1 and 2 that I gave to this (Brazilian) site. I must have forgotten lots of great references... Sorry about that.

Inside The Machine Learning that Google Used to Build Meena: A Chatbot that Can Chat About Anything


It seems that every year Google plans to shock the artificial intelligence(AI) world with new astonishing progress in natural language understanding(NLU) systems. Last year, the BERT model definitely stole the headlines of the NLU research space. Just a few weeks into 2020, Google Research published a new paper introducing Meena, a new deep learning model that can power chatbots that can engage in conversations about any domain. NLU has been one of the most active areas of research of the last few years and have produced some of the most widely adopted AI systems to date. However, despite all the progress, most conversational systems remain highly constrained to a specific domain which contrasts with our ability as humans to naturally converse about different topics.

Improving the Expected Improvement Algorithm

Neural Information Processing Systems

The expected improvement (EI) algorithm is a popular strategy for information collection in optimization under uncertainty. The algorithm is widely known to be too greedy, but nevertheless enjoys wide use due to its simplicity and ability to handle uncertainty and noise in a coherent decision theoretic framework. To provide rigorous insight into EI, we study its properties in a simple setting of Bayesian optimization where the domain consists of a finite grid of points. This is the so-called best-arm identification problem, where the goal is to allocate measurement effort wisely to confidently identify the best arm using a small number of measurements. In this framework, one can show formally that EI is far from optimal.

AIoT - Convergence of Artificial Intelligence with the Internet of Things


Even though the full optimization of AI and the IoT is relatively far away, the two technologies are now being combined across industries in scenarios where problem-solving and information can improve outcomes for all stakeholders. Last such great convergence occurred in the late 1990s as mobile phones and the internet collided to change the course of human history. The convergence of AI and the IoT will bring in a similar revolution on an even grander scale. The ability to capture data through IoT is a large scale evolution that has exploded on the scene over the past five years. These new advancements have been accompanied by new concerns and threats associated with privacy and security.