Human biases cause problems for machines trying to learn chemistry

#artificialintelligence 

They found that models trained on a small randomised sample of reactions outperformed those trained on larger human-selected datasets. The results show the importance of including experimental results that people might think are unimportant when it comes to developing computer programs for chemists. Machine learning models are a valuable tool in chemical synthesis, but they're trained on data from the literature where positive results are favoured, whereas the dark reactions – the experiments that were tried but didn't work – are usually left out. 'Including these failures is essential for generating predictive machine learning models,' says Joshua Schrier of Fordham University, US, who was part of a team that studied hydrothermal syntheses of amine-templated metal oxides and found that biases were introduced into the literature by people's choices of the reaction parameters. 'We considered extra dark reactions – a class of reactions that humans don't even attempt, not because of scientific or practical reasons, but simply because it's humans who make the decisions,' Schrier says.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found