Goto

Collaborating Authors

Amazon.com: Probability and Statistics for Data Science: Math + R + Data (Chapman & Hall/CRC Data Science Series) (9781138393295): Matloff, Norman: Books

#artificialintelligence

I believe that the book describes itself quite well when it says: Mathematically correct yet highly intuitive…This book would be great for a class that one takes before one takes my statistical learning class. I often run into beginning graduate Data Science students whose background is not math (e.g., CS or Business) and they are not ready…The book fills an important niche, in that it provides a self-contained introduction to material that is useful for a higher-level statistical learning course. I think that it compares well with competing books, particularly in that it takes a more "Data Science" and "example driven" approach than more classical books." "This text by Matloff (Univ. of California, Davis) affords an excellent introduction to statistics for the data science student…Its examples are often drawn from data science applications such as hidden Markov models and remote sensing, to name a few… All the models and concepts are explained well in precise mathematical terms (not presented as formal proofs), to help students gain an intuitive understanding."


Nine Firms Changing Real Estate With Artificial Intelligence

#artificialintelligence

Real estate is not a new industry. It has been formed over the centuries to help make the risky and important process of buying, selling, and leasing properties safer and easier. Many technologies have been adopted to help property professionals make their jobs more efficient. Websites, email, spreadsheets, CRMs, valuation calculators, search engines, all of these have helped change the way real estate is transacted. But it seems like we are approaching the limit on how technology can help humans do their job more efficiently.


New system cleans messy data tables automatically

#artificialintelligence

MIT researchers have created a new system that automatically cleans "dirty data" -- the typos, duplicates, missing values, misspellings, and inconsistencies dreaded by data analysts, data engineers, and data scientists. The system, called PClean, is the latest in a series of domain-specific probabilistic programming languages written by researchers at the Probabilistic Computing Project that aim to simplify and automate the development of AI applications (others include one for 3D perception via inverse graphics and another for modeling time series and databases). According to surveys conducted by Anaconda and Figure Eight, data cleaning can take a quarter of a data scientist's time. Automating the task is challenging because different datasets require different types of cleaning, and common-sense judgment calls about objects in the world are often needed (e.g., which of several cities called "Beverly Hills" someone lives in). PClean provides generic common-sense models for these kinds of judgment calls that can be customized to specific databases and types of errors.


Artificial Intelligence and Machine Learning Drive the Future of Supply Chain Logistics

#artificialintelligence

Artificial intelligence (AI) is more accessible than ever and is increasingly used to improve business operations and outcomes, not only in transportation and logistics management, but also in diverse fields like finance, healthcare, retail and others. An Oxford Economics and NTT DATA survey of 1,000 business leaders conducted in early 2020 reveals that 96% of companies were at least researching AI solutions, and over 70% had either fully implemented or at least piloted the technology. Nearly half of survey respondents said failure to implement AI would cause them to lose customers, with 44% reporting their company's bottom line would suffer without it. Simply put, AI enables companies to parse vast quantities of business data to make well-informed and critical business decisions fast. And, the transportation management industry specifically is using this intelligence and its companion technology, machine learning (ML), to gain greater process efficiency and performance visibility driving impactful changes bolstering the bottom line.



C3.ai: Differentiated And Highly Efficient AI

#artificialintelligence

A deep-down analysis reveals that the company indeed has some key … the company's highly efficient machine learning process in the medium term.


Principal Component Analysis Demystified

#artificialintelligence

We see that column "Post Weekday" has less variance and column "Lifetime Post Total Reach" has comparatively more variance. Therefore, if we apply PCA without standardization of data then more weightage will be given to the "Lifetime Post Total Reach" column during the calculation of "eigenvectors" and "eigenvalues" and we will get biased principal components. Now we will standardize the dataset using RobustScaler of sklearn library. Other ways of standardizing data are provided in sklearn like StandardScaler and MinMaxScaler and can be chosen as per the requirement. Unless specified, the number of principal components will be equal to the number of attributes.


Machine Learning Model Dashboard

#artificialintelligence

Nowadays, creating a machine learning model is easy because of different python libraries that are in the market like sklearn, lazypredict, etc. These libraries are easy to use and used to create different types of models along with different types of visualizations and finding out the model performance. If you don't know how lazy predict works, check out the article given below. The main challenge nowadays is that models are not interpreted easily which makes it difficult for a non-technical person to understand and interpret the logic and how the model is working. Explainer dashboard is an open-source python library that creates machine learning model dashboards that can be used to easily understand and analyze the important factors on which the model is working like feature importance, model performance, visualizations, etc.


Computer deciphers brain signals of imagined writing

#artificialintelligence

This is an Inside Science story. A man paralyzed below the neck can imagine writing by hand and, with the help of artificial intelligence software, use electronics hooked up to his brain to translate his mental handwriting into words at speeds comparable to typing on a smartphone, a new study finds. By helping convert thoughts into actions, brain-computer interfaces can help people move or speak. Recently, scientists have sought to help people with disabilities communicate by using these mind-machine interfaces to move a cursor on a screen to point and click on letters on a keyboard. The previous speed record for typing with such devices was about 40 characters per minute.


Brain implants turn imagined handwriting into text on a screen / Humans + Tech - #80

#artificialintelligence

If you've never heard colours, you can now do so. Researchers planted tiny electrodes on the surface of the brain of a man paralysed from the neck down. As he imagined writing letters with his hand, the researchers analysed the neural patterns for each letter. They created an algorithm that transformed these neural patterns into words on a screen [Anushree Dave, ScienceNews]. From his brain activity alone, the participant produced 90 characters, or 15 words, per minute, Krishna Shenoy, a Howard Hughes Medical Institute investigator at Stanford University, and colleagues report May 12 in Nature.