New AI tools make BI smarter -- and more useful


Companies looking to make good on the promise of machine learning for data analysis are turning to a somewhat unlikely old friend. Business intelligence systems, largely the domain for analyzing past performance, are being retrofitted with artificial intelligence to bring predictive features to their reporting capabilities. The Symphony Post Acute Network is one such organization. The health care company, which has 5,000 beds in 28 health care facilities in Illinois, Indiana and Wisconsin, wanted to use artificial intelligence and machine learning to improve care for up to 80,000 patients a year recovering from procedures like knee surgery, or receiving dialysis treatment. For example, buried deep in a patient's medical core could be an indication that a patient is particularly at risk for a dangerous fall and therefore requires extra precautions.

[D] Best software to write a machine learning based master thesis • r/MachineLearning


I am studying a MSc in Applied Statistics and my master thesis is called "Unsupervised learning techniques applied to classification of gymnasts through the measuring of individual elements from acrobatic gymnastic discipline". That is, I would need to have an index, citations, plots, math (for the theoretical background of machine learning models) and so on. I am going to use Python for the developing. However, I am still thinking about which sofwtare should I use for typing purposes. Regarding the first option, the main inconvenient is that coding has to be developed outside Latex (for testing) and then it has to be pasted in Latex, which results in a loss of hyphenation (at least that happened when I first used it) and the consequent loss of time for making it again.

It's all deep learning


Artificial intelligence (AI) stands out as a transformational technology of our digital age -- and its practical application throughout the economy is growing apace. Neural networks are a subset of machine learning techniques, loosely modelling the way that neurons interact in the brain. AI practitioners refer to these techniques as "deep learning…". Deep learning requires thousands of data records for models to become relatively good at classification tasks and, in some cases, millions for them to perform at the level of humans. By one estimate, a supervised deep-learning algorithm will achieve acceptable performance with around 5,000 labelled examples per category and will match or exceed human-level performance when trained with a data set containing at least 10 million labelled examples.

Machine Learning


Bootcamp Description Everything we have learnt in software development is undergoing change. Thousands of job go unfilled at various companies as they can't find qualified engineers who know machine learning. TiE's one day bootcamp will set you on the right path. This is one investment in your career you can't afford to miss. Whether you are starting out or want to manage a team of machine learning experts, here is your chance to take just one day from your schedule and walk with real hands on experience with machine learning concepts.

The importance of Big Data in AI technologies


To say that AI is big data is to overstate things a bit. And yet, without big data, AI wouldn't be where it is today. In the last few decades, the two technologies have advanced in lock-step. Largely because without big data, however clever the AI programmers were, they couldn't get past the theoretical stage. Mainly, this is down to what big data is used for.

Learn TensorFlow Slim(TF-Slim) From Scratch Udemy


Welcome to this course: Learn TensorFlow Slim(TF-Slim) From Scratch. TensorFlow-Slim is a light-weight library for defining, training, and evaluating complex models in TensorFlow. With the TensorFlow-Slim library, we can build, train, and evaluate the model easier by providing lots of high-level layers, variables, and regularizers. At the end of this course, you will be geared up to take on any challenges of implementing TensorFlow-Slim in your machine learning environment.

Open Machine Learning Course. Topic 6. Feature Engineering and Feature Selection


In this course, we have already seen several key machine learning algorithms. However, before moving on to the more fancy ones, we'd like to take a small detour and talk about data preparation. The well-known concept of "garbage in -- garbage out" applies 100% to any task in machine learning. Any experienced professional can recall numerous times when a simple model trained on high-quality data was proven to be better than a complicated multi-model ensemble built on data that wasn't clean. This article will contain almost no math, but there will be a fair amount of code. Some examples will use the dataset from Renthop company, which is used in the Two Sigma Connect: Rental Listing Inquiries Kaggle competition. In this task, you need to predict the popularity of a new rental listing, i.e. classify the listing into three classes: ['low', 'medium', 'high']. To evaluate the solutions, we will use the log loss metric (the smaller, the better).

CLS Architetti and Arup use a portable robot to 3D print a house in Milan


Engineering firm Arup and architecture studio CLS Architetti have used a portable robot to 3D print a concrete house, which is on show for Milan design week. Printed onsite on Milan's Piazza Cesare Beccaria, the 100-square-metre house was formed over the course of a week. Made up of 35 modules, the house features curved walls, a living area, bedroom, kitchen and bathroom. The walls were was built by a robot designed by Cybe Construction, a 3D printing company from Poland, using a special mix of concrete and additives developed by Italcementi, one of the world's largest cement suppliers. The roof, windows and doors were added afterwards.

DHL and IBM report cites benefits and potential of AI in logistics


The advent of Artificial Intelligence (AI) technology has been making inroads over the years in various sectors. But a joint report issued this week by global express and logistics services provider DHL and technology powerhouse IBM takes an in-depth look into the impact of AI within logistics. The report, entitled "Artificial Intelligence in Logistics: A collaborative report by DHL and IBM on implications and use cases for the logistics industry," examines different ways in which AI can be used for augmenting logistics operations, especially now at a time when leveraging AI is more accessible and affordable than it has been in the past. "Everything can be enhanced through modern technology, and I think AI is at the beginning of really big usefulness," said Ken Allen, CEO of DHL Express, in an interview. "We already have big data and IoT and this is another part of that.

How Artificial Intelligence Will Impact Corporate Communications


I have seen a glimpse of the future impact of artificial intelligence on corporate communications – and it is good. AI will bring a new level of trust to information, improve the way information is delivered (i.e., via augmented reality and virtual reality apps) and provide better insights and predictive analytics for decision making by corporate communications professionals. My exposure to artificial intelligence has primarily been in the trusted identity technology industry, where AI is starting to revolutionize the digitization of identity and access management, physical access control and cybersecurity, especially as a proactive approach to threat and fraud detection. The management of identities, either physical or digital, is changing rapidly, requiring new ways of thinking to add trust. Trust is an important topic for corporate communications, too.