Collaborating Authors

Building your first machine learning model using KNIME (no coding)


One of the biggest challenges for beginners in machine learning / data science is that there is too much to learn simultaneously. Especially so, if you do not know how to code. You need to quickly get used to Linear Algebra, Statistics, other mathematical concepts and learn how to code them! It might end up being a bit overwhelming for the new users. If you have no background in coding and find it difficult to cope with, you can start learning data science with a tool which is GUI driven.



KNIME is an open source data analytics platform for data science, ML, AI, AutoML, big data, & more. This is a course for Business Enthusiasts who look for data-driven decision-making techniques for different business scenarios. This would provide a basic and intermediate level understanding of different Machine Learning Algorithms and how they can be implemented in KNIME. It would also teach the students how to judge the different Machine Learning Algorithms and which ones will fit your business scenario. KNIME is free and powerful software that has a vast number of business use cases.

Introduction to Components with Knime Analytics - Analytics Vidhya


This article was published as a part of the Data Science Blogathon. In the last article A Friendly Introduction to KNIME Analytics Platform I provided a brief insight into the open-source software KNIME Analytics Platform and what it is capable of. With the help of a customer segmentation example, I showed the general functions of KNIME Analytics Platform. This article takes up a topic that was briefly mentioned at the end of the last article: Components. I'll provide an in-depth explanation of what components are, what functionalities they have, and why they are useful.

Boosting the Assembly and Deployment of Artificial Intelligence Solutions with KNIME Visual Data Science Tools Amazon Web Services


With rapid advancements in machine learning (ML) techniques over the past decade, intelligent decision-making and prediction systems are poised to transform productivity and lead to significant economic gains. A study conducted by PwC Global concludes that by the end of this decade, the total positive impact of artificial intelligence (AI) on the global economy could be above $15 trillion, driven mostly by enhancements in consumer products. To make that happen, however, businesses must make strategic investments in the type of technology that moves AI projects into production (productionizing) and helps customers deploy them. Unfortunately, PwC's survey reveals the percentage of executives planning to deploy AI has gone down from 20 percent a year ago to only 4 percent at the beginning of 2020. The primary reason for this decrease is the gap between the growing volume of data and data-driven modeling capabilities, and the necessary skills and toolsets.

Data Analytics and Mining for Dummies – Data Science Blog (English only)


Data Analytics and Mining is often perceived as an extremely tricky task cut out for Data Analysts and Data Scientists having a thorough knowledge encompassing several different domains such as mathematics, statistics, computer algorithms and programming. However, there are several tools available today that make it possible for novice programmers or people with no absolutely no algorithmic or programming expertise to carry out Data Analytics and Mining. One such tool which is very powerful and provides a graphical user interface and an assembly of nodes for ETL: Extraction, Transformation, Loading, for modeling, data analysis and visualization without, or with only slight programming is the KNIME Analytics Platform. KNIME, or the Konstanz Information Miner, was developed by the University of Konstanz and is now popular with a large international community of developers. Initially KNIME was originally made for commercial use but now it is available as an open source software and has been used extensively in pharmaceutical research since 2006 and also a powerful data mining tool for the financial data sector. It is also frequently used in the Business Intelligence (BI) sector.