I'm almost certain that now you might want to learn about these branches in greater detail. Worry not, I'll surely open the gates to these subsets in the posts to come. If you missed my post, you can find it at the following link: Branches of Artificial Intelligence. Previously, we discussed Machine Learning. We also discussed its subsets -- Supervised Learning, Unsupervised Learning, and Reinforcement Learning.
Linear and Logistic regressions are usually the first algorithms people learn in predictive modeling. Due to their popularity, a lot of analysts even end up thinking that they are the only form of regressions. The ones who are slightly more involved think that they are the most important amongst all forms of regression analysis. The truth is that there are innumerable forms of regressions, which can be performed. Each form has its own importance and a specific condition where they are best suited to apply.
Linear and Logistic regressions are usually the first algorithms people learn in data science. Due to their popularity, a lot of analysts even end up thinking that they are the only form of regressions. The ones who are slightly more involved think that they are the most important among all forms of regression analysis. The truth is that there are innumerable forms of regressions, which can be performed. Each form has its own importance and a specific condition where they are best suited to apply.
Linear and Logistic regressions are usually the first modelling algorithms that people learn for Machine Learning and Data Science. Both are great since they're easy to use and interpret. However, their inherent simplicity also comes with a few drawbacks and in many cases they're not really the best choice of regression model. There are in fact several different types of regressions, each with their own strengths and weaknesses. In this post, we're going to look at 7 of the most common types of regression algorithms and their properties.