# Decision Tree Regressor

Decision Tree is both classification and regression problem solving algorithm.

for classification follow this blogs 1, 2.

for Regression Decision Tree works Differently from classification. in classification, Splitting Decision tree will decided by Entropy or Gini. …

it is one of the most powerful techniques for building predictive models.

Gradient boosting is a machine learning technique for regression, classification and other tasks, which produces a prediction model in the form of an ensemble of weak prediction models(Weak Learner), typically decision trees or Linear Regression.

Weak learners are…

# L1 and L2 Regularization

Regularization is a technique used for tuning the function by adding an additional penalty term in the error function. The additional term controls the excessively fluctuating function such that the coefficients don’t take extreme values.

in simple word, it uses to add some noise so that our our model cannot…

# Mean Absolute Error, L1 loss

it is calculated the average of the absolute difference between actual and predicted data point or y or output.

# Introduction to Random Forest

Random Forest works on classification and Regression both problem statement.

Random Forest uses N number of Decision Tree as base model and give some sample of the data to each Decision Tree to predict.

if you don’t know about the DT blogs 1,2.

# What is Gini Index?

The Gini Index or Gini Impurity is calculated by subtracting the sum of the squared probabilities of each class from one. It favors mostly the larger partitions and are very simple to implement.

In simple terms, it calculates the probability…

# Introduction to Decision Tree

A decision tree is a decision support tool that uses a tree-like model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. It is one way to display an algorithm that only contains conditional control statements.( definition from Wikipedia)

In simple term, It has tree…

# Introduction to Logistic Regression

In statistics, the logistic model is used to model the probability of a certain class or event existing such as pass/fail, win/lose, alive/dead or healthy/sick. This can be extended to model several classes of events such as determining whether an image contains a cat, dog, lion, etc.

Logistic Regression came…

# Linear Regression

before diving deep into algorithm let’s talk about,

# What is Linear in Linear Regression?

Linear regression is called ‘Linear regression’ not because the x’s or the dependent variables are linear with respect to the y or the independent variable but because the parameters or the thetas are.

If you don’t understand above lines, don’t worry. First…

# Naive Bayes Classifier

Before going into depth of Naive Bayes, let’s talk about the his name,

# Why algorithm called Naive Bayes’ Classifier?

First, classifier word, it means utilizing some training data to understand how given input variables relate to the class.

Second, Bayes word, because Algorithm is based on Bayes’ theorem. … smiling...