Machine Learning-Logistic Regression


Machine Learning-Logistic Regression

Logistic Regression otherwise known as characterization is a subset of Supervised learning. Arrangement machine learning models relies upon the paired yield. One such model is the sigmoid function.

Why the sigmoid function:

The primary motivation behind why we picked the sigmoid function is that it exists between (0 to 1). Along these lines, it is particularly utilized for models where we need to foresee the likelihood as a yield. Since the likelihood of anything exists just between the scope of 0 and 1, sigmoid is the correct decision. It could be said, the sigmoid function is utilized to draw choice limits.

Sigmoid function Derivation:


Logistic Regression model

where x = X*Θ^T

Quick math: Log(0) = ∞; Log(1) = 0

Equation of cost function when expected output is 1:

Image for post

Equation of cost function when expected output is 0:

Image for post

We can combine the above two equations as shown below and it serves as the overall cost function of a logistic model:

Image for post

Using the logistic model, we can conclude that — the error is infinite when the expected output and predicted output is different and the error is zero when the expected output and predicted output is the same.

Image for post


Scenario: To identify a given pictorial number is either 1/2/3/4

Input: 1

We have to prepare a model with an immense measure of named information of the pixel estimation of each picture. The idea required here is called ‘one versus all’ which analyzes the information given to a specific category[We have four categories] and raises the probability rate that it would fall under that classification. At the point when the information is given to the model, it restores a variety of liable to fall under a classification of numbers and it would seem that [0.9, 0.1,0.3, 0.2] which compares to the class [1,2,3,4]. Henceforth, it can distinguish the number as 1.