Linear regression is one of basic supervised learning which is used to predict outcome. In linear regression problem, we find best fit line using sample points with one independent variable and one dependent variable. Basic idea is to finds a linear function which predicts the dependent variable values as a function of the independent variables.

1. Compute mean of x and y values. Here x̅ and È³ are mean of x and y data points.

2. Calculate slope of line(linear classifier)

3. Calculate intercept of line .

Best fit line separates data points below and this line can be used to predict outcome for other test(new) data points.

In order to best fit line which is represented by find linear function - we can use

**ordinary least squares method(**minimize the residuals) or**least absolute deviations**(minimizing the sum of absolute values of residuals). Residuals means vertical distances between the points of the data set and the fitted line**(**wiki**)**.Linear best fit line(blue) for data points(Red) and green line indicates error/residues (Source:wiki) |

**Least square method**:- Using this approach vertical distances between the data set points and the fitted line is computed such that sum of all distances for each point from best fit line is minimum.

**Dataset**:- (x, y) = (2,10) (4,9) (3,6) (6,6) (8,6) (8,3) (10,2)

**Algorithm**: For finding best fit line (y= . mX+ c) , we have to find value of m(slope) and c(intercept). Follow below steps to find slope and intercept.

1. Compute mean of x and y values. Here x̅ and È³ are mean of x and y data points.

2. Calculate slope of line(linear classifier)

3. Calculate intercept of line .

**---------------------------------------------------------**

**Use python terminal**to find mean of x and y data points.Python 2.7.10 (default, Jul 15 2017, 17:16:57) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.31)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> >>> import numpy >>> a = [2,4,3,6,8,8,10] >>> numpy.mean(a) 5.8571428571428568 >>> >>> >>> >>> b = [10,9,6,6,6,3,2] >>> numpy.mean(b) 6.0

**x̅ =**5.86**È³ =**6.0**Find slope(m):**Pre-process sample data in tabular form below and compute slope of line.iteration# | xi | yi | xi - x̅ | yi - È³ | (xi - x̅)(yi -È³)= (P) | (xi - x̅)2 =(Q) |

1 | 2 | 10 | -3.86 | 4 | -15.44 | 14.9 |

2 | 4 | 9 | -1.86 | 3 | -5.58 | 3.46 |

3 | 3 | 6 | -2.86 | 0 | 0 | 8.18 |

4 | 6 | 6 | 0.14 | 0 | 0 | 0.02 |

5 | 8 | 6 | 2.14 | 0 | 0 | 4.58 |

6 | 8 | 3 | 2.14 | -3 | -6.42 | 4.58 |

7 | 10 | 2 | 4.14 | -4 | -16.56 | 17.14 |

**Slope of line**:**Compute y-intercept**:**Now equation of line**:Best fit line separates data points below and this line can be used to predict outcome for other test(new) data points.

Tags:
MLandAI

Your post is very likable and attractable! I am always following your blog and please sharing the new updates...

ReplyDeleteSocial Media Marketing Courses in Chennai

Social Media Marketing Training in Chennai

Pega Training in Chennai

Primavera Training in Chennai

Unix Training in Chennai

Oracle Training in Chennai

Oracle DBA Training in Chennai

Social Media Marketing Courses in Chennai

Social Media Marketing Training in Chennai

Linear Regression is a fundamental technique in statistics and machine learning for modeling the relationship between a dependent variable (target) and one or more independent variables (features). It assumes a linear relationship between the variables, which can be expressed as a straight line when plotted.

DeleteMachine Learning Projects for Final Year

Deep Learning Projects for Final Year

The blog explores worthy things. Great job.

ReplyDeleteWeb Designing Course in Madurai

Web Designing Training in Madurai

Web Designing Course in Madurai

Web Designing Course in Coimbatore

Web Design Training in Coimbatore

Web Design Training Coimbatore

Mua vÃ© táº¡i Ä‘áº¡i lÃ½ vÃ© mÃ¡y bay Aivivu, tham kháº£o

ReplyDeletevÃ© mÃ¡y bay Ä‘i Má»¹ giÃ¡ ráº» 2021

vÃ© mÃ¡y bay Ä‘i sÃ i gÃ²n giÃ¡ ráº»

giÃ¡ vÃ© mÃ¡y bay sÃ i gÃ²n hÃ ná»™i vietjet

mÃ¡y bay Ä‘Ã náºµng nha trang

giÃ¡ vÃ© mÃ¡y bay Ä‘i Huáº¿

Ä‘Ã³n taxi á»Ÿ sÃ¢n bay ná»™i bÃ i

combo Ä‘Ã náºµng