Linear Regression & Gradient Descent

Linear Regression & Gradient Descent

Linear Regression Definition Linear Regression is one of the simplest supervised learning algorithms, which is used to predict a continuous value. You can see the ALVINN video, the autonomous driving video, for the self-driving car, in the previous post Introduction to Machine Learning, that was a supervised learning problem. And that was a Regression problem, because the output Y that you want is a continuous value. Rather than using a self-driving car example which is quite complicated, let’s build up a supervised learning algorithm using a simpler example. ...

March 8, 2025 · 25 min · 5115 words · _AminVilan
Locally Weighted and Logistic Regression

Locally Weighted and Logistic Regression

Locally Weighted Regression Locally weighted regression is a way to modify linear regression and make it fit very non-linear functions, so you aren’t just fitting straight lines. What if the data isn’t just fit well by straight line? how we can addressing out this problem? we wanna use the idea called locally weighted regression or locally weighted linear regression or LWR. So it’s pretty clear what the shape of this data is, but how do you fit a curve that looks like this? It’s actually quite difficult to find features, is it $$ \sqrt x $$ or $$ \log x $$ or $$ x^\frac23 $$ and what is the set of features that lets you do this? ...

March 9, 2025 · 28 min · 5760 words · _AminVilan
Perceptron and Generalized Linear Model

Perceptron and Generalized Linear Model

Perceptron Perceptron algorithm is not something that is widely used in practice. We study it mostly for historical reasons. And also it’s nice and simple, and easy to analyze. As you know, Logistic Regression uses the Sigmoid Function. Which essentially squeezes the entire real line, from $-\infty$ to $+\infty$, between 0 and 1, and the 0 and 1 kind of represent the probability of the output being 1. ...

March 12, 2025 · 25 min · 5140 words · _AminVilan