Deconstructing Linear Regression
Submitted by Vishal (@vishalgokhale) on Sunday, 14 June 2015
This short talk aims to “deconstruct” Linear Regression and explain the steps done by the library functions before throwing out the intercept and slope.
We usually use linear regression when we know that our dependent variable has a linear relationship with the independent variable. We use the library functions to identify the parameters and move on.
But how does the library function choose a single line, out of the infinite possibilities?
How does it know that the line it chooses is the one that fits the data best?
Or rather, what is a best fit, in the first place?
What if the technique it uses has some inherent flaws… Can that guide me to make a smarter choice of model instead?
Have these questions come to your mind? Are you still in search of the answers? If yes, this talk is for you.
We won’t go into the depths of all the techniques and everything under the sky about Linear Regression.
We’ll just the scratch the surface and digest one small chunk of Simple Linear Regression.
Enough though, to get the curious minds wiggling !!!
Particpants: Familiarity with fundamental calculus.
Infra: White-board, duster and markers.
I am a java programmer and a stats/ math enthusiast with 10+ years of coding experience. With nearly 5 years of working with Data Scientists and Wildlife biologists.
As much as I love learning techniques (like Linear-Regression! for instance), I also love to learn about derivations and philosophy involved. And just in case it is not evident, I love to talk about that too.