Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

Why is the error term considered to be of mean Zero in linear regression?

$
0
0

@NSS wrote:

I was studying some mathematics about linear regression and I came across this.

If we denote the variable we are trying to predict as YY and our covariates as XX, we may assume that there is a relationship relating one to the other such as Y=f(X)+ϵY=f(X)+ϵ where the error term ϵ is normally distributed with a mean of zero like so ϵ∼N(0,σϵ)ϵ∼N(0,σϵ).

Why is the error term distributed with mean zero ?

Is it a mathematical assumption ?

And what if the error is not distributed normally with a non zero mean ?

Any help would be greatly appreciated.

Thanks
Neeraj

Posts: 1

Participants: 1

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles