Quantcast
Channel: Data Science, Analytics and Big Data discussions - Latest topics
Viewing all articles
Browse latest Browse all 4448

Calculating Information gain in Decision Trees while choosing which attribute to split on

$
0
0

@crisis1.08 wrote:

Suppose we do this by calculating the Entropy as -(P+)log(P+) - P(-)log(P-) and comparing the uncertainty, then what if we encounter a pure set like 4Yes/0No. In that case the second term in the above expression would be 0*(-Infinity). So do we have to assume it as 0 in that case?

Posts: 7

Participants: 2

Read full topic


Viewing all articles
Browse latest Browse all 4448

Trending Articles