Consider the training examples shown in the following table for a binary classif
ID: 3680569 • Letter: C
Question
Consider the training examples shown in the following table for a binary classification problem. What is the entropy of this collection of training examples with respect to the target class variable? What are the information gains of a_1 and a_2 relative to these training examples? For a_3, which is a continuous attribute, compute the information gain for every possible split. What is the best split (among a_1, a_2, and a_3) according to the information gain? What is the best split (between a_1 and a_2) according to the Gini index?Explanation / Answer
a)
Entropy = -4/8(log4/8) – 4/8(log4/8) = 1