This page challenges you to answer a series of multiple choice exercises about the material discussed in the "Training Decision Trees" unit.
Question 1
What are the effects of replacing the numerical features with their
negative values (for example, changing the value +8 to -8) with
the exact numerical splitter?
The same conditions will be learned; only the
positive/negative children will be switched.
Fantastic.
Different conditions will be learned, but the overall structure
of the decision tree will remain the same.
If the features change, then the conditions will change.
The structure of the decision tree will be completely
different.
The structure of the decision tree will actually be
pretty much the same. The conditions will change, though.
Question 2
What two answers best describe the effect of testing only half
(randomly selected) of the candidate threshold values in X?
The information gain would be higher or equal.
The information gain would be lower or equal.
Well done.
The final decision tree would have worse testing accuracy.
The final decision tree would have no better training accuracy.
Well done.
Question 3
What would happen if the "information gain" versus "threshold" curve
had multiple local maxima?
It is impossible to have multiple local maxima.
Multiple local maxima are possible.
The algorithm would select the local maxima with the smallest
threshold value.
The algorithm would select the global maximum.
Well done.
Question 4
Compute the information gain of the following split:
Node | # of positive examples | # of negative examples |
---|---|---|
parent node | 10 | 6 |
first child | 8 | 2 |
second child | 2 | 4 |