決策樹狀圖:隨堂測驗
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
本頁面會針對「決策樹」單元中討論的內容,提供一系列複選題練習,供您挑戰。
第 1 題
決策樹的推論會透過路由化為例...
從葉子到根。
所有推論都會從根節點 (第一個條件) 開始。
從一個葉子傳遞到另一個葉子。
所有推論都從根節點開始,而非從葉節點開始。
第 2 題
所有條件是否都只涉及單一功能?
不會。
雖然與軸對齊的條件只涉及單一地圖項目,但斜向條件則涉及多個地圖項目。
第 3 題
請考慮以下兩個特徵 x1 和 x2 的預測地圖:

下列哪個決策樹與預測地圖相符?
決策樹 B.
如果條件 x2 ≥ 0.5 為否,則葉子可能或不一定是藍色,因此這是不良條件。
決策樹 C.
如果 x1 不大於或等於 1.0,則葉片應為「藍色」而非「橘色」,因此這是錯誤的葉片。
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2025-07-27 (世界標準時間)。
[null,null,["上次更新時間:2025-07-27 (世界標準時間)。"],[[["\u003cp\u003eThis page provides multiple choice questions to test your understanding of decision trees.\u003c/p\u003e\n"],["\u003cp\u003eThe questions cover topics such as decision tree inference, types of conditions used in decision trees, and interpretation of decision tree prediction maps.\u003c/p\u003e\n"],["\u003cp\u003eCorrect answers are provided with explanations to reinforce learning.\u003c/p\u003e\n"]]],[],null,["# Decision trees: Check your understanding\n\n\u003cbr /\u003e\n\nThis page challenges you to answer a series of multiple choice exercises\nabout the material discussed in the \"Decision trees\" unit.\n\nQuestion 1\n----------\n\nThe inference of a decision tree runs by routing an example... \nfrom the leaf to the root. \nAll inference starts from the root (the first condition). \nfrom one leaf to another. \nAll inference starts from the root, not from a leaf. \nfrom the root to the leaf. \nWell done!\n\nQuestion 2\n----------\n\nDo all conditions involve only a single feature? \nYes. \nOblique features test multiple features. \nNo. \nAlthough axis-aligned conditions only involve a single feature, oblique conditions involve multiple features.\n\nQuestion 3\n----------\n\nConsider the following prediction map on two features x1 and x2:\n\nWhich of the following decision trees match the prediction map? \nDecision Tree A. \nYes! \nDecision Tree B. \nIf the condition x2 ≥ 0.5 is no, then the leaf may or may not be blue, so this is a bad condition. \nDecision Tree C. \nIf x1 is not ≥ 1.0, then the leaf should be 'blue' rather than 'orange', so this is the wrong leaf."]]