Section outline

  • Lesson Goal: Introduce decision trees as a simple yet powerful machine learning method that learns logical if-then rules from data. Students will see how a decision tree can learn explicit decision rules (like a flowchart) to solve problems, through examples like a spelling rule and a medical prediction. They will also understand how decision trees learn (splitting criteria) and their pros and cons (interpretable but can overfit).

    • Micro-Topic 3.1: What is a Decision Tree?

      Goal: Explain the concept of a decision tree algorithm and how it represents decisions as a flowchart of questions, leading to an outcome. This provides a foundation before diving into examples.

    • Micro-Topic 3.2: Case Study: Learning a Spelling Rule

      Goal: Demonstrate a simple, relatable example of a decision tree learning a logical rule – the English spelling rule “i before e except after c” – showing how the tree can capture such a rule and its exceptions.

    • Micro-Topic 3.3: Case Study: Decision Tree for Diabetes Prediction

      Goal: Follow up from Lesson 1’s medical example in more detail: how to build a decision tree model to predict diabetes risk from patient data, demonstrating how the tree splits on health factors and how we interpret the resulting rules.

    • Micro-Topic 3.4: How Decision Trees Learn (Splitting Criteria)

      Goal: Give a peek under the hood of the decision tree training process – how does the algorithm decide what question to ask, and what are concepts like information gain or Gini impurity (explained intuitively) that guide the tree building?

    • Micro-Topic 3.5: Pros and Cons of Decision Trees

      Goal: Summarize the advantages and disadvantages of decision trees, preparing students to understand when to use them and what pitfalls to watch out for (like overfitting), connecting to the theme of always choosing the right tool in the “war” against complex problems.