Skip to content

Commit d2ba756

Browse files
author
ArturoAmorQ
committed
Notebook related to classification tree exercise
1 parent 564d9f5 commit d2ba756

File tree

2 files changed

+59
-26
lines changed

2 files changed

+59
-26
lines changed

notebooks/trees_ex_01.ipynb

+35-13
Original file line numberDiff line numberDiff line change
@@ -6,16 +6,13 @@
66
"source": [
77
"# \ud83d\udcdd Exercise M5.01\n",
88
"\n",
9-
"In the previous notebook, we showed how a tree with a depth of 1 level was\n",
10-
"working. The aim of this exercise is to repeat part of the previous experiment\n",
11-
"for a depth with 2 levels to show how the process of partitioning is repeated\n",
12-
"over time.\n",
9+
"In the previous notebook, we showed how a tree with 1 level depth works. The\n",
10+
"aim of this exercise is to repeat part of the previous experiment for a tree\n",
11+
"with 2 levels depth to show how such parameter affects the feature space\n",
12+
"partitioning.\n",
1313
"\n",
14-
"Before to start, we will:\n",
15-
"\n",
16-
"* load the dataset;\n",
17-
"* split the dataset into training and testing dataset;\n",
18-
"* define the function to show the classification decision function."
14+
"We first load the penguins dataset and split it into a training and a testing\n",
15+
"sets:"
1916
]
2017
},
2118
{
@@ -61,10 +58,35 @@
6158
"metadata": {},
6259
"source": [
6360
"Create a decision tree classifier with a maximum depth of 2 levels and fit the\n",
64-
"training data. Once this classifier trained, plot the data and the decision\n",
65-
"boundary to see the benefit of increasing the depth. To plot the decision\n",
66-
"boundary, you should import the class `DecisionBoundaryDisplay` from the\n",
67-
"module `sklearn.inspection` as shown in the previous course notebook."
61+
"training data."
62+
]
63+
},
64+
{
65+
"cell_type": "code",
66+
"execution_count": null,
67+
"metadata": {},
68+
"outputs": [],
69+
"source": [
70+
"# Write your code here."
71+
]
72+
},
73+
{
74+
"cell_type": "markdown",
75+
"metadata": {},
76+
"source": [
77+
"Now plot the data and the decision boundary of the trained classifier to see\n",
78+
"the effect of increasing the depth of the tree.\n",
79+
"\n",
80+
"Hint: Use the class `DecisionBoundaryDisplay` from the module\n",
81+
"`sklearn.inspection` as shown in previous course notebooks.\n",
82+
"\n",
83+
"<div class=\"admonition warning alert alert-danger\">\n",
84+
"<p class=\"first admonition-title\" style=\"font-weight: bold;\">Warning</p>\n",
85+
"<p class=\"last\">At this time, it is not possible to use <tt class=\"docutils literal\"><span class=\"pre\">response_method=\"predict_proba\"</span></tt> for\n",
86+
"multiclass problems. This is a planned feature for a future version of\n",
87+
"scikit-learn. In the mean time, you can use <tt class=\"docutils literal\"><span class=\"pre\">response_method=\"predict\"</span></tt>\n",
88+
"instead.</p>\n",
89+
"</div>"
6890
]
6991
},
7092
{

python_scripts/trees_ex_01.py

+24-13
Original file line numberDiff line numberDiff line change
@@ -14,16 +14,13 @@
1414
# %% [markdown]
1515
# # 📝 Exercise M5.01
1616
#
17-
# In the previous notebook, we showed how a tree with a depth of 1 level was
18-
# working. The aim of this exercise is to repeat part of the previous experiment
19-
# for a depth with 2 levels to show how the process of partitioning is repeated
20-
# over time.
17+
# In the previous notebook, we showed how a tree with 1 level depth works. The
18+
# aim of this exercise is to repeat part of the previous experiment for a tree
19+
# with 2 levels depth to show how such parameter affects the feature space
20+
# partitioning.
2121
#
22-
# Before to start, we will:
23-
#
24-
# * load the dataset;
25-
# * split the dataset into training and testing dataset;
26-
# * define the function to show the classification decision function.
22+
# We first load the penguins dataset and split it into a training and a testing
23+
# sets:
2724

2825
# %%
2926
import pandas as pd
@@ -48,10 +45,24 @@
4845

4946
# %% [markdown]
5047
# Create a decision tree classifier with a maximum depth of 2 levels and fit the
51-
# training data. Once this classifier trained, plot the data and the decision
52-
# boundary to see the benefit of increasing the depth. To plot the decision
53-
# boundary, you should import the class `DecisionBoundaryDisplay` from the
54-
# module `sklearn.inspection` as shown in the previous course notebook.
48+
# training data.
49+
50+
# %%
51+
# Write your code here.
52+
53+
# %% [markdown]
54+
# Now plot the data and the decision boundary of the trained classifier to see
55+
# the effect of increasing the depth of the tree.
56+
#
57+
# Hint: Use the class `DecisionBoundaryDisplay` from the module
58+
# `sklearn.inspection` as shown in previous course notebooks.
59+
#
60+
# ```{warning}
61+
# At this time, it is not possible to use `response_method="predict_proba"` for
62+
# multiclass problems. This is a planned feature for a future version of
63+
# scikit-learn. In the mean time, you can use `response_method="predict"`
64+
# instead.
65+
# ```
5566

5667
# %%
5768
# Write your code here.

0 commit comments

Comments
 (0)