AI learning with decision trees
Horses and Riders
Decision trees help machines emulate human behavior. Thanks to artificial intelligence, programs can autonomously build decision trees by learning from existing real-life data. Can an AI system determine who was operating a particular vehicle by analyzing driving style?
In the 1950s, TV stations were broadcasting howlers that probably wouldn't interest anyone today. One was "What's My Line?" with presenter John Daly, in which a blindfolded team of four panelists had to guess the identity of a star guest using only yes or no questions.
A popular strategy was to pose general questions first ("Are you male?"), in order to ask more specific questions in the final round, until the panelists began to close in on the guest star and their identity was finally revealed.
In machine learning, programmers employ similar techniques to teach computers to imitate learned behaviors. As a somewhat contrived example, take the behavior of an AND gate (Table 1), which always shows the 0
value at output until a 1
is applied to both inputs. The gate is normally implemented by binary operators, but for the purpose of this investigation, I'll be using the decision tree in Figure 1.
Table 1
x/y AND Connection
Input |
| Output |
---|---|---|
x |
y |
x AND y |
0 |
0 |
0 |
0 |
1 |
0 |
1 |
0 |
0 |
1 |
1 |
1 |
The algorithm starts at the head node and checks whether the X
input variable is set to 1
. If this is not the case, further testing is unnecessary, and the result of 0
is final. If X
is 1
, on the other hand, the algorithm proceeds to the lower left to the Y==1?
node. After evaluation, the end result is clear: If Y
is also equal to 1
, the AND gate displays 1
, but if Y
is 0
, the gate value is 0
.
Supervised Learning
Now, of course, artificial intelligence (AI) can capture and simulate far more complex relationships than simple AND gates. However, the decision trees used there follow exactly the same procedure as in our simple example. In supervised learning, they receive input values such as X
and Y
and the expected output values, like the output values of the AND gate. From this, decision trees are automatically being built internally and gradually honed until, in a fully trained state in production, the system generates exact results from new live input values – or at least values close to the ideal result.
AI kits [1] like the Python module scikit-learn
help automatically create these decision trees, which are based on live input data; it is easily installed using
pip3 install scikit-learn scipy
preferably in a virtualenv
environment, to avoid clutter on your computer:
virtualenv -p /usr/local/bin/python3 dt source dt/bin/activate pip3 install [...]
Listing 1 [2] additionally imports the pydotplus
module to print the decision tree for study purposes.
Listing 1
decision-tree.py
Mandatory Algorithm
The X
Python list contains the previously known input values of the gate as a series of x/y values, also in list form. The Y
variable contains the corresponding output values in the same order. The tree
class from the scikit-learn
module (the abbreviated name sklearn
is also valid) now offers the DecisionTreeClassifier
, which approximates relationships by fitting with the fit()
method; internally, it builds a tree and later predicts the results for new input values using predict()
.
To take a look at the internal processes in the production line of this procedure, the export_graphviz()
method collects the generated tree and outputs it in text notation for the graphviz
chart tool. The pydotplus
Python module (installed with pip3
) creates the PNG file in Figure 2, which requires the graphviz
package. You can install the graphviz
package on Ubuntu by typing:
sudo apt-get install graphviz
The output from the script in Figure 3 shows that, after the fitting phase, perfect results are obtained from all possible input values.
At first glance, the machine-generated decision tree in Figure 2 looks a bit unorthodox, but machines think differently from humans; they just rigidly follow their algorithms. For example, the classifier first checks whether X
is less than 0.5, and if that is the case, it outputs class=0
as a result in the True node below left. If X
is greater than 0.5, however, the False node below right checks to see whether Y
is less than 0.5 and displays 0
if yes and 1
if no; that is, it's equal to 1
in real life.
The procedure performs as optimally as the tree in Figure 1 but – typical of a machine – somewhat non-humanly, just like the AlphaGo mastermind [3], which now ranks among the world's top Go board players, with some seemingly absurd moves that no one has ever tried before.
From Insecure to Secure
The method used by scikit-learn
[4] is mathematically proven and builds the tree by splitting the total of all possible input tuples and their preexisting results into subsets. Each division aims to minimize uncertainty in the system – its entropy. At the head node, the entropy is still high because nothing is yet known about the result, so the algorithm would have to select either 0
or 1
as a result. Because the output is 0
in 75 percent of all cases, and 1
in 25 percent of cases (Table 1), the entropy of the system is initially:
-[(1/4) x ln(1/4)]-[(3/4) x ln(3/4)] = 0.56
If the first node separates the possible input values into two subsets by the X<=0.5
comparison,
[[0,0],[0,1]]
and
[[1,0],[1,1]]
then the entropy of the first subset is equal to 0 (the result is 0 in any case; that is, it's certain), and the second stays 0.56, so the total entropy is:
-(1/2 x 0)+(1/2 x 0.56) = 0.28
With only one node in the tree, the system's entropy has decreased from 0.56 to 0.28. An additional node splits the remaining two result tuples into two individual results. In the case of X>0.5
where Y<=0.5
, the entropy of the whole system drops to 0. Therefore, with two nodes, the tree can correctly predict the result for each arbitrary input tuple with 100 percent probability.
In this case, it is of academic interest at most, because an AND gate can be set up in hardware or firmware much more efficiently. However, an algorithm that knows nothing of the internal structure at first and only learns from combinations of input and output values, mimicking the behavior of the unknown system, can be used to tackle arbitrarily complex problems. As a bonus, it even supports non-permitted input values such as [0.9,0.8]
, which it transforms into useful results (in this case, 1
).
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Systemd Fixes Bug While Facing New Challenger in GNU Shepherd
The systemd developers have fixed a really nasty bug amid the release of the new GNU Shepherd init system.
-
AlmaLinux 10.0 Beta Released
The AlmaLinux OS Foundation has announced the availability of AlmaLinux 10.0 Beta ("Purple Lion") for all supported devices with significant changes.
-
Gnome 47.2 Now Available
Gnome 47.2 is now available for general use but don't expect much in the way of newness, as this is all about improvements and bug fixes.
-
Latest Cinnamon Desktop Releases with a Bold New Look
Just in time for the holidays, the developer of the Cinnamon desktop has shipped a new release to help spice up your eggnog with new features and a new look.
-
Armbian 24.11 Released with Expanded Hardware Support
If you've been waiting for Armbian to support OrangePi 5 Max and Radxa ROCK 5B+, the wait is over.
-
SUSE Renames Several Products for Better Name Recognition
SUSE has been a very powerful player in the European market, but it knows it must branch out to gain serious traction. Will a name change do the trick?
-
ESET Discovers New Linux Malware
WolfsBane is an all-in-one malware that has hit the Linux operating system and includes a dropper, a launcher, and a backdoor.
-
New Linux Kernel Patch Allows Forcing a CPU Mitigation
Even when CPU mitigations can consume precious CPU cycles, it might not be a bad idea to allow users to enable them, even if your machine isn't vulnerable.
-
Red Hat Enterprise Linux 9.5 Released
Notify your friends, loved ones, and colleagues that the latest version of RHEL is available with plenty of enhancements.
-
Linux Sees Massive Performance Increase from a Single Line of Code
With one line of code, Intel was able to increase the performance of the Linux kernel by 4,000 percent.