Question
Below is a example of a ID3 algorithm in Unity using C# im not sure how the ID3Example works in the whole thing can someone
Below is a example of a ID3 algorithm in Unity using C# im not sure how the ID3Example works in the whole thing can someone explain the whole thing in more detail please. i am trying to use it with this data set a txt file
Alternates?:Bar?:Friday?:Hungry?:#Patrons:Price:Raining?:Reservations?:Type:EstWaitTime:WillWait? Yes:No:No:Yes:Some:$$$:No:Yes:French:0-10:True Yes:No:No:Yes:Full:$:No:No:Thai:30-60:False No:Yes:No:No:Some:$:No:No:Burger:0-10:True Yes:No:Yes:Yes:Full:$:Yes:No:Thai:10-30:True Yes:No:Yes:No:Full:$$$:No:Yes:French:>60:False No:Yes:No:Yes:Some:$$:Yes:Yes:Italian:0-10:True No:Yes:No:No:None:$:Yes:No:Burger:0-10:False No:No:No:Yes:Some:$$:Yes:Yes:Thai:0-10:True No:Yes:Yes:No:Full:$:Yes:No:Burger:>60:False Yes:Yes:Yes:Yes:Full:$$$:No:Yes:Italian:10-30:False No:No:No:No:None:$:No:No:Thai:0-10:False Yes:Yes:Yes:Yes:Full:$:No:No:Burger:30-60:True
Learning to use decision trees
We already learned the power and flexibility of decision trees for adding a decision-making component to our game. Furthermore, we can also build them dynamically through supervised learning. That's why we're revisiting them in this chapter.
There are several algorithms for building decision trees that are suited for different uses such as prediction and classification. In our case, we'll explore decision-tree learning by implementing the ID3 algorithm.
Getting ready
Despite having built decision trees in a previous chapter, and the fact that they're based on the same principles as the ones that we will implement now, we will use different data types for our implementation needs in spite of the learning algorithm.
We will need two data types: one for the decision nodes and one for storing the examples to be learned.
The code for the DecisionNode data type is as follows:
Copy
using System.Collections.Generic; public class DecisionNode { public string testValue; public Dictionarychildren; public DecisionNode(string testValue = "") { this.testValue = testValue; children = new Dictionary (); } }
The code for the Example data type is as follows:
Copy
using UnityEngine; using System.Collections.Generic; public enum ID3Action { STOP, WALK, RUN } public class ID3Example : MonoBehaviour { public ID3Action action; public Dictionaryvalues; public float GetValue(string attribute) { return values[attribute]; } }
How to do it
We will create the ID3 class with several functions for computing the resulting decision tree.
Create the ID3 class:
Copy
using UnityEngine; using System.Collections.Generic; public class ID3 : MonoBehaviour { // next steps }
Start the implementation of the function responsible for splitting the attributes into sets:
Copy
public Dictionary> SplitByAttribute( ID3Example[] examples, string attribute) { Dictionary > sets; sets = new Dictionary >(); // next step }
Iterate though all the examples received, and extract their value in order to assign them to a set:
Copy
foreach (ID3Example e in examples) { float key = e.GetValue(attribute); if (!sets.ContainsKey(key)) sets.Add(key, new List()); sets[key].Add(e); } return sets;
Create the function for computing the entropy for a set of examples:
Copy
public float GetEntropy(ID3Example[] examples) { if (examples.Length == 0) return 0f; int numExamples = examples.Length; DictionaryactionTallies; actionTallies = new Dictionary (); // next steps }
Iterate through all of the examples to compute their action quota:
Copy
foreach (ID3Example e in examples) { if (!actionTallies.ContainsKey(e.action)) actionTallies.Add(e.action, 0); actionTallies[e.action]++; }
Compute the entropy :
Copy
int actionCount = actionTallies.Keys.Count; if (actionCount == 0) return 0f; float entropy = 0f; float proportion = 0f; foreach (int tally in actionTallies.Values) { proportion = tally / (float)numExamples; entropy -= proportion * Mathf.Log(proportion, 2); } return entropy;
Implement the function for computing the entropy for all the sets of examples. This is very similar to the preceding one; in fact, it uses it:
Copy
public float GetEntropy( Dictionary> sets, int numExamples) { float entropy = 0f; foreach (List s in sets.Values) { float proportion; proportion = s.Count / (float)numExamples; entropy -= proportion * GetEntropy(s.ToArray()); } return entropy; }
Define the function for building a decision tree:
Copy
public void MakeTree( ID3Example[] examples, Listattributes, DecisionNode node) { float initEntropy = GetEntropy(examples); if (initEntropy <= 0) return; // next steps }
Declare and initialize all the required members for the task:
Copy
int numExamples = examples.Length; float bestInfoGain = 0f; string bestSplitAttribute = ""; float infoGain = 0f; float overallEntropy = 0f; Dictionary> bestSets; bestSets = new Dictionary >(); Dictionary > sets;
Iterate through all the attributes in order to get the best set based on the information gain:
Copy
foreach (string a in attributes) { sets = SplitByAttribute(examples, a); overallEntropy = GetEntropy(sets, numExamples); infoGain = initEntropy - overallEntropy; if (infoGain > bestInfoGain) { bestInfoGain = infoGain; bestSplitAttribute = a; bestSets = sets; } }
Select the root node based on the best split attribute, and rearrange the remaining attributes for building the rest of the tree:
Copy
node.testValue = bestSplitAttribute; ListnewAttributes = new List (attributes); newAttributes.Remove(bestSplitAttribute);
Iterate through all the remaining attributes. calling the function recursively:
Copy
foreach (Listset in bestSets.Values) { float val = set[0].GetValue(bestSplitAttribute); DecisionNode child = new DecisionNode(); node.children.Add(val, child); MakeTree(set.ToArray(), newAttributes, child); }
How it works
The class is modular in terms of functionality. It doesn't store any information but is able to compute and retrieve everything needed for the function that builds the decision tree. SplitByAttribute takes the examples and divides them into sets that are needed for computing their entropy. ComputeEntropy is an overloaded function that computes a list of examples and all the sets of examples using the formulae defined in the ID3 algorithm. Finally, MakeTree works recursively in order to build the decision tree, getting hold of the most significant attribute.
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started