TDIDT

AcronymDefinition
TDIDTTop-Down Induction of Decision Trees
References in periodicals archive ?
Algorithm 1: TDIDT procedure PCT(E) returns tree 1: ([t.sup.*], [h.sup.*], [p.sup.*]) = BestTest(E) 2: if [t.sup.*] [not equal to] none then 3: for each [E.sub.i] [member of] [p.sup.*] do 4: [tree.sub.i] = PCT([E.sub.i]) 5: return node([t.sup.*],[U.sub.i]{[tree.sub.i]}) 6: else 7: return leaf(Prototype(E)) procedure BestTest(E) 1: ([t.sup.*], [h.sup.*], p) = (none, 0, [PHI]) 2: for each possible test t do 3: p= partition induced by t on E 4: [MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] 5: if(h>[h.sup.*]) [conjunction] Acceptable (t,p) then 6: ([t.sup.*], [h.sup.*], p) = (t,h,p) 7: return ([t.sup.*], [h.sup.*], p).
For instance, in case of top-down induction of decision trees (TDIDT), information entropy is a very efficient heuristic that makes the generated decision tree quite small and decreases the number of leaves.