Open Access
Subscription Access
Open Access
Subscription Access
A Review of Splitting Criteria for Decision Tree Induction
Subscribe/Renew Journal
Decision Tree techniques are used to build classification models in data mining. A decision tree is a sequential hierarchical tree structure which is composed of decision nodes corresponding to attributes. A decision tree model is based on attribute selection measure. The paper represents splitting criterion like Information Gain, Gain Ratio, Gini Index, Jaccard Coefficient, and Least Probable Intersections. In decision tree construction, the splitting criterion is heuristic for best attribute selection that partitions node dataset. Attribute with best score is chosen as a splitting attribute for a node. Best score is based on either impurity reduction or purity gain. This paper gives comparative study of attribute selection measures for top down induction of decision tree.
Keywords
Decision Tree, Splitting Criterion, Attribute Selection.
User
Subscription
Login to verify subscription
Font Size
Information
Abstract Views: 338
PDF Views: 4