Ara
Toplam kayıt 19, listelenen: 1-10
On the VC-dimension of univariate decision trees
(2012)
In this paper, we give and prove lower bounds of the VC-dimension of the univariate decision tree hypothesis class. The VC-dimension of the univariate decision tree depends on the VC-dimension values of its subtrees and ...
Statistical tests using hinge/ε-sensitive loss
(Springer-Verlag, 2013)
Statistical tests used in the literature to compare algorithms use the misclassification error which is based on the 0/1 loss and square loss for regression. Kernel-based, support vector machine classifiers (regressors) ...
Feature extraction from discrete attributes
(IEEE, 2010)
In many pattern recognition applications, first decision trees are used due to their simplicity and easily interpretable nature. In this paper, we extract new features by combining k discrete attributes, where for each ...
Univariate margin tree
(Springer, 2010)
In many pattern recognition applications, first decision trees are used due to their simplicity and easily interpretable nature. In this paper, we propose a new decision tree learning algorithm called univariate margin ...
Parallel univariate decision trees
(Elsevier B.V., 2007-05-01)
Univariate decision tree algorithms are widely used in data mining because (i) they are easy to learn (ii) when trained they can be expressed in rule based manner. In several applications mainly including data mining, the ...
Calculating the VC-dimension of decision trees
(IEEE, 2009)
We propose an exhaustive search algorithm that calculates the VC-dimension of univariate decision trees with binary features. The VC-dimension of the univariate decision tree with binary features depends on (i) the ...
Tree Ensembles on the induced discrete space
(Institute of Electrical and Electronics Engineers Inc., 2016-05)
Decision trees are widely used predictive models in machine learning. Recently, K-tree is proposed, where the original discrete feature space is expanded by generating all orderings of values of k discrete attributes and ...
Searching for the optimal ordering of classes in rule induction
(IEEE, 2012-11-15)
Rule induction algorithms such as Ripper, solve a K > 2 class problem by converting it into a sequence of K - 1 two-class problems. As a usual heuristic, the classes are fed into the algorithm in the order of increasing ...
Regularizing soft decision trees
(Springer, 2013)
Recently, we have proposed a new decision tree family called soft decision trees where a node chooses both its left and right children with different probabilities as given by a gating function, different from a hard ...
Budding trees
(IEEE Computer Soc, 2014-08-24)
We propose a new decision tree model, named the budding tree, where a node can be both a leaf and an internal decision node. Each bud node starts as a leaf node, can then grow children, but then later on, if necessary, its ...