Bagging soft decision trees

dc.authorid0000-0001-5838-4615
dc.authorid0000-0001-7506-0321
dc.contributor.authorYıldız, Olcay Taneren_US
dc.contributor.authorİrsoy, Ozanen_US
dc.contributor.authorAlpaydın, Ahmet İbrahim Ethemen_US
dc.date.accessioned2017-03-13T12:32:59Z
dc.date.available2017-03-13T12:32:59Z
dc.date.issued2016
dc.departmentIşık Üniversitesi, Mühendislik Fakültesi, Bilgisayar Mühendisliği Bölümüen_US
dc.departmentIşık University, Faculty of Engineering, Department of Computer Engineeringen_US
dc.description.abstractThe decision tree is one of the earliest predictive models in machine learning. In the soft decision tree, based on the hierarchical mixture of experts model, internal binary nodes take soft decisions and choose both children with probabilities given by a sigmoid gating function. Hence for an input, all the paths to all the leaves are traversed and all those leaves contribute to the final decision but with different probabilities, as given by the gating values on the path. Tree induction is incremental and the tree grows when needed by replacing leaves with subtrees and the parameters of the newly-added nodes are learned using gradient-descent. We have previously shown that such soft trees generalize better than hard trees; here, we propose to bag such soft decision trees for higher accuracy. On 27 two-class classification data sets (ten of which are from the medical domain), and 26 regression data sets, we show that the bagged soft trees generalize better than single soft trees and bagged hard trees. This contribution falls in the scope of research track 2 listed in the editorial, namely, machine learning algorithms.en_US
dc.description.versionPublisher's Versionen_US
dc.identifier.citationYıldız, O. T., İrsoy, O. & Alpaydın, A. İ. E. (2016). Bagging soft decision trees. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 9605, 25-36. doi:10.1007/978-3-319-50478-0_2en_US
dc.identifier.doi10.1007/978-3-319-50478-0_2
dc.identifier.endpage36
dc.identifier.isbn9783319504773
dc.identifier.isbn9783319504780
dc.identifier.issn0302-9743
dc.identifier.scopus2-s2.0-85006494215
dc.identifier.scopusqualityQ3
dc.identifier.startpage25
dc.identifier.urihttps://hdl.handle.net/11729/1200
dc.identifier.urihttp://dx.doi.org/10.1007/978-3-319-50478-0_2
dc.identifier.volume9605
dc.identifier.wosWOS:000408904100003
dc.identifier.wosqualityQ4
dc.indekslendigikaynakWeb of Scienceen_US
dc.indekslendigikaynakScopusen_US
dc.indekslendigikaynakBook Citation Index – Science (BKCI-S)en_US
dc.institutionauthorYıldız, Olcay Taneren_US
dc.institutionauthorid0000-0001-5838-4615
dc.language.isoenen_US
dc.peerreviewedYesen_US
dc.publicationstatusPublisheden_US
dc.publisherSpringer Verlagen_US
dc.relation.publicationcategoryKitap Bölümü - Uluslararasıen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.sourceLecture Notes in Computer Scienceen_US
dc.subjectBaggingen_US
dc.subjectDecision treesen_US
dc.subjectRegression treesen_US
dc.subjectRegularizationen_US
dc.titleBagging soft decision treesen_US
dc.typeBook Chapteren_US
dspace.entity.typePublication

Dosyalar

Orijinal paket
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
1200.pdf
Boyut:
237.27 KB
Biçim:
Adobe Portable Document Format
Açıklama:
Publisher's Version
Lisans paketi
Listeleniyor 1 - 1 / 1
Küçük Resim Yok
İsim:
license.txt
Boyut:
1.71 KB
Biçim:
Item-specific license agreed upon to submission
Açıklama: