Data mining with decision trees: theory and applications
Gespeichert in:
1. Verfasser: | |
---|---|
Format: | Buch |
Sprache: | English |
Veröffentlicht: |
New Jersey [u.a.]
World Scientific
2010
|
Ausgabe: | 1. publ., repr. |
Schriftenreihe: | Series in machine perception and artificial intelligence
69 |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | Literaturverz. S. 215 - 242 |
Beschreibung: | XVIII, 244 S. graph. Darst. |
ISBN: | 9789812771711 9812771719 |
Internformat
MARC
LEADER | 00000nam a2200000zcb4500 | ||
---|---|---|---|
001 | BV036128331 | ||
003 | DE-604 | ||
005 | 20110424 | ||
007 | t | ||
008 | 100420s2010 xxkd||| |||| 00||| eng d | ||
020 | |a 9789812771711 |9 978-981-277-171-1 | ||
020 | |a 9812771719 |9 981-277-171-9 | ||
035 | |a (OCoLC)634520852 | ||
035 | |a (DE-599)BVBBV036128331 | ||
040 | |a DE-604 |b ger |e aacr | ||
041 | 0 | |a eng | |
044 | |a xxk |c GB | ||
049 | |a DE-19 |a DE-473 | ||
084 | |a ST 530 |0 (DE-625)143679: |2 rvk | ||
100 | 1 | |a Rokach, Lior |e Verfasser |4 aut | |
245 | 1 | 0 | |a Data mining with decision trees |b theory and applications |c Lior Rokach ; Oded Maimon |
250 | |a 1. publ., repr. | ||
264 | 1 | |a New Jersey [u.a.] |b World Scientific |c 2010 | |
300 | |a XVIII, 244 S. |b graph. Darst. | ||
336 | |b txt |2 rdacontent | ||
337 | |b n |2 rdamedia | ||
338 | |b nc |2 rdacarrier | ||
490 | 1 | |a Series in machine perception and artificial intelligence |v 69 | |
500 | |a Literaturverz. S. 215 - 242 | ||
650 | 0 | 7 | |a Entscheidungsbaum |0 (DE-588)4347788-4 |2 gnd |9 rswk-swf |
650 | 0 | 7 | |a Data Mining |0 (DE-588)4428654-5 |2 gnd |9 rswk-swf |
689 | 0 | 0 | |a Data Mining |0 (DE-588)4428654-5 |D s |
689 | 0 | 1 | |a Entscheidungsbaum |0 (DE-588)4347788-4 |D s |
689 | 0 | |5 DE-604 | |
700 | 1 | |a Maimon, Oded Z. |e Sonstige |0 (DE-588)143250833 |4 oth | |
830 | 0 | |a Series in machine perception and artificial intelligence |v 69 |w (DE-604)BV006668231 |9 69 | |
856 | 4 | 2 | |m Digitalisierung UB Bamberg |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=020210683&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-020210683 |
Datensatz im Suchindex
_version_ | 1804142799912697856 |
---|---|
adam_text | Contents
Preface vii
1.
Introduction
to Decision Trees
1
1.1
Data Mining and Knowledge Discovery
........... 1
1.2
Taxonomy of Data Mining Methods
............. 3
1.3
Supervised Methods
...................... 4
1.3.1
Overview
........................ 4
1.4
Classification Trees
...................... 5
1.5
Characteristics of Classification Trees
............ 8
1.5.1
Tree Size
........................ 9
1.5.2
The Hierarchical Nature of Decision Trees
..... 10
1.6
Relation to Rule Induction
.................. 11
2.
Growing Decision Trees
13
2.0.1
Training Set
...................... 13
2.0.2
Definition of the Classification Problem
....... 14
2.0.3
Induction Algorithms
................. 16
2.0.4
Probability Estimation in Decision Trees
...... 16
2.0.4.1
Laplace Correction
............. 17
2.0.4.2
No Match
.................. 18
2.1
Algorithmic Framework for Decision Trees
......... 18
2.2
Stopping Criteria
....................... 19
3.
Evaluation of Classification Trees
21
3.1
Overview
............................ 21
3.2
Generalization Error
..................... 21
xii
Data Mining with Decision Trees: Theory and Applications
3.2.1
Theoretical Estimation of Generalization Error
. . . 22
3.2.2
Empirical Estimation of Generalization Error
... 23
3.2.3
Alternatives to the Accuracy Measure
........ 24
3.2.4
The F-Measure
.................... 25
3.2.5
Confusion Matrix
................... 27
3.2.6
Classifier Evaluation under Limited Resources
... 28
3.2.6.1
ROC Curves
................. 30
3.2.6.2
Hit Rate Curve
............... 30
3.2.6.3
Qrecall (Quota Recall)
........... 32
3.2.6.4
Lift Curve
.................. 32
3.2.6.5
Pearson Correlation Coefficient
...... 32
3.2.6.6
Area Under Curve (AUC)
......... 34
3.2.6.7
Average Hit Rate
.............. 35
3.2.6.8
Average Qrecall
............... 35
3.2.6.9
Potential Extract Measure
(РЕМ)
..... 36
3.2.7
Which Decision Tree Classifier is Better?
...... 40
3.2.7.1
McNemar s Test
............... 40
3.2.7.2
A Test for the Difference of Two
Proportions
................. 41
3.2.7.3
The Resampled Paired
t
Test
....... 43
3.2.7.4
The fc-fold Cross-validated Paired
t
Test
. 43
3.3
Computational Complexity
.................. 44
3.4
Comprehensibility
....................... 44
3.5
Scalability to Large
Datasets
................. 45
3.6
Robustness
........................... 47
3.7
Stability
............................ 47
3.8
Interestingness Measures
................... 48
3.9
Overfitting and Underrating
................. 49
3.10
No Free Lunch Theorem
.................. 50
4.
Splitting Criteria
53
4.1
Univariate Splitting Criteria
................. 53
4.1.1
Overview
........................ 53
4.1.2
Impurity based Criteria
................ 53
4.1.3
Information Gain
................... 54
4.1.4
Gini
Index
....................... 55
4.1.5
Likelihood Ratio Chi-squared Statistics
....... 55
4.1.6 DKM
Criterion
.................... 55
4.1.7
Normalized Impurity-based Criteria
......... 56
Contents xiii
4.1.8
Gain
Ratio....................... 56
4.1.9
Distance Measure
................... 56
4.1.10
Binary Criteria
.................... 57
4.1.11
Twoing Criterion
................... 57
4.1.12
Orthogonal Criterion
................. 58
4.1.13
Kolmogorov-Smirnov Criterion
........... 58
4.1.14
AUC Splitting Criteria
................ 58
4.1.15
Other Univariate Splitting Criteria
......... 59
4.1.16
Comparison of Univariate Splitting Criteria
.... 59
4.2
Handling Missing Values
................... 59
5.
Pruning Trees
63
5.1
Stopping Criteria
....................... 63
5.2
Heuristic Pruning
....................... 63
5.2.1
Overview
........................ 63
5.2.2
Cost Complexity Pruning
............... 64
5.2.3
Reduced Error Pruning
................ 65
5.2.4
Minimum Error Pruning
(МЕР)
........... 65
5.2.5
Pessimistic Pruning
.................. 65
5.2.6
Error-Based Pruning (EBP)
............. 66
5.2.7
Minimum Description Length
(MDL)
Pruning
... 67
5.2.8
Other Pruning Methods
............... 67
5.2.9
Comparison of Pruning Methods
........... 68
5.3
Optimal Pruning
....................... 68
6.
Advanced Decision Trees
71
6.1
Survey of Common Algorithms for Decision Tree Induction
71
6.1.1
ID3
........................... 71
6.1.2
C4.5
.......................... 71
6.1.3
CART
......................... 71
6.1.4
CHAID
......................... 72
6.1.5
QUEST
......................... 73
6.1.6
Reference to Other Algorithms
............ 73
6.1.7
Advantages and Disadvantages of Decision Trees
. . 73
6.1.8
Oblivious Decision Trees
............... 76
6.1.9
Decision Trees Inducers for Large
Datasets
..... 78
6.1.10
Online Adaptive Decision Trees
........... 79
6.1.11
Lazy Tree
....................... 79
xiv
Data Mining with Decision Trees: Theory and Applications
6.1.12
Option Tree
...................... 80
6.2
Lookahead
........................... 82
6.3
Oblique Decision Trees
.................... 83
7.
Decision Forests
87
7.1
Overview
............................ 87
7.2
Introduction
.......................... 87
7.3
Combination Methods
.................... 90
7.3.1
Weighting Methods
.................. 90
7.3.1.1
Majority Voting
............... 90
7.3.1.2
Performance Weighting
.......... 91
7.3.1.3
Distribution Summation
.......... 91
7.3.1.4
Bayesian Combination
........... 91
7.3.1.5
Dempster-Shafer
.............. 92
7.3.1.6
Vogging
................... 92
7.3.1.7
Naïve Bayes
................. 93
7.3.1.8
Entropy Weighting
............. 93
7.3.1.9
Density-based Weighting
.......... 93
7.3.1.10
DEA
Weighting Method
.......... 93
7.3.1.11
Logarithmic Opinion Pool
......... 94
7.3.1.12
Gating Network
............... 94
7.3.1.13
Order Statistics
............... 95
7.3.2
Meta-combination Methods
............. 95
7.3.2.1
Stacking
................... 95
7.3.2.2
Arbiter Trees
................ 97
7.3.2.3
Combiner Trees
............... 99
7.3.2.4
Grading
................... 100
7.4
Classifier Dependency
..................... 101
7.4.1
Dependent Methods
.................. 101
7.4.1.1
Model-guided Instance Selection
...... 101
7.4.1.2
Incremental Batch Learning
........ 105
7.4.2
Independent Methods
................. 105
7.4.2.1
Bagging
................... 105
7.4.2.2
Wagging
................... 107
7.4.2.3
Random Forest
............... 108
7.4.2.4
Cross-validated Committees
........ 109
7.5
Ensemble Diversity
...................... 109
7.5.1
Manipulating the Inducer
............... 110
7.5.1.1
Manipulation of the Inducer s Parameters 111
Contents xv
7.5.1.2
Starting
Point in
Hypothesis
Space ....
Ill
7.5.1.3
Hypothesis
Space
Traversal
.........
Ill
7.5.2
Manipulating the Training Samples
......... 112
7.5.2.1
Resampling
................. 112
7.5.2.2
Creation
................... 113
7.5.2.3
Partitioning
................. 113
7.5.3
Manipulating the Target Attribute Representation
. 114
7.5.4
Partitioning the Search Space
............ 115
7.5.4.1
Divide and Conquer
............. 116
7.5.4.2
Feature Subset-based Ensemble Methods
. 117
7.5.5
Multi-Inducers
..................... 121
7.5.6
Measuring the Diversity
............... 122
7.6
Ensemble Size
......................... 124
7.6.1
Selecting the Ensemble Size
............. 124
7.6.2
Pre
Selection of the Ensemble Size
.......... 124
7.6.3
Selection of the Ensemble Size while Training
... 125
7.6.4
Pruning
—
Post Selection of the Ensemble Size
. . 125
7.6.4.1
Pre-combining Pruning
........... 126
7.6.4.2
Post-combining Pruning
.......... 126
7.7
Cross-Inducer
......................... 127
7.8
Multistrategy Ensemble Learning
.............. 127
7.9
Which Ensemble Method Should be Used?
......... 128
7.10
Open Source for Decision Trees Forests
........... 128
8.
Incremental Learning of Decision Trees
131
8.1
Overview
............................ 131
8.2
The Motives for Incremental Learning
........... 131
8.3
The Inefficiency Challenge
.................. 132
8.4
The Concept Drift Challenge
................. 133
9.
Feature Selection
137
9.1
Overview
............................ 137
9.2
The Curse of Dimensionality
............... 137
9.3
Techniques for Feature Selection
............... 140
9.3.1
Feature Filters
..................... 141
9.3.1.1
FOCUS
.................... 141
9.3.1.2
LVF
...................... 141
xvi
Data Mining with Decision Trees: Theory and Applications
9.3.1.3
Using One Learning Algorithm as a Filter
for Another
................. 141
9.3.1.4
An Information Theoretic Feature Filter
. 142
9.3.1.5
An Instance Based Approach to Feature
Selection
-
RELIEF
............. 142
9.3.1.6
Simba
and G-flip
.............. 142
9.3.1.7
Contextual Merit Algorithm
........ 143
9.3.2
Using Traditional Statistics for Filtering
...... 143
9.3.2.1
Mallows Cp
................. 143
9.3.2.2
AIC,
BIC
and F-ratio
............ 144
9.3.2.3
Principal Component Analysis (PCA)
... 144
9.3.2.4
Factor Analysis (FA)
............ 145
9.3.2.5
Projection Pursuit
.............. 145
9.3.3
Wrappers
........................ 145
9.3.3.1
Wrappers for Decision Tree Learners
. . . 145
9.4
Feature Selection as a Means of Creating Ensembles
.... 146
9.5
Ensemble Methodology as a Means for Improving Feature
Selection
............................ 147
9.5.1
Independent Algorithmic Framework
........ 149
9.5.2
Combining Procedure
................. 150
9.5.2.1
Simple Weighted Voting
.......... 151
9.5.2.2
Naïve Bayes
Weighting using Artificial
Contrasts
................... 152
9.5.3
Feature Ensemble Generator
............. 154
9.5.3.1
Multiple Feature Selectors
......... 154
9.5.3.2
Bagging
................... 156
9.6
Using Decision Trees for Feature Selection
......... 156
9.7
Limitation of Feature Selection Methods
.......... 157
10.
Fuzzy Decision Trees
159
10.1
Overview
............................ 159
10.2
Membership Function
..................... 160
10.3
Fuzzy Classification Problems
................ 161
10.4
Fuzzy Set Operations
..................... 163
10.5
Fuzzy Classification Rules
.................. 164
10.6
Creating Fuzzy Decision Tree
................ 164
10.6.1
Fuzzifying Numeric Attributes
............ 165
10.6.2
Inducing of Fuzzy Decision Tree
........... 166
10.7
Simplifying the Decision Tree
................ 169
Contents xvii
10.8
Classification of New Instances
............... 169
10.9
Other Fuzzy Decision Tree Inducers
............. 169
11.
Hybridization of Decision Trees with other Techniques
171
11.1
Introduction
.......................... 171
11.2
A Decision Tree Framework for Instance-Space Decom¬
position
............................ 171
11.2.1
Stopping Rules
.................... 174
11.2.2
Splitting Rules
..................... 175
11.2.3
Split Validation Examinations
............ 175
11.3
The CPOM Algorithm
.................... 176
11.3.1
CPOM Outline
.................... 176
11.3.2
The Grouped Gain Ratio Splitting Rule
...... 177
11.4
Induction of Decision Trees by an Evolutionary Algorithm
179
12.
Sequence Classification Using Decision Trees
187
12.1
Introduction
.......................... 187
12.2
Sequence Representation
................... 187
12.3
Pattern Discovery
....................... 188
12.4
Pattern Selection
....................... 190
12.4.1
Heuristics for Pattern Selection
........... 190
12.4.2
Correlation based Feature Selection
......... 191
12.5
Classifier Training
....................... 191
12.5.1
Adjustment of Decision Trees
............ 192
12.5.2
Cascading Decision Trees
............... 192
J
2.6
Application of CREDT in Improving of Information
Retrieval of Medical Narrative Reports
........... 193
12.6.1
Related Works
..................... 195
12.6.1.1
Text Classification
.............. 195
12.6.1.2
Part-of-speech Tagging
........... 198
12.6.1.3
Frameworks for Information Extraction
. 198
12.6.1.4
Frameworks for Labeling Sequential Data
199
12.6.1.5
Identifying Negative Context in Non-
domain Specific Text (General NLP)
... 199
12.6.1.6
Identifying Negative Context in Medical
Narratives
.................. 200
12.6.1.7
Works Based on Knowledge Engineering
. 200
12.6.1.8
Works based on Machine Learning
..... 201
xviii
Data Mining with Decision Trees: Theory and Applications
12.6.2
Using CREDT for Solving the Negation Problem
. 201
12.6.2.1
The Process Overview
........... 201
12.6.2.2
Step
1:
Corpus Preparation
........ 201
12.6.2.3
Step
1.1:
Tagging
.............. 202
12.6.2.4
Step
1.2:
Sentence Boundaries
....... 202
12.6.2.5
Step
1.3:
Manual Labeling
......... 203
12.6.2.6
Step
2:
Patterns Creation
......... 203
12.6.2.7
Step
3:
Patterns Selection
......... 206
12.6.2.8
Step
4:
Classifier Training
......... 208
12.6.2.9
Cascade of Three Classifiers
........ 209
Bibliography
215
Index
243
|
any_adam_object | 1 |
author | Rokach, Lior |
author_GND | (DE-588)143250833 |
author_facet | Rokach, Lior |
author_role | aut |
author_sort | Rokach, Lior |
author_variant | l r lr |
building | Verbundindex |
bvnumber | BV036128331 |
classification_rvk | ST 530 |
ctrlnum | (OCoLC)634520852 (DE-599)BVBBV036128331 |
discipline | Informatik |
edition | 1. publ., repr. |
format | Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01706nam a2200421zcb4500</leader><controlfield tag="001">BV036128331</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20110424 </controlfield><controlfield tag="007">t</controlfield><controlfield tag="008">100420s2010 xxkd||| |||| 00||| eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9789812771711</subfield><subfield code="9">978-981-277-171-1</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9812771719</subfield><subfield code="9">981-277-171-9</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)634520852</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV036128331</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">aacr</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">xxk</subfield><subfield code="c">GB</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-19</subfield><subfield code="a">DE-473</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 530</subfield><subfield code="0">(DE-625)143679:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Rokach, Lior</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Data mining with decision trees</subfield><subfield code="b">theory and applications</subfield><subfield code="c">Lior Rokach ; Oded Maimon</subfield></datafield><datafield tag="250" ind1=" " ind2=" "><subfield code="a">1. publ., repr.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">New Jersey [u.a.]</subfield><subfield code="b">World Scientific</subfield><subfield code="c">2010</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XVIII, 244 S.</subfield><subfield code="b">graph. Darst.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">n</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">nc</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">Series in machine perception and artificial intelligence</subfield><subfield code="v">69</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Literaturverz. S. 215 - 242</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Entscheidungsbaum</subfield><subfield code="0">(DE-588)4347788-4</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Data Mining</subfield><subfield code="0">(DE-588)4428654-5</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Data Mining</subfield><subfield code="0">(DE-588)4428654-5</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2="1"><subfield code="a">Entscheidungsbaum</subfield><subfield code="0">(DE-588)4347788-4</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Maimon, Oded Z.</subfield><subfield code="e">Sonstige</subfield><subfield code="0">(DE-588)143250833</subfield><subfield code="4">oth</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">Series in machine perception and artificial intelligence</subfield><subfield code="v">69</subfield><subfield code="w">(DE-604)BV006668231</subfield><subfield code="9">69</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">Digitalisierung UB Bamberg</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=020210683&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-020210683</subfield></datafield></record></collection> |
id | DE-604.BV036128331 |
illustrated | Illustrated |
indexdate | 2024-07-09T22:37:31Z |
institution | BVB |
isbn | 9789812771711 9812771719 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-020210683 |
oclc_num | 634520852 |
open_access_boolean | |
owner | DE-19 DE-BY-UBM DE-473 DE-BY-UBG |
owner_facet | DE-19 DE-BY-UBM DE-473 DE-BY-UBG |
physical | XVIII, 244 S. graph. Darst. |
publishDate | 2010 |
publishDateSearch | 2010 |
publishDateSort | 2010 |
publisher | World Scientific |
record_format | marc |
series | Series in machine perception and artificial intelligence |
series2 | Series in machine perception and artificial intelligence |
spelling | Rokach, Lior Verfasser aut Data mining with decision trees theory and applications Lior Rokach ; Oded Maimon 1. publ., repr. New Jersey [u.a.] World Scientific 2010 XVIII, 244 S. graph. Darst. txt rdacontent n rdamedia nc rdacarrier Series in machine perception and artificial intelligence 69 Literaturverz. S. 215 - 242 Entscheidungsbaum (DE-588)4347788-4 gnd rswk-swf Data Mining (DE-588)4428654-5 gnd rswk-swf Data Mining (DE-588)4428654-5 s Entscheidungsbaum (DE-588)4347788-4 s DE-604 Maimon, Oded Z. Sonstige (DE-588)143250833 oth Series in machine perception and artificial intelligence 69 (DE-604)BV006668231 69 Digitalisierung UB Bamberg application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=020210683&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Rokach, Lior Data mining with decision trees theory and applications Series in machine perception and artificial intelligence Entscheidungsbaum (DE-588)4347788-4 gnd Data Mining (DE-588)4428654-5 gnd |
subject_GND | (DE-588)4347788-4 (DE-588)4428654-5 |
title | Data mining with decision trees theory and applications |
title_auth | Data mining with decision trees theory and applications |
title_exact_search | Data mining with decision trees theory and applications |
title_full | Data mining with decision trees theory and applications Lior Rokach ; Oded Maimon |
title_fullStr | Data mining with decision trees theory and applications Lior Rokach ; Oded Maimon |
title_full_unstemmed | Data mining with decision trees theory and applications Lior Rokach ; Oded Maimon |
title_short | Data mining with decision trees |
title_sort | data mining with decision trees theory and applications |
title_sub | theory and applications |
topic | Entscheidungsbaum (DE-588)4347788-4 gnd Data Mining (DE-588)4428654-5 gnd |
topic_facet | Entscheidungsbaum Data Mining |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=020210683&sequence=000002&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
volume_link | (DE-604)BV006668231 |
work_keys_str_mv | AT rokachlior dataminingwithdecisiontreestheoryandapplications AT maimonodedz dataminingwithdecisiontreestheoryandapplications |