Feature extraction: foundations and applications
Gespeichert in:
Format: | Medienkombination Buch |
---|---|
Sprache: | English |
Veröffentlicht: |
Berlin [u.a]
Springer
2006
|
Schriftenreihe: | Studies in fuzziness and soft computing
207 |
Schlagworte: | |
Online-Zugang: | Inhaltsverzeichnis |
Beschreibung: | XXIV, 778 S. Ill., graph. Darst. 1 CD-ROM (12 cm) |
ISBN: | 9783540354871 3540354875 |
Internformat
MARC
LEADER | 00000nom a2200000 cb4500 | ||
---|---|---|---|
001 | BV021799129 | ||
003 | DE-604 | ||
005 | 20140710 | ||
008 | 061107s2006 gw ||| 0| bneng d | ||
016 | 7 | |a 97984083X |2 DE-101 | |
020 | |a 9783540354871 |9 978-3-540-35487-1 | ||
020 | |a 3540354875 |9 3-540-35487-5 | ||
035 | |a (OCoLC)633075017 | ||
035 | |a (DE-599)BVBBV021799129 | ||
040 | |a DE-604 |b ger |e rakddb | ||
041 | 0 | |a eng | |
044 | |a gw |c XA-DE-BW | ||
049 | |a DE-20 |a DE-83 |a DE-11 | ||
082 | 0 | |a 006.301519 |2 22/ger | |
084 | |a ST 302 |0 (DE-625)143652: |2 rvk | ||
084 | |a ST 330 |0 (DE-625)143663: |2 rvk | ||
084 | |a 004 |2 sdnb | ||
245 | 1 | 0 | |a Feature extraction |b foundations and applications |c Isabelle Guyon ... eds. |
264 | 1 | |a Berlin [u.a] |b Springer |c 2006 | |
300 | |a XXIV, 778 S. |b Ill., graph. Darst. |e 1 CD-ROM (12 cm) | ||
490 | 1 | |a Studies in fuzziness and soft computing |v 207 | |
650 | 0 | 7 | |a Merkmalsextraktion |0 (DE-588)4314440-8 |2 gnd |9 rswk-swf |
655 | 7 | |0 (DE-588)4143413-4 |a Aufsatzsammlung |2 gnd-content | |
689 | 0 | 0 | |a Merkmalsextraktion |0 (DE-588)4314440-8 |D s |
689 | 0 | |5 DE-604 | |
700 | 1 | |a Guyon, Isabelle |e Sonstige |4 oth | |
830 | 0 | |a Studies in fuzziness and soft computing |v 207 |w (DE-604)BV021858135 |9 207 | |
856 | 4 | 2 | |m GBV Datenaustausch |q application/pdf |u http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=015011666&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |3 Inhaltsverzeichnis |
999 | |a oai:aleph.bib-bvb.de:BVB01-015011666 |
Datensatz im Suchindex
_version_ | 1804135713722073088 |
---|---|
adam_text | ISABELLE GUYON STEVE GUNN MASOUD NIKRAVESH LOTFI A. ZADEH (EDS.) FEATURE
EXTRACTION FOUNDATIONS AND APPLICATIONS SPRIN GER CONTENTS AN
INTRODUCTION TO FEATURE EXTRACTION ISABELLE GUYON, ANDRE ELISSEEFF 1 1
FEATURE EXTRACTION BASICS 1 2 WLRAT IS NEW IN FEATURE EXTRACTION? 7 3
GETTING STARTED 9 4 ADVANCED TOPICS AND OPCN PROBLEMS 16 5 CONCLUSION 22
REFERENCCA 23 A FORWARD SELECTION WITLI GRAM-SCHRNIDT ORTHOGOIIALIZATION
24 B JUSTIFICATION OF THE COMPUTATIOIIAL COMPLEXITV ESTIMATES 25 PART I
FEATURE EXTRACTION FUNDAMENTALS 1 LEARNING MACHINES NORBERT JANKOWSKI,
KRZYSZIOF GRABCZEWSKI 29 1.1 INTRODUCTION 29 1.2 THE LEARNING PROBLEM 29
1.3 LEARNING ALGORITHMS 35 1.4 SOME REMARKS ON LEARNING ALGORITHMS 57
REFERENCES 58 2 ASSESSMENT METHODS GERARD DREYFUS, ISABELLE GUYON 65 2.1
INTRODUCTION 65 2.2 A STATISTICAL VIEW OF FEATURE SELECTION: HYPOTHESIS
TESTS AND RANDORRI PROBES 66 2.3 A MACHINE LEARNING VIEW OF FEATURE
SELECTION 78 2.4 CONCLUSION 86 XII CONTENTS 3 FILTER METHODS WLODZISLAW
DUCK 89 3.1 INTRODUCTION TO FILTER METHODS FOR FEATURE SCLECTION 89 3.2
GENERAL ISSUCS RELATED TO FILTERS 91 3.3 CORRCLATIOII-BASED FILTERS 96
3.4 RELEVANCE INDICCS BASED ON DISTANCES BETWCCN DISTRIBUTION» 99 3.5
RELEVANCE MEASURES BASED ON INFORMATION THEORY 101 3.6 DECISION TREES
FOR FILTERING 104 3.7 RELIABILITY AND BIAS OF RELEVANCE INDICCS 106 3.8
FILTERS FOR FEATURE SELECTION 108 3.9 SUMMARY AND COIRIPARISON 110 3.10
DISCUSSION AND CONCLUSIONS 113 REFERENCES 114 4 SEARCH STRATEGIES JUHA
REUNANEN 119 4.1 INTRODUCTION 119 4.2 OPTIMAL RESULTS 119 4.3 SEQUENTIAL
SELECTION 121 4.4 EXTENSIONS TO SEQUENTIAL SELECTION 123 4.5 STOCHASTIC
SEARCH 129 4.6 ON THE DIFFERCNT LEVELS OF TESTING 133 4.7 THE BEST
STRATEGY? 134 REFERENCES 135 5 EMBEDDED METHODS THOMAS NAVIN LAL,
OLIVIER CHAPELLT, JASON WESTON, ANDRE ELISSEEFF. . . . 137 5.1
INTRODUCTION 137 5.2 FORWARD-BACKWARD METHODS 139 5.3 OPTHNIZATION OF
SCALING FACTORS 150 5.4 SPARSITY TERM 156 5.5 DLSCUSSIOIIS AND
CONCLUSIONS 161 REFERENCES 162 6 INFORMATION-THEORETLC METHODS KART
TORKKOLA, 167 6.1 INTRODUCTION 167 6.2 WHAT IS RELEVANCE? 167 6.3
INFORMATION THEORY 169 6.4 INFORMATION-TLICORETIC CRITERIA FOR VARIABLE
SCLECTION 172 6.5 MI FOR FEATURE CONSTRACTION 178 6.6 INFORMATION THEORY
IN LEARNING DISTANCC METRICS 179 6.7 INFORMATION BOTTLENECK AND VARIANTS
180 6.8 DISCUSSION 181 REFERENCES 182 CONTENTS XIII 7 ENSEMBLE LEARNING
EUGENE TUV 187 7.1 INTRODUCTION 187 7.2 OVORVIEW OF ENSEMBLE METHODS 188
7.3 VARIABLE SCLECTION AND RANKING WITH TREE ENSEMBLES 191 7.4 BAYESIAN
VOTING 200 7.5 DISCUSSIONS 201 REFERENCES 203 8 FUZZY NEURAL NETWORKS
MADAN M. GUPTA. NORIYASU HOMM.A, ZENG-GUANG HOU 205 8.1 INTRODUCTION 205
8.2 FUZZY SETS AND SYSTEMS: AN OVERVIEW 207 8.3 UILDING FUZZY NEUIONS
USING FUZZY ARITHMETIC AND FUZZY LOGIC OPERATIONS 215 8.4 HYBRID FUZZY
NEURAL NETWORKS (HFNNS) 222 8.5 CONCLUDING RCMARKS 230 REFERENCES 231
PART II FEATURE SELECTION CHALLENGE 9 DESIGN AND ANALYSIS OF THE
NIPS2003 CHALLENGE ISABELLE GUYON, STEVE GUNN, ASA BEN HUR, GIDEON DROR
237 9.1 INTRODUCTION 237 9.2 BENCHMARK DESIGN 239 9.3 CHALLENGE RCSULTS
245 9.4 POST-CHALLENGE VERIFICATIONS 253 9.5 CONCLUSIONS AND FUTURE WORK
259 REFERENCES 260 A DETAILS ABOUT THE FIFTY FEATURE SUBSET STUDY 261 10
HIGH DIMENSIONAL CLASSIFICATION WITH BAYESIAN NEURAL NETWORKS AND
DIRICHLET DIFFUSION TREES RADFORD M. NCAL, JIANGUO ZHANG 265 10.1
BAYESIAN MODELS VS. LEARNING MACHINES 266 10.2 SELECTHIG FEATURES WITH
UNIVARIATC TESTS 269 10.3 RCDUCING DIMENSIONALITY USING PCA 271 10.4
BAYESIAN LOGISTIC REGRESSION 271 10.5 BAYESIAN NEURAL NETWORK MODELS 274
10.6 DIRICHLET DIFFUSION TREE MODELS 276 10.7 METHODS AND RESULTS FOR
THE CHALLENGE DATA SETS 280 10.8 CONCLUSIONS 294 R.CFCRCNCCS 295 XIV
CONTENTS 11 ENSEMBLES OF REGULARIZED LEAST SQUARES CLASSIFIERS FOR
HIGH-DIMENSIONAL PROBLEMS KARI TORKKOLA AND EUGENE TUV 297 11.1
INTRODUKTION 297 11.2 REGULARIZED LEAST-SQUARES CLASSIFICATION (I1LSC)
298 11.3 MODEL AVERAGING AND REGULA RIZATION 300 11.4 VARIABLE FILTERHIG
WITH TRCC-BASED ENSEMBLES 301 11.5 EXPERIMENTS WITH CHALLERIGE DATA SETS
302 11.6 FUTIIRE DIRCCTIONS 310 11.7 CONCLUSION 312 REFERENCCS 313 12
COMBINING SVMS WITH VARIOUS FEATURE SELECTION STRATEGIES YI-WEI CHEN,
CHIH-JEN LIN 315 12.1 INTRODUCTION 315 12.2 SUPPORT VECTOR
CLASSIFICATION 316 12.3 FEATURE SELECTION STRATEGIES 317 12.4
EXPERITNENTAL RESULTS 320 12.5 COMPETILION RESULTS 321 12.6 DISCUSSION
AND CONCLUSION 322 REFERENCCS 323 13 FEATURE SELECTION WITH
TRANSDUCTIVE SUPPORT VECTOR MACHINES ZHILI WU. CHUNH.7I.NG LI 325 13.1
INTRODUCTION 325 13.2 SVMS AND TRANSDUCTIVE SVMS 326 13.3 FEATURE
SELECTION METHODS RELATED WITH SVMS 329 13.4 TRANSDUCTIVE SVM-RELATCD
FEATURE SELECTION 331 13.5 EXPERIMENTATION 332 13.6 CONCLUSION AND
DISCUSSION 339 13.7 ACKNOWLCDGEMENT 340 REFERENCES 340 14 VARIABLE
SELECTION USING CORRELATION AND SINGLE VARIABLE CLASSIFIER METHODS:
APPLICATIONS AN VIR B.EZA SAFFARI AZA.R ALAMDARI 343 14.1 INTRODUCTION
343 14.2 INTRODUCTION TO CORRELATION AND SINGLE VARIABLE CLASSIFIER
METHODS 344 14.3 ENSEMBLE AVERAGING 347 14.4 APPLICATIONS TO NIPS 2003
FEATURE SELECTION CHALLERIGE 350 14.5 CONCLUSION 355 REFERENCES SR,7
CONTENTS XV 15 TREE-BASED ENSEMBLES WITH DYNAMIC SOFT FEATURE SELECTION
ALEXANDER BORISOV, VICTOR ERUHIMOV. EUGENE TUV 359 15.1 BACKGROUND 359
15.2 DYNAMIC FEATURE SCLCCTION 361 15.3 EXPERIMENTAL RESULTS 367 15.4
SUMMARY 373 TLEFERENCES 374 16 SPARSE, FLEXIBLE AND EFFLCIENT MODELING
USING L REGULARIZATION SAHARON ROSSET, JI ZHU 37 5 16.1 TNTRODUCTION
375 16.2 THE LI-NORM PENALTY 380 16.3 PIECEWISUE LINEAR SOLUTION PATLIS
384 16.1 A ROBUST, EIFICIENT AND. ADAPTABLE METHOD FOR CLASSIFICATION
388 16.5 RESULTS ON THE NIPS-03 CHALLENGE DATASETS 390 16.6 CONCLUSION
392 RCFCRENCES 393 17 MARGIN BASED FEATURE SELECTION AND INFOGAIN *WITH
STANDARD CLASSIFIERS RON GILA.D-BACHRACH, AMIR NAVOT 395 17.1 MCTHODS
395 17.2 RESULTS 398 17.3 DISCNSSION 399 REFERENCES 400 18 BAYESIAN
SUPPORT VECTOR MACHINES FOR FEATURE RANKING AND SELECTION WEI CHU, S.
SATHIYA KEERTHI, CHORIG JIN ONG, ZOUHIN GHAHRAMANI 403 18.1 INTRODUETION
403 18.2 BAYESIAN FRAMEWORK 404 18.3 POST-PROCESSING FOR FEATURE
SELECTION 410 18.4 NUMCRICAL EXPERIMENTS 414 18.5 CONCLUSION 416
REFERENCES 416 19 NONLINEAR FEATURE SELECTION WITH THE POTENTIAL SUPPORT
VECTOR MACHINE SEPP HOCHREITER, KLAUS OBERMAYER 419 19.1 INTRODUETION
419 19.2 THE POTENTIAL SUPPORT VECTOR MACHINC 420 19.3 P-SVM DISCNSSION
AND REDUNDANCY CONTROL 424 19.4 NONLINEAR P-SVM FEATURE SCLCCTION 427
19.5 EXPERIMENTS 429 XVI CONTENTS 19.6 CONCLUSION 436 RCFERCNCCS 436 20
COMBINING A FILTER METHOD WITH SVMS THOMAS NAVIN LAL, OLIVICR CHAPCLLC,
BERNHARD SCHOELKOPF 439 20.1 THE PARAMETERS U AND C OF THC SVM 439 20.2
FEATURE RANKING 440 20.3 NUMBER OF FEATURES 441 20.4 SUMMARY 443
REFERENCES 445 21 FEATURE SELECTION VIA SENSITIVITY ANALYSIS WITH DIRECT
KERNEL PLS MARK J. EMBRECHTS, ROBERT A. BRESS, ROBERT H. KEWLEY 447 21.1
INTRODUCTION 447 21.2 PARTIAL LEAST SQUARES REGRESSION (PLS) 448 21.3
REGRESSION MODELS BASED ON DIRECT KERNELS 450 21.4 DEALING WITH THE
BIAS: CENTERING THE KERNEL 452 21.5 MCTRICS FOR ASSESSING THE MODEL
QUALITY 453 21.6 DATA CONDITIONING AND PREPROCESSING 454 21.7
SENSITIVITY ANALYSIS 455 21.8 HEURISTIE FEATURE SELECTION POLICICS FOR
THC NIPS FEATURE SELECTION CHALLENGE 456 21.9 BENCHMARKS 459 21.10
CONCLUSIONS 460 REFERENCES 461 22 INFORMATION GAIN, CORRELATION AND
SUPPORT VECTOR MACHINES DANNY ROOBAERT. GRIGORIS KARAKOULAS. NITESH V.
CHAWLA 463 22.1 INTRODUCTION 463 22.2 DESCRIPTION OF APPROACH 464 22.3
FINAL RESULTA 467 22.4 ALTERNATIVE APPROACHES PURSUED 468 22.5
DISCUSSION AND CONCLUSION 469 REFERENCES 470 23 MINING FOR COMPLEX
MODELS CONIPRISING FEATURE SELECTION AND CLASSIFICATION KRZYSZTOF
GRABCZEWSKI, NORBERT JANKOWSKI 47 1 23.1 INTRODUCTION 47]. 23.2
FUNDAMENTAL ALGORITHMS 472 23.3 FULLY OPERATIONAL COMPLEX MODELS 481
23.4 CHALLENGE DATA EXPLORATION 483 23.5 CONCLUSIONS 486 CONTENTS XVTI
24 COMBINING INFORMATION-BASED SUPERVISED AND UNSUPERVISED FEATURE
SELECTION SANG-KYUN LEE, SEUNG-JOON YI, BYOUNG-TAK ZHANG 489 24.1
IUTRODUCTION 489 24.2 METHODS 490 24.3 EXPERIMENTS 494 24.4 CONCHISIONS
496 REFERENCES 498 25 AN ENHANCED SELECTIVE NA I VE BAYES METHOD WITH
OPTIMAL DISCRETIZATION MARC BOULU 499 25.1 IUTRODUCTION 499 25.2 THE
ENHANCED SELECTIVE NAIVE BAYES METHOD 500 25.3 THE MODL DISCRETIZATION
METHOD 502 25.4 RESULTS ON THE NIPS CHALLENGE 504 25.5 CONCHISION 506
REFERENCES 506 26 AN INPUT VARIABLE IMPORTANCE DEFINITION BASED ON
EMPIRICAL DATA PROBABILITY DISTRIBUTION V. LEMAIRE, F. CLEROT 509 26.1
IUTRODUCTION 509 26.2 ANALYSIS OF AN INPUT VARIABLE INFLUERICC 510 26.3
APPLICATION TO FEATURE SUBSET SELECTION 512 26.4 RESULTS ON THE NIPS
FEATURE SELECTION CHALLENGE 513 26.5 CONCLUSIONS 516 REFERENCES 516 PART
III NEW PERSPECTIVES IN FEATURE EXTRACTION 27 SPECTRAL DIMENSIONALITY
REDUCTION YOSHUA BENGIO, OLIMER DDALLEAU, NICOLAS HE ROUX, JEAN-FMNCOIS
PAIEMENT, PASCAL VINCENT, MARIE OUIMET 519 27.1 INTRODUCTION 519 27.2
DATA-DEP ENDE NT KERNELS FOR SPECTRAL ENIBCDDING ALGORITHMS 524 27.3
KERNEL EIGENFUNCTIONS FOR INDUCTION 532 27.4 LEARNING CRITCRION FOR THE
LEADING EIGENFUNCTIONS 539 27.5 EXPERIMENTS 541 27.6 CONCLUSION 544
REFERENCES 547 XVIII CONTENTS 28 CONSTRUCTING ORTHOGONAL LATENT FEATURES
FOR ARBITRARY LOSS MICHINARI MAMMA, KRISTIN P. BENNETT 551 28.1
INTRODUCTION 551 28.2 GENERAL FRAMEWORK IN BLF 554 28.3 BLF WITH LINEAR
FUNCTIONS 557 28.4 CONVERGCTICE PROPCRTICS OF BLF 561 28.5 PLS AND BLF
563 28.6 BLF FOR ARBITRARY LOSS 564 28.7 KERNEL BLF 571 28.8
COMPULATIONAL RCSULTS 572 28.9 CONCLUSION 581 REFERENCES 582 29 LARGE
MARGIN PRINCIPLES FOR FEATURE SELCCTION RAN GILAD-BACHRACH, AMIR NAVOL,
NAFTALI TISHBY 585 29.1 INTRODUCTION 585 29.2 MARGINS 586 29.3
ALGORITHMS 589 29.4 THEORCTICAL ANALYSIS 592 29.5 EMPIRICAL ASSESSMENT
593 29.6 DISCIISSION AND FLIRTHER RESEARCH DIRECTIONS 602 REFERENCES 604
A COMPLCMCNTARY PROOFS 604 30 FEATURE EXTRACTION FOR CLASSIFICATION OF
PROTEOMIC MASS SPECTRA: A COMPARATIVE STUDY ILYA LEVNER, VADIM BULITKO.
GUOHUI LIN 607 30.1 INTRODUCTION 607 30.2 EXISTING FEATURE EXTRACTION
AND CLASSIFICATION METHODS 611 30.3 FXPCRIMEIITAL R.CSNLTS 615 30.4
CONCLUSION 622 30.5 ACKNOWLEDGEMENTS 623 R.CFCREII(;ES 623 31 SEQUENCE
MOTIFS: HIGHLY PREDICTIVE FEATURES OF PROTEIN FUNCTION ASA BEN-HUR,
DOUGLAS BRUTLAG 625 31.1 INTRODUCTION 625 31.2 ENZYME CLASSIFICATION 627
31.3 METHODS 628 31.1 RESULTS 636 31.5 DISCUSSION 642 31.6 CONCLUSION
643 REFERENCES 643 CONTENTS XIX APPENDIX A ELEMENTARY STATISTICS
ELEMENTARY STATISTICS GERARD DREYFUS 649 1 BASIC PRINCIPLCS 649 2
ESTIMATIIIG AND LCARNMG 651 3 SOMO ADDITIONAL USEFUL PROBABILITY
DISTRIBUUEONS 654 *1 CORIFIDENEE INTERVALS 655 5 HYPOTHESIS TCSTING 657 6
PROBABLY APPROXIMATELY CORRECT (PAC) LEAMING AND GUARANTEED ESTIMATORS
660 REFERENCES 662 APPENDIX B FEATURE SELECTION CHALLENGE DATASETS
EXPERIMENTAL DESIGN ISABELLE GUYON 665 ARCENE 669 GISETTE 677 DEXTER 683
DOROTHEA 687 MADELON 691 MAILAB CODE OF THE LAMBDA METHOD 697 M AT LAB
CODE USED TO GENERATE MADELON 699 APPENDIX C FEATURE SELECTION CHALLENGE
FACT SHEETS 10 HIGH DIMENSIONAL CLASSIFICATION WITH BAYESIAN NEURAL
NETWORKS AND DIRICHLET DIFFUSION TREES RADFORD M. NEAL, JIANGUO ZHANG
707 11 ENSEMBLES OF REGULARIZED LEAST SQUARES CLASSIFIERS FOR
HIGH-DIMENSIONAL PROBLEMS KART TORKKOLA AND EUGENE TUV 709 XX CONTENTS
12 COMBINING SVMS WITH VARIOUS FEATURE SELECTION STRATEGIES YI-WCI
CHERT, CHIH-JEN HIN 711 13 FEATURE SELECTION WITH TRANSDUCTIVE SUPPORT
VECTOR MACHINES ZHILI WU, CHUNHUNG LI 713 14 VARIABLE SELECTION USING
CORRELATION AND SVC METHODS: APPLICATIONS AMIR REZA SAFFARI AZAR
ALAMDARI 715 15 TREE-BASED ENSEMBLES WITH DYNAMIC SOFT FEATURE SELECTION
ALEXANDER BORISOV, VICTOR ERUHIMOV, EUGENE TUV 717 16 SPARSE, FLEXIBLE
AND EFFICIENT MODELING USING LI REGULAERIZATION SAHARON ROSSET, JI ZHU
719 17 MARGIN BASED FEATURE SELECTION AND INFOGAIN WITH STANDARD
CLASSIFIERS RAN GILAD-BACHRACH, AMIR NAVOT 721 18 BAYESIAN SUPPORT
VECTOR MACHINES FOR FEATURE RANKING AND SELECTION WEA CHU, S. SATHIYA
KEERTHI, CHONG JIN ONG. ZOUBIN GHAHRAMANI 723 19 NONLINEAR FEATURE
SELECTION WITH THE POTENTIAL SUPPORT VECTOR MACHINE SEPP HOCHREUEER,
KLAUS OBERMAYER 725 20 COMBINING A FILTER METHOD WITH SVMS THOMAS NAVIN
LAL. OLIVIER CHAPELLE, BERNHARD SCHOELKOPF 729 21 FEATURE SELECTION VIA
SENSITIVITY ANALYSIS WITH DIRECT KERNEL PLS MARK J. EMBRECHTS. ROBERT A.
BRESS, ROBERT H. KEWLEY 731 22 INFORMATION GAIN, CORRELATION AND SUPPORT
VECTOR MACHINES DANNY ROOBAERL, GRIGORIS KARAKOULAS, NITESH V. CHAWLA
733 23 MINING FOR COMPLEX MODELS COMPRISING FEATURE SELECTION AND
CLASSIFICATION KRZY&ZTOF GRABCZEWSKI, NORBERT JANKOWSKI 735 CONTENTS XXI
24 COMBINING INFORMATION-BASED SUPERVISED AND UNSUPERVISED FEATURE
SELECTION SANG-KYUN LEE, SEUNG-JOON YI, BYOUNG- TAK ZHANG 737 25 AN
ENHANCED SELECTIVE NAI VE BAYES METHOD WITH OPTIMAL DISCRETIZATION MARC
BOV.LLE 741 26 AN INPUT VARIABLE IINPORTANCE DEFINITION BASED ON
EMPIRICAL DATA PROBABILITY DISTRIBUTION V. LEMAIRE, F. CLEROT 74 3
APPENDIX D FEATURE SELECTION CHALLENGE RESULTS TABLES RESULT TABLES OF
THE NIPS2003 CHALLENGE I.IABELLE GUYON, STEVE GUNN 747 AROENE 749 DEXTER
753 DOROTHEA 757 GISETTE 761 DOROTHEA 765 OVERALL RESULTS 769 INDEX 773
|
adam_txt |
ISABELLE GUYON STEVE GUNN MASOUD NIKRAVESH LOTFI A. ZADEH (EDS.) FEATURE
EXTRACTION FOUNDATIONS AND APPLICATIONS SPRIN GER CONTENTS AN
INTRODUCTION TO FEATURE EXTRACTION ISABELLE GUYON, ANDRE ELISSEEFF 1 1
FEATURE EXTRACTION BASICS 1 2 WLRAT IS NEW IN FEATURE EXTRACTION? 7 3
GETTING STARTED 9 4 ADVANCED TOPICS AND OPCN PROBLEMS 16 5 CONCLUSION 22
REFERENCCA 23 A FORWARD SELECTION WITLI GRAM-SCHRNIDT ORTHOGOIIALIZATION
24 B JUSTIFICATION OF THE COMPUTATIOIIAL COMPLEXITV ESTIMATES 25 PART I
FEATURE EXTRACTION FUNDAMENTALS 1 LEARNING MACHINES NORBERT JANKOWSKI,
KRZYSZIOF GRABCZEWSKI 29 1.1 INTRODUCTION 29 1.2 THE LEARNING PROBLEM 29
1.3 LEARNING ALGORITHMS 35 1.4 SOME REMARKS ON LEARNING ALGORITHMS 57
REFERENCES 58 2 ASSESSMENT METHODS GERARD DREYFUS, ISABELLE GUYON 65 2.1
INTRODUCTION 65 2.2 A STATISTICAL VIEW OF FEATURE SELECTION: HYPOTHESIS
TESTS AND RANDORRI PROBES 66 2.3 A MACHINE LEARNING VIEW OF FEATURE
SELECTION 78 2.4 CONCLUSION 86 XII CONTENTS 3 FILTER METHODS WLODZISLAW
DUCK 89 3.1 INTRODUCTION TO FILTER METHODS FOR FEATURE SCLECTION 89 3.2
GENERAL ISSUCS RELATED TO FILTERS 91 3.3 CORRCLATIOII-BASED FILTERS 96
3.4 RELEVANCE INDICCS BASED ON DISTANCES BETWCCN DISTRIBUTION» 99 3.5
RELEVANCE MEASURES BASED ON INFORMATION THEORY 101 3.6 DECISION TREES
FOR FILTERING 104 3.7 RELIABILITY AND BIAS OF RELEVANCE INDICCS 106 3.8
FILTERS FOR FEATURE SELECTION 108 3.9 SUMMARY AND COIRIPARISON 110 3.10
DISCUSSION AND CONCLUSIONS 113 REFERENCES 114 4 SEARCH STRATEGIES JUHA
REUNANEN 119 4.1 INTRODUCTION 119 4.2 OPTIMAL RESULTS 119 4.3 SEQUENTIAL
SELECTION 121 4.4 EXTENSIONS TO SEQUENTIAL SELECTION 123 4.5 STOCHASTIC
SEARCH 129 4.6 ON THE DIFFERCNT LEVELS OF TESTING 133 4.7 THE BEST
STRATEGY? 134 REFERENCES 135 5 EMBEDDED METHODS THOMAS NAVIN LAL,
OLIVIER CHAPELLT, JASON WESTON, ANDRE ELISSEEFF. . . . 137 5.1
INTRODUCTION 137 5.2 FORWARD-BACKWARD METHODS 139 5.3 OPTHNIZATION OF
SCALING FACTORS 150 5.4 SPARSITY TERM 156 5.5 DLSCUSSIOIIS AND
CONCLUSIONS 161 REFERENCES 162 6 INFORMATION-THEORETLC METHODS KART
TORKKOLA, 167 6.1 INTRODUCTION 167 6.2 WHAT IS RELEVANCE? 167 6.3
INFORMATION THEORY 169 6.4 INFORMATION-TLICORETIC CRITERIA FOR VARIABLE
SCLECTION 172 6.5 MI FOR FEATURE CONSTRACTION 178 6.6 INFORMATION THEORY
IN LEARNING DISTANCC METRICS 179 6.7 INFORMATION BOTTLENECK AND VARIANTS
180 6.8 DISCUSSION 181 REFERENCES 182 CONTENTS XIII 7 ENSEMBLE LEARNING
EUGENE TUV 187 7.1 INTRODUCTION 187 7.2 OVORVIEW OF ENSEMBLE METHODS 188
7.3 VARIABLE SCLECTION AND RANKING WITH TREE ENSEMBLES 191 7.4 BAYESIAN
VOTING 200 7.5 DISCUSSIONS 201 REFERENCES 203 8 FUZZY NEURAL NETWORKS
MADAN M. GUPTA. NORIYASU HOMM.A, ZENG-GUANG HOU 205 8.1 INTRODUCTION 205
8.2 FUZZY SETS AND SYSTEMS: AN OVERVIEW 207 8.3 UILDING FUZZY NEUIONS
USING FUZZY ARITHMETIC AND FUZZY LOGIC OPERATIONS 215 8.4 HYBRID FUZZY
NEURAL NETWORKS (HFNNS) 222 8.5 CONCLUDING RCMARKS 230 REFERENCES 231
PART II FEATURE SELECTION CHALLENGE 9 DESIGN AND ANALYSIS OF THE
NIPS2003 CHALLENGE ISABELLE GUYON, STEVE GUNN, ASA BEN HUR, GIDEON DROR
237 9.1 INTRODUCTION 237 9.2 BENCHMARK DESIGN 239 9.3 CHALLENGE RCSULTS
245 9.4 POST-CHALLENGE VERIFICATIONS 253 9.5 CONCLUSIONS AND FUTURE WORK
259 REFERENCES 260 A DETAILS ABOUT THE FIFTY FEATURE SUBSET STUDY 261 10
HIGH DIMENSIONAL CLASSIFICATION WITH BAYESIAN NEURAL NETWORKS AND
DIRICHLET DIFFUSION TREES RADFORD M. NCAL, JIANGUO ZHANG 265 10.1
BAYESIAN MODELS VS. LEARNING MACHINES 266 10.2 SELECTHIG FEATURES WITH
UNIVARIATC TESTS 269 10.3 RCDUCING DIMENSIONALITY USING PCA 271 10.4
BAYESIAN LOGISTIC REGRESSION 271 10.5 BAYESIAN NEURAL NETWORK MODELS 274
10.6 DIRICHLET DIFFUSION TREE MODELS 276 10.7 METHODS AND RESULTS FOR
THE CHALLENGE DATA SETS 280 10.8 CONCLUSIONS 294 R.CFCRCNCCS 295 XIV
CONTENTS 11 ENSEMBLES OF REGULARIZED LEAST SQUARES CLASSIFIERS FOR
HIGH-DIMENSIONAL PROBLEMS KARI TORKKOLA AND EUGENE TUV 297 11.1
INTRODUKTION 297 11.2 REGULARIZED LEAST-SQUARES CLASSIFICATION (I1LSC)
298 11.3 MODEL AVERAGING AND REGULA RIZATION 300 11.4 VARIABLE FILTERHIG
WITH TRCC-BASED ENSEMBLES 301 11.5 EXPERIMENTS WITH CHALLERIGE DATA SETS
302 11.6 FUTIIRE DIRCCTIONS 310 11.7 CONCLUSION 312 REFERENCCS 313 12
COMBINING SVMS WITH VARIOUS FEATURE SELECTION STRATEGIES YI-WEI CHEN,
CHIH-JEN LIN 315 12.1 INTRODUCTION 315 12.2 SUPPORT VECTOR
CLASSIFICATION 316 12.3 FEATURE SELECTION STRATEGIES 317 12.4
EXPERITNENTAL RESULTS 320 12.5 COMPETILION RESULTS 321 12.6 DISCUSSION
AND CONCLUSION 322 REFERENCCS 323 13 FEATURE SELECTION WITH
TRANSDUCTIVE SUPPORT VECTOR MACHINES ZHILI WU. CHUNH.7I.NG LI 325 13.1
INTRODUCTION 325 13.2 SVMS AND TRANSDUCTIVE SVMS 326 13.3 FEATURE
SELECTION METHODS RELATED WITH SVMS 329 13.4 TRANSDUCTIVE SVM-RELATCD
FEATURE SELECTION 331 13.5 EXPERIMENTATION 332 13.6 CONCLUSION AND
DISCUSSION 339 13.7 ACKNOWLCDGEMENT 340 REFERENCES 340 14 VARIABLE
SELECTION USING CORRELATION AND SINGLE VARIABLE CLASSIFIER METHODS:
APPLICATIONS AN VIR B.EZA SAFFARI AZA.R ALAMDARI 343 14.1 INTRODUCTION
343 14.2 INTRODUCTION TO CORRELATION AND SINGLE VARIABLE CLASSIFIER
METHODS 344 14.3 ENSEMBLE AVERAGING 347 14.4 APPLICATIONS TO NIPS 2003
FEATURE SELECTION CHALLERIGE 350 14.5 CONCLUSION 355 REFERENCES SR,7
CONTENTS XV 15 TREE-BASED ENSEMBLES WITH DYNAMIC SOFT FEATURE SELECTION
ALEXANDER BORISOV, VICTOR ERUHIMOV. EUGENE TUV 359 15.1 BACKGROUND 359
15.2 DYNAMIC FEATURE SCLCCTION 361 15.3 EXPERIMENTAL RESULTS 367 15.4
SUMMARY 373 TLEFERENCES 374 16 SPARSE, FLEXIBLE AND EFFLCIENT MODELING
USING L\ REGULARIZATION SAHARON ROSSET, JI ZHU 37 5 16.1 TNTRODUCTION
375 16.2 THE LI-NORM PENALTY 380 16.3 PIECEWISUE LINEAR SOLUTION PATLIS
384 16.1 A ROBUST, EIFICIENT AND. ADAPTABLE METHOD FOR CLASSIFICATION
388 16.5 RESULTS ON THE NIPS-03 CHALLENGE DATASETS 390 16.6 CONCLUSION
392 RCFCRENCES 393 17 MARGIN BASED FEATURE SELECTION AND INFOGAIN *WITH
STANDARD CLASSIFIERS RON GILA.D-BACHRACH, AMIR NAVOT 395 17.1 MCTHODS
395 17.2 RESULTS 398 17.3 DISCNSSION 399 REFERENCES 400 18 BAYESIAN
SUPPORT VECTOR MACHINES FOR FEATURE RANKING AND SELECTION WEI CHU, S.
SATHIYA KEERTHI, CHORIG JIN ONG, ZOUHIN GHAHRAMANI 403 18.1 INTRODUETION
403 18.2 BAYESIAN FRAMEWORK 404 18.3 POST-PROCESSING FOR FEATURE
SELECTION 410 18.4 NUMCRICAL EXPERIMENTS 414 18.5 CONCLUSION 416
REFERENCES 416 19 NONLINEAR FEATURE SELECTION WITH THE POTENTIAL SUPPORT
VECTOR MACHINE SEPP HOCHREITER, KLAUS OBERMAYER 419 19.1 INTRODUETION
419 19.2 THE POTENTIAL SUPPORT VECTOR MACHINC 420 19.3 P-SVM DISCNSSION
AND REDUNDANCY CONTROL 424 19.4 NONLINEAR P-SVM FEATURE SCLCCTION 427
19.5 EXPERIMENTS 429 XVI CONTENTS 19.6 CONCLUSION 436 RCFERCNCCS 436 20
COMBINING A FILTER METHOD WITH SVMS THOMAS NAVIN LAL, OLIVICR CHAPCLLC,
BERNHARD SCHOELKOPF 439 20.1 THE PARAMETERS U AND C OF THC SVM 439 20.2
FEATURE RANKING 440 20.3 NUMBER OF FEATURES 441 20.4 SUMMARY 443
REFERENCES 445 21 FEATURE SELECTION VIA SENSITIVITY ANALYSIS WITH DIRECT
KERNEL PLS MARK J. EMBRECHTS, ROBERT A. BRESS, ROBERT H. KEWLEY 447 21.1
INTRODUCTION 447 21.2 PARTIAL LEAST SQUARES REGRESSION (PLS) 448 21.3
REGRESSION MODELS BASED ON DIRECT KERNELS 450 21.4 DEALING WITH THE
BIAS: CENTERING THE KERNEL 452 21.5 MCTRICS FOR ASSESSING THE MODEL
QUALITY 453 21.6 DATA CONDITIONING AND PREPROCESSING 454 21.7
SENSITIVITY ANALYSIS 455 21.8 HEURISTIE FEATURE SELECTION POLICICS FOR
THC NIPS FEATURE SELECTION CHALLENGE 456 21.9 BENCHMARKS 459 21.10
CONCLUSIONS 460 REFERENCES 461 22 INFORMATION GAIN, CORRELATION AND
SUPPORT VECTOR MACHINES DANNY ROOBAERT. GRIGORIS KARAKOULAS. NITESH V.
CHAWLA 463 22.1 INTRODUCTION 463 22.2 DESCRIPTION OF APPROACH 464 22.3
FINAL RESULTA 467 22.4 ALTERNATIVE APPROACHES PURSUED 468 22.5
DISCUSSION AND CONCLUSION 469 REFERENCES 470 23 MINING FOR COMPLEX
MODELS CONIPRISING FEATURE SELECTION AND CLASSIFICATION KRZYSZTOF
GRABCZEWSKI, NORBERT JANKOWSKI 47 1 23.1 INTRODUCTION 47]. 23.2
FUNDAMENTAL ALGORITHMS 472 23.3 FULLY OPERATIONAL COMPLEX MODELS 481
23.4 CHALLENGE DATA EXPLORATION 483 23.5 CONCLUSIONS 486 CONTENTS XVTI
24 COMBINING INFORMATION-BASED SUPERVISED AND UNSUPERVISED FEATURE
SELECTION SANG-KYUN LEE, SEUNG-JOON YI, BYOUNG-TAK ZHANG 489 24.1
IUTRODUCTION 489 24.2 METHODS 490 24.3 EXPERIMENTS 494 24.4 CONCHISIONS
496 REFERENCES 498 25 AN ENHANCED SELECTIVE NA'I'VE BAYES METHOD WITH
OPTIMAL DISCRETIZATION MARC BOULU 499 25.1 IUTRODUCTION 499 25.2 THE
ENHANCED SELECTIVE NAIVE BAYES METHOD 500 25.3 THE MODL DISCRETIZATION
METHOD 502 25.4 RESULTS ON THE NIPS CHALLENGE 504 25.5 CONCHISION 506
REFERENCES 506 26 AN INPUT VARIABLE IMPORTANCE DEFINITION BASED ON
EMPIRICAL DATA PROBABILITY DISTRIBUTION V. LEMAIRE, F. CLEROT 509 26.1
IUTRODUCTION 509 26.2 ANALYSIS OF AN INPUT VARIABLE INFLUERICC 510 26.3
APPLICATION TO FEATURE SUBSET SELECTION 512 26.4 RESULTS ON THE NIPS
FEATURE SELECTION CHALLENGE 513 26.5 CONCLUSIONS 516 REFERENCES 516 PART
III NEW PERSPECTIVES IN FEATURE EXTRACTION 27 SPECTRAL DIMENSIONALITY
REDUCTION YOSHUA BENGIO, OLIMER DDALLEAU, NICOLAS HE ROUX, JEAN-FMNCOIS
PAIEMENT, PASCAL VINCENT, MARIE OUIMET 519 27.1 INTRODUCTION 519 27.2
DATA-DEP ENDE NT KERNELS FOR SPECTRAL ENIBCDDING ALGORITHMS 524 27.3
KERNEL EIGENFUNCTIONS FOR INDUCTION 532 27.4 LEARNING CRITCRION FOR THE
LEADING EIGENFUNCTIONS 539 27.5 EXPERIMENTS 541 27.6 CONCLUSION 544
REFERENCES 547 XVIII CONTENTS 28 CONSTRUCTING ORTHOGONAL LATENT FEATURES
FOR ARBITRARY LOSS MICHINARI MAMMA, KRISTIN P. BENNETT 551 28.1
INTRODUCTION 551 28.2 GENERAL FRAMEWORK IN BLF 554 28.3 BLF WITH LINEAR
FUNCTIONS 557 28.4 CONVERGCTICE PROPCRTICS OF BLF 561 28.5 PLS AND BLF
563 28.6 BLF FOR ARBITRARY LOSS 564 28.7 KERNEL BLF 571 28.8
COMPULATIONAL RCSULTS 572 28.9 CONCLUSION 581 REFERENCES 582 29 LARGE
MARGIN PRINCIPLES FOR FEATURE SELCCTION RAN GILAD-BACHRACH, AMIR NAVOL,
NAFTALI TISHBY 585 29.1 INTRODUCTION 585 29.2 MARGINS 586 29.3
ALGORITHMS 589 29.4 THEORCTICAL ANALYSIS 592 29.5 EMPIRICAL ASSESSMENT
593 29.6 DISCIISSION AND FLIRTHER RESEARCH DIRECTIONS 602 REFERENCES 604
A COMPLCMCNTARY PROOFS 604 30 FEATURE EXTRACTION FOR CLASSIFICATION OF
PROTEOMIC MASS SPECTRA: A COMPARATIVE STUDY ILYA LEVNER, VADIM BULITKO.
GUOHUI LIN 607 30.1 INTRODUCTION 607 30.2 EXISTING FEATURE EXTRACTION
AND CLASSIFICATION METHODS 611 30.3 FXPCRIMEIITAL R.CSNLTS 615 30.4
CONCLUSION 622 30.5 ACKNOWLEDGEMENTS 623 R.CFCREII(;ES 623 31 SEQUENCE
MOTIFS: HIGHLY PREDICTIVE FEATURES OF PROTEIN FUNCTION ASA BEN-HUR,
DOUGLAS BRUTLAG 625 31.1 INTRODUCTION 625 31.2 ENZYME CLASSIFICATION 627
31.3 METHODS 628 31.1 RESULTS 636 31.5 DISCUSSION 642 31.6 CONCLUSION
643 REFERENCES 643 CONTENTS XIX APPENDIX A ELEMENTARY STATISTICS
ELEMENTARY STATISTICS GERARD DREYFUS 649 1 BASIC PRINCIPLCS 649 2
ESTIMATIIIG AND LCARNMG 651 3 SOMO ADDITIONAL USEFUL PROBABILITY
DISTRIBUUEONS 654 *1 CORIFIDENEE INTERVALS 655 5 HYPOTHESIS TCSTING 657 6
PROBABLY APPROXIMATELY CORRECT (PAC) LEAMING AND GUARANTEED ESTIMATORS
660 REFERENCES 662 APPENDIX B FEATURE SELECTION CHALLENGE DATASETS
EXPERIMENTAL DESIGN ISABELLE GUYON 665 ARCENE 669 GISETTE 677 DEXTER 683
DOROTHEA 687 MADELON 691 MAILAB CODE OF THE LAMBDA METHOD 697 M AT LAB
CODE USED TO GENERATE MADELON 699 APPENDIX C FEATURE SELECTION CHALLENGE
FACT SHEETS 10 HIGH DIMENSIONAL CLASSIFICATION WITH BAYESIAN NEURAL
NETWORKS AND DIRICHLET DIFFUSION TREES RADFORD M. NEAL, JIANGUO ZHANG
707 11 ENSEMBLES OF REGULARIZED LEAST SQUARES CLASSIFIERS FOR
HIGH-DIMENSIONAL PROBLEMS KART TORKKOLA AND EUGENE TUV 709 XX CONTENTS
12 COMBINING SVMS WITH VARIOUS FEATURE SELECTION STRATEGIES YI-WCI
CHERT, CHIH-JEN HIN 711 13 FEATURE SELECTION WITH TRANSDUCTIVE SUPPORT
VECTOR MACHINES ZHILI WU, CHUNHUNG LI 713 14 VARIABLE SELECTION USING
CORRELATION AND SVC METHODS: APPLICATIONS AMIR REZA SAFFARI AZAR
ALAMDARI 715 15 TREE-BASED ENSEMBLES WITH DYNAMIC SOFT FEATURE SELECTION
ALEXANDER BORISOV, VICTOR ERUHIMOV, EUGENE TUV 717 16 SPARSE, FLEXIBLE
AND EFFICIENT MODELING USING LI REGULAERIZATION SAHARON ROSSET, JI ZHU
719 17 MARGIN BASED FEATURE SELECTION AND INFOGAIN WITH STANDARD
CLASSIFIERS RAN GILAD-BACHRACH, AMIR NAVOT 721 18 BAYESIAN SUPPORT
VECTOR MACHINES FOR FEATURE RANKING AND SELECTION WEA CHU, S. SATHIYA
KEERTHI, CHONG JIN ONG. ZOUBIN GHAHRAMANI 723 19 NONLINEAR FEATURE
SELECTION WITH THE POTENTIAL SUPPORT VECTOR MACHINE SEPP HOCHREUEER,
KLAUS OBERMAYER 725 20 COMBINING A FILTER METHOD WITH SVMS THOMAS NAVIN
LAL. OLIVIER CHAPELLE, BERNHARD SCHOELKOPF 729 21 FEATURE SELECTION VIA
SENSITIVITY ANALYSIS WITH DIRECT KERNEL PLS MARK J. EMBRECHTS. ROBERT A.
BRESS, ROBERT H. KEWLEY 731 22 INFORMATION GAIN, CORRELATION AND SUPPORT
VECTOR MACHINES DANNY ROOBAERL, GRIGORIS KARAKOULAS, NITESH V. CHAWLA
733 23 MINING FOR COMPLEX MODELS COMPRISING FEATURE SELECTION AND
CLASSIFICATION KRZY&ZTOF GRABCZEWSKI, NORBERT JANKOWSKI 735 CONTENTS XXI
24 COMBINING INFORMATION-BASED SUPERVISED AND UNSUPERVISED FEATURE
SELECTION SANG-KYUN LEE, SEUNG-JOON YI, BYOUNG- TAK ZHANG 737 25 AN
ENHANCED SELECTIVE NAI'VE BAYES METHOD WITH OPTIMAL DISCRETIZATION MARC
BOV.LLE 741 26 AN INPUT VARIABLE IINPORTANCE DEFINITION BASED ON
EMPIRICAL DATA PROBABILITY DISTRIBUTION V. LEMAIRE, F. CLEROT 74 3
APPENDIX D FEATURE SELECTION CHALLENGE RESULTS TABLES RESULT TABLES OF
THE NIPS2003 CHALLENGE I.IABELLE GUYON, STEVE GUNN 747 AROENE 749 DEXTER
753 DOROTHEA 757 GISETTE 761 DOROTHEA 765 OVERALL RESULTS 769 INDEX 773 |
any_adam_object | 1 |
any_adam_object_boolean | 1 |
building | Verbundindex |
bvnumber | BV021799129 |
classification_rvk | ST 302 ST 330 |
ctrlnum | (OCoLC)633075017 (DE-599)BVBBV021799129 |
dewey-full | 006.301519 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 006 - Special computer methods |
dewey-raw | 006.301519 |
dewey-search | 006.301519 |
dewey-sort | 16.301519 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
discipline_str_mv | Informatik |
format | Kit Book |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>01529nom a2200373 cb4500</leader><controlfield tag="001">BV021799129</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20140710 </controlfield><controlfield tag="008">061107s2006 gw ||| 0| bneng d</controlfield><datafield tag="016" ind1="7" ind2=" "><subfield code="a">97984083X</subfield><subfield code="2">DE-101</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9783540354871</subfield><subfield code="9">978-3-540-35487-1</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">3540354875</subfield><subfield code="9">3-540-35487-5</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)633075017</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV021799129</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rakddb</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="044" ind1=" " ind2=" "><subfield code="a">gw</subfield><subfield code="c">XA-DE-BW</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-20</subfield><subfield code="a">DE-83</subfield><subfield code="a">DE-11</subfield></datafield><datafield tag="082" ind1="0" ind2=" "><subfield code="a">006.301519</subfield><subfield code="2">22/ger</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 302</subfield><subfield code="0">(DE-625)143652:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">ST 330</subfield><subfield code="0">(DE-625)143663:</subfield><subfield code="2">rvk</subfield></datafield><datafield tag="084" ind1=" " ind2=" "><subfield code="a">004</subfield><subfield code="2">sdnb</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Feature extraction</subfield><subfield code="b">foundations and applications</subfield><subfield code="c">Isabelle Guyon ... eds.</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Berlin [u.a]</subfield><subfield code="b">Springer</subfield><subfield code="c">2006</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">XXIV, 778 S.</subfield><subfield code="b">Ill., graph. Darst.</subfield><subfield code="e">1 CD-ROM (12 cm)</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">Studies in fuzziness and soft computing</subfield><subfield code="v">207</subfield></datafield><datafield tag="650" ind1="0" ind2="7"><subfield code="a">Merkmalsextraktion</subfield><subfield code="0">(DE-588)4314440-8</subfield><subfield code="2">gnd</subfield><subfield code="9">rswk-swf</subfield></datafield><datafield tag="655" ind1=" " ind2="7"><subfield code="0">(DE-588)4143413-4</subfield><subfield code="a">Aufsatzsammlung</subfield><subfield code="2">gnd-content</subfield></datafield><datafield tag="689" ind1="0" ind2="0"><subfield code="a">Merkmalsextraktion</subfield><subfield code="0">(DE-588)4314440-8</subfield><subfield code="D">s</subfield></datafield><datafield tag="689" ind1="0" ind2=" "><subfield code="5">DE-604</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Guyon, Isabelle</subfield><subfield code="e">Sonstige</subfield><subfield code="4">oth</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">Studies in fuzziness and soft computing</subfield><subfield code="v">207</subfield><subfield code="w">(DE-604)BV021858135</subfield><subfield code="9">207</subfield></datafield><datafield tag="856" ind1="4" ind2="2"><subfield code="m">GBV Datenaustausch</subfield><subfield code="q">application/pdf</subfield><subfield code="u">http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=015011666&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA</subfield><subfield code="3">Inhaltsverzeichnis</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-015011666</subfield></datafield></record></collection> |
genre | (DE-588)4143413-4 Aufsatzsammlung gnd-content |
genre_facet | Aufsatzsammlung |
id | DE-604.BV021799129 |
illustrated | Illustrated |
index_date | 2024-07-02T15:47:13Z |
indexdate | 2024-07-09T20:44:53Z |
institution | BVB |
isbn | 9783540354871 3540354875 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-015011666 |
oclc_num | 633075017 |
open_access_boolean | |
owner | DE-20 DE-83 DE-11 |
owner_facet | DE-20 DE-83 DE-11 |
physical | XXIV, 778 S. Ill., graph. Darst. 1 CD-ROM (12 cm) |
publishDate | 2006 |
publishDateSearch | 2006 |
publishDateSort | 2006 |
publisher | Springer |
record_format | marc |
series | Studies in fuzziness and soft computing |
series2 | Studies in fuzziness and soft computing |
spelling | Feature extraction foundations and applications Isabelle Guyon ... eds. Berlin [u.a] Springer 2006 XXIV, 778 S. Ill., graph. Darst. 1 CD-ROM (12 cm) Studies in fuzziness and soft computing 207 Merkmalsextraktion (DE-588)4314440-8 gnd rswk-swf (DE-588)4143413-4 Aufsatzsammlung gnd-content Merkmalsextraktion (DE-588)4314440-8 s DE-604 Guyon, Isabelle Sonstige oth Studies in fuzziness and soft computing 207 (DE-604)BV021858135 207 GBV Datenaustausch application/pdf http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=015011666&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA Inhaltsverzeichnis |
spellingShingle | Feature extraction foundations and applications Studies in fuzziness and soft computing Merkmalsextraktion (DE-588)4314440-8 gnd |
subject_GND | (DE-588)4314440-8 (DE-588)4143413-4 |
title | Feature extraction foundations and applications |
title_auth | Feature extraction foundations and applications |
title_exact_search | Feature extraction foundations and applications |
title_exact_search_txtP | Feature extraction foundations and applications |
title_full | Feature extraction foundations and applications Isabelle Guyon ... eds. |
title_fullStr | Feature extraction foundations and applications Isabelle Guyon ... eds. |
title_full_unstemmed | Feature extraction foundations and applications Isabelle Guyon ... eds. |
title_short | Feature extraction |
title_sort | feature extraction foundations and applications |
title_sub | foundations and applications |
topic | Merkmalsextraktion (DE-588)4314440-8 gnd |
topic_facet | Merkmalsextraktion Aufsatzsammlung |
url | http://bvbr.bib-bvb.de:8991/F?func=service&doc_library=BVB01&local_base=BVB01&doc_number=015011666&sequence=000001&line_number=0001&func_code=DB_RECORDS&service_type=MEDIA |
volume_link | (DE-604)BV021858135 |
work_keys_str_mv | AT guyonisabelle featureextractionfoundationsandapplications |