Optimal Bayesian classification:
"The most basic problem of engineering is the design of optimal operators. Design takes different forms depending on the random process constituting the scientific model and the operator class of interest. This book treats classification, where the underlying random process is a feature-label d...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Bellingham, Washington, USA
SPIE Press
[2020]
|
Schlagworte: | |
Online-Zugang: | FHD01 Volltext |
Zusammenfassung: | "The most basic problem of engineering is the design of optimal operators. Design takes different forms depending on the random process constituting the scientific model and the operator class of interest. This book treats classification, where the underlying random process is a feature-label distribution, and an optimal operator is a Bayes classifier, which is a classifier minimizing the classification error. With sufficient knowledge we can construct the feature-label distribution and thereby find a Bayes classifier. Rarely, do we possess such knowledge. On the other hand, if we had unlimited data, we could accurately estimate the feature-label distribution and obtain a Bayes classifier. Rarely do we possess sufficient data. The aim of this book is to best use whatever knowledge and data are available to design a classifier. The book takes a Bayesian approach to modeling the feature-label distribution and designs an optimal classifier relative to a posterior distribution governing an uncertainty class of feature-label distributions. In this way it takes full advantage of knowledge regarding the underlying system and the available data. Its origins lie in the need to estimate classifier error when there is insufficient data to hold out test data, in which case an optimal error estimate can be obtained relative to the uncertainty class. A natural next step is to forgo classical ad hoc classifier design and simply find an optimal classifier relative to the posterior distribution over the uncertainty class-this being an optimal Bayesian classifier"-- |
Beschreibung: | 1 Online-Ressource |
ISBN: | 9781510630710 |
DOI: | 10.1117/3.2540669 |
Internformat
MARC
LEADER | 00000nmm a2200000 c 4500 | ||
---|---|---|---|
001 | BV046693671 | ||
003 | DE-604 | ||
005 | 20200616 | ||
007 | cr|uuu---uuuuu | ||
008 | 200426s2020 |||| o||u| ||||||eng d | ||
020 | |a 9781510630710 |9 978-1-5106-3071-0 | ||
035 | |a (OCoLC)1152208406 | ||
035 | |a (DE-599)BVBBV046693671 | ||
040 | |a DE-604 |b ger |e rda | ||
041 | 0 | |a eng | |
049 | |a DE-1050 | ||
100 | 1 | |a Dalton, Lori A. |e Verfasser |4 aut | |
245 | 1 | 0 | |a Optimal Bayesian classification |c Lori A. Dalton, Edward R. Dougherty |
264 | 1 | |a Bellingham, Washington, USA |b SPIE Press |c [2020] | |
300 | |a 1 Online-Ressource | ||
336 | |b txt |2 rdacontent | ||
337 | |b c |2 rdamedia | ||
338 | |b cr |2 rdacarrier | ||
505 | 8 | |a Preface -- Acknowledgments -- 1. Classification and error estimation: 1.1. Classifiers; 1.2. Constrained classifiers; 1.3. Error estimation; 1.4. Random versus separate sampling; 1.5. Epistemology and validity -- 2. Optimal Bayesian error estimation: 2.1. The Bayesian MMSE error estimator; 2.2. Evaluation of the Bayesian MMSE error estimator; 2.3. Performance evaluation at a fixed point; 2.4. Discrete model; 2.5. Gaussian model; 2.6. Performance in the Gaussian model with LDA; -- 2.7. Consistency of Bayesian error estimation; 2.8. Calibration; 2.9. Optimal Bayesian ROC-based analysis -- 3. Sample-conditioned MSE of error estimation: 3.1. Conditional MSE of error estimators; 3.2. Evaluation of the conditional MSE; 3.3. Discrete model; 3.4. Gaussian model; 3.5. Average performance in the Gaussian model; 3.6. Convergence of the sample-conditioned MSE; 3.7. A performance bound for the discrete model; 3.8. Censored sampling; 3.9. Asymptotic approximation of the RMS -- 4. Optimal Bayesian classification: 4.1. Optimal operator design under uncertainty; 4.2. Optimal Bayesian classifier; 4.3. Discrete model; 4.4. Gaussian model; 4.5. Transformations of the feature space; 4.6. Convergence of the optimal Bayesian classifier; 4.7. Robustness in the Gaussian model; 4.8. Intrinsically Bayesian robust classifiers; 4.9. Missing values; 4.10. Optimal sampling; 4.11. OBC for autoregressive dependent sampling | |
505 | 8 | |a 5. Optimal Bayesian risk-based multi-class classification: 5.1. Bayes decision theory; 5.2. Bayesian risk estimation; 5.3. Optimal Bayesian risk classification; 5.4. Sample-conditioned MSE of risk estimation; 5.5. Efficient computation; 5.6. Evaluation of posterior mixed moments: discrete model; 5.7. Evaluation of posterior mixed moments: Gaussian models; 5.8. Simulations -- 6. Optimal Bayesian transfer learning: 6.1. Joint prior distribution; 6.2. Posterior distribution in the target domain; 6.3. Optimal Bayesian transfer learning classifier; 6.4. OBTLC with negative binomial distribution -- 7. Construction of prior distributions: 7.1. Prior construction using data from discarded features; 7.2. Prior knowledge from stochastic differential equations; 7.3. Maximal knowledge-driven information prior; 7.4. REMLP for a normal-Wishart prior -- References -- Index | |
520 | |a "The most basic problem of engineering is the design of optimal operators. Design takes different forms depending on the random process constituting the scientific model and the operator class of interest. This book treats classification, where the underlying random process is a feature-label distribution, and an optimal operator is a Bayes classifier, which is a classifier minimizing the classification error. With sufficient knowledge we can construct the feature-label distribution and thereby find a Bayes classifier. Rarely, do we possess such knowledge. On the other hand, if we had unlimited data, we could accurately estimate the feature-label distribution and obtain a Bayes classifier. Rarely do we possess sufficient data. The aim of this book is to best use whatever knowledge and data are available to design a classifier. The book takes a Bayesian approach to modeling the feature-label distribution and designs an optimal classifier relative to a posterior distribution governing an uncertainty class of feature-label distributions. In this way it takes full advantage of knowledge regarding the underlying system and the available data. Its origins lie in the need to estimate classifier error when there is insufficient data to hold out test data, in which case an optimal error estimate can be obtained relative to the uncertainty class. A natural next step is to forgo classical ad hoc classifier design and simply find an optimal classifier relative to the posterior distribution over the uncertainty class-this being an optimal Bayesian classifier"-- | ||
650 | 4 | |a Bayesian statistical decision theory | |
650 | 4 | |a Statistical decision | |
700 | 1 | |a Dougherty, Edward R. |d 1945- |e Verfasser |0 (DE-588)172047439 |4 aut | |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe, epub |z 978-1-5106-3070-3 |
776 | 0 | 8 | |i Erscheint auch als |n Online-Ausgabe, kindle edition |z 978-1-5106-3072-7 |
776 | 0 | 8 | |i Erscheint auch als |n Druck-Ausgabe, paperback |z 978-1-5106-3069-7 |
856 | 4 | 0 | |u https://doi.org/10.1117/3.2540669 |x Verlag |z URL des Erstveröffentlichers |3 Volltext |
912 | |a ZDB-50-SPI | ||
999 | |a oai:aleph.bib-bvb.de:BVB01-032104379 | ||
966 | e | |u https://doi.org/10.1117/3.2540669 |l FHD01 |p ZDB-50-SPI |x Verlag |3 Volltext |
Datensatz im Suchindex
_version_ | 1804181415697317888 |
---|---|
adam_txt | |
any_adam_object | |
any_adam_object_boolean | |
author | Dalton, Lori A. Dougherty, Edward R. 1945- |
author_GND | (DE-588)172047439 |
author_facet | Dalton, Lori A. Dougherty, Edward R. 1945- |
author_role | aut aut |
author_sort | Dalton, Lori A. |
author_variant | l a d la lad e r d er erd |
building | Verbundindex |
bvnumber | BV046693671 |
collection | ZDB-50-SPI |
contents | Preface -- Acknowledgments -- 1. Classification and error estimation: 1.1. Classifiers; 1.2. Constrained classifiers; 1.3. Error estimation; 1.4. Random versus separate sampling; 1.5. Epistemology and validity -- 2. Optimal Bayesian error estimation: 2.1. The Bayesian MMSE error estimator; 2.2. Evaluation of the Bayesian MMSE error estimator; 2.3. Performance evaluation at a fixed point; 2.4. Discrete model; 2.5. Gaussian model; 2.6. Performance in the Gaussian model with LDA; -- 2.7. Consistency of Bayesian error estimation; 2.8. Calibration; 2.9. Optimal Bayesian ROC-based analysis -- 3. Sample-conditioned MSE of error estimation: 3.1. Conditional MSE of error estimators; 3.2. Evaluation of the conditional MSE; 3.3. Discrete model; 3.4. Gaussian model; 3.5. Average performance in the Gaussian model; 3.6. Convergence of the sample-conditioned MSE; 3.7. A performance bound for the discrete model; 3.8. Censored sampling; 3.9. Asymptotic approximation of the RMS -- 4. Optimal Bayesian classification: 4.1. Optimal operator design under uncertainty; 4.2. Optimal Bayesian classifier; 4.3. Discrete model; 4.4. Gaussian model; 4.5. Transformations of the feature space; 4.6. Convergence of the optimal Bayesian classifier; 4.7. Robustness in the Gaussian model; 4.8. Intrinsically Bayesian robust classifiers; 4.9. Missing values; 4.10. Optimal sampling; 4.11. OBC for autoregressive dependent sampling 5. Optimal Bayesian risk-based multi-class classification: 5.1. Bayes decision theory; 5.2. Bayesian risk estimation; 5.3. Optimal Bayesian risk classification; 5.4. Sample-conditioned MSE of risk estimation; 5.5. Efficient computation; 5.6. Evaluation of posterior mixed moments: discrete model; 5.7. Evaluation of posterior mixed moments: Gaussian models; 5.8. Simulations -- 6. Optimal Bayesian transfer learning: 6.1. Joint prior distribution; 6.2. Posterior distribution in the target domain; 6.3. Optimal Bayesian transfer learning classifier; 6.4. OBTLC with negative binomial distribution -- 7. Construction of prior distributions: 7.1. Prior construction using data from discarded features; 7.2. Prior knowledge from stochastic differential equations; 7.3. Maximal knowledge-driven information prior; 7.4. REMLP for a normal-Wishart prior -- References -- Index |
ctrlnum | (OCoLC)1152208406 (DE-599)BVBBV046693671 |
doi_str_mv | 10.1117/3.2540669 |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>05285nmm a2200397 c 4500</leader><controlfield tag="001">BV046693671</controlfield><controlfield tag="003">DE-604</controlfield><controlfield tag="005">20200616 </controlfield><controlfield tag="007">cr|uuu---uuuuu</controlfield><controlfield tag="008">200426s2020 |||| o||u| ||||||eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781510630710</subfield><subfield code="9">978-1-5106-3071-0</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)1152208406</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(DE-599)BVBBV046693671</subfield></datafield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">DE-604</subfield><subfield code="b">ger</subfield><subfield code="e">rda</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-1050</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Dalton, Lori A.</subfield><subfield code="e">Verfasser</subfield><subfield code="4">aut</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Optimal Bayesian classification</subfield><subfield code="c">Lori A. Dalton, Edward R. Dougherty</subfield></datafield><datafield tag="264" ind1=" " ind2="1"><subfield code="a">Bellingham, Washington, USA</subfield><subfield code="b">SPIE Press</subfield><subfield code="c">[2020]</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 Online-Ressource</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Preface -- Acknowledgments -- 1. Classification and error estimation: 1.1. Classifiers; 1.2. Constrained classifiers; 1.3. Error estimation; 1.4. Random versus separate sampling; 1.5. Epistemology and validity -- 2. Optimal Bayesian error estimation: 2.1. The Bayesian MMSE error estimator; 2.2. Evaluation of the Bayesian MMSE error estimator; 2.3. Performance evaluation at a fixed point; 2.4. Discrete model; 2.5. Gaussian model; 2.6. Performance in the Gaussian model with LDA; -- 2.7. Consistency of Bayesian error estimation; 2.8. Calibration; 2.9. Optimal Bayesian ROC-based analysis -- 3. Sample-conditioned MSE of error estimation: 3.1. Conditional MSE of error estimators; 3.2. Evaluation of the conditional MSE; 3.3. Discrete model; 3.4. Gaussian model; 3.5. Average performance in the Gaussian model; 3.6. Convergence of the sample-conditioned MSE; 3.7. A performance bound for the discrete model; 3.8. Censored sampling; 3.9. Asymptotic approximation of the RMS -- 4. Optimal Bayesian classification: 4.1. Optimal operator design under uncertainty; 4.2. Optimal Bayesian classifier; 4.3. Discrete model; 4.4. Gaussian model; 4.5. Transformations of the feature space; 4.6. Convergence of the optimal Bayesian classifier; 4.7. Robustness in the Gaussian model; 4.8. Intrinsically Bayesian robust classifiers; 4.9. Missing values; 4.10. Optimal sampling; 4.11. OBC for autoregressive dependent sampling</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">5. Optimal Bayesian risk-based multi-class classification: 5.1. Bayes decision theory; 5.2. Bayesian risk estimation; 5.3. Optimal Bayesian risk classification; 5.4. Sample-conditioned MSE of risk estimation; 5.5. Efficient computation; 5.6. Evaluation of posterior mixed moments: discrete model; 5.7. Evaluation of posterior mixed moments: Gaussian models; 5.8. Simulations -- 6. Optimal Bayesian transfer learning: 6.1. Joint prior distribution; 6.2. Posterior distribution in the target domain; 6.3. Optimal Bayesian transfer learning classifier; 6.4. OBTLC with negative binomial distribution -- 7. Construction of prior distributions: 7.1. Prior construction using data from discarded features; 7.2. Prior knowledge from stochastic differential equations; 7.3. Maximal knowledge-driven information prior; 7.4. REMLP for a normal-Wishart prior -- References -- Index</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">"The most basic problem of engineering is the design of optimal operators. Design takes different forms depending on the random process constituting the scientific model and the operator class of interest. This book treats classification, where the underlying random process is a feature-label distribution, and an optimal operator is a Bayes classifier, which is a classifier minimizing the classification error. With sufficient knowledge we can construct the feature-label distribution and thereby find a Bayes classifier. Rarely, do we possess such knowledge. On the other hand, if we had unlimited data, we could accurately estimate the feature-label distribution and obtain a Bayes classifier. Rarely do we possess sufficient data. The aim of this book is to best use whatever knowledge and data are available to design a classifier. The book takes a Bayesian approach to modeling the feature-label distribution and designs an optimal classifier relative to a posterior distribution governing an uncertainty class of feature-label distributions. In this way it takes full advantage of knowledge regarding the underlying system and the available data. Its origins lie in the need to estimate classifier error when there is insufficient data to hold out test data, in which case an optimal error estimate can be obtained relative to the uncertainty class. A natural next step is to forgo classical ad hoc classifier design and simply find an optimal classifier relative to the posterior distribution over the uncertainty class-this being an optimal Bayesian classifier"--</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Bayesian statistical decision theory</subfield></datafield><datafield tag="650" ind1=" " ind2="4"><subfield code="a">Statistical decision</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Dougherty, Edward R.</subfield><subfield code="d">1945-</subfield><subfield code="e">Verfasser</subfield><subfield code="0">(DE-588)172047439</subfield><subfield code="4">aut</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe, epub</subfield><subfield code="z">978-1-5106-3070-3</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Online-Ausgabe, kindle edition</subfield><subfield code="z">978-1-5106-3072-7</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Erscheint auch als</subfield><subfield code="n">Druck-Ausgabe, paperback</subfield><subfield code="z">978-1-5106-3069-7</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="u">https://doi.org/10.1117/3.2540669</subfield><subfield code="x">Verlag</subfield><subfield code="z">URL des Erstveröffentlichers</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-50-SPI</subfield></datafield><datafield tag="999" ind1=" " ind2=" "><subfield code="a">oai:aleph.bib-bvb.de:BVB01-032104379</subfield></datafield><datafield tag="966" ind1="e" ind2=" "><subfield code="u">https://doi.org/10.1117/3.2540669</subfield><subfield code="l">FHD01</subfield><subfield code="p">ZDB-50-SPI</subfield><subfield code="x">Verlag</subfield><subfield code="3">Volltext</subfield></datafield></record></collection> |
id | DE-604.BV046693671 |
illustrated | Not Illustrated |
index_date | 2024-07-03T14:25:57Z |
indexdate | 2024-07-10T08:51:18Z |
institution | BVB |
isbn | 9781510630710 |
language | English |
oai_aleph_id | oai:aleph.bib-bvb.de:BVB01-032104379 |
oclc_num | 1152208406 |
open_access_boolean | |
owner | DE-1050 |
owner_facet | DE-1050 |
physical | 1 Online-Ressource |
psigel | ZDB-50-SPI |
publishDate | 2020 |
publishDateSearch | 2020 |
publishDateSort | 2020 |
publisher | SPIE Press |
record_format | marc |
spelling | Dalton, Lori A. Verfasser aut Optimal Bayesian classification Lori A. Dalton, Edward R. Dougherty Bellingham, Washington, USA SPIE Press [2020] 1 Online-Ressource txt rdacontent c rdamedia cr rdacarrier Preface -- Acknowledgments -- 1. Classification and error estimation: 1.1. Classifiers; 1.2. Constrained classifiers; 1.3. Error estimation; 1.4. Random versus separate sampling; 1.5. Epistemology and validity -- 2. Optimal Bayesian error estimation: 2.1. The Bayesian MMSE error estimator; 2.2. Evaluation of the Bayesian MMSE error estimator; 2.3. Performance evaluation at a fixed point; 2.4. Discrete model; 2.5. Gaussian model; 2.6. Performance in the Gaussian model with LDA; -- 2.7. Consistency of Bayesian error estimation; 2.8. Calibration; 2.9. Optimal Bayesian ROC-based analysis -- 3. Sample-conditioned MSE of error estimation: 3.1. Conditional MSE of error estimators; 3.2. Evaluation of the conditional MSE; 3.3. Discrete model; 3.4. Gaussian model; 3.5. Average performance in the Gaussian model; 3.6. Convergence of the sample-conditioned MSE; 3.7. A performance bound for the discrete model; 3.8. Censored sampling; 3.9. Asymptotic approximation of the RMS -- 4. Optimal Bayesian classification: 4.1. Optimal operator design under uncertainty; 4.2. Optimal Bayesian classifier; 4.3. Discrete model; 4.4. Gaussian model; 4.5. Transformations of the feature space; 4.6. Convergence of the optimal Bayesian classifier; 4.7. Robustness in the Gaussian model; 4.8. Intrinsically Bayesian robust classifiers; 4.9. Missing values; 4.10. Optimal sampling; 4.11. OBC for autoregressive dependent sampling 5. Optimal Bayesian risk-based multi-class classification: 5.1. Bayes decision theory; 5.2. Bayesian risk estimation; 5.3. Optimal Bayesian risk classification; 5.4. Sample-conditioned MSE of risk estimation; 5.5. Efficient computation; 5.6. Evaluation of posterior mixed moments: discrete model; 5.7. Evaluation of posterior mixed moments: Gaussian models; 5.8. Simulations -- 6. Optimal Bayesian transfer learning: 6.1. Joint prior distribution; 6.2. Posterior distribution in the target domain; 6.3. Optimal Bayesian transfer learning classifier; 6.4. OBTLC with negative binomial distribution -- 7. Construction of prior distributions: 7.1. Prior construction using data from discarded features; 7.2. Prior knowledge from stochastic differential equations; 7.3. Maximal knowledge-driven information prior; 7.4. REMLP for a normal-Wishart prior -- References -- Index "The most basic problem of engineering is the design of optimal operators. Design takes different forms depending on the random process constituting the scientific model and the operator class of interest. This book treats classification, where the underlying random process is a feature-label distribution, and an optimal operator is a Bayes classifier, which is a classifier minimizing the classification error. With sufficient knowledge we can construct the feature-label distribution and thereby find a Bayes classifier. Rarely, do we possess such knowledge. On the other hand, if we had unlimited data, we could accurately estimate the feature-label distribution and obtain a Bayes classifier. Rarely do we possess sufficient data. The aim of this book is to best use whatever knowledge and data are available to design a classifier. The book takes a Bayesian approach to modeling the feature-label distribution and designs an optimal classifier relative to a posterior distribution governing an uncertainty class of feature-label distributions. In this way it takes full advantage of knowledge regarding the underlying system and the available data. Its origins lie in the need to estimate classifier error when there is insufficient data to hold out test data, in which case an optimal error estimate can be obtained relative to the uncertainty class. A natural next step is to forgo classical ad hoc classifier design and simply find an optimal classifier relative to the posterior distribution over the uncertainty class-this being an optimal Bayesian classifier"-- Bayesian statistical decision theory Statistical decision Dougherty, Edward R. 1945- Verfasser (DE-588)172047439 aut Erscheint auch als Online-Ausgabe, epub 978-1-5106-3070-3 Erscheint auch als Online-Ausgabe, kindle edition 978-1-5106-3072-7 Erscheint auch als Druck-Ausgabe, paperback 978-1-5106-3069-7 https://doi.org/10.1117/3.2540669 Verlag URL des Erstveröffentlichers Volltext |
spellingShingle | Dalton, Lori A. Dougherty, Edward R. 1945- Optimal Bayesian classification Preface -- Acknowledgments -- 1. Classification and error estimation: 1.1. Classifiers; 1.2. Constrained classifiers; 1.3. Error estimation; 1.4. Random versus separate sampling; 1.5. Epistemology and validity -- 2. Optimal Bayesian error estimation: 2.1. The Bayesian MMSE error estimator; 2.2. Evaluation of the Bayesian MMSE error estimator; 2.3. Performance evaluation at a fixed point; 2.4. Discrete model; 2.5. Gaussian model; 2.6. Performance in the Gaussian model with LDA; -- 2.7. Consistency of Bayesian error estimation; 2.8. Calibration; 2.9. Optimal Bayesian ROC-based analysis -- 3. Sample-conditioned MSE of error estimation: 3.1. Conditional MSE of error estimators; 3.2. Evaluation of the conditional MSE; 3.3. Discrete model; 3.4. Gaussian model; 3.5. Average performance in the Gaussian model; 3.6. Convergence of the sample-conditioned MSE; 3.7. A performance bound for the discrete model; 3.8. Censored sampling; 3.9. Asymptotic approximation of the RMS -- 4. Optimal Bayesian classification: 4.1. Optimal operator design under uncertainty; 4.2. Optimal Bayesian classifier; 4.3. Discrete model; 4.4. Gaussian model; 4.5. Transformations of the feature space; 4.6. Convergence of the optimal Bayesian classifier; 4.7. Robustness in the Gaussian model; 4.8. Intrinsically Bayesian robust classifiers; 4.9. Missing values; 4.10. Optimal sampling; 4.11. OBC for autoregressive dependent sampling 5. Optimal Bayesian risk-based multi-class classification: 5.1. Bayes decision theory; 5.2. Bayesian risk estimation; 5.3. Optimal Bayesian risk classification; 5.4. Sample-conditioned MSE of risk estimation; 5.5. Efficient computation; 5.6. Evaluation of posterior mixed moments: discrete model; 5.7. Evaluation of posterior mixed moments: Gaussian models; 5.8. Simulations -- 6. Optimal Bayesian transfer learning: 6.1. Joint prior distribution; 6.2. Posterior distribution in the target domain; 6.3. Optimal Bayesian transfer learning classifier; 6.4. OBTLC with negative binomial distribution -- 7. Construction of prior distributions: 7.1. Prior construction using data from discarded features; 7.2. Prior knowledge from stochastic differential equations; 7.3. Maximal knowledge-driven information prior; 7.4. REMLP for a normal-Wishart prior -- References -- Index Bayesian statistical decision theory Statistical decision |
title | Optimal Bayesian classification |
title_auth | Optimal Bayesian classification |
title_exact_search | Optimal Bayesian classification |
title_exact_search_txtP | Optimal Bayesian classification |
title_full | Optimal Bayesian classification Lori A. Dalton, Edward R. Dougherty |
title_fullStr | Optimal Bayesian classification Lori A. Dalton, Edward R. Dougherty |
title_full_unstemmed | Optimal Bayesian classification Lori A. Dalton, Edward R. Dougherty |
title_short | Optimal Bayesian classification |
title_sort | optimal bayesian classification |
topic | Bayesian statistical decision theory Statistical decision |
topic_facet | Bayesian statistical decision theory Statistical decision |
url | https://doi.org/10.1117/3.2540669 |
work_keys_str_mv | AT daltonloria optimalbayesianclassification AT doughertyedwardr optimalbayesianclassification |