F♯ for machine learning essentials :: get up and running with machine learning with F♯ in a fun and functional way /
Gespeichert in:
1. Verfasser: | |
---|---|
Weitere Verfasser: | |
Format: | Elektronisch E-Book |
Sprache: | English |
Veröffentlicht: |
Birmingham, England ; Mumbai [India] :
Packt Publishing,
2016.
|
Schriftenreihe: | Community experience distilled.
|
Schlagworte: | |
Online-Zugang: | Volltext |
Beschreibung: | Includes index. |
Beschreibung: | 1 online resource (194 pages) : color illustrations, tables. |
ISBN: | 9781783989355 1783989351 |
Internformat
MARC
LEADER | 00000cam a2200000Mi 4500 | ||
---|---|---|---|
001 | ZDB-4-EBA-ocn961851724 | ||
003 | OCoLC | ||
005 | 20241004212047.0 | ||
006 | m o d | ||
007 | cr |n||||||||| | ||
008 | 160226t20162016enka o 001 0 eng d | ||
040 | |a VT2 |b eng |e pn |c VT2 |d OCLCO |d COO |d OCLCQ |d OCLCF |d UOK |d N$T |d LVT |d G3B |d IGB |d STF |d OCLCO |d OCLCQ |d OCLCO |d OCLCL | ||
020 | |a 9781783989355 |q (electronic bk.) | ||
020 | |a 1783989351 |q (electronic bk.) | ||
020 | |z 9781783989348 | ||
020 | |z 1783989343 | ||
020 | |z 1783989351 | ||
035 | |a (OCoLC)961851724 | ||
050 | 4 | |a QA76.73.F16 |b .M854 2016eb | |
072 | 7 | |a COM |x 021000 |2 bisacsh | |
082 | 7 | |a 005.133 |2 23 | |
049 | |a MAIN | ||
100 | 1 | |a Mukherjee, Sudipta. | |
245 | 1 | 0 | |a F♯ for machine learning essentials : |b get up and running with machine learning with F♯ in a fun and functional way / |c Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon. |
260 | |a Birmingham, England ; |a Mumbai [India] : |b Packt Publishing, |c 2016. | ||
300 | |a 1 online resource (194 pages) : |b color illustrations, tables. | ||
336 | |a text |b txt |2 rdacontent | ||
337 | |a computer |b c |2 rdamedia | ||
338 | |a online resource |b cr |2 rdacarrier | ||
490 | 1 | |a Community Experience Distilled | |
500 | |a Includes index. | ||
588 | 0 | |a Online resource; title from PDF title page (ebrary, viewed July 29, 2016). | |
505 | 0 | |a Cover -- Copyright -- Credits -- Foreword -- About the Author -- Acknowledgments -- About the Reviewers -- www.PacktPub.com -- Table of Contents -- Preface -- Chapter 1: Introduction to Machine Learning -- Objective -- Getting in touch -- Different areas where machine learning is being used -- Why use F#? -- Supervised machine learning -- Training and test dataset/corpus -- Some motivating real life examples of supervised learning -- Nearest Neighbour algorithm (a.k.a k-NN algorithm) -- Distance metrics -- Decision tree algorithms -- Unsupervised learning -- Machine learning frameworks -- Machine learning for fun and profit -- Recognizing handwritten digits -- your "Hello World" ML program -- How does this work? -- Summary -- Chapter 2: Linear Regression -- Objective -- Different types of linear regression algorithms -- APIs used -- Math.NET Numerics for F# 3.7.0 -- Getting Math.NET -- Experimenting with Math.NET -- The basics of matrices and vectors (a short and sweet refresher) -- Creating a vector -- Creating a matrix -- Finding the transpose of a matrix -- Finding the inverse of a matrix -- Trace of a matrix -- QR decomposition of a matrix -- SVD of a matrix -- Linear regression method of least square -- Finding linear regression coefficients using F# -- Finding the linear regression coefficients using Math.NET -- Putting it together with Math.NET and FsPlot -- Multiple linear regression -- Multiple linear regression and variations using Math.NET -- Weighted linear regression -- Plotting the result of multiple linear regression -- Ridge regression -- Multivariate multiple linear regression -- Feature scaling -- Summary -- Chapter 3: Classification Techniques -- Objective -- Different classification algorithms you will learn -- Some interesting things you can do -- Binary classification using k-NN -- How does it work?. | |
505 | 8 | |a Finding cancerous cells using k-NN: a case study -- Understanding logistic regression -- The sigmoid function chart -- Binary classification using logistic regression (using Accord.NET) -- Multiclass classification using logistic regression -- How does it work? -- Multiclass classification using decision trees -- Obtaining and using WekaSharp -- How does it work? -- Predicting a traffic jam using a decision tree: a case study -- Challenge yourself! -- Summary -- Chapter 4: Information Retrieval -- Objective -- Different IR algorithms you will learn -- What interesting things can you do? -- Information retrieval using tf-idf -- Measures of similarity -- Generating a PDF from a histogram -- Minkowski family -- L1 family -- Intersection family -- Inner Product family -- Fidelity family or squared-chord family -- Squared L2 family -- Shannon's Entropy family -- Similarity of asymmetric binary attributes -- Some example usages of distance metrics -- Finding similar cookies using asymmetric binary similarity measures -- Grouping/clustering color images based on Canberra distance -- Summary -- Chapter 5: Collaborative Filtering -- Objective -- Different classification algorithms you will learn -- Vocabulary of collaborative filtering -- Baseline predictors -- Basis of User-User collaborative filtering -- Implementing basic user-user collaborative filtering using F# -- Code walkthrough -- Variations of gap calculations and similarity measures -- Item-item collaborative filtering -- Top-N recommendations -- Evaluating recommendations -- Prediction accuracy -- Confusion matrix (decision support) -- Ranking accuracy metrics -- Prediction-rating correlation -- Working with real movie review data (Movie Lens) -- Summary -- Chapter 6: Sentiment Analysis -- Objective -- What you will learn -- A baseline algorithm for SA using SentiWordNet lexicons. | |
505 | 8 | |a Handling negations -- Identifying praise or criticism with sentiment orientation -- Pointwise Mutual Information -- Using SO-PMI to find sentiment analysis -- Summary -- Chapter 7: Anomaly Detection -- Objective -- Different classification algorithms -- Some cool things you will do -- The different types of anomalies -- Detecting point anomalies using IQR (Interquartile Range) -- Detecting point anomalies using Grubb's test -- Grubb's test for multivariate data using Mahalanobis distance -- Code walkthrough -- Chi-squared statistic to determine anomalies -- Detecting anomalies using density estimation -- Strategy to convert a collective anomaly to a point anomaly problem -- Dealing with categorical data in collective anomalies -- Summary -- Index. | |
650 | 0 | |a F♯ (Computer program language) |0 http://id.loc.gov/authorities/subjects/sh2008001530 | |
650 | 0 | |a Machine learning. |0 http://id.loc.gov/authorities/subjects/sh85079324 | |
650 | 6 | |a Apprentissage automatique. | |
650 | 7 | |a COMPUTERS / Databases / General. |2 bisacsh | |
650 | 7 | |a F♯ (Computer program language) |2 fast | |
650 | 7 | |a Machine learning |2 fast | |
700 | 1 | |a Herbrich, Ralf. | |
758 | |i has work: |a F♯ for machine learning essentials (Text) |1 https://id.oclc.org/worldcat/entity/E39PCGvBMRPryt6HXrxjMQ9TDy |4 https://id.oclc.org/worldcat/ontology/hasWork | ||
776 | 0 | 8 | |i Print version: |a Mukherjee, Sudipta. |t F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way. |d Birmingham, England ; Mumbai, [India] : Packt Publishing, ©2016 |h x, 169 pages |k Community experience distilled. |z 9781783989348 |
830 | 0 | |a Community experience distilled. |0 http://id.loc.gov/authorities/names/no2011030603 | |
856 | 4 | 0 | |l FWS01 |p ZDB-4-EBA |q FWS_PDA_EBA |u https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1191129 |3 Volltext |
938 | |a EBSCOhost |b EBSC |n 1191129 | ||
994 | |a 92 |b GEBAY | ||
912 | |a ZDB-4-EBA | ||
049 | |a DE-863 |
Datensatz im Suchindex
DE-BY-FWS_katkey | ZDB-4-EBA-ocn961851724 |
---|---|
_version_ | 1816882367642468352 |
adam_text | |
any_adam_object | |
author | Mukherjee, Sudipta |
author2 | Herbrich, Ralf |
author2_role | |
author2_variant | r h rh |
author_facet | Mukherjee, Sudipta Herbrich, Ralf |
author_role | |
author_sort | Mukherjee, Sudipta |
author_variant | s m sm |
building | Verbundindex |
bvnumber | localFWS |
callnumber-first | Q - Science |
callnumber-label | QA76 |
callnumber-raw | QA76.73.F16 .M854 2016eb |
callnumber-search | QA76.73.F16 .M854 2016eb |
callnumber-sort | QA 276.73 F16 M854 42016EB |
callnumber-subject | QA - Mathematics |
collection | ZDB-4-EBA |
contents | Cover -- Copyright -- Credits -- Foreword -- About the Author -- Acknowledgments -- About the Reviewers -- www.PacktPub.com -- Table of Contents -- Preface -- Chapter 1: Introduction to Machine Learning -- Objective -- Getting in touch -- Different areas where machine learning is being used -- Why use F#? -- Supervised machine learning -- Training and test dataset/corpus -- Some motivating real life examples of supervised learning -- Nearest Neighbour algorithm (a.k.a k-NN algorithm) -- Distance metrics -- Decision tree algorithms -- Unsupervised learning -- Machine learning frameworks -- Machine learning for fun and profit -- Recognizing handwritten digits -- your "Hello World" ML program -- How does this work? -- Summary -- Chapter 2: Linear Regression -- Objective -- Different types of linear regression algorithms -- APIs used -- Math.NET Numerics for F# 3.7.0 -- Getting Math.NET -- Experimenting with Math.NET -- The basics of matrices and vectors (a short and sweet refresher) -- Creating a vector -- Creating a matrix -- Finding the transpose of a matrix -- Finding the inverse of a matrix -- Trace of a matrix -- QR decomposition of a matrix -- SVD of a matrix -- Linear regression method of least square -- Finding linear regression coefficients using F# -- Finding the linear regression coefficients using Math.NET -- Putting it together with Math.NET and FsPlot -- Multiple linear regression -- Multiple linear regression and variations using Math.NET -- Weighted linear regression -- Plotting the result of multiple linear regression -- Ridge regression -- Multivariate multiple linear regression -- Feature scaling -- Summary -- Chapter 3: Classification Techniques -- Objective -- Different classification algorithms you will learn -- Some interesting things you can do -- Binary classification using k-NN -- How does it work?. Finding cancerous cells using k-NN: a case study -- Understanding logistic regression -- The sigmoid function chart -- Binary classification using logistic regression (using Accord.NET) -- Multiclass classification using logistic regression -- How does it work? -- Multiclass classification using decision trees -- Obtaining and using WekaSharp -- How does it work? -- Predicting a traffic jam using a decision tree: a case study -- Challenge yourself! -- Summary -- Chapter 4: Information Retrieval -- Objective -- Different IR algorithms you will learn -- What interesting things can you do? -- Information retrieval using tf-idf -- Measures of similarity -- Generating a PDF from a histogram -- Minkowski family -- L1 family -- Intersection family -- Inner Product family -- Fidelity family or squared-chord family -- Squared L2 family -- Shannon's Entropy family -- Similarity of asymmetric binary attributes -- Some example usages of distance metrics -- Finding similar cookies using asymmetric binary similarity measures -- Grouping/clustering color images based on Canberra distance -- Summary -- Chapter 5: Collaborative Filtering -- Objective -- Different classification algorithms you will learn -- Vocabulary of collaborative filtering -- Baseline predictors -- Basis of User-User collaborative filtering -- Implementing basic user-user collaborative filtering using F# -- Code walkthrough -- Variations of gap calculations and similarity measures -- Item-item collaborative filtering -- Top-N recommendations -- Evaluating recommendations -- Prediction accuracy -- Confusion matrix (decision support) -- Ranking accuracy metrics -- Prediction-rating correlation -- Working with real movie review data (Movie Lens) -- Summary -- Chapter 6: Sentiment Analysis -- Objective -- What you will learn -- A baseline algorithm for SA using SentiWordNet lexicons. Handling negations -- Identifying praise or criticism with sentiment orientation -- Pointwise Mutual Information -- Using SO-PMI to find sentiment analysis -- Summary -- Chapter 7: Anomaly Detection -- Objective -- Different classification algorithms -- Some cool things you will do -- The different types of anomalies -- Detecting point anomalies using IQR (Interquartile Range) -- Detecting point anomalies using Grubb's test -- Grubb's test for multivariate data using Mahalanobis distance -- Code walkthrough -- Chi-squared statistic to determine anomalies -- Detecting anomalies using density estimation -- Strategy to convert a collective anomaly to a point anomaly problem -- Dealing with categorical data in collective anomalies -- Summary -- Index. |
ctrlnum | (OCoLC)961851724 |
dewey-full | 005.133 |
dewey-hundreds | 000 - Computer science, information, general works |
dewey-ones | 005 - Computer programming, programs, data, security |
dewey-raw | 005.133 |
dewey-search | 005.133 |
dewey-sort | 15.133 |
dewey-tens | 000 - Computer science, information, general works |
discipline | Informatik |
format | Electronic eBook |
fullrecord | <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>07078cam a2200541Mi 4500</leader><controlfield tag="001">ZDB-4-EBA-ocn961851724</controlfield><controlfield tag="003">OCoLC</controlfield><controlfield tag="005">20241004212047.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr |n|||||||||</controlfield><controlfield tag="008">160226t20162016enka o 001 0 eng d</controlfield><datafield tag="040" ind1=" " ind2=" "><subfield code="a">VT2</subfield><subfield code="b">eng</subfield><subfield code="e">pn</subfield><subfield code="c">VT2</subfield><subfield code="d">OCLCO</subfield><subfield code="d">COO</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCF</subfield><subfield code="d">UOK</subfield><subfield code="d">N$T</subfield><subfield code="d">LVT</subfield><subfield code="d">G3B</subfield><subfield code="d">IGB</subfield><subfield code="d">STF</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCQ</subfield><subfield code="d">OCLCO</subfield><subfield code="d">OCLCL</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">9781783989355</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">1783989351</subfield><subfield code="q">(electronic bk.)</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">9781783989348</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">1783989343</subfield></datafield><datafield tag="020" ind1=" " ind2=" "><subfield code="z">1783989351</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(OCoLC)961851724</subfield></datafield><datafield tag="050" ind1=" " ind2="4"><subfield code="a">QA76.73.F16</subfield><subfield code="b">.M854 2016eb</subfield></datafield><datafield tag="072" ind1=" " ind2="7"><subfield code="a">COM</subfield><subfield code="x">021000</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="082" ind1="7" ind2=" "><subfield code="a">005.133</subfield><subfield code="2">23</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">MAIN</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Mukherjee, Sudipta.</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">F♯ for machine learning essentials :</subfield><subfield code="b">get up and running with machine learning with F♯ in a fun and functional way /</subfield><subfield code="c">Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon.</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">Birmingham, England ;</subfield><subfield code="a">Mumbai [India] :</subfield><subfield code="b">Packt Publishing,</subfield><subfield code="c">2016.</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 online resource (194 pages) :</subfield><subfield code="b">color illustrations, tables.</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="490" ind1="1" ind2=" "><subfield code="a">Community Experience Distilled</subfield></datafield><datafield tag="500" ind1=" " ind2=" "><subfield code="a">Includes index.</subfield></datafield><datafield tag="588" ind1="0" ind2=" "><subfield code="a">Online resource; title from PDF title page (ebrary, viewed July 29, 2016).</subfield></datafield><datafield tag="505" ind1="0" ind2=" "><subfield code="a">Cover -- Copyright -- Credits -- Foreword -- About the Author -- Acknowledgments -- About the Reviewers -- www.PacktPub.com -- Table of Contents -- Preface -- Chapter 1: Introduction to Machine Learning -- Objective -- Getting in touch -- Different areas where machine learning is being used -- Why use F#? -- Supervised machine learning -- Training and test dataset/corpus -- Some motivating real life examples of supervised learning -- Nearest Neighbour algorithm (a.k.a k-NN algorithm) -- Distance metrics -- Decision tree algorithms -- Unsupervised learning -- Machine learning frameworks -- Machine learning for fun and profit -- Recognizing handwritten digits -- your "Hello World" ML program -- How does this work? -- Summary -- Chapter 2: Linear Regression -- Objective -- Different types of linear regression algorithms -- APIs used -- Math.NET Numerics for F# 3.7.0 -- Getting Math.NET -- Experimenting with Math.NET -- The basics of matrices and vectors (a short and sweet refresher) -- Creating a vector -- Creating a matrix -- Finding the transpose of a matrix -- Finding the inverse of a matrix -- Trace of a matrix -- QR decomposition of a matrix -- SVD of a matrix -- Linear regression method of least square -- Finding linear regression coefficients using F# -- Finding the linear regression coefficients using Math.NET -- Putting it together with Math.NET and FsPlot -- Multiple linear regression -- Multiple linear regression and variations using Math.NET -- Weighted linear regression -- Plotting the result of multiple linear regression -- Ridge regression -- Multivariate multiple linear regression -- Feature scaling -- Summary -- Chapter 3: Classification Techniques -- Objective -- Different classification algorithms you will learn -- Some interesting things you can do -- Binary classification using k-NN -- How does it work?.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Finding cancerous cells using k-NN: a case study -- Understanding logistic regression -- The sigmoid function chart -- Binary classification using logistic regression (using Accord.NET) -- Multiclass classification using logistic regression -- How does it work? -- Multiclass classification using decision trees -- Obtaining and using WekaSharp -- How does it work? -- Predicting a traffic jam using a decision tree: a case study -- Challenge yourself! -- Summary -- Chapter 4: Information Retrieval -- Objective -- Different IR algorithms you will learn -- What interesting things can you do? -- Information retrieval using tf-idf -- Measures of similarity -- Generating a PDF from a histogram -- Minkowski family -- L1 family -- Intersection family -- Inner Product family -- Fidelity family or squared-chord family -- Squared L2 family -- Shannon's Entropy family -- Similarity of asymmetric binary attributes -- Some example usages of distance metrics -- Finding similar cookies using asymmetric binary similarity measures -- Grouping/clustering color images based on Canberra distance -- Summary -- Chapter 5: Collaborative Filtering -- Objective -- Different classification algorithms you will learn -- Vocabulary of collaborative filtering -- Baseline predictors -- Basis of User-User collaborative filtering -- Implementing basic user-user collaborative filtering using F# -- Code walkthrough -- Variations of gap calculations and similarity measures -- Item-item collaborative filtering -- Top-N recommendations -- Evaluating recommendations -- Prediction accuracy -- Confusion matrix (decision support) -- Ranking accuracy metrics -- Prediction-rating correlation -- Working with real movie review data (Movie Lens) -- Summary -- Chapter 6: Sentiment Analysis -- Objective -- What you will learn -- A baseline algorithm for SA using SentiWordNet lexicons.</subfield></datafield><datafield tag="505" ind1="8" ind2=" "><subfield code="a">Handling negations -- Identifying praise or criticism with sentiment orientation -- Pointwise Mutual Information -- Using SO-PMI to find sentiment analysis -- Summary -- Chapter 7: Anomaly Detection -- Objective -- Different classification algorithms -- Some cool things you will do -- The different types of anomalies -- Detecting point anomalies using IQR (Interquartile Range) -- Detecting point anomalies using Grubb's test -- Grubb's test for multivariate data using Mahalanobis distance -- Code walkthrough -- Chi-squared statistic to determine anomalies -- Detecting anomalies using density estimation -- Strategy to convert a collective anomaly to a point anomaly problem -- Dealing with categorical data in collective anomalies -- Summary -- Index.</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">F♯ (Computer program language)</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh2008001530</subfield></datafield><datafield tag="650" ind1=" " ind2="0"><subfield code="a">Machine learning.</subfield><subfield code="0">http://id.loc.gov/authorities/subjects/sh85079324</subfield></datafield><datafield tag="650" ind1=" " ind2="6"><subfield code="a">Apprentissage automatique.</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">COMPUTERS / Databases / General.</subfield><subfield code="2">bisacsh</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">F♯ (Computer program language)</subfield><subfield code="2">fast</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Machine learning</subfield><subfield code="2">fast</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Herbrich, Ralf.</subfield></datafield><datafield tag="758" ind1=" " ind2=" "><subfield code="i">has work:</subfield><subfield code="a">F♯ for machine learning essentials (Text)</subfield><subfield code="1">https://id.oclc.org/worldcat/entity/E39PCGvBMRPryt6HXrxjMQ9TDy</subfield><subfield code="4">https://id.oclc.org/worldcat/ontology/hasWork</subfield></datafield><datafield tag="776" ind1="0" ind2="8"><subfield code="i">Print version:</subfield><subfield code="a">Mukherjee, Sudipta.</subfield><subfield code="t">F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way.</subfield><subfield code="d">Birmingham, England ; Mumbai, [India] : Packt Publishing, ©2016</subfield><subfield code="h">x, 169 pages</subfield><subfield code="k">Community experience distilled.</subfield><subfield code="z">9781783989348</subfield></datafield><datafield tag="830" ind1=" " ind2="0"><subfield code="a">Community experience distilled.</subfield><subfield code="0">http://id.loc.gov/authorities/names/no2011030603</subfield></datafield><datafield tag="856" ind1="4" ind2="0"><subfield code="l">FWS01</subfield><subfield code="p">ZDB-4-EBA</subfield><subfield code="q">FWS_PDA_EBA</subfield><subfield code="u">https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1191129</subfield><subfield code="3">Volltext</subfield></datafield><datafield tag="938" ind1=" " ind2=" "><subfield code="a">EBSCOhost</subfield><subfield code="b">EBSC</subfield><subfield code="n">1191129</subfield></datafield><datafield tag="994" ind1=" " ind2=" "><subfield code="a">92</subfield><subfield code="b">GEBAY</subfield></datafield><datafield tag="912" ind1=" " ind2=" "><subfield code="a">ZDB-4-EBA</subfield></datafield><datafield tag="049" ind1=" " ind2=" "><subfield code="a">DE-863</subfield></datafield></record></collection> |
id | ZDB-4-EBA-ocn961851724 |
illustrated | Illustrated |
indexdate | 2024-11-27T13:27:29Z |
institution | BVB |
isbn | 9781783989355 1783989351 |
language | English |
oclc_num | 961851724 |
open_access_boolean | |
owner | MAIN DE-863 DE-BY-FWS |
owner_facet | MAIN DE-863 DE-BY-FWS |
physical | 1 online resource (194 pages) : color illustrations, tables. |
psigel | ZDB-4-EBA |
publishDate | 2016 |
publishDateSearch | 2016 |
publishDateSort | 2016 |
publisher | Packt Publishing, |
record_format | marc |
series | Community experience distilled. |
series2 | Community Experience Distilled |
spelling | Mukherjee, Sudipta. F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon. Birmingham, England ; Mumbai [India] : Packt Publishing, 2016. 1 online resource (194 pages) : color illustrations, tables. text txt rdacontent computer c rdamedia online resource cr rdacarrier Community Experience Distilled Includes index. Online resource; title from PDF title page (ebrary, viewed July 29, 2016). Cover -- Copyright -- Credits -- Foreword -- About the Author -- Acknowledgments -- About the Reviewers -- www.PacktPub.com -- Table of Contents -- Preface -- Chapter 1: Introduction to Machine Learning -- Objective -- Getting in touch -- Different areas where machine learning is being used -- Why use F#? -- Supervised machine learning -- Training and test dataset/corpus -- Some motivating real life examples of supervised learning -- Nearest Neighbour algorithm (a.k.a k-NN algorithm) -- Distance metrics -- Decision tree algorithms -- Unsupervised learning -- Machine learning frameworks -- Machine learning for fun and profit -- Recognizing handwritten digits -- your "Hello World" ML program -- How does this work? -- Summary -- Chapter 2: Linear Regression -- Objective -- Different types of linear regression algorithms -- APIs used -- Math.NET Numerics for F# 3.7.0 -- Getting Math.NET -- Experimenting with Math.NET -- The basics of matrices and vectors (a short and sweet refresher) -- Creating a vector -- Creating a matrix -- Finding the transpose of a matrix -- Finding the inverse of a matrix -- Trace of a matrix -- QR decomposition of a matrix -- SVD of a matrix -- Linear regression method of least square -- Finding linear regression coefficients using F# -- Finding the linear regression coefficients using Math.NET -- Putting it together with Math.NET and FsPlot -- Multiple linear regression -- Multiple linear regression and variations using Math.NET -- Weighted linear regression -- Plotting the result of multiple linear regression -- Ridge regression -- Multivariate multiple linear regression -- Feature scaling -- Summary -- Chapter 3: Classification Techniques -- Objective -- Different classification algorithms you will learn -- Some interesting things you can do -- Binary classification using k-NN -- How does it work?. Finding cancerous cells using k-NN: a case study -- Understanding logistic regression -- The sigmoid function chart -- Binary classification using logistic regression (using Accord.NET) -- Multiclass classification using logistic regression -- How does it work? -- Multiclass classification using decision trees -- Obtaining and using WekaSharp -- How does it work? -- Predicting a traffic jam using a decision tree: a case study -- Challenge yourself! -- Summary -- Chapter 4: Information Retrieval -- Objective -- Different IR algorithms you will learn -- What interesting things can you do? -- Information retrieval using tf-idf -- Measures of similarity -- Generating a PDF from a histogram -- Minkowski family -- L1 family -- Intersection family -- Inner Product family -- Fidelity family or squared-chord family -- Squared L2 family -- Shannon's Entropy family -- Similarity of asymmetric binary attributes -- Some example usages of distance metrics -- Finding similar cookies using asymmetric binary similarity measures -- Grouping/clustering color images based on Canberra distance -- Summary -- Chapter 5: Collaborative Filtering -- Objective -- Different classification algorithms you will learn -- Vocabulary of collaborative filtering -- Baseline predictors -- Basis of User-User collaborative filtering -- Implementing basic user-user collaborative filtering using F# -- Code walkthrough -- Variations of gap calculations and similarity measures -- Item-item collaborative filtering -- Top-N recommendations -- Evaluating recommendations -- Prediction accuracy -- Confusion matrix (decision support) -- Ranking accuracy metrics -- Prediction-rating correlation -- Working with real movie review data (Movie Lens) -- Summary -- Chapter 6: Sentiment Analysis -- Objective -- What you will learn -- A baseline algorithm for SA using SentiWordNet lexicons. Handling negations -- Identifying praise or criticism with sentiment orientation -- Pointwise Mutual Information -- Using SO-PMI to find sentiment analysis -- Summary -- Chapter 7: Anomaly Detection -- Objective -- Different classification algorithms -- Some cool things you will do -- The different types of anomalies -- Detecting point anomalies using IQR (Interquartile Range) -- Detecting point anomalies using Grubb's test -- Grubb's test for multivariate data using Mahalanobis distance -- Code walkthrough -- Chi-squared statistic to determine anomalies -- Detecting anomalies using density estimation -- Strategy to convert a collective anomaly to a point anomaly problem -- Dealing with categorical data in collective anomalies -- Summary -- Index. F♯ (Computer program language) http://id.loc.gov/authorities/subjects/sh2008001530 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Apprentissage automatique. COMPUTERS / Databases / General. bisacsh F♯ (Computer program language) fast Machine learning fast Herbrich, Ralf. has work: F♯ for machine learning essentials (Text) https://id.oclc.org/worldcat/entity/E39PCGvBMRPryt6HXrxjMQ9TDy https://id.oclc.org/worldcat/ontology/hasWork Print version: Mukherjee, Sudipta. F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way. Birmingham, England ; Mumbai, [India] : Packt Publishing, ©2016 x, 169 pages Community experience distilled. 9781783989348 Community experience distilled. http://id.loc.gov/authorities/names/no2011030603 FWS01 ZDB-4-EBA FWS_PDA_EBA https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1191129 Volltext |
spellingShingle | Mukherjee, Sudipta F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / Community experience distilled. Cover -- Copyright -- Credits -- Foreword -- About the Author -- Acknowledgments -- About the Reviewers -- www.PacktPub.com -- Table of Contents -- Preface -- Chapter 1: Introduction to Machine Learning -- Objective -- Getting in touch -- Different areas where machine learning is being used -- Why use F#? -- Supervised machine learning -- Training and test dataset/corpus -- Some motivating real life examples of supervised learning -- Nearest Neighbour algorithm (a.k.a k-NN algorithm) -- Distance metrics -- Decision tree algorithms -- Unsupervised learning -- Machine learning frameworks -- Machine learning for fun and profit -- Recognizing handwritten digits -- your "Hello World" ML program -- How does this work? -- Summary -- Chapter 2: Linear Regression -- Objective -- Different types of linear regression algorithms -- APIs used -- Math.NET Numerics for F# 3.7.0 -- Getting Math.NET -- Experimenting with Math.NET -- The basics of matrices and vectors (a short and sweet refresher) -- Creating a vector -- Creating a matrix -- Finding the transpose of a matrix -- Finding the inverse of a matrix -- Trace of a matrix -- QR decomposition of a matrix -- SVD of a matrix -- Linear regression method of least square -- Finding linear regression coefficients using F# -- Finding the linear regression coefficients using Math.NET -- Putting it together with Math.NET and FsPlot -- Multiple linear regression -- Multiple linear regression and variations using Math.NET -- Weighted linear regression -- Plotting the result of multiple linear regression -- Ridge regression -- Multivariate multiple linear regression -- Feature scaling -- Summary -- Chapter 3: Classification Techniques -- Objective -- Different classification algorithms you will learn -- Some interesting things you can do -- Binary classification using k-NN -- How does it work?. Finding cancerous cells using k-NN: a case study -- Understanding logistic regression -- The sigmoid function chart -- Binary classification using logistic regression (using Accord.NET) -- Multiclass classification using logistic regression -- How does it work? -- Multiclass classification using decision trees -- Obtaining and using WekaSharp -- How does it work? -- Predicting a traffic jam using a decision tree: a case study -- Challenge yourself! -- Summary -- Chapter 4: Information Retrieval -- Objective -- Different IR algorithms you will learn -- What interesting things can you do? -- Information retrieval using tf-idf -- Measures of similarity -- Generating a PDF from a histogram -- Minkowski family -- L1 family -- Intersection family -- Inner Product family -- Fidelity family or squared-chord family -- Squared L2 family -- Shannon's Entropy family -- Similarity of asymmetric binary attributes -- Some example usages of distance metrics -- Finding similar cookies using asymmetric binary similarity measures -- Grouping/clustering color images based on Canberra distance -- Summary -- Chapter 5: Collaborative Filtering -- Objective -- Different classification algorithms you will learn -- Vocabulary of collaborative filtering -- Baseline predictors -- Basis of User-User collaborative filtering -- Implementing basic user-user collaborative filtering using F# -- Code walkthrough -- Variations of gap calculations and similarity measures -- Item-item collaborative filtering -- Top-N recommendations -- Evaluating recommendations -- Prediction accuracy -- Confusion matrix (decision support) -- Ranking accuracy metrics -- Prediction-rating correlation -- Working with real movie review data (Movie Lens) -- Summary -- Chapter 6: Sentiment Analysis -- Objective -- What you will learn -- A baseline algorithm for SA using SentiWordNet lexicons. Handling negations -- Identifying praise or criticism with sentiment orientation -- Pointwise Mutual Information -- Using SO-PMI to find sentiment analysis -- Summary -- Chapter 7: Anomaly Detection -- Objective -- Different classification algorithms -- Some cool things you will do -- The different types of anomalies -- Detecting point anomalies using IQR (Interquartile Range) -- Detecting point anomalies using Grubb's test -- Grubb's test for multivariate data using Mahalanobis distance -- Code walkthrough -- Chi-squared statistic to determine anomalies -- Detecting anomalies using density estimation -- Strategy to convert a collective anomaly to a point anomaly problem -- Dealing with categorical data in collective anomalies -- Summary -- Index. F♯ (Computer program language) http://id.loc.gov/authorities/subjects/sh2008001530 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Apprentissage automatique. COMPUTERS / Databases / General. bisacsh F♯ (Computer program language) fast Machine learning fast |
subject_GND | http://id.loc.gov/authorities/subjects/sh2008001530 http://id.loc.gov/authorities/subjects/sh85079324 |
title | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / |
title_auth | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / |
title_exact_search | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / |
title_full | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon. |
title_fullStr | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon. |
title_full_unstemmed | F♯ for machine learning essentials : get up and running with machine learning with F♯ in a fun and functional way / Sudipta Mukherjee ; foreword by Dr. Ralf Herbrich, director of machine learning science at Amazon. |
title_short | F♯ for machine learning essentials : |
title_sort | f♯ for machine learning essentials get up and running with machine learning with f♯ in a fun and functional way |
title_sub | get up and running with machine learning with F♯ in a fun and functional way / |
topic | F♯ (Computer program language) http://id.loc.gov/authorities/subjects/sh2008001530 Machine learning. http://id.loc.gov/authorities/subjects/sh85079324 Apprentissage automatique. COMPUTERS / Databases / General. bisacsh F♯ (Computer program language) fast Machine learning fast |
topic_facet | F♯ (Computer program language) Machine learning. Apprentissage automatique. COMPUTERS / Databases / General. Machine learning |
url | https://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&AN=1191129 |
work_keys_str_mv | AT mukherjeesudipta fformachinelearningessentialsgetupandrunningwithmachinelearningwithfinafunandfunctionalway AT herbrichralf fformachinelearningessentialsgetupandrunningwithmachinelearningwithfinafunandfunctionalway |