Library Logo
Normal view MARC view ISBD view

Conformal prediction for reliable machine learning : theory, adaptations, and applications / [electronic resource]

by Balasubramanian, Vineeth [editor.]; Ho, Shen-Shyang [editor.]; Vovk, Vladimir [editor.].
Material type: materialTypeLabelBookPublisher: Amsterdam ; Boston : Morgan Kaufmann, �2014Description: 1 online resource.ISBN: 9780124017153; 0124017150; 1306697484; 9781306697484.Subject(s): Machine learning | COMPUTERS -- General | Machine learning | Electronic books | Llibres electr�onics | Electronic bookOnline resources: ScienceDirect
Contents:
Half Title; Title Page; Copyright; Copyright Permissions; Contents; Contributing Authors; Foreword; Preface; Book Organization; Part I: Theory; Part II: Adaptations; Part III: Applications; Companion Website; Contacting Us; Acknowledgments; Part I: Theory; 1 The Basic Conformal Prediction Framework; 1.1 The Basic Setting and Assumptions; 1.2 Set and Confidence Predictors; 1.2.1 Validity and Efficiency of Set and Confidence Predictors; 1.3 Conformal Prediction; 1.3.1 The Binary Case; 1.3.2 The Gaussian Case; 1.4 Efficiency in the Case of Prediction without Objects.
1.5 Universality of Conformal Predictors1.6 Structured Case and Classification; 1.7 Regression; 1.8 Additional Properties of Validity and Efficiency in the Online Framework; 1.8.1 Asymptotically Efficient Conformal Predictors; Acknowledgments; 2 Beyond the Basic Conformal Prediction Framework; 2.1 Conditional Validity; 2.2 Conditional Conformal Predictors; 2.2.1 Venn's Dilemma; 2.3 Inductive Conformal Predictors; 2.3.1 Conditional Inductive Conformal Predictors; 2.4 Training Conditional Validity of Inductive Conformal Predictors; 2.5 Classical Tolerance Regions.
2.6 Object Conditional Validity and Efficiency2.6.1 Negative Result; 2.6.2 Positive Results; 2.7 Label Conditional Validity and ROC Curves; 2.8 Venn Predictors; 2.8.1 Inductive Venn Predictors; 2.8.2 Venn Prediction without Objects; Acknowledgments; Part II: Adaptations; 3 Active Learning; 3.1 Introduction; 3.2 Background and Related Work; 3.2.1 Pool-based Active Learning with Serial Query; SVM-based methods; Statistical methods; Ensemble-based methods; Other methods; 3.2.2 Batch Mode Active Learning; 3.2.3 Online Active Learning; 3.3 Active Learning Using Conformal Prediction.
3.3.1 Query by Transduction (QBT)Algorithmic formulation; 3.3.2 Generalized Query by Transduction; Algorithmic formulation; Combining multiple criteria in GQBT; 3.3.3 Multicriteria Extension to QBT; 3.4 Experimental Results; 3.4.1 Benchmark Datasets; 3.4.2 Application to Face Recognition; 3.4.3 Multicriteria Extension to QBT; 3.5 Discussion and Conclusions; Acknowledgments; 4 Anomaly Detection; 4.1 Introduction; 4.2 Background; 4.3 Conformal Prediction for Multiclass Anomaly Detection; 4.3.1 A Nonconformity Measure for Multiclass Anomaly Detection; 4.4 Conformal Anomaly Detection.
4.4.1 Conformal Anomalies4.4.2 Offline versus Online Conformal Anomaly Detection; 4.4.3 Unsupervised and Semi-supervised Conformal Anomaly Detection; 4.4.4 Classification Performance and Tuning of the Anomaly Threshold; 4.5 Inductive Conformal Anomaly Detection; 4.5.1 Offline and Semi-Offline Inductive Conformal Anomaly Detection; 4.5.2 Online Inductive Conformal Anomaly Detection; 4.6 Nonconformity Measures for Examples Represented as Sets of Points; 4.6.1 The Directed Hausdorff Distance; 4.6.2 The Directed Hausdorff k-Nearest Neighbors Nonconformity Measure.
Summary: "Traditional, low-dimensional, small scale data have been successfully dealt with using conventional software engineering and classical statistical methods, such as discriminant analysis, neural networks, genetic algorithms and others. But the change of scale in data collection and the dimensionality of modern data sets has profound implications on the type of analysis that can be done. Recently several kernel-based machine learning algorithms have been developed for dealing with high-dimensional problems, where a large number of features could cause a combinatorial explosion. These methods are quickly gaining popularity, and it is widely believed that they will help to meet the challenge of analysing very large data sets. Learning machines often perform well in a wide range of applications and have nice theoretical properties without requiring any parametric statistical assumption about the source of data (unlike traditional statistical techniques). However, a typical drawback of many machine learning algorithms is that they usually do not provide any useful measure of confidence in the predicted labels of new, unclassifed examples. Confidence estimation is a well-studied area of both parametric and non-parametric statistics; however, usually only low-dimensional problems are considered"-- Provided by publisher.
Tags from this library: No tags from this library for this title. Add tag(s)
Log in to add tags.
    average rating: 0.0 (0 votes)
No physical items for this record

"Traditional, low-dimensional, small scale data have been successfully dealt with using conventional software engineering and classical statistical methods, such as discriminant analysis, neural networks, genetic algorithms and others. But the change of scale in data collection and the dimensionality of modern data sets has profound implications on the type of analysis that can be done. Recently several kernel-based machine learning algorithms have been developed for dealing with high-dimensional problems, where a large number of features could cause a combinatorial explosion. These methods are quickly gaining popularity, and it is widely believed that they will help to meet the challenge of analysing very large data sets. Learning machines often perform well in a wide range of applications and have nice theoretical properties without requiring any parametric statistical assumption about the source of data (unlike traditional statistical techniques). However, a typical drawback of many machine learning algorithms is that they usually do not provide any useful measure of confidence in the predicted labels of new, unclassifed examples. Confidence estimation is a well-studied area of both parametric and non-parametric statistics; however, usually only low-dimensional problems are considered"-- Provided by publisher.

Includes bibliographical references and index.

Print version record.

Half Title; Title Page; Copyright; Copyright Permissions; Contents; Contributing Authors; Foreword; Preface; Book Organization; Part I: Theory; Part II: Adaptations; Part III: Applications; Companion Website; Contacting Us; Acknowledgments; Part I: Theory; 1 The Basic Conformal Prediction Framework; 1.1 The Basic Setting and Assumptions; 1.2 Set and Confidence Predictors; 1.2.1 Validity and Efficiency of Set and Confidence Predictors; 1.3 Conformal Prediction; 1.3.1 The Binary Case; 1.3.2 The Gaussian Case; 1.4 Efficiency in the Case of Prediction without Objects.

1.5 Universality of Conformal Predictors1.6 Structured Case and Classification; 1.7 Regression; 1.8 Additional Properties of Validity and Efficiency in the Online Framework; 1.8.1 Asymptotically Efficient Conformal Predictors; Acknowledgments; 2 Beyond the Basic Conformal Prediction Framework; 2.1 Conditional Validity; 2.2 Conditional Conformal Predictors; 2.2.1 Venn's Dilemma; 2.3 Inductive Conformal Predictors; 2.3.1 Conditional Inductive Conformal Predictors; 2.4 Training Conditional Validity of Inductive Conformal Predictors; 2.5 Classical Tolerance Regions.

2.6 Object Conditional Validity and Efficiency2.6.1 Negative Result; 2.6.2 Positive Results; 2.7 Label Conditional Validity and ROC Curves; 2.8 Venn Predictors; 2.8.1 Inductive Venn Predictors; 2.8.2 Venn Prediction without Objects; Acknowledgments; Part II: Adaptations; 3 Active Learning; 3.1 Introduction; 3.2 Background and Related Work; 3.2.1 Pool-based Active Learning with Serial Query; SVM-based methods; Statistical methods; Ensemble-based methods; Other methods; 3.2.2 Batch Mode Active Learning; 3.2.3 Online Active Learning; 3.3 Active Learning Using Conformal Prediction.

3.3.1 Query by Transduction (QBT)Algorithmic formulation; 3.3.2 Generalized Query by Transduction; Algorithmic formulation; Combining multiple criteria in GQBT; 3.3.3 Multicriteria Extension to QBT; 3.4 Experimental Results; 3.4.1 Benchmark Datasets; 3.4.2 Application to Face Recognition; 3.4.3 Multicriteria Extension to QBT; 3.5 Discussion and Conclusions; Acknowledgments; 4 Anomaly Detection; 4.1 Introduction; 4.2 Background; 4.3 Conformal Prediction for Multiclass Anomaly Detection; 4.3.1 A Nonconformity Measure for Multiclass Anomaly Detection; 4.4 Conformal Anomaly Detection.

4.4.1 Conformal Anomalies4.4.2 Offline versus Online Conformal Anomaly Detection; 4.4.3 Unsupervised and Semi-supervised Conformal Anomaly Detection; 4.4.4 Classification Performance and Tuning of the Anomaly Threshold; 4.5 Inductive Conformal Anomaly Detection; 4.5.1 Offline and Semi-Offline Inductive Conformal Anomaly Detection; 4.5.2 Online Inductive Conformal Anomaly Detection; 4.6 Nonconformity Measures for Examples Represented as Sets of Points; 4.6.1 The Directed Hausdorff Distance; 4.6.2 The Directed Hausdorff k-Nearest Neighbors Nonconformity Measure.

There are no comments for this item.

Log in to your account to post a comment.
Last Updated on September 15, 2019
© Dhaka University Library. All Rights Reserved|Staff Login