Amazon cover image
Image from Amazon.com

An elementary introduction to statistical learning theory / Sanjeev Kulkarni, Gilbert Harman.

By: Contributor(s): Material type: TextTextSeries: Wiley series in probability and statisticsPublication details: Hoboken, N.J. : Wiley, ©2011.Description: 1 online resource (1 volume) : illustrationsContent type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9781118023471
  • 1118023471
  • 9781118023433
  • 1118023439
  • 1283098687
  • 9781283098687
Subject(s): Genre/Form: Additional physical formats: Print version:: Elementary introduction to statistical learning theory.DDC classification:
  • 006.3/1 22
LOC classification:
  • Q325.5 .K85 2011
NLM classification:
  • Q 325.5
Online resources:
Contents:
Introduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting.
Summary: "A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference"--Back cover.
Tags from this library: No tags from this library for this title. Log in to add tags.
No physical items for this record

Introduction: Classification, Learning, Features, and Applications -- Probability -- Probability Densities -- The Pattern Recognition Problem -- The Optimal Bayes Decision Rule -- Learning from Examples -- The Nearest Neighbor Rule -- Kernel Rules -- Neural Networks: Perceptrons -- Multilayer Networks -- PAC Learning -- VC Dimension -- Infinite VC Dimension -- The Function Estimation Problem -- Learning Function Estimation -- Simplicity -- Support Vector Machines -- Boosting.

"A joint endeavor from leading researchers in the fields of philosophy and electrical engineering An Introduction to Statistical Learning Theory provides a broad and accessible introduction to rapidly evolving field of statistical pattern recognition and statistical learning theory. Exploring topics that are not often covered in introductory level books on statistical learning theory, including PAC learning, VC dimension, and simplicity, the authors present upper-undergraduate and graduate levels with the basic theory behind contemporary machine learning and uniquely suggest it serves as an excellent framework for philosophical thinking about inductive inference"--Back cover.

Includes bibliographical references and index.

Print version record.

Electrical & Electronic Engineering