SANJEEV SHARMA, 12th Nov 2010: Machine Learning: Lecture11: Kernel Perceptron Learning. CONTENTS:...
This video will teach you the 4 fundamental learning methods to employ when preparing for your exam ...
This video will teach you the 4 fundamental learning methods to employ when preparing for your exam ...
June 1, 2009  Leonard Susskind presents the final lecture of Statistical Mechanics 10. In this lect...
March 30, 2009  Leonard Susskind discusses the study of statistical analysis as calculating the pro...
citeseer 
(0) (0 Votes)

Views: (1047) Date: (130509) Pages: () 
Abstract: We present a class of statistical learning algorithms formulated in terms of minimizing Bregman distances, a family of generalized entropy measures associated with convex functions. The inductive learning scheme is akin to growing a decision tree, with the Bregman distance filling the role of the impurity function in treebased classifiers. Our approach is based on two components. In the feature selection step, each linear constraint in a pool of candidate features is evaluated by the reduction in Bregman distance that would result from adding it to the model. In the constraint satisfaction step, all of the parameters are adjusted to minimize the Bregman distance subject to the chosen constraints. We introduce a new iterative estimation algorithm for carrying out both the feature selection and constraint satisfaction steps, and outline a proof of the convergence of these algorithms. 1 Introduction In this paper we present a class of statistical learning algorithms formulated in terms...
Powered free by PHPmotion 