THEORY OF LEARNING

2004:

Poggio, T., R. Rifkin, S. Mukherjee and P. Niyogi. General Conditions for Predictivity in Learning Theory, Nature, Vol. 428, 419-422, 2004.

Rakhlin, A., S. Mukherjee, and T. Poggio. On Stability and Concentration of Measure, CBCL Paper #239, Massachusetts Institute of Technology, Cambridge, MA, June 2004.

Rakhlin, A., D. Panchencko and S. Mukherjee. Risk Bounds for Mixture Density Estimation, CBCL Paper #233/AI Memo #2004-001, Massachusetts Institute of Technology, Cambridge, MA, January, 2004.

Rifkin, R. and A. Klautau. In Defense of One-vs-All Classification, Journal of Machine Learning Research, Vol. 5, 101-141, 2004.

2003:

Poggio, T. and S. Smale. The Mathematics of Learning: Dealing with Data, Notices of the American Mathematical Society (AMS), Vol. 50, No. 5, 537-544, 2003. (See journal issue at AMS Notices.)

2002:

Crane, A. (S.M. Thesis, EECS, MIT, September 2002): Object Recognition with Partially Labeled Examples.

Evgeniou, T., M. Pontil, C. Papageorgiou and T. Poggio. Image Representations and Feature Selection for Multimedia Database Search. In: IEEE Transactions in Knowledge and Data Engineering, May 2002, to appear.

Kumar, V. (Ph.D. Thesis, BCS, MIT, June 2002): Towards Trainable Man-machine Interfaces: Combining Top-down Constraints with Bottom-up Learning in Facial Analysis.

Mukherjee, S., P. Niyogi, T. Poggio and R. Rifkin. Statistical Learning: Stability is Sufficient for Generalization and Necessary and Sufficient for Consistency of Empirical Risk Minimization, CBCL Paper #223, Massachusetts Institute of Technology, Cambridge, MA, December 2002 [January 2004 revision].

Mukherjee, S., R. Rifkin and T. Poggio. Regression and Classification with Regularization. In: Lectures Notes in Statistics: Nonlinear Estimation and Classification, Proceedings from MSRI Workshop, D.D. Denison, M.H. Hansen, C.C. Holmes, B. Mallick and B. Yu (eds.), Springer-Verlag, 171, 107-124, 2002.

Papageorgiou, C., F. Girosi and T. Poggio. "Sparse Correlation Kernel Reconstruction and Superresolution." In: Probabilistic Models of the Brain - Perception and Neural Function, Rajesh Rao, Bruno Olshausen and Michael Lewicki (eds.), Cambridge, MA, The MIT Press, pp. 155-177, 2002.

Poggio, T., S. Mukherjee, R. Rifkin, A. Rakhlin and A. Verri. B. In: Uncertainty in Geometric Computations, J. Winkler and M. Niranjan (eds.), Kluwer Academic Publishers, 131-141, 2002.

Poggio, T., R. Rifkin, S. Mukherjee, and A. Rakhlin. Bagging Regularizes, CBCL Paper #214/AI Memo #2002-003, Massachusetts Institute of Technology, Cambridge, MA, February 2002.

Rakhlin, A., G. Yeo and T. Poggio. Extra-label Information: Experiments with View-based Classification. In: Proceedings of the Sixth International Conference on Knowledge-Based Intelligent Information & Engineering Systems (KES'2002), Podere d'Ombriano, Crema, Italy, September 16-18, 2002.

Rifkin, R. (Ph.D. Thesis, EECS & OR, MIT, September, 2002): Everything Old Is New Again: A Fresh Look at Historical Approaches in Machine Learning.

Szummer, M. (Ph.D. Thesis, EECS, MIT, September 2002): Learning from Partially Labeled Data.

2001:

Cauwenberghs, G. and T. Poggio. Incremental and Decremental Support Vector Machine Learning. In: Advances in Neural Information Processing Systems (NIPS*2000), MIT Press, Vol. 13, 409-415, Cambridge, MA, 2001.

Chapelle, O., V. Vapnik, O. Bousquet and S. Mukherjee. Choosing Multiple Parameters for Support Vector Machines, Machine Learning - Special Issue on Support Vector Machines, 2001.

Evgeniou, T. and M. Pontil. A Note on the Generalization Performance of Kernel Classifiers with Margin. In: Proceedings of Algorithmic Learning Theory 2000 Conference, Lecture Notes in Artificial Intelligence, Sydney, Australia (December 11-13, 2000), Vol. 1968, 306-315, 2001.

Mukherjee, S. (Ph.D. Thesis, BCS, MIT, June 2001): "Application of Statistical Learning Theory to DNA Microarray Analysis."

Peshkin, L. and S. Mukherjee. Bounds on Sample Size for Policy Evaluation in Markov Environments. In: Proceedings of COLT 2001: The Fourteenth Annual Conference on Computational Learning Theory 2001, Amsterdam, Netherlands (July 16-19, 2001), to appear.

Poggio, T., S. Mukherjee, R. Rifkin, A. Rakhlin and A. Verri. b, CBCL Paper #198/AI Memo #2001-011, Massachusetts Institute of Technology, Cambridge, MA, July 2001.

Shelton, C. (Ph.D. Thesis, EECS, MIT, August 2001): Importance Sampling for Reinforcement Learning with Multiple Objectives.

Shelton, C. Policy Improvement for POMDPs Using Normalized Importance Sampling, CBCL Paper #194/AI Memo #2001-002, Massachusetts Institute of Technology, Cambridge, MA, March 2001.

Shelton, C. Balancing Multiple Sources of Reward in Reinforcement Learning. In: Proceedings of Advances in Neural Information Processing Systems (NIPS), 1082-1088, 2001.

Weston, J., S. Mukherjee, O. Chapelle, M. Pontil, T. Poggio and V. Vapnik. Feature Selection for SVMs. In: Advances in Neural Information Processing Systems, NIPS 13, 668-674, 2001.

2000:

Evgeniou, T. (Ph.D. Thesis, EECS, MIT, June 2000): Learning with Kernel Machine Architectures.

Evgeniou, T., M. Pontil and T. Poggio. Statistical Learning Theory: A Primer, International Journal of Computer Vision, 38, 1, 9-13, 2000.

Evgeniou, T., L. Perez-Breva, M. Pontil, and T. Poggio. Bounds on the Generalization Performance of Kernel Machine Ensembles. In: Proceedings of Seventeenth International Conference on Machine Learning, Stanford University, June 29-July 2, 2000, (to appear).

Evgeniou, T., M. Pontil and T. Poggio. Regularization Networks and Support Vector Machines, Advances in Computational Mathematics, 13, 1, 1-50, 2000.

Ezzat, T. and T. Poggio. Visual Speech Synthesis by Morphing Visemes, International Journal of Computer Vision, 38, 1, 45-57, 2000.

Giese, M.A. and T. Poggio. Morphable Models for the Analysis and Synthesis of Complex Motion Patterns, International Journal of Computer Vision, 38, 1, 59-73, 2000.

Kumar, V. and T. Poggio. Learning-based Approach to Estimation of Morphable Model Parameters, CBCL Paper #191/AI Memo #1696, Massachusetts Institute of Technology, Cambridge, MA, September 2000.

Papageorgiou, C. A Trainable System for Object Detection in Images and Video Sequences, CBCL Paper #186/AI Tech Report #1685, Massachusetts Institute of Technology, Cambridge, MA, May 2000.

Papageorgiou, C. and T. Poggio. A Trainable System for Object Detection, International Journal of Computer Vision, 38, 1, 15-33, 2000.

Poggio, T. and C. Shelton. "Learning in Brains and Machines," Spatial Vision, Vol. 13, No. 2-3, 287-296, 2000.

Poggio, T. and A. Verri. Introduction: Learning and Vision at CBCL, International Journal of Computer Vision, 38, 1, 5-7, 2000.

Pontil, M., S. Mukherjee, and F. Girosi. On the Noise Model of Support Vector Machine Regression. In: Proceedings of Algorithmic Learning Theory 2000 Conference, Sydney, Australia, December 11-13, 2000, (to appear).

Shelton, C. Morphable Surface Models, International Journal of Computer Vision, 38, 1, 75-91, 2000.

1999:

Evgeniou, T. and M. Pontil. On the V Gamma Dimension for Regression in Reproducing Kernel Hilbert Spaces, CBCL Paper #172/AI Memo #1656, Massachusetts Institute of Technology, Cambridge, MA, May 1999.

Evgeniou, T., M. Pontil, and T. Poggio. A Unified Framework for Regularization Networks and Support Vector Machines, CBCL Paper #171/AI Memo #1654, Massachusetts Institute of Technology, Cambridge, MA, March 1999.

Marques, J. (S.M. Thesis, EECS, MIT, May 1999): "An Automatic Annotation System for Audio Data Containing Music."

Meila, M. An Accelerated Chow and Liu Algorithm: Fitting Tree Distributions to High Dimensional Sparse Data, CBCL Paper #169/AI Memo #1652, Massachusetts Institute of Technology, Cambridge, MA, 1999.

Mukherjee, S., P. Tamayo, J.P. Mesirov, D. Slonim, A. Verri and T. Poggio. Support Vector Machine Classification of Microarray Data, CBCL Paper #182/AI Memo #1676, Massachusetts Institute of Technology, Cambridge, MA, December 1999.

Mukherjee, S. and V. Vapnik. Multivariate Density Estimation: An SVM Approach, CBCL Paper #170/AI Memo #1653, Massachusetts Institute of Technology, Cambridge, MA, April 1999.

Niyogi, P. and F. Girosi. "Generalization Bounds for Function Approximation from Scattered Noisy Data," Advances in Computational Mathematics, Vol. 10, p.51-80, 1999.

Pérez-Breva, L. (Ingenieria Superior, Institut Quimic de Sarriŕ, Universitat Ramon Llull, Barcelona, July 1999): “Applying Learning Techniques to Solve Engineering Problems: Preprocessing, Learning and Measuring.”

Poggio, T. and Shelton, C. Machine Learning, Machine Vision and the Brain, AI Magazine, Vol. 20, No. 3, 37-55, 1999.

Rifkin, R., M. Pontil, and A. Verri. A Note on Support Vector Machines Degeneracy, CBCL Paper #177/AI Memo #1661, Massachusetts Institute of Technology, Cambridge, MA, June 1999.

1998:

Girosi, F. An Equivalence between Sparse Approximation and Support Vector Machines, Neural Computation, Vol. 10, 1455-1480, 1998.

Hofmann, Thomas and Jan Puzicha. Statistical Models for Co-occurrence Data, CBCL Paper #161/AI Memo #1625, Massachusetts Institute of Technology, Cambridge, MA, February 1998.

Mukherjee, S., E. Osuna, and F. Girosi. Nonlinear Prediction of Chaotic Time Series using a Support Vector Machine. In: IEEE Neural Network for Signal Processing (NNSP'97), Amelia Island, FL, September 1997, in press.

Niyogi, P. and F. Girosi. Generalization Bounds for Function Approximation from Scattered Noisy Data, Advances in Computational Mathematics, in press.

Niyogi, P., T. Poggio, and F. Girosi. Incorporating Prior Information in Machine Learning by Creating Virtual Examples. In: IEEE Proceedings on Intelligent Signal Processing, September 1998, in press.

Osuna, E. (Ph.D. Thesis, EECS & OR, MIT, June 1998): "Support Vector Machines: Training and Applications."

Pontil, M. and A. Verri. Properties of Support Vector Machines, Neural Computation, Vol. 10, pp. 955-974, 1998.

Pontil, M., R. Rifkin and T. Evgeniou. From Regression to Classification in Support Vector Machines, CBCL Paper #166, AI Memo #1649, Massachusetts Institute of Technology, Cambridge, MA, November 1998.

Pontil, M., S. Mukherjee and F. Girosi. On the Noise Model of Support Vector Machine Regression, CBCL Paper #168, AI Memo #1651, Massachusetts Institute of Technology, Cambridge, MA, October 1998.

Poggio, T. and F. Girosi. Notes on PCA, Regularization, Sparsity and Support Vector Machines, CBCL Paper #161/AI Memo #1632, Massachusetts Institute of Technology, Cambridge, MA, April 1998.

Poggio, T. and F. Girosi. A Sparse Representation for Function Approximation, Neural Computation, Vol. 10, No. 6, 1445-1454, 1998.

1997:

Girosi, F. An Equivalence between Sparse Approximation and Support Vector Machines, CBCL Paper #161/AI Memo #1606, Massachusetts Institute of Technology, Cambridge, MA, May 1997.

Jones, M. (Ph.D. Thesis, EECS, MIT, June 1997): "Multidimensional Morphable Models: A Framework for Representing and Matching Object Classes."

Meila, Marina, Michael I. Jordan, and Quaid Morris. Estimating Dependency Structure as a Hidden Variable, CBCL Paper #151/AI Memo #1611, Massachusetts Institute of Technology, Cambridge, MA, June 1997.

Meila, Marina and Michael I. Jordan. Triangulation by Continuous Embedding, CBCL Paper #146/AI Memo #1605, Massachusetts Institute of Technology, Cambridge, MA, March 1997.

Osuna, E., R. Freund, and F. Girosi. Improved Training Algorithm for Support Vector Machines. In: IEEE Neural Network for Signal Processing (NNSP'97), Amelia Island, FL, September 1997.

Osuna, E., R. Freund and F. Girosi. Support Vector Machines: Training and Applications, CBCL Paper #144/AI Memo #1602, Massachusetts Institute of Technology, Cambridge, MA, May 1997.

Pontil, P. and A. Verri. Properties of Support Vector Machines, CBCL Paper #152/AI Memo #1612, Massachusetts Institute of Technology, Cambridge, MA, August 1997.

Schoelkopf, B., K.K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio and V. Vapnik. Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers. In: IEEE Transactions on Signal Processing, Vol. 45, No. 11, 2758-2765, 1997.

Weiss, Yair. Belief Propagation and Revision in Networks with Loops, CBCL Paper #155/AI Memo #1616, Massachusetts Institute of Technology, Cambridge, MA, November 1997.

1996:

Ghahramani, Zoubin and Michael I. Jordan. Factorial Hidden Markov Models, CBCL Paper #130/AI Memo #1561, Massachusetts Institute of Technology, Cambridge, MA, January 1996.

Jaakkola, Tommi S., Lawrence K. Saul, Michael I. Jordan. Fast Learning by Bounding Likelihoods in Sigmoid Type Belief Networks, CBCL Paper #129/AI Memo #1560, Massachusetts Institute of Technology, Cambridge, MA, January 1996.

Jaakkola, Tommi S. and Michael I. Jordan. Computing Upper and Lower Bounds on Likelihoods in Intractable Networks, CBCL Paper #136/AI Memo #1571, Massachusetts Institute of Technology, Cambridge, MA, March 1996.

Jordan, Michael I. and Christopher M. Bishop. Neural Networks, CBCL Paper #131/AI Memo #1562, Massachusetts Institute of Technology, Cambridge, MA, March 1996.

Lemm, J.D. Prior Information and Generalized Questions, CBCL Paper #141/AI Memo #1598, Massachusetts Institute of Technology, Cambridge, MA, December 1996.

Olshausen, Bruno A. Learning Linear, Sparse, Factorial Codes, CBCL Paper #138/AI Memo #1580, Massachusetts Institute of Technology, Cambridge, MA, July 1996.

Poggio, T. "Networks that Learn and How the Brain Works." In: Proceedings of Symposia in Pure Mathematics (PSPUM), D. Jerison, I.M. Singer and D.W. Stroock (eds.), American Mathematical Society, 1996.

Niyogi, P. and F. Girosi. On The Relationship between Generalization Error, Hypothesis Complexity and Sample Complexity for Radial Basis Functions, Neural Computation, 8, 819-842, 1996.

Sabes, Philip N. and Michael I. Jordan. Reinforcement Learning by Probability Matching, CBCL Paper #134/AI Memo #1568, Massachusetts Institute of Technology, Cambridge, MA, January 1996.

Saul, Lawrence K., Tommi Jaakkola and Michael I. Jordan. Mean Field Theory for Sigmoid Belief Networks, CBCL Paper #135/AI Memo #1570, Massachusetts Institute of Technology, Cambridge, MA, March 1996.

Schoelkopf, B., K. Sung, C. Burges, F. Girosi, P. Niyogi, T. Poggio and V. Vapnik. Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers, CBCL Paper 142/AI Memo 1599, Massachusetts Institute of Technology, Cambridge, MA, December 1996.

Smyth, Padhraic, David Heckerman and Michael Jordan. Probabilistic Independence Networks for Hidden Markov Probability Models, CBCL Paper #132/AI Memo #1565, Massachusetts Institute of Technology, Cambridge, MA, February 1996.

1995:

Chan, N. (S.M. Thesis, EECS, MIT, May 1995): "The Complexity and A Priori Knowledge of Learning From Examples."

Cohn, David A. Minimizing Statistical Bias with Queries, CBCL Paper #124/AI Memo #1552, Massachusetts Institute of Technology, Cambridge, MA, September 1995.

Girosi, F. Approximation Error Bounds that Use VC-bounds. In: Proceedings of the International Conference on Artificial Neural Networks, 295-302, Paris, October 9-13, 1995.

Girosi, F. and N. Chan. Prior Knowledge and the Creation of Virtual Examples for RBF Networks. In: Neural Networks Signal Processing Proceedings of the 1995/IEEE-SP/Workshop, IEEE Signal Processing Society, Cambridge, MA, 201-210, September 1995.

Girosi, F., M. Jones, and T. Poggio. Regularization Theory and Neural Networks Architectures, Neural Computation, Vol. 7, No. 2, 219-269, 1995.

Niyogi, P. (Ph.D. Thesis, EECS, MIT, February 1995): "The Informational Complexity of Learning from Examples."