Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- 1 January 1995
- journal article
- Published by Springer Nature in Machine Learning
- Vol. 18 (2), 131-148
- https://doi.org/10.1007/bf00993408
Abstract
No abstract availableKeywords
This publication has 19 references indexed in Scilit:
- Finiteness results for sigmoidal “neural” networksPublished by Association for Computing Machinery (ACM) ,1993
- Vapnik-Chervonenkis Classes of Definable SetsJournal of the London Mathematical Society, 1992
- Results on learnability and the Vapnik-Chervonenkis dimensionInformation and Computation, 1991
- Learnability and the Vapnik-Chervonenkis dimensionJournal of the ACM, 1989
- A general lower bound on the number of examples needed for learningInformation and Computation, 1989
- What Size Net Gives Valid Generalization?Neural Computation, 1989
- Occam's RazorInformation Processing Letters, 1987
- A linear time algorithm for the Hausdorff distance between convex polygonsInformation Processing Letters, 1983
- Lower bounds for algebraic computation treesPublished by Association for Computing Machinery (ACM) ,1983
- Central Limit Theorems for Empirical MeasuresThe Annals of Probability, 1978