Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε
The copyright of the original papers published on this site belongs to IEICE. Unauthorized use of the original or translated papers is prohibited. See IEICE Provisions on Copyright for details.
Copy
Eiji TAKIMOTO, Akira MARUOKA, "On the Sample Complexity of Consistent Learning with One-Sided Error" in IEICE TRANSACTIONS on Information,
vol. E78-D, no. 5, pp. 518-525, May 1995, doi: .
Abstract: Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε
URL: https://globals.ieice.org/en_transactions/information/10.1587/e78-d_5_518/_p
Copy
@ARTICLE{e78-d_5_518,
author={Eiji TAKIMOTO, Akira MARUOKA, },
journal={IEICE TRANSACTIONS on Information},
title={On the Sample Complexity of Consistent Learning with One-Sided Error},
year={1995},
volume={E78-D},
number={5},
pages={518-525},
abstract={Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε
keywords={},
doi={},
ISSN={},
month={May},}
Copy
TY - JOUR
TI - On the Sample Complexity of Consistent Learning with One-Sided Error
T2 - IEICE TRANSACTIONS on Information
SP - 518
EP - 525
AU - Eiji TAKIMOTO
AU - Akira MARUOKA
PY - 1995
DO -
JO - IEICE TRANSACTIONS on Information
SN -
VL - E78-D
IS - 5
JA - IEICE TRANSACTIONS on Information
Y1 - May 1995
AB - Although consistent learning is sufficient for PAC-learning, it has not been found what strategy makes learning more efficient, especially on the sample complexity, i.e., the number of examples required. For the first step towards this problem, classes that have consistent learning algorithms with one-sided error are considered. A combinatorial quantity called maximal particle sets is introduced, and an upper bound of the sample complexity of consistent learning with one-sided error is obtained in terms of maximal particle sets. For the class of n-dimensional axis-parallel rectangles, one of those classes that are consistently learnable with one-sided error, the cardinality of the maximal particle set is estimated and O(d/ε
ER -