No Learner Left Behind: On the Complexity of Teaching Multiple Learners Simultaneously
No Learner Left Behind: On the Complexity of Teaching Multiple Learners Simultaneously
Xiaojin Zhu, Ji Liu, Manuel Lopes
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
Main track. Pages 3588-3594.
https://doi.org/10.24963/ijcai.2017/502
We present a theoretical study of algorithmic teaching in the setting where the teacher must use the same training set to teach multiple learners. This problem is a theoretical abstraction of the real-world classroom setting in which the teacher delivers the same lecture to academically diverse students. We define a minimax teaching criterion to guarantee the performance of the worst learner in the class. We prove that the teaching dimension increases with class diversity in general. For the classes of conjugate Bayesian learners and linear regression learners, respectively, we exhibit corresponding minimax teaching set. We then propose a method to enhance teaching by partitioning the class into sections. We present cases where the optimal partition minimizes overall teaching dimension while maintaining the guarantee on all learners. Interestingly, we show personalized education (one learner per section) is not necessarily the optimal partition. Our results generalize algorithmic teaching to multiple learners and offer insight on how to teach large classes.
Keywords:
Machine Learning: Learning Theory
Machine Learning: New Problems