- -

Optimal Teaching Curricula with Compositional Simplicity Priors

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Optimal Teaching Curricula with Compositional Simplicity Priors

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author García-Piqueras, Manuel es_ES
dc.contributor.author Hernández-Orallo, José es_ES
dc.date.accessioned 2022-04-27T11:33:09Z
dc.date.available 2022-04-27T11:33:09Z
dc.date.issued 2021-09-17 es_ES
dc.identifier.isbn 978-3-030-86514-6 es_ES
dc.identifier.issn 0302-9743 es_ES
dc.identifier.uri http://hdl.handle.net/10251/182198
dc.description.abstract [EN] Machine teaching under strong simplicity priors can teach any concept in universal languages. Remarkably, recent experiments suggest that the teaching sets are shorter than the concept description itself. This raises many important questions about the complexity of concepts and their teaching size, especially when concepts are taught incrementally. In this paper we put a bound to these surprising experimental findings and reconnect teaching size and concept complexity: complex concepts do require large teaching sets. Also, we analyse teaching curricula, and find a new interposition phenomenon: the teaching size of a concept can increase because examples are captured by simpler concepts built on previously acquired knowledge. We provide a procedure that not only avoids interposition but builds an optimal curriculum. These results indicate novel curriculum design strategies for humans and machines. es_ES
dc.description.sponsorship This work was funded by the EU (FEDER) and Spanish MINECO under RTI2018-094403-B-C32, G. Valenciana under PROMETEO/2019/098 and EU's Horizon 2020 research and innovation programme under grant 952215 (TAILOR). es_ES
dc.language Inglés es_ES
dc.publisher Springer es_ES
dc.relation.ispartof Machine Learning and Knowledge Discovery in Databases: Applied Data Science Track. European Conference, ECML PKDD 2021, Bilbao, Spain, September 13-17, 2021, Proceedings, Part IV. LNCS, volume 12978 es_ES
dc.rights Reserva de todos los derechos es_ES
dc.subject Machine teaching es_ES
dc.subject Interposition es_ES
dc.subject Kolmogorov complexity es_ES
dc.subject.classification LENGUAJES Y SISTEMAS INFORMATICOS es_ES
dc.title Optimal Teaching Curricula with Compositional Simplicity Priors es_ES
dc.type Comunicación en congreso es_ES
dc.type Artículo es_ES
dc.type Capítulo de libro es_ES
dc.identifier.doi 10.1007/978-3-030-86486-6_43 es_ES
dc.relation.projectID info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/RTI2018-094403-B-C32/ES/RAZONAMIENTO FORMAL PARA TECNOLOGIAS FACILITADORAS Y EMERGENTES/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/GVA//PROMETEO%2F2019%2F098//DEEPTRUST/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/952215/EU es_ES
dc.rights.accessRights Abierto es_ES
dc.contributor.affiliation Universitat Politècnica de València. Departamento de Sistemas Informáticos y Computación - Departament de Sistemes Informàtics i Computació es_ES
dc.description.bibliographicCitation García-Piqueras, M.; Hernández-Orallo, J. (2021). Optimal Teaching Curricula with Compositional Simplicity Priors. Springer. 705-721. https://doi.org/10.1007/978-3-030-86486-6_43 es_ES
dc.description.accrualMethod S es_ES
dc.relation.conferencename European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECMLPKDD 2021) es_ES
dc.relation.conferencedate Septiembre 13-17,2021 es_ES
dc.relation.conferenceplace Online es_ES
dc.relation.publisherversion https://doi.org/10.1007/978-3-030-86486-6_43 es_ES
dc.description.upvformatpinicio 705 es_ES
dc.description.upvformatpfin 721 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.relation.pasarela S\458393 es_ES
dc.contributor.funder Generalitat Valenciana es_ES
dc.contributor.funder Agencia Estatal de Investigación es_ES
dc.contributor.funder European Regional Development Fund es_ES
dc.description.references Antoniol, G., Di Penta, M.: Library miniaturization using static and dynamic information. In: International Conference on Software Maintenance, pp. 235–244 (2003) es_ES
dc.description.references Balbach, F.J.: Models for algorithmic teaching. Ph.D. thesis, U. of Lübeck (2007) es_ES
dc.description.references Balbach, F.J.: Measuring teachability using variants of the teaching dimension. Theoret. Comput. Sci. 397(1–3), 94–113 (2008) es_ES
dc.description.references Brown, T.B., Mann, B., Ryder, N., et al.: Language models are few-shot learners. arXiv:2005.14165 (2020) es_ES
dc.description.references Cicalese, F., Laber, E., Molinaro, M., et al.: Teaching with limited information on the learner’s behaviour. In: ICML, pp. 2016–2026. PMLR (2020) es_ES
dc.description.references Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv: 1810.04805 (2018) es_ES
dc.description.references Elias, P.: Universal codeword sets and representations of the integers. IEEE Trans. Inf. Theory 21(2), 194–203 (1975) es_ES
dc.description.references Gao, Z., Ries, C., Simon, H.U., Zilles, S.: Preference-based teaching. J. Mach. Learn. Res. 18(1), 1012–1043 (2017) es_ES
dc.description.references Garcia-Piqueras, M., Hernández-Orallo, J.: Conditional teaching size. arXiv: 2107.07038 (2021) es_ES
dc.description.references Gong, C.: Exploring commonality and individuality for multi-modal curriculum learning. In: AAAI, vol. 31 (2017) es_ES
dc.description.references Gong, C., Yang, J., Tao, D.: Multi-modal curriculum learning over graphs. ACM Trans. Intell. Syst. Technol. (TIST) 10(4), 1–25 (2019) es_ES
dc.description.references Gong, T., Zhao, Q., Meng, D., Xu, Z.: Why curriculum learning & self-paced learning work in big/noisy data: a theoretical perspective. BDIA 1(1), 111 (2016) es_ES
dc.description.references Gulwani, S., Hernández-Orallo, J., Kitzelmann, E., Muggleton, S.H., Schmid, U., Zorn, B.: Inductive programming meets the real world. Commun. ACM 58(11), 90–99 (2015) es_ES
dc.description.references Hendrycks, D., Burns, C., Basart, S., Zou, A., Mazeika, M., Song, D., Steinhardt, J.: Measuring massive multitask language understanding. In: ICLR (2021) es_ES
dc.description.references Hernández-Orallo, J., Telle, J.A.: Finite and confident teaching in expectation: Sampling from infinite concept classes. In: ECAI (2020) es_ES
dc.description.references Kumar, A., Ithapu, V.: A sequential self teaching approach for improving generalization in sound event recognition. In: ICML, pp. 5447–5457 (2020) es_ES
dc.description.references Lake, B.M., Salakhutdinov, R., Tenenbaum, J.B.: Human-level concept learning through probabilistic program induction. Science 350(6266), 1332–1338 (2015) es_ES
dc.description.references Leibniz, G.W., Rabouin, D.: Mathesis universalis: écrits sur la mathématique universelle. Mathesis (Paris, France) Librairie philosophique J. Vrin (2018) es_ES
dc.description.references Li, M., Vitányi, P.M.: An Introduction to Kolmogorov Complexity and Its Applications, 3rd edn. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-030-11298-1 es_ES
dc.description.references Li, Y., Mao, J., Zhang, X., Freeman, W.T., Tenenbaum, J.B., Wu, J.: Perspective plane program induction from a single image. In: CVPR, pp. 4434–4443 (2020) es_ES
dc.description.references Liu, W., et al.: Iterative machine teaching. In: ICML, pp. 2149–2158 (2017) es_ES
dc.description.references Manohar, S., Zokaei, N., Fallon, S., Vogels, T., Husain, M.: Neural mechanisms of attending to items in working memory. Neurosci. Biobehav. Rev. 101, 1–12 (2019) es_ES
dc.description.references Nye, M.I., Solar-Lezama, A., Tenenbaum, J.B., Lake, B.M.: Learning compositional rules via neural program synthesis. arXiv: 2003.05562 (2020) es_ES
dc.description.references Oberauer, K., Lin, H.Y.: An interference model of visual working memory. Psychol. Rev. 124(1), 21 (2017) es_ES
dc.description.references Peng, B., Li, C., Li, J., Shayandeh, S., Liden, L., Gao, J.: Soloist: building task bots at scale with transfer learning and machine teaching. arXiv: 2005.05298 (2020) es_ES
dc.description.references Pentina, A., Sharmanska, V., Lampert, C.H.: Curriculum learning of multiple tasks. In: Proceedings of Computer Vision and Pattern Recognition (2015) es_ES
dc.description.references Radford, A., Wu, J., Child, R., Luan, D., Amodei, D., Sutskever, I.: Language models are unsupervised multitask learners. OpenAI Blog 1(8), 9 (2019) es_ES
dc.description.references Rakhsha, A., Radanovic, G., Devidze, R., Zhu, X., Singla, A.: Policy teaching via environment poisoning: training-time adversarial attacks against reinforcement learning. In: ICML, pp. 7974–7984 (2020) es_ES
dc.description.references Salkind, N.: An Introduction to Theories of Human Development. Sage P. (2004) es_ES
dc.description.references Schneider, W.X., Albert, J., Ritter, H.: Enabling cognitive behavior of humans, animals, and machines: a situation model framework. ZiF 1, 21–34 (2020) es_ES
dc.description.references Shi, Y., Mi, Y., Li, J., Liu, W.: Concept-cognitive learning model for incremental concept learning. IEEE Trans. Syst. Man Cybern. Syst. (2018) es_ES
dc.description.references Shindyalov, I., Bourne, P.: Protein structure alignment by incremental combinatorial extension of the optimal path. Prot. Eng. Des. Sel. 11(9), 739–747 (1998) es_ES
dc.description.references Shukla, S., et al.: Conversation learner-a machine teaching tool for building dialog managers for task-oriented dialog systems. arXiv: 2004.04305 (2020) es_ES
dc.description.references Solomonoff, R.J.: A formal theory of inductive inference I. IC 7(1), 1–22 (1964) es_ES
dc.description.references Solomonoff, R.J.: A system for incremental learning based on algorithmic probability. In: Proceedings of the Sixth Israeli Conference on AICVPR, pp. 515–527 (1989) es_ES
dc.description.references Soviany, P., Ionescu, R.T., Rota, P., Sebe, N.: Curriculum learning: a survey. arXiv:2101.10382 (2021) es_ES
dc.description.references Such, F.P., Rawal, A., Lehman, J., Stanley, K., Clune, J.: Generative teaching networks: accelerating neural architecture search by learning to generate synthetic training data. In: ICML, pp. 9206–9216 (2020) es_ES
dc.description.references Telle, J.A., Hernández-Orallo, J., Ferri, C.: The teaching size: computable teachers and learners for universal languages. Mach. Learn. 108(8), 1653–1675 (2019). https://doi.org/10.1007/s10994-019-05821-2 es_ES
dc.description.references Vygotsky, L.S.: Mind in Society: Development of Higher Psychological Processes. Harvard University Press, Cambridge (1978) es_ES
dc.description.references Weinshall, D., Cohen, G., Amir, D.: Curriculum learning by transfer learning: theory and experiments with deep networks. In: ICML, pp. 5235–5243 (2018) es_ES
dc.description.references Zhou, T., Bilmes, J.A.: Minimax curriculum learning: machine teaching with desirable difficulties and scheduled diversity. In: ICLR (Poster) (2018) es_ES
dc.description.references Zhu, X.: Machine teaching: an inverse problem to machine learning and an approach toward optimal education. In: AAAI, pp. 4083–4087 (2015) es_ES
dc.description.references Zhu, X., Singla, A., Zilles, S., Rafferty, A.: An overview of machine teaching. arXiv: 1801.05927 (2018) es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem