About the symposium

The symposium brings together international experts, researchers, and practitioners around the theme of Learning Analytics for feedback at scale. The goal of the symposium is to contribute to bridging the gap between research and actual educational practice. The symposium in particular focuses on using learning analytic to provide the involved stakeholders with feedback at scale. These stakeholders can be students, teachers, but also student advisers or policy makers.

Invited international experts will share their research and experiences in plenary sessions. Additionally, all attendants are invited to submit a poster for discussion during the symposium. The symposium will provide plenty opportunities for interaction between the attendees and each of the attendees.

Program

The program is as follows:

  • 11:15 – 11:30: Registration
  • 11:30 – 12:50: Poster and demo session & lunch
  • 12:50 – 13:00: Welcome by Katrien Verbert
  • 13:00 – 13:30: Closing the Loop in Education – Carlos Delgado Kloos

    In control systems there is a feedback loop that compares the obtained output with the expected one. This allows to correct deviations. Robust theories exist and there are many systems that incorporate them in their operation. To mention just a few of them, consider thermostats, power steering systems, SCADAs, and many others.
    Learning frameworks have been in essence open-loop systems or systems with a very week and broad feedback loop. At the end of a course, if the learning of a student reaches a minimum, the student can advance to the next course. If not, the course has to be repeated. There is the feedback loop but at a quite broad level. However, one can include much faster feedback loops into the learning design in order to achieve better learning results. Examples of good learning frameworks with fine-grain closed loops are formative assessment, learning analytics based interventions, mastery learning, improvement of teaching material through effectiveness analysis, etc.

    In the talk, we will analyse educational systems from this point of view and give appropriate examples

  • 13:30 – 14:00: Learning from Learning Analytics: risky assumptions about the efficacy of feedback for learners – Ed Foster

    If assessment is the ‘engine that drives learning, feedback is the oil that lubricates the cogs of understanding’ (Race, 2006). Arguably, learning analytics provides learners, lecturers and study advisers with new levels of sophisticated feedback data in a far timelier manner than has historically been available. This ought to be transformative. Ought to be.

    Nottingham Trent University implemented a whole-institution learning analytics resource in 2014/15. The Student Dashboard provides information to both students and staff about how students are engaging with their studies. Our experience is that around 40% of students frequently engage with the resource, have the highest engagement and are most likely to progress/ achieve the best grades. It appears that for students with sophisticated approaches to learning, learning analytics is an absolute boon, however far more work is needed to develop students’ capacity to learn from feedback presented directly by learning analytics, or mediated by staff. Assuming that all students can learn from learning analytics feedback is at the very least a risky assumption.

    (RACE, P., 2006. The Lecturers Toolkit: A practical guide to assessment, learning and teaching. Oxon: Routledge.)

  • 14:00 – 14:30: Feedback automation for complex learning tasks – Monique Snoeck
    Teaching complex subjects to students generates a high demand for personal, elaborative and cognitive feedback.  In the presence of large number of students this quickly becomes a too time consuming task for a teacher. In this presentation I will explain two approaches to feedback automation for teaching conceptual modelling. The first approach constitutes of feedback features built into a conceptual modelling tool. The second approach makes use of standard features in the edX platform.  Both approaches rely on the systematic analysis of student errors. These errors are then used to create cognitive feedback and exercises for part-task practice with corresponding feedback. The feedback has been shown to have a positive impact on learning.
  • 14:30 – 15:00: Scaling personalized feedback – Yi-Shan Tsai

    Feedback is a crucial part of communication between students and teachers in terms of clarifying expectations, monitoring the current progress of learners, and moving towards desired learning goals. However, there is substantial evidence showing that higher education struggles to deliver consistent, timely, and constructive feedback to meet the needs and expectations of students. The inadequacy in delivering effective feedback to students is partly due to conflicts between an increasing focus on ‘massiveness’, ‘inclusiveness’, and ‘personalisation’ in higher education and yet unmatched capacity of staff to produce feedback that speaks to the needs of individual students. OnTask is a semi-automated feedback tool, which uses ‘if…then’ rules to help teachers compose personalised feedback for a large cohort of students based on parameters relevant to the course design. In this talk, I will talk about key elements of effective feedback and how we can leverage feedback practice using OnTask.

  • 15:00 – 15:30: Break and poster and demo session
  • 15:30 – 16:00: Learning analytics and feedback: not only for students – Jan Elen
    Learning happens in an environment. Learning results from the interaction between characteristics of the learners and characteristics of the environment. In learning analytics research the focus is on the learner. Using data on learning behavior learning analytics provides feedback to the learner in view of predicting learning outcomes and/or of encouraging learners to change their behavior. The characteristics of the learning environment largely remain out of scope.
    In this contribution two other potential uses of learning analytics data are suggested. A first proposal pertains to using learning analytics data to provide feedback on the validity of instructional design models. The second proposal inverses the current learning analytics logic by using learning analytics data to provide feedback on the quality of learning environments rather than of students.
  • 16:00 – 16:30: Interactive recommender systems and dashboards for learning – Katrien Verbert
    Researchers have become more aware of the fact that effectiveness of recommender systems goes beyond recommendation accuracy. Thus, research on these human factors has gained increased interest, for instance by combining interactive visualization techniques with recommendation techniques to support transparency and controllability of the recommendation process. In this talk, I will present our work on interactive visualizations to enable end-users to interact with recommender systems. The objective is two-fold: 1) to explain the rationale of recommendations as a basis to increase user trust and acceptance of recommendations, and 2) to incorporate user feedback and input into the recommendation process and to help them steer this process. In addition, I will present the results of several user studies that investigate how such explanations and user control interact with different personal characteristics, such as expertise and visual working memory. I will also present concrete applications in the field of learning analytics and job recommender systems that use visualization techniques to support interaction with recommender systems.
  • 16:30 – 17:00: Bridging the gap between educational research and practice with learning analytics: insights from studies in primary and secondary education – Frederik Cornillie
    In educational research, the collection and analysis of data on technology-mediated learning behaviour have long been used with a view to answering fundamental questions on learning. Typically, however, these data were collected in rather sterile learning tasks and less realistic contexts, compromising the ecological validity of the findings. Mobile technologies and learning analytics have now made it possible to study fundamental questions on learning in a variety of more realistic and less controlled environments. The results of such research can inform educational technology practice.  In this talk, we present two empirical studies informed by learning learning analytics that were carried out in primary and secondary education classrooms. The first study focuses on automatization processes and the role of corrective feedback in language learning; the second study focuses on task complexity and adaptive learning in maths practice. The findings provide instructional designers and teachers with insights on how to enhance and personalize the learning environments for their pupils.
  • 17:00 – 17:30: Learning Dashboards for feedback at scale – Tinne De Laet
    Learning analytics is hot. But are learning dashboards scalable and sustainable solutions for providing actionable feedback to students? Can learning dashboard be applied for feedback at scale? Is learning analytics applicable in more traditional higher education settings? This talk will share experiences and lessons learned from three European projects (STELAABLE,  and LALA ) that focuses on
    scalable applications of learning dashboards and their integration within actual educational practices. Can learning dashboards deployed at scale, create new learning traces? The talk will challenge your beliefs regarding “chances of success”, predictive models, and explainable interpretations.

  • 17:30 – 17:40: Closing

Speakers

Carlos Delgado Kloos

Carlos Delgado Kloos received the Ph.D. degree in Computer Science from the Technische Universität München (Germany) and in Telecommunications Engineering from the Universidad Politécnica de Madrid (Spain). He is Full Professor of Telematics Engineering at the Universidad Carlos III de Madrid, where he is the Director of the GAST research group, Director of the UNESCO Chair on “Scalable Digital Education for All”, and Vice President for Strategy and Digital Education. He is also the Coordinator of the eMadrid research network on Educational Technology in the Region of Madrid. He is the Spanish representative at IFIP TC3 on Education, Senior Member of IEEE, and Associate Editor of IEEE Transactions on Learning Technologies. He has been the Manager of ICT research projects at the Spanish Ministry and has carried out research stays at several universities such as MIT, Harvard, Munich, and Passau.

Ed Foster

Ed Foster is the Student Engagement Manager in Nottingham Trent University’s (NTU) Centre for Student and Community Engagement. He leads the implementation and delivery of NTU’s learning analytics resource, the Student Dashboard. Ed has worked on three Erasmus+ projects (ABLE, STELA and OfLA) exploring the technological, pedagogical, and ethical issues of using big data in education. Ed’s research interests are using learning analytics to support students, approaches to overcoming socio-economic disadvantage in education, student transition and student engagement. Ed’s favourite chord is Dsus4. Ed blogs about learning analytics at www.livinglearninganalytics.blog

Monique Snoeck

Monique Snoeck is full professor at the KU Leuven, Research Center for Management Informatics (LIRIS), and visiting professor at the U Namur. Her research focuses on smart learning environments, enterprise modeling, requirements engineering, model-driven engineering and business process management. Her main guiding research themes are the integration of different modelling approaches into a comprehensive approach, the quality of models through formal grounding, model to code transformations and educational aspects of conceptual modelling. She has published over 130 peer-reviewed papers.

Yi-Shan Tsai

Yi-Shan Tsai is a research associate at the School of Informatics at the University of Edinburgh, with an affiliation to the Centre for Research in Digital Education. She currently works on two large multinational research projects on learning analytics and blended learning in collaboration with 11 different institutional partners in Europe and Latin America. Prior to this, she took the lead on a large learning analytics project involving 6 European institutional partners and investigated social and cultural factors that influence institutional adoption of learning analytics. Yi-Shan is currently an executive member of the Society for Learning Analytics Research (SoLAR). Her research interests span from learning analytics and digital storytelling to reading cultures and multimodal texts.

Jan Elen

Jan Elen is a full professor at the KU Leuven, Centre for Instructional Psychology and Technology. For several years he was the head of the educational support office of the KU Leuven. He has been the coordinator of the Special Interest Group on Instructional Design of the European Association for Research on Learning and Instruction, coordinator of the expertise network School of education, vice-dean education of the Faculty of Psychology and Educational Sciences and senior editor of Instructional Science. He acts as the current academic responsible for the educational master behavioural sciences. His research pertains to the design of learning environments for complex learning outcomes with and without using technology. In addition to research projects funded by esteemed national and international research agencies in Europe, he also collaborates in projects in Peru, Ecuador, Ethiopia and Congo. He teaches courses at the bachelor and master level pertaining to the design of learning environments and to higher education.

Katrien Verbert

Katrien Verbert is an Associate Professor at the HCI research group of the KU Leuven, Belgium. Her research interests include recommender systems, visualization techniques, visual analytics, and applications in healthcare, learning analytics, precision agriculture and digital humanities

Frederik Cornillie

Frederik Cornillie, PhD is research and valorization manager at ITEC, an imec research group at KU Leuven devoted to interdisciplinary research in educational technology. He is passionate about research and innovation in technology-enhanced learning, and collaborates with researchers, industrial partners, educational institutions, and other not-for-profit partners to bring engaging and effective technology-mediated learning solutions to society. His research specializes in CALL (computer-assisted language learning) and adaptive learning.

Tinne De Laet

Tinne De Laet is Associate professor at the Faculty of Engineering Science, KU Leuven, Belgium. She is the Head of the Tutorial Services of Engineering Science. Her research focuses on using learning analytics, conceptual learning in mechanics, multiple-choice tests, and study success.

Sponsors

LESEC

LESEC or the Leuven Engineering and Science Education Center of the University of Leuven was established in 2009 within the Science, Engineering and Technology group. All members of the center share a passion for engineering and science education and the drive to actively participate in engineering and science education research and development, together with other LESEC members and with other interested people from both within and without the University of Leuven.

LICT

The mission of the LICT Center is to coordinate and promote top-level research on the design and application of ICT (Information and Communication Technology) systems, both hardware and software, to address societal and industrial needs, to make human life more comfortable and more secure, to improve our health and to conserve energy and the environment.

LALA project

The LALA (Learning Analytics Latin America) project aims to build the local capacity to design, implement and use Learning Analytics tools in Latin America Higher Education Institutions(HEI), with the aid of European Universities, to provide a powerful tool to solve any problem where academic data analysis is necessary.

Erasmus+ program of European Commission

The European Commission support for the production of this publication does not constitute an endorsement of the contents which reflects the views only of the authors, and the Commission cannot be held responsible for any use which may be made of the information contained therein.

Practical details

Registration is free, but we ask a no show fee. The workshop will be at the Erik Duval auditorium in the Department of Computer Science, KU Leuven. More information about the venue can be found on the department website. If you have any questions regarding the workshop, please send us an email: augment@cs.kuleuven.be

Organizers

Katrien Verbert

Katrien Verbert

Associate Professor

Katrien Verbert is an Associate Professor at the HCI research group of KU Leuven, Belgium. Her research interests include recommender systems, visualization techniques, visual analytics, and applications in healthcare, learning analytics, precision agriculture and digital humanities.

Robin De Croon

Robin De Croon

Postdoctoral Researcher

Robin De Croon is a postdoctoral researcher at the HCI research group of KU Leuven, Belgium. His research interests include healthcare informatics, visualization techniques, and gamification.

Tinne De Laet

Tinne De Laet

Associate Professor

Tinne De Laet is Associate professor at the Faculty of Engineering Science, KU Leuven. She is the Head of the Tutorial Services of Engineering Science. Her research focuses on using learning analytics, conceptual learning in mechanics, multiple-choice tests, and study success.

Martijn Millecamp

Martijn Millecamp

PhD Researcher

Martijn Millecamp is a PhD student at the HCI research group of KU Leuven, Belgium. His research interests include user interfaces for music recommender systems and dashboards for learning analytics.

Tom Broos

Tom Broos

PhD Researcher

Tom Broos is a PhD student at the HCI research group of KU Leuven, Belgium. He researches scalable learning analytics interventions to support first-year students in their transition to higher education. He emphasizes the active receiver, analytical transparency and privacy.