e-Analytics for e-Learning

e-Analytics for e-Learning

Dafinka Miteva, Krassen Stefanov, Eliza Stefanova
DOI: 10.4018/IJHCITP.2017100101
OnDemand:
(Individual Articles)
Available
$37.50
No Current Special Offers
TOTAL SAVINGS: $37.50

Abstract

Nowadays education is inconceivable without the use of e-environment and all the benefits it can ensure. Modern learning systems are featured with a wide range of resources for gathering relevant user traces for data analysis. The main goal of the research described in this paper is to explore how the learning analytics methods can be used to improve the results of e-Learning. The novel method for cross-system data collection is accomplished in a newly designed database/system e-Analytics. Using the data collected, various analyses and reports are presented and discussed, aiming to disclose important learner's behaviors and regularities during the educational process. Recommendations are made for further improvements of the teaching and learning process. The paper concludes by enumerating some challenges and further works for creating effective Learning Analytics tools.
Article Preview
Top

Introduction

The process of education implies building a complex set of relationships between teachers and learners. In an effort to hand over his knowledge and experience to students, the teacher asks himself a lot of questions, related to increasing the quality of his teaching. “Will my course be useful for the students?”, “What is their background and do they have a big difference in preliminary training?”, “Are my course materials suitable for this concrete students’ class or I need further to adapt them?”, etc. These are just a small part of the hundreds of questions, which teachers asked themselves, and which answers they are trying to find out during their career.

Researchers were interested to find out what methods and tools they can use to facilitate and automate the efforts of educators for analyzing data for student’s results, achievements, interests and feedbacks. In 2010, as an attempt to answer the above-mentioned efforts, George Siemens introduced a new concept “Learning Analytics” (LA), as “the use of intelligent data, learner-produced data, and analysis models to discover information and social connections, and to predict and advise on learning” (Siemens, 2010). This definition was modified during the first international conference on LA and knowledge (LAK, 2011) to “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs”. In parallel, the appearance of computer-based learning environments with the ability to log data about each students’ activity, and the existence of high computer storage capacity, led to the introduction of a new trend in the field of students’ information analysis – Educational Data Mining (EDM), defined as “concerned with developing, researching, and applying computerized methods to detect patterns in large collections of educational data that would otherwise be hard or impossible to analyze due to the enormous volume of data within which they exist” (Romero & Ventura, 2013). At the same time, Rebeca Ferguson pointed out other main trends of educational analytics: “Educational data mining focused on the technical challenge: How can we extract value from these big sets of learning-related data?” and “Academic analytics focused on the political/economic challenge: How can we substantially improve learning opportunities and educational results at national or international levels?” (Furguson, 2012). In 2015 an Innovation and Technology in Computer Science Education (ITICSE) working group published a detailed overview about the use of educational data mining and learning analytics focused on the teaching and learning of programming (Ihantola, et al., 2015). Observing literature on mining students' programming processes for a decade (from 2005 to 2015) they indicated the critical need for replicating studies, reproducing previous works and validating results. They introduce a taxonomy for doing the suggested additional replication analysis. In their article the working group defined five grand challenges for researchers and practitioners:

  • 1.

    To build and maintain a multi-language, multi-institution, multi-nation learning process data and experiment result database

  • 2.

    To systematically analyze and verify previous studies using data from multiple contexts to tease out tacit factors that contribute to previously observed outcomes

  • 3.

    To use pilots and experiments, with control and treatment groups, to evaluate and explain the results

  • 4.

    To adopt results and practices into classroom use to continuously monitor and improve offered education

  • 5.

    To generalize the results to other contexts, if possible, and help practitioners apply them in their respective fields. (Ihantola, et al., 2015)

Complete Article List

Search this Journal:
Reset
Volume 15: 1 Issue (2024)
Volume 14: 1 Issue (2023)
Volume 13: 4 Issues (2022): 1 Released, 3 Forthcoming
Volume 12: 4 Issues (2021)
Volume 11: 4 Issues (2020)
Volume 10: 4 Issues (2019)
Volume 9: 4 Issues (2018)
Volume 8: 4 Issues (2017)
Volume 7: 4 Issues (2016)
Volume 6: 4 Issues (2015)
Volume 5: 4 Issues (2014)
Volume 4: 4 Issues (2013)
Volume 3: 4 Issues (2012)
Volume 2: 4 Issues (2011)
Volume 1: 4 Issues (2010)
View Complete Journal Contents Listing