Rethinking eLearning Practices in Higher Education: A Critical Reflection from Teaching Economic Statistics (EC203)

This week’s learning extended beyond understanding eLearning concepts to critically engaging with how these practices can be applied within my own teaching context. As a tutor for EC203: Economic Statistics, I found this particularly relevant, as teaching quantitative subjects online presents both opportunities and challenges that require careful pedagogical design.

A key insight from this session is that effective eLearning is not simply about integrating technology, but about aligning pedagogy, content, and assessment. The development of a Micro Learning Environment Design (MicroLED) reinforced this idea. Focusing on a single learning outcome required me to critically evaluate what students actually need to learn, rather than what I want to cover. In economic statistics, where topics such as regression analysis or hypothesis testing can be conceptually dense, this approach is particularly valuable. However, it also raises a critical question: can complex statistical reasoning truly be captured in short, microlearning formats without oversimplifying the content? While microlearning promotes clarity and focus, there is a risk that breaking content into smaller units may fragment understanding if not carefully sequenced.

The process of redesigning a lesson from face-to-face to an online format further highlighted that eLearning requires transformation rather than replication. As noted in the Horizon Report (2016), emerging technologies such as learning analytics and adaptive systems are reshaping higher education by enabling more personalised learning experiences. In practice, tools such as Moodle logs and Early Warning Systems (EWS) can support instructors in identifying at-risk students and providing timely interventions. At the University of the South Pacific (USP), the implementation of EWS demonstrates how learning analytics can be used to monitor student engagement and improve outcomes. However, Fergusson and Clow (2017) caution that learning analytics should not be viewed as a purely technical solution; issues of data interpretation, ethics, and institutional capacity must also be considered. In the Pacific context, where infrastructure and digital literacy vary significantly, the effectiveness of such systems may be uneven.

Similarly, the discussion on BYOD and mLearning highlights both potential and limitations. While mobile devices increase access to learning, particularly in geographically dispersed regions like the Pacific, access does not necessarily equate to equity. Students may rely heavily on smartphones, which can limit their ability to engage with data-intensive tasks such as statistical analysis. This is particularly relevant for EC203, where students are expected to work with datasets and software tools. Therefore, while BYOD strategies are promoted in the Horizon Report (2016) as imminent trends, their implementation must be critically evaluated within local contexts.

The exploration of Virtual Reality (VR), Augmented Reality (AR), and makerspaces further emphasises the growing emphasis on experiential and interactive learning. These technologies have the potential to transform how abstract concepts are taught. For instance, in economic statistics, visualisation tools could enhance students’ understanding of data patterns and relationships. However, their adoption in Pacific Island contexts remains constrained by cost, infrastructure, and digital skills. As such, while these innovations are promising, their practical relevance may be limited in the short term.

The role of assessment in eLearning also emerged as a critical area of reflection. As Boud argues, students may be able to compensate for poor teaching, but not for poor assessment. This is particularly important in the context of increasing concerns around academic integrity and the rise of generative AI. Traditional forms of assessment may no longer be sufficient, and there is a growing need to design authentic, application-based tasks that encourage critical thinking. In EC203, this could involve using real-world datasets and requiring students to interpret results rather than simply perform calculations. Additionally, tools such as Turnitin, while useful, should be understood as limited in preventing plagiarism and more effective as tools for feedback and learning.

Feedback, both formal and informal, is another essential component of effective eLearning. The integration of technology allows for more immediate and personalised feedback, which can enhance student learning. However, this also requires instructors to be more intentional in how feedback is designed and delivered. In my own teaching, I recognise that feedback is often underutilised, and this session has encouraged me to think more critically about how to incorporate it as an ongoing process rather than a one-time activity.

Finally, engaging with eLearning case studies highlighted the importance of context in evaluating educational innovations. While global trends provide valuable insights, their application in the Pacific must be carefully adapted to local realities. Issues such as connectivity, resource availability, and institutional support play a significant role in determining the success of eLearning initiatives. This reinforces the need for educators to adopt a critical and reflective approach when integrating technology into their teaching.

In conclusion, this week has reinforced that effective eLearning is not about adopting the latest technologies, but about making informed pedagogical choices that enhance learning. For my teaching in EC203, this means focusing on clarity, alignment, and meaningful engagement while being mindful of the contextual challenges faced by students. Moving forward, I aim to integrate these insights into my practice, while continuing to critically evaluate the role of technology in higher education.


References

Ferguson, R., & Clow, D. (2017, March). Where is the evidence? A call to action for learning analytics. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 56-65).

2016 Horizon Report. Higher Education Edition.



Comments

Popular posts from this blog

Reflecting on Learning Technologies, Assistive Technologies, and Open Educational Resources

Reflecting on Technology-Enabled Learning: Rethinking Teaching in a Digital Age