Written by: Andrew Ramsden – Strategic Consultant EMEA, Blackboard International

Ipswich, United Kingdom

Becoming an independent or self regulated learner is an important part of becoming a more effective learner. The characteristics of a self regulated learner include “their responsiveness to feedback regarding the effectiveness of their learning, and by their self perceptions of academic accomplishment” (Zimmerman (2000:14)). For this to occur, not only must the learner taking responsibility for their weaknesses, but Faculty need to give them the opportunity to identify, correct and improve upon these weaknesses (Fritz (2013)).

The following discussion has two aims. Firstly, to outline how a Faculty member can redesign their learning activities using the virtual learning environment to provide richer feedback and trigger conversations to nudge their students to become more effective self regulated learners. Secondly, to outline the case for Senior Mangers to drive this redesign process to enable the institution to take better advantage of learning analytics and data driven decision making.

Davenport et al., (2000) define learning analytics as “the application of analytic techniques to analyze educational data, including data about the learner and teacher activities, to identify patterns of behaviors and provide actionable information to improve learning and learning related activities”. An important aspect within this definition is the concept of the data being “actionable”, by either the learner, teacher or another stakeholder group.

To gather data and take actions requires the learning model to move away from an orthodox approach designed around a few high stake summative assessments (typically one essay and an unseen exam) towards one which provides more frequent feedback opportunities and learning loops for the student, while being sustainable and scaleable for the Faculty member. The orthodox design does not readily develop self regulated learners because there are not seldom enough reliable and actionable data points.

Consequently, the re-designed approach should include using the VLE quiz engine, the submission of short online writing tasks (with defined marking criteria) and potentially classroom voting technologies. For example, you could deploy a four online tests (of 5 questions) to be completed by all students at regular intervals throughout the course. This would generate a significant amount of actionable date without significant work. Also, by using a variety of different question types you can easily ensure you are testing the higher order thinking skills of analysis and synthesis. The question types might include Likert Scale (to what extent do you agree with …), and short answer questions (in less than 200 words, explain why …). The individual will be able to access their score and feedback online, and compare themselves to the average grade. The Faculty can dedicate a proportion of the next face to face teaching session to provide additional feedback on the questions. The learning model should align to the seven principles of good feedback, which includes, helping to clarify good performance, reflection in learning, encouraging teacher and peer dialogue around learning, and providing information to Faculty on how to shape their teaching (Nicol & Macfarlane-Dick (2006).

So, why should Senior Managers encourage Faculty to enhance their technology based learning designs? The simple answer is, from the institutional perspective using the VLE to generate and share data underpins the effective adoption of learning analytics. These individual re-designs will act “as a series of small steps designed to gain experience and make the case that data based decisions have enhanced value” (Bichsel 2012:26). The use of small scale pilots is essential as Learning Analytics does not require perfect data, while to identify the benefits of Learning Analytics an institution needs to apply it to their unique conditions. Therefore, a number of small scale pilots which are focussed on enhanced reporting and effective intervention strategies will enable the institution to improve its readiness for learning Analytics. The pilots would instigate conversations around institutional culture, people and processes, as well as technology infrastructure.

Where next for you? If you are Faculty, a good starting point would be to explore the functionality and possibilities of the online quiz tool and grade center. If you are a Senior Manager a good starting point would be to explore effective academic adoption models within the context of your institution. Information on both of these is available from Blackboard (http://www.blackboard.com).

*Andrew Ramsden – Strategic Consultant EMEA, Blackboard International

————————————————

Bichsel, J., (2012) ECAR Study of Analytics in Higher Education, available at:http://www.educause.edu/library/resources/2012-ecar-study-analytics-higher-education (Accessed October 2015)

Davenport, T., Harris, J., and Morrison, R., (2010) Analytics at Work, Harvard Business School Publishing Corporation

Fritz, J., (2013) Using Analytics at UMBC, Educase Center for Applied Research, Research Bulletin, available athttps://net.educause.edu/ir/library/pdf/ERB1304.pdf

Nicol, D., & Macfarlane-Dick, D., (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218

Nussbaumer, A., Hillemann, E-C., Gutl C., and Albert, D., (2015), “Competence-based Service for Supporting Self-Regulated Learning in Virtual Environments, Journal of Learning Analytics, 2(1), 101-133

Zimmerman, B., (1990), “Self-Regulated Learning and Academic Achievement: An Overview” in Educational Psychologist, 25(1), 3-17