The Impact of the Online Learning Readiness Self-Check Survey with Australian Tertiary Enabling Students

Abstract

This study reports on two key aspects relating to the use of the Online Learning Readiness Self-Check (OLRSC) survey, which has been proposed as identifying non-traditional students’ readiness for online learning, and their strengths and weaknesses in six key areas. The first aspect validates the use of the instrument based on data from 199 students engaged in an online tertiary enabling course at a regional university in Australia. Factor analysis verified the scale structure of the instrument; however, two items were removed prior to the final analysis due to low communality and/or high cross loading with other items. This is followed by an examination of whether the instrument might be useful for the early identification of students who are at risk of disengagement from the enabling program. While it was hypothesised that the instrument, which measured factors such as the quality of interaction with peers and instructors, their capacity to manage technology and how well they managed learning, should have been a useful tool to identify early disengagement, the hypothesis was not supported. No significant associations were identified between any of the instrument’s scales and early withdrawal from the course or completion of the first unit of study. Future recommendations for educators are made with a view to improving student engagement.

Share and Cite:

Whannell, R. , Parkes, M. , Bartlett-Taylor, T. and Harrington, I. (2024) The Impact of the Online Learning Readiness Self-Check Survey with Australian Tertiary Enabling Students. Creative Education, 15, 856-866. doi: 10.4236/ce.2024.155052.

1. Introduction

Students who enter university enabling programs demonstrate a wide range of learning readiness. Irrespective of age, there are a variety of reasons why students may be challenged in their attempt to re-engage with education, including having a limited educational background, personal and/or environmental barriers, competing opportunities, being challenged by previous educational experiences, or being absent from formal education for an extended period of time that they have limited confidence in their ability to be able to successfully engage with a tertiary enabling course. For these reasons, an underpinning characteristic of tertiary enabling programs is that they have a focus on student support to ensure that each student has the opportunity to succeed to their potential (Crawford et al., 2019; Motta & Bennett, 2018) . This paper describes one attempt at an Australian regional university to identify non-traditional online students in a tertiary enabling program, who may be at higher risk of disengaging with their study, so that targeted interventions could take place.

2. Background

A key goal for tertiary enabling education is to “assist academically underprepared learners to acquire the necessary knowledge, skills and confidence to transition to and succeed in higher education” (Willans & Seary, 2018: p. 48) . Strategies identified to help increase the likelihood of success in enabling programs include building appropriate supportive relationships with university, academic staff and peers (Lisciandro & Gibbs, 2016; Pham, 2022) , developing an appropriate learning environment in which to study (Shah et al., 2014) and providing students with the ability to maintain commitment, motivation and self-belief related to their study and learning goals (Syme et al., 2022; Whannell & Whannell, 2015) . One of the key challenges in enabling education, in addition to managing the diversity of students in such programs, is the challenge of retaining them (Willans & Seary, 2018) .

Students who enrol in enabling education programs in Australia have been frequently reported at being as a high risk of attrition. For example, Li and Carroll (2017) found that students from equity groups were at greater risk of university attrition. Nelson et al. (2009) reported similar findings for equity students at regional universities. Further, they found that for students belonging to multiple equity groups, as may be the case with many enabling students, factors compounded resulting in additional impact on completions. Accordingly in higher education general, and enabling education in particular, there has been substantial research that has attempted to identify students who are at a high risk of attrition (e.g. Chai & Gibson, 2015; Whannell & Whannell, 2014, Willans & Seary, 2018 ). This study adds to this literature by exploring the utility of the Online Learning Readiness Self-Check (OLRSC) (Cheon et al., 2021) to equity students at a regional Australia university.

The Online Learning Resources Self-Check Survey

The OLRSC survey was developed and validated by Cheon et al. (2021) . The items in the unit and the factor structure are shown in Table 1. The validation was completed using a dataset comprising “505 prospective online learners with diverse background[s]” (p. 599) and was completed using both exploratory and confirmatory factor analysis. The instrument, comprising of 23 items, was identified to have a structure made up of six factors as summarised in Table 1 below.

The Cronbach’s alpha values indicate a high level of internal consistency in each of the scales (Ho, 2006) .

Cheon et al. (2021) proposed a number of opportunities that were available in the use of the OLRSC with non-traditional students, including that these learners would be able to “recognize their strengths and weaknesses in regard to online learning” (p. 614). They also argued that that student with lower online readiness scores as identified by the OLRSC, might abandon online learning, and that the use of the instrument was appropriate to “evaluate current readiness levels and provide online learning tips or guidelines to improve factors with low scores” (p. 614). The provision of customised resources for students was also recommended.

Of particular interest to the researchers in this study was that although Cheon et al. (2021) proposed that the OLRSC was suitable for use with non-traditional students, they stated that the “majority of the participants (79%) had a postsecondary degree. In particular, 33.3% of the participants had a graduate degree” (p. 606). This contrasts greatly with the demographics of students who enrol in tertiary enabling courses in Australia, including at the institution where this study was conducted, where few have any form of post-secondary school qualifications. This brought into question whether the instrument was valid for use with enabling students in the Australian context. Despite the difference in the background of the cohort targeted in this study, the scales were considered appropriate for use for this study.

3. Method

The research questions that guided the project were:

Table 1. OLRSC factor structure (Cheon et al., 2021) .

· How valid is the Online Learning Readiness Self-Check survey for use with students enrolled in a tertiary pathways enabling course?

· What capacity does the early use of the Online Learning Readiness Self-Check survey in a tertiary pathways enabling course have for predicting student attrition/retention?

3.1. Unit Content

The OLRSC as developed by Cheon et al. (2021) includes six scales, namely Learning Management, Space Management, Technology Management, Interaction with Instructors, Interaction with Peers and Motivation Management. Content was developed to support each of the areas addressed by these scales and was included in a module in the Moodle LMS. The existing Moodle material also included content to specifically target the enhancement of students’ academic skills relating to writing, numeracy, information technology and how to interact with academic staff and peers. The study plan for the module required students to complete the survey which was available in Qualtrics. On completion, the result for each scale was provided, and the student was advised to continue to the supporting content in the LMS, particularly those areas where the students’ scale result was considered low. The additional content was expected to take approximately one week to complete.

3.2. Participants

Potential participants were students enrolled in the two foundation units located in the enabling course for the Trimester 2 and 3 sessions in 2022. Students were notified of the research in week 2 of the trimester by an announcement in the Moodle LMS used by the institution, which also generated an email to each student. The survey was available for completion up to the end of week 5 of the trimester via the online survey tool Qualtrics. At the completion of the study and following cleaning of the dataset where incomplete responses were removed, 199 surveys were available for analysis from a total population of 480 students, representing a 41.5% response rate. Of the students who responded, over 82% identified as female, indicating a strong gender bias in the participants. A similar gender bias is seen in the total enrolments in the pathways program for these trimesters, where 74% of all students are female. Participant ages ranged from 17 to 72, with a mean of 29.4.

At the conclusion of the survey, participants were presented with their summative scores on each of the six scales available. They were also provided with a link to the customised resources in the teaching materials in Moodle that could be used to understand the meaning of the result, and how they might develop their capacities in each area.

At the completion of the project, the data was downloaded, and the researchers were provided with the email address for each of the participants to allow matching of survey results with engagement in the enabling unit. This process was in accordance with the ethics approval for the project.

3.3. Analysis

The data available was examined and partial responses were removed prior to analysis. Considering that the six-factor structure of the OLRSC was validated by Cheon et al. (2021) , the initial analysis to confirm this was done using Principal Components Analysis (PCA) using Direct Oblimin rotation to allow for correlation between the factors was conducted using all 23 items (Ho, 2006) . Factors were considered suitable for use if the eigenvalue for the factor was greater than one and the Scree Test indicated suitability (Ford, MacCallum, & Tait, 1986) . Individual items were considered appropriate for inclusion in a factor if the item communality was 0.5 or greater (Child, 2006) and the item loaded on the factor with a value of greater than 0.5 with cross-loadings of less than 0.2 (Ho, 2006) .

To allow testing of the capacity for the OLRSC to be used as a tool to predict outcomes in the enabling unit, student engagement was operationalised using two variables. The first variable, called Engagement, was calculated by an examination of the Moodle logs. Students had access to Moodle for a period of 14 weeks from when it was available for access, to the date of the submission of the final assessment task for the unit. Students who completed all assessment tasks, irrespective of whether a pass grade was achieved, were given a result on the Engagement variable of 14. Students who did not complete all assessments were allocated a value for Engagement, depending on the week at which the Moodle activity logs indicated that they had no longer accessed the unit content. Thus, a student whose final access to Moodle was in week 7, was allocated an Engagement value of 8. Due to a high incidence of student attrition early in the trimester, the resultant negative skewing of the Engagement variable indicated that the non-parametric Spearman’s rho correlation was appropriate for use (Field, 2013) .

It was hypothesised that there should be statistically significant differences in one or more of the OLRSC scales based on unit completion. The second nominal variable, Completion, was operationalised by examining if students had completed and submitted all assessment tasks for the unit. Due to non-normal distributions in some of the OLRSC scales, the Mann-Whitney U-test was used to determine if there were any significant differences in any of the OLRSC scales based on Completion (Field, 2013) .

4. Findings

4.1. Validation of the Online Learning Readiness Self-Check Survey

The 199 valid responses to the survey were entered into SPSS version 27. The initial PCA using Direct Oblimin rotation to allow for correlation between the factors was conducted using all 23 items in the OLRSC (Ho, 2006) . An examination of the scree plot and factor eigenvalues greater than one indicated that the six-factor solution was supported, with a Kaiser-Meyer-Olkin Measure of Sampling Adequacy of 0.839 and 73.4% of the variance in the items accounted for (Dziuban & Shirkey, 1974) . However, item LM1 demonstrated a relatively low communality at 0.434, and also demonstrated a high cross loading on the Motivation Management factor (0.294). Item LM5 demonstrated a high cross loading on the Interaction with Peers factor (0.362). These items were removed from the analysis and it was repeated.

The final PCA using the remaining 21 items demonstrated a Kaiser-Meyer-Olkin Measure of Sampling Adequacy of 0.822 with item communalities ranging from 0.519 to 0.946. Six factors were identified accounting for 76.1% of the variation in the items. The Scree plot is shown in Figure 1.

The factor loadings for the PCA are shown below in Table 2.

Figure 1. Principal components analysis scree plot.

Table 2. Pattern matrix for PCA using direct Oblimin rotation.

Table 3. Descriptive statistics-OLRSC scales.

The Cronbach’s alpha for each of the final scales was between 0.761 and 0.963, indicating a sound to high level of internal consistency for all scales: Learning Management—0.761; Interaction with Peers—0.899; Technology Management— 0.838; Space Management—0.872; Interaction with Instructors—0.963; Motivation Management—0.885 (Buckingham & Saunders, 2004) .

Table 3 provides the descriptive statistics for each of the scales. It is evident that all scales demonstrated a wide spread of scores, with some students reporting very low results.

An examination of the histograms and box plots indicated a negative skewing of the data in some scales. This analysis indicated that the 21-item version of the OLRSC would be appropriate for use with the non-traditional students enrolled in the pathways enabling program, however the decision was made that further data analysis would be done using non-parametric techniques (Field, 2013) .

4.2. Predictive Capacity of the OLRSC

Based on the reviewed literature, it was hypothesised that students who scored higher on the scales of the OLRSC should demonstrate a higher level of engagement and completion in the enabling unit. The Spearman correlations of Engagement with the OLRSC scales are shown in Table 4. The second variable, Completion, was a nominal variable indicating whether the student had completed all assessment in the unit. The Mann-Whitney U-test results for each of the scales based on Completion are shown in Table 5.

5. Discussion

It was hypothesised that the OLRSC scales would be a useful indicator of a non-traditional online student’s capacity to successfully engage with their online study in the tertiary enabling course and, as a consequence, would be useful in identifying at-risk students. While the OLRSC appears to be a robust survey for use with Australian enabling students, its usefulness as a tool to be able to assist with the early identification of at-risk students was not supported in this study.

The lack of a significant correlation between the Engagement variable and any of the OLRSC scales, and the lack of statistically significant differences based on unit completion, is contrary to what was expected based on the literature. By way of example, a study by Farr-Wharton et al. (2018) of first- and second-year undergraduates at a similar Australian regional university to that where this study was conducted, found “compelling evidence regarding the role of lecturer-student relationships in enhancing student outcomes” (p. 167). Studies with tertiary enabling students have also supported this view (e.g. Cavanagh et al., 2012; Bunn, 2019 ). Syme et al. (2022) argue that in the tertiary enabling context, high quality outcomes require “a trusting and open student-teacher relationship” (p. 2428). In this study, there was little association between the Engagement variable and the nature of the interaction with instructors (r = 0.037, p = 0.606). Similarly, there was little evidence of a difference in the variable based on unit completion (U = 3752, p = 0.568). When the items of the Interaction with Instructors scale of the OLRSC are considered, it appears that the items are quite limited in scope, with a focus on practical actions that are required to access content via the instructor e.g. II1: I ask the instructor questions when

Table 4. Spearman correlations: Engagement with OLRSC scales.

Table 5. Mann-Whitney U-test: OLRSC scales based on unit completion.

needed; II2: I seek assistance from the instructor when needed. These items do not look to the question of the quality or nature of the relationship with instructors, and may therefore not be addressing those aspects that may be predictive of overall outcome.

When the rationale provided by Cheon et al. (2021) for the items included in the OLRSC is considered, it appears appropriate and based on the extant literature. The items appear to address aspects of the tertiary study environment that non-traditional students would need to develop in order to succeed at university. For this reason, the instrument and associated support materials that were developed to support this study are still included in the unit content. The introduction to the survey and the support materials are presented as tools to assist in understanding the level of development of specific skills and how they might be enhanced.

6. Conclusion

This research project investigated the validity of the OLRSC survey for use with two cohorts of Australian tertiary enabling students. With the exception of two items in the Learning Management scale which were excluded, the survey provided six scales that validated appropriately and demonstrated the same scale structure as that found by Cheon et al. (2021) . However, the capacity of the instrument to be used for predictive purposes to identify an enabling student who may be more at risk of disengagement and early attrition is not supported. This finding does appear somewhat contrary to what was expected based on the literature and what the scales of the OLRSC appear to measure. When the nature of the items in the various scales of the instrument are considered, the lack of predictive capacity of the instrument is considered unusual. This appears as an area that warrants additional research efforts with a view to develop an instrument that is able to be used to identify students who are at risk of attrition so that appropriate intervention may be performed.

A limitation of this study is that it has used a quantitative approach based on engagement and unit completion. Future qualitative research would need to be undertaken to establish whether the completion of the OLRSC and engagement with the accompanying support materials were of use to students, either in terms of assisting them to develop the relevant skills addressed or to inform them of strategies that could be used to assist them in the transition into their tertiary study.

Data Availability Statement

The data that supports the analysis and findings of this study are not available as the ethics approval does not allow sharing of data beyond the researchers involved.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

References

[1] Buckingham, A., & Saunders, P. (2004). The Survey Methods Handbook. Polity Press.
[2] Bunn, R. J. (2019). We Need to Help Students Discover Themselves and See into the Life of Things: Advice from Open Foundations Lecturers. In A. Jones, A. Olds, & J. G. Lisciandro (Eds.), Transitioning Students into Higher Education (pp, 2-11). Routledge.
https://lo.unisa.edu.au/pluginfile.php/2570959/mod_resource/content/1/Rosalie Bunn - the art of being an enabling educator.pdf
https://doi.org/10.4324/9780429279355-19
[3] Cavanagh, T., Macfarlane, A., Glynn, T., & Macfarlane, S. (2012). Creating Peaceful and Effective Schools through a Culture of Care. Discourse, 33, 443-455.
https://doi.org/10.1080/01596306.2012.681902
[4] Chai, K., & Gibson, D. (2015). Predicting the Risk of Attrition for Undergraduate Students with Time Based Modelling. In 12th Cognition and Exploratory Learning in Digital Age (pp. 109-116).
http://files.eric.ed.gov/fulltext/ED562154.pdf
[5] Cheon, J., Cheng, J., & Cho, M. (2021). Validation of the Online Learning Readiness Self-Check Survey. Distance Education, 42, 599-619.
https://doi.org/10.1080/01587919.2021.1986370
[6] Child, D. (2006). The Essentials of Factor Analysis (3rd ed.). Continuum International Publishing Group.
[7] Crawford, N., Kift, S., & Jarvis, L. (2019). Supporting Student Mental Wellbeing in Enabling Education: Practices, Pedagogies and a Philosophy of Care. In A. Jones, A. Olds, & J. G. Lisciandro (Eds.), Transitioning Students into Higher Education (pp. 161-170). Routledge.
https://doi.org/10.4324/9780429279355-20
[8] Dziuban, C., & Shirkey, E. (1974). When Is a Correlation Matrix Appropriate for Factor Analysis? Some Decision Rules. Psychological Bulletin, 81, 358-361.
https://doi.org/10.1037/h0036316
[9] Farr-Wharton, B., Charles, M. B., Keast, R. L., Woolcott, G., & Chamberlain, D. E. (2018). Why Lecturers Still Matter: The Impact of Lecturer-Student Exchange on Student Engagement and Intention to Leave University Prematurely. Higher Education, 75, 167-185.
https://doi.org/10.1007/s10734-017-0190-5
[10] Field, A. (2013). Discovering Statistics Using IBM SPSS. Sage.
[11] Ford, J., MacCallum, R., & Tait, M. (1986). The Application of Exploratory Factor Analysis in Applied Psychology: A Critical Review and Analysis. Personnel Psychology, 39, 291-314.
https://doi.org/10.1111/j.1744-6570.1986.tb00583.x
[12] Ho, R. (2006). Handbook of Univeriate and Multivariate Data Analysis and Interpretation with SPSS. Chapman & Hall/CRC.
https://doi.org/10.1201/9781420011111
[13] Li, I., & Carroll, D. (2017). Factors Influencing University Student Satisfaction, Dropout and Academic Performance: An Australian Higher Education Equity Perspective.
https://www.ncsehe.edu.au/publications/factors-influencing-university-student-satisfaction-dropout-and-academic-performance-an-australian-higher-education-equity-perspective/
[14] Lisciandro, J., & Gibbs, G. (2016). On Track to University: Understanding Mechanisms of Student Retention in an Australian Pre-University Enabling Program. Australian Journal of Adult Learning, 56, 198-224.
https://files.eric.ed.gov/fulltext/EJ1107578.pdf
[15] Motta, S., & Bennett, A. (2018). Pedagogies of Care, Care-Full Epistemological Practice and ‘Other’ Caring Subjectivities in Enabling Education. Teaching in Higher Education, 23, 631-646.
https://doi.org/10.1080/13562517.2018.1465911
[16] Nelson, K., Duncan, M., & Clarke, J. (2009). Student Success: The Identification and Support of First Year University Students at Risk of Attrition. Studies in Learning, Evaluation, Innovation and Development, 6, 1-15.
https://eprints.qut.edu.au/28064/
[17] Pham, M. (2022). Enabling the Enablers: Professional Development for Peer Leaders to Enhance the Learning Experience of Enabling Education Students. Journal of Peer Learning, 15, 4-16.
https://ro.uow.edu.au/ajpl/vol15/iss1/2
[18] Shah, M., Goode, E., West, S., & Clark, H. (2014). Widening Student Participation in Higher Education through Online Enabling Education. Widening Participation and Lifelong Learning, 16, 36-57.
https://doi.org/10.5456/WPLL.16.3.36
[19] Syme, S., Roche, T., Goode, E., & Crandon, E. (2022). Transforming Lives: The Power of an Australian Enabling Education. Higher Education Research & Development, 41, 2426-2440.
https://doi.org/10.1080/07294360.2021.1990222
[20] Whannell, R., & Whannell, P. (2014). Identifying Tertiary Bridging Students at Risk of Failure in the First Semester of Undergraduate Study. Australian Journal of Adult Learning, 54, 101-120.
http://files.eric.ed.gov/fulltext/EJ1033865.pdf
[21] Whannell, R., & Whannell, P. (2015). Identity Theory as a Theoretical Framework to Understand Attrition for University Students in Transition. Student Success, 6, 43-52.
https://doi.org/10.5204/ssj.v6i2.286
[22] Willans, S., & Seary, K. (2018). Why Did We Lose Them and What Could We Have Done? Student Success, 9, 47-60.
https://doi.org/10.5204/ssj.v9i1.432

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.