Turnitin Originality Report

neil by Neil Evans

From Chapter 4 Part 3 (Moodle TT) (Honours research report - AIS506 (Moodle TT))

  • Processed on 26-Oct-2015 09:33 SAST
  • ID: 590244918
  • Word Count: 8627
 
Similarity Index
21%
Similarity by Source
Internet Sources:
19%
Publications:
9%
Student Papers:
N/A
sources:
paper text:
Predicting the 1Acceptance of Electronic Learning at the University of Zululand Neil Evans and Jerry le Roux University of Zululand EvansN@unizulu.ac.za, lerouxcjb@gmail.com ABSTRACT We presents a new and compelling method to help understand some of the important needs, perceptions and expectations of users of electronic 1(e-) learning resources at the University of Zululand by contextualizing their pedagogic place in this blended tertiary learning environment and confirming their approval by both academic staff and students. Predicting their acceptance was achieved conceptually by adopting the 1Unified Theory of Acceptance and Use of Technology (UTAUT) model and statistically validating its application1to predict the behavioural intentions and usage behaviour of the primary users towards e-learning using a positivist epistemological belief and deductive reasoning, this paper also embraces an interpretive research paradigm to include the researchers’ views on the topic. 1Partial least squares structural equation modelling and inferential statistics predicted 1the level of acceptance of e-learning by academic staff (adjusted R2 = 0.41) and students (adjusted R2 = 0.39) and illustrated 1the strengths and significances of the postulated UTAUT relationships and their moderating effects. Academic performance gains proved to be the strongest significant influence on both 1primary users’ intentions to use e-learning. Although the results may not be generalised to other institutions they do contribute 4to UTAUT’s theoretical validity and empirical applicability to the management of e-learning -based initiatives. We argue that the high predictive accuracies found in Venkatesh et al. (2003) could be obtained if significant moderators contextualised to the education sector were added to the SEM, although cognisance of maintaining a parsimonious structural equation model should also be taken into consideration before inflating the coefficient of determination (R2), which 41measures the relationship of a latent variables explained variance to its total variance.37Keywords: e-learning, Unified Theory of Acceptance and Use of Technology (UTAUT),51Partial Least Squares Structural Equation Modeling (PLS-SEM), University of Zululand, South Africa INTRODUCTION AND BACKGROUND 1Since the beginning of the 21st century, the abundant use of various technologies (e-learning Africa, 2013:10) connected to digital networks (Mansell and Tremblay, 2013:iii) have 1changed the way information, especially multimedia information, is being accessed, stored and disseminated (Agyei, 2007:5). Users today essentially want information delivered to them (Sturges, 2006). This has been particularly pertinent to institutions of higher education that 1constantly have to evaluate instructional policies and technical frameworks to accommodate new pedagogies and educational technologies that are required by a rapidly growing generation of students with different learning styles, and different needs (Siemens, 2004a). The growing 7use of Information and Communication Technologies (ICT) and Information Systems (IS) for teaching and learning gives rise to new pedagogies, for example connectivism, where Siemens (2004a) purports that language together with technology and media act as conduits of information, promoting greater participation, collaboration and interaction between networked learners, who socially construct an active learning experience within different learning networks (Siemens, 2004a in Evans, 2013). 7For the purposes of Evans’ (2013) study electronic (e-) learning was broadly defined as the use of ICTs and7IS in teaching and learning, which at the University of Zululand occurs within a blended learning environment, where traditional face-to-face teaching and learning is combined with e-learning, experiential learning, research and community engagement. E- learning resources can specifically include office and classroom ICT and IS, portable presentation tools for lectures, research databases and institutional repository and the 20Learning Management System (LMS) — a software application for the administration, documentation, tracking, reporting and delivery of e-learning activities and resources — can be accessed through the wired or wireless network services available to both staff and students. A number of e-learning projects have been initiated on the main campus 2since 2000 ranging from basic departmental websites hosting “virtual classrooms” to the actual deployment of various LMSs including WebCT, now called Blackboard— MyCMT in 2000, which was developed in-house by the Department of Accounting and Auditing in 2002 and Moodle (Modular Object-Orientated Dynamic Learning Environment — introduced in 2007 and piloted by the author1 in the Department of Information Studies. Moodle was officially adopted as the preferred 2LMS on campus in 2009, with one instance installed for each of the four faculties. 1 Neil Evans 2 62Dillon and Morris (1996) defined user acceptance as“the demonstrable willingness within a user group to employ information technology for the tasks it is designed to support”. Dillon (2001) believes that 11by developing and testing models of the variables influencing user acceptance, researchers seek to provide direction to the process of design and implementation in a manner that will minimize the risk of disapproval by users of these resources. In 2006 the Wageningen University Research Centre, in cooperation with the University of Zululand, launched and funded the 2Netherlands Universities Foundation for International Cooperation (NUFFIC) research project, which became known as the Wageningen University Zululand University (WUZULU) project with the aim of “enhancing the quality and relevance of education and research in the social and natural sciences at the university (Definite Schedule WUR-visit, 2006). One of the themes of the project was the role of e- learning. However, a proposal for2a structured e-learning initiative at the university initially made little or no progress because of heavy workloads of participating staff. In an attempt to address this problem, a special e-learning task team was established in 2008 to revitalise the project. After a revised proposal (Muller and Evans, 2008) was tabled and 2accepted by the WUZULU project, the task team drew up a SWOT (Strengths, Weaknesses, Opportunities and Threats) analysis, which was presented to the 2university management in the form of a road show. Both the Registrar and the Vice-Rector of academic affairs subsequently agreed that e-learning should be an integrated part of the university’s curricula. In the same year (2008), the e-learning task team was invited to participate in a 17developmental study towards effective practices in Technology Assisted Learning (TAL) by the 17University of Johannesburg in collaboration with Edge Hill University (United Kingdom). The study’s coordinators, Boere and Kruger (2008), invited role players from 23 South African universities to use 12 50so-called“lenses” of self-evaluation to review and organise their information regarding TAL or e-learning within their institutions. 15 institutions including the University of Zululand accepted the invitation to participate. The workshop provided some valuable insight into TAL benchmarks and it emerged that Zululand was significantly behind other institutions such as the University of Stellenbosch (Boere and Kruger, 2008:8) and the University of Johannesburg (Boere and Kruger, 2008:13) in the 12 lenses of review. E-learning challenges that were noted in the self-evaluation included: ? Poor computer literacy among some sacademic staff, a low desktop computer (720) (E-Learning Working Group, 2013) to student (16 582) (University of Zululand registration website, 2013) ratio of 1:26. ? A general reference to the 13use of technologies in the University of Zululand’s Teaching and Learning Policy (2004:2) but no specific policy that refers to or promotes e-learning. ? ? No specific 8quality management processes to emphasise and enhance e-learning. Limited initiatives for the professional development of staff to integrate e-learning within the existing traditional curricula. ? Few structures in place for technical and system support and working with pre- determined digital standards. ? Few contributions from leadership and change management, hence relying on a bottom-up approach in its implementation and support. In 2009, the e-learning task team developed a 2strategy and implementation plan which recommended a two -phased implementation approach (Muller and Evans, 2009). The 2first phase included a requirements analysis for academic staff, students and other stakeholders to determine their e-learning needs and expectations. The second phase involved creating the necessary organisational changes 2to facilitate, support and roll out e-learning on campus. Later in 82009, the e-learning implementation strategy and plan was presented to all four faculty boards and Senate and all bodies unanimously adopted it. In the next two years no support was received by the university to implement this plan, however in 2012, the document, together with a budget proposal, was submitted, via Academic Development (AD), to the 70Department of Higher Education and Training (DHET), requesting funding through the Teaching Development Grant (TDG). The theme, Creating a Sound Teaching Environment through e-learning, was allocated R5.6 million out of a total TDG of R15.2 million for the period 2012 to 2013 (University of Zululand teaching development proposal, 2012/2013). The popularity of Moodle’s social constructivist approach, whereby it deliberately sets out to enable academic staff and students to have control of their learning environments, rather than having ideas imposed in a top-down approach. The idea was to give students more choice on how they learn and academic staff more resources to create a good learning strategy. The original 4 Moodle version 1.8 instances were upgraded to version 2.2 in 2014. Enrolment figures in the Faculty of Science’s instance were 80 academic staff and 5550 students enrolled; the Faculty of Commerce, Administration and Law instance had 32 academic staff and 2732 students; the Faculty of Education LMS with 51 academic staff and 1160 students and the Faculty of Arts with 31 academic staff and 2956 students (Faculty learning management systems, 2014). Not all LMS users were active for all login accounts given above. The 8limited initiatives for the professional development of academic staff to integrate e-learning within the existing traditional curricula27was identified as a threat to the use of e-learning resources. Therefore numerous introductory workshops on e-learning and Moodle were conducted by volunteer staff through the AD offices from 2011–2014 with a total 93 academic staff attending. In 2014 a further 7 staff who had volunteered to help support the individual Faculty instances attended in-depth Moodle administrator (5 front-end, 2 back-end) training with the hope that the small pocket of enthusiasts would be able to constitute hubs for further development of staff in the current and newly upgraded LMSs. The rollout of 30 electronic smart boards also started in 2014, however many staff who were invited to attend a capacity-building demonstration of the e-boards requested additional training that has to date not materialized so many of these e-boards are still being used as projectors only. According to the Council of Higher Education (2014:i) 12higher education has an important role to play in contributing to the reconstruction and development of all aspects of South African society.6In any education system, the quality of the academic staff profoundly affects the quality of student learning. In addition to disciplinary expertise, academics need skills in pedagogy, curriculum development, assessment and moderation, as well as other skills and attributes that include digital literacy. 6There are often limited opportunities and incentives for academic staff to acquire such skills. On the contrary, the university reward and promotion criteria often act as disincentives for academics to put time into developing teaching skills, since research output is often the main criterion for recognition, promotion and financial rewards. With the renewed impetus in 2014 12to enhance all aspects of teaching and learning though the 6Quality Enhancement Project (QEP) and to improve student success by integrating e-learning resources into 32teaching and learning methods at the university this paper analyses relationships that influence the primary users (academic staff and students) acceptance of these resources, a prerequisite for the successful planning, implementation and support of e-learning. THEORETICAL FRAMEWORK A learning theory like connectivism (Siemens, 2004a) provides insight into a learning ecology for the digital era, where forming connections within expert communities using language, media and technology as conduits of information and ethics, beliefs and perspectives as filters of information, builds the skills required to work in the knowledge economies of today. The pedagogy also emphasizes that creating a blended learning network around the intent of learning will result in a greater change or transformation in the learner’s knowledge, experience and skills. At the University of Zululand this blend varies among different faculties, departments and academic staff but could include face-to-face, experiential learning, where students volunteer for work experience during their holidays, research, which is an essential component of any academic programme, community outreach, e-learning, self-learning and informal learning. Based on the assumption that people’s actions are guided by their emotions or how they feel (Hayes, 2013:24), it has been shown theoretically that a user’s initial reaction or attitude towards ICT and IS technologies affects their intentions to use them, which in turn will influence their actual use of such technologies 25(Venkatesh et al., 2003: 427; Davis, Bagozzi, and Warshaw, 1989 in Theories Used in IS Research Website, 2008). Hayes (2013:vii) explains that analytically questions of “how” are approached through a process called mediation analysis, while questions of “when” are normally answered through moderation analysis, and analytically combining the two, leads to what the author calls conditional process analysis. According to Evans (2013:56) predicting the acceptance of e-learning requires the review of psychology-based theories, which include the original 38Theory of Reasoned Action (TRA) by Fishbein and Ajzen (1975, 1980) (Dillon and Morris, 1996; Ajzen, 2008); the42Technology Acceptance Model (TAM) by Davis, Bagozzi, and Warshaw (Dillon and Morris, 1996; Dillon, 2001) 22and the Theory of Planned Behaviour (TPB) by Ajzen (1985, 1991) (Ajzen, 2008) among others, that are essentially modifications of the 3 above mentioned models. Ventakesh et al. (2003:425) empirically compared 8 34models to formulate the Unified Theory of Acceptance and Use of Technology (UTAUT) model that incorporated all validated relationships across all models as well as a selected subset of additional moderators. The 4UTAUT model has thus condensed the 32 variables found in the 8 existing models into 4 main effects and 4 moderating factors (Ventakesh 13et al. 2003: 467). The UTAUT model proposes that 4 constructs, namely, 33performance expectancy (PE), effort expectancy (EE), social influence (SI), and facilitating conditions (FC) are significant determinants73on the two dependent variables behavioural intentions (BI) and use behaviour (UB) and hence of 4user acceptance of ICT innovations (Venkatesh et al., 2003).4These constructs are moderated, in varying degrees, by gender, age, experience, and voluntary or compulsory use. For the sake of parsimony (Bhattacherjee, 2012:1) the moderation of 31voluntariness of use on the social influence to use e-learning was not postulated on 27in this study as the use of e-learning at the university can be seen as both compulsory (structured lectures) and voluntary (using resources after hours) at the same time. By applying a user acceptance model, criteria that contribute to the primary users’ reaction, their 28intention to use and their actual use of e-learning can be measured, thus enabling the study to predict its adoption. While the UTAUT model’s ability to predict students’ and academic staff’s 1behavioural intentions to accept e-learning at the university was empirically validated by Evans (2013) using a strictly positivist epistemological belief and deductive reasoning, this paper also attempts to interpret the level of acceptance, behavioural intentions and usage behaviour of academic staff towards e-learning within this institution by inductive reasoning and the of use of constructivism or interpretivism, which recognises the important role of the observer and society in constructing the patterns we study (Moses and Knutsen, 2012:9). PROBLEM STATEMENT While the incorporation of e-learning within higher education seems inevitable, contextualizing its pedagogic place in this blended tertiary education environment is required. Predicting the acceptance, adoption and 68use of these resources by students and academic staff will help identify, interpret and support relationships that are important to facilitate this development and ensure that, first, users take ownership and use the resources, second, that the resources serve their intended purpose and, lastly, the resources give a good return on investment. RESEARCH QUESTIONS The research questions were as follows: 1. How does e-learning fit into a blended learning ecology at the University of Zululand? 2. Will students and academic staff accept /adopt e-learning at the university? 3. To what level of efficiency can the UTAUT model be used to predict the 1acceptance of e-learning by students and academic staff at the university? 4. How will the constructs and their moderating 25variables in the UTAUT model impact on the acceptance of e-learning with special reference to their specific influence on students’ and academic staff’s behavioural 28intention to use, and their use of e- learning at the university? 5. How strong is the adopted user acceptance model’s theoretical validity and practical applicability? METHODOLOGY AND DATA COLLECTION 1A non-experimental statistical method was used to analyse the quantitative data from both primary users — 1inferential statistics was used to predict the level of acceptance of e- learning by students and academic staff, whereas descriptive statistics was used to report on 7 the biographical data. The two target populations of students and academic staff represented the primary users of 1e-learning resources at the university. Static probability sampling of the primary users made use of Probability Proportionate to Size (PPS) and Equal Probability Selection Methods (EPSEM) to randomly and proportionally select students stratified according to their faculty and academic year, and academic staff stratified according to their positions, from their respective sample frames — clusters of academic modules registered on the computer laboratories’ timetables for students and the academic staff email address book. The survey of students and academic staff was conducted using three self-administrated questionnaires. Two online questionnaires (one for students and one for academic staff) and an additional multi-mode paper version (for academic staff) were used as instruments to measure the biographic and indicator statements of the key relationships of both target populations. The 4survey questions were mapped to constructs of the UTAUT model to measure the four independent variables or 45determinants (PE, EE, SI and FC) and their moderating effects (gender, age and experience), together with the two dependent variables — BI and UB. 4Survey participants were asked to indicate their response to each indicator statement using a five item Likert scale, with 1 representing strong disagreement and 5 being strong agreement with the statement. According to Hair et al. (2010:94), 18factor analysis is an interdependence technique used to define the underlying structure among variables in the analysis, which are the building blocks of relationships. The authors recommend that the sample size should not be fewer than 50 but preferably 100 or larger. Hair et al. (2010:10), however, caution against samples that are too large because 21at any given alpha (α) level, increasing the sample size always produces greater power for the statistical test and, by having a very large 21sample size, smaller and smaller effects become statistically significant. According to Chin (1997), 24Partial Least Squares (PLS) can be a powerful method of analysis for a number of reasons including its minimal demands on sample size. Guidelines advocated by Chin (1997) include a sample size 19equal to the larger of two possibilities: first, 10 times the scale with the largest number of formative (i.e., causal) indicators, which equates to 10 times the 5 (students and academic staff) indicators of PE and gives a minimum sample of 50 students and 50 academic staff participants; and second 3510 times the largest number of precursor constructs used to determine a dependent variable, or 10 times 3 (30), the number of constructs (PE, effort EE and SI) used to determine BI. Although a minimal sample size can give results, Urbach and Ahlemann (2010:17) warn that the situation can be more complicated. They give the example where 3small sample sizes (e.g., n = 20) do not allow the discovery or validation of structural paths with small coefficients (Chin and Newsted, 1999 in Urbach and Ahlemann, 2010) and 3in such cases, sample sizes are required that are similar to those necessary for 8 covariance-based approaches where samples should be greater than 150. Taking this into account, the study recognised the possible limitations of the minimum sample sizes (insensitive) and relatively large sample sizes (overly sensitive) (Hair et al., 2010:10) and thus aimed to obtain the minimum recommended sample size from 150 academic staff and 300 students to provide a statistically strong sample size with the correct balance in power. Once all responses from the primary users had been captured, data were exported from the www.stellarsurvey.com website in a Comma Separated Value (.CSV) file format. Data were then imported in SmartPLS (Ringle et al., 2004) for the statistical analysis. PLS regression was the statistical data analysis method used to process the results and was made available through specialised statistical software — SmartPLS (Ringle et al., 2004). After data quality was evaluated and outliers and spoilt responses discarded, the PLS regression algorithm was run to calculate the UTAUT model parameter’s estimates for each target population. Statistical output was analysed according to recommendations by Urbach and Ahlemann (2010:18) for model validation, which 15represents the process of systematically evaluating whether the hypotheses expressed by the structural model are supported by the data or not. Urbach and Ahlemann (2010: 18) state that although 3PLS does not provide an established global goodness-of-fit criterion, there are several criteria for assessing partial model structures, a systematic application of which was 3carried out in a two-step process, including (1) the assessment of the measurement models that included internal consistency reliability, indicator 30reliability, convergent validity and discriminant validity and (2) the assessment of the structural model including its validity and predictive relevance (Urbach and Ahlemann, 2010:18). The paper refers readers to Evans’ (2013) study for all the statistics. Both SmartPLS (Ringle et al., 2004) and PROCESS — designed by Hayes (2013), and installed as an add-on in the regression tools of IBM SPSS Statistics, were used to analyse the hypothesized moderating effects of the UTAUT model. Evans (2013) administered a pilot survey to a sample of staff and postgraduate students to evaluate the 3survey instrument and to obtain feedback on its quality. After the pilot, 3 UB indicators were added to allow the self-measure of use behaviour of e-learning by participants, one indicator statement with low loadings for EE was removed, and 3 SI indicator statements were removed and replaced to improve content validity of the survey (Evans, 2013:104). STATISTICAL ANALYSIS AND FINDINGS The total available participant pool included 692 students, who were enrolled in 10 academic modules that were randomly selected (probability proportionate to size) from the four faculties to participate in the study (University of Zululand registration website, 2013). The student population was selected using cluster sampling according to the faculty and stratification according to the academic level of the randomly selected modules. Lecturers, whose modules were selected for the survey, were contacted in advance and permission was sought to administer the survey to their students during scheduled lecturing / revision time. All modules were administered in this manner, except for one hydrology module, whose class was not being held in the computer labs during the time of the survey. In this case, a link to the survey was posted on their Moodle LMS module, and messages were sent to all enrolled students asking for their participation. The student survey opened on 15 October, 2013, and closed a week later on the 22nd; a total of 511 responses were captured on the hosting website, 90 of which were incomplete and so discarded. This left 22a sample size of 421, which equals a response rate of 59%. Responses were then filtered for those who had 4 or more non-random missing answers for the construct and moderator-related questions on page 2 of the questionnaire. This resulted in 16 more cases being 54removed from the sample, thus leaving a final student sample size of 405. The academic staff survey opened on 23 October, 2013, and closed three weeks later on 14 November. From the 310 academic staff listed on the institution’s email address book and stratified according to their positions (contract lecturers, junior lecturers, lecturers, senior lecturers, associate professor and professor), 150 were selected using SRS (with replacement) and PPS formula. Four tracked emails and a paper questionnaire — placed in post boxes of staff who had not responded after the second email — elicited a total of 98 online submissions and 5 paper questionnaires, giving a total of 103 responses. One of the paper questionnaires was blank; the remaining 4 paper questionnaires’ data were captured manually onto the hosting website’s database. There were 27 incomplete responses that 75were excluded from the academic staff sample, leaving a total of 75 participants. After delivering the paper questionnaires to the postal services, it was discovered that 4 staff members had left the institution, one had retired and another had died, leaving a total possible participant pool of 144 and a response rate of 52%. Responses were then filtered for those who had 4 or more non-random missing answers for the construct and moderator- related questions on page 2 of the questionnaire and 2 more cases were removed, leaving a final academic staff sample size of 73. The majority of the student respondents were females (245; 61%), with the minority being males (160; 39.5%). Most of student 65respondents were between the ages of 18 and 23 years (70.1%), followed by the age categories 24–29 (23%) and 30–35 (6%). It was unexpected to find one student as young as 16 responding. The 40average age of the student participants was 23 years with a standard deviation of almost 4 years. The academic level of the student sample showed the majority were in their first year (200; 49%), followed by second years (96; 23.7%), third years (79; 20%) and fourth years (29; 7%); one post-graduate, featured in the survey. The great majority (98%) of the students were South African, while 2% were from other African countries. The academic staff sample consisted of fewer (29; 40%) females than males (44; 60%) and the 60average age of the staff who participated was 45 years with a standard deviation of 10 years. The representation of academic staff was from all four faculties including: Arts (28, 38%), Science and Agriculture (25, 35%), Commerce, Administration and Law (10, 14%) and Education (10, 14%). The stratification of academic staff according to their position showed that the majority were lecturers (45, 62%), followed by senior lecturers (10, 14%), junior lecturers (8, 11%), associate professors (6, 8%), professors (3, 4%) and one other (1, 1%). The majority (84%) of the staff who participated were South African nationals, while 16% were from other countries. Brown (2011:13) explains that 14Likert items and Likert scales (made up of multiple items) are reported in different ways and that Likert items, whether nominal, ordinal or interval, are irrelevant when 14using Likert scale data, which can be taken to be interval. This paper provides only a summary of findings later in the discussion and refers readers to the full study (Evans, 2013) for detailed ordinal and interval scale statistics for the individual indicator statements used to measure the various latent variables in the SEM. Evans’ (2013) study took cognisance of the warning of Hair et al. (2014:8) that it was not appropriate to calculate arithmetic means or variance for ordinal data because the researcher cannot assume that the differences in order are equally spaced. However, with a well-structured Likert scale with appropriate 29categories (1 = strongly disagree, 2 = disagree, 3 = neutral, 4 = agree and 5 = strongly agree), the inference is that the “distance” between categories 1 and 2 is the same as between 3 and 4, etc. (Hair et al., 2014:8–9). Measurement Models Students The PLS-SEM algorithm converged in 6 iterations in the first PLS algorithm run for students, and 5 in the last, showing that the algorithm could find a stable solution in one fewer iteration 11 after the unreliable indicators were removed from the student measurement model. Both student and staff reflective measurement models were validated by following the guidelines suggested by 3Straub et al. (2004) and Lewis et al. (2005) in 49Urbach and Ahlemann (2010: 18), Hair et al. (2011) and Hair et al. (2014). To start the approximations for the 52relationships between the reflective latent variables and their indicators, the indicator’s outer loadings were investigated. Statistics led to the removal of unreliable indicator items. All these 53items were below the recommended loading value of 0.70 and, because they do not adequately explain their associated latent variables, were considered unreliable for the the student data analysis. A weaker behavioural intention indicator below this value (0.64) was, however, retained for content validity as observed by Hair et al. (2011:146). 26The significance of the indicator loadings were also tested using the resampling method of bootstrapping (Efron 1979; Efron and Tibshirani 1993 in Urbach and Ahlemann, 2010:18) and all proved significant. The Fornell-Larcker criterion showed evidence of discriminant validity between each reflective construct and their remaining reliable indicators. The average variance extracted 3(AVE) of each LV should be greater than the LV’s highest squared correlation with any other LV, which is the same as 46comparing the correlation with the square root of the AVE. Academic staff The PLS-SEM algorithm converged in 6 iterations in both the first PLS algorithm run and the last, showing that the algorithm could find a stable solution relatively easily for academic staff. Statistics led to the removal of the unreliable indicator items in the academic staff’s reflective outer measurement model. The 3significance of the indicator loadings were also tested using the resampling method of bootstrapping (Efron 1979; Efron and Tibshirani 1993 in Urbach and Ahlemann, 2010:18) and all remaining reliable indicators proved significant. The Fornell-Larcker criterion showed evidence of discriminant validity between each reflective construct and their remaining reliable indicators. Structural Models Students The first step in assessing the students’ PLS-SEM structural model was to run the collinearity assessments for the two sets of predictor constructs (Hair et al., 2014:168) (BI and FC for Use and PE, EE and SI for BI), which were run in IBM SPSS Statistics and showed no unwanted collinearity. Note: 66variance inflation factor (VIF) values above 5 and tolerance values below 0. 20 are indicative of unwanted collinearity (Hair et al., 2014:170). The second step in assessing the PLS-SEM structural model was to examine the path coefficients, after running the PLS-SEM algorithm, as these represented the postulated relationships between the independent and dependent constructs 63(Hair et al., 2014: 170). The significance of the path coefficients depends on their standard error, which was obtained by bootstrapping in SmartPLS (Hair et al., 2014:171). For the student sample (n = 405), the 10empirical t value has to be larger than the critical t value (1.96) at a significance level of 5%, and the p value should therefore be less than 0.05 for the hypothesized relationships to be significant. The significance of the total effects, including the direct 16(PE, EE, SI on BI and BI and FC on Use), and indirect (PE, EE and SI on Use) effects, was obtained by bootstrapping in SmartPLS and summarised in Table 1. Table 1: Significance testing of the total effects for the structural model of students Total effect t value Significance level p value 95% Confidence intervals LLCI ULCI BI -> USE 0.22 4.50 *** 0.00 0.13 0.32 EE -> BI 0.31 4.35 *** 0.00 0.16 0.42 EE -> USE 0.07 3.01 *** 0.00 0.02 0.11 FC -> USE 0.27 5.86 *** 0.00 0.17 0.34 PE -> BI 0.34 5.77 *** 0.00 0.24 0.48 PE -> USE 0.07 3.33 *** 0.00 0.03 0.13 SI -> BI 0.11 2.07 ** 0.05 0.00 0.19 SI -> USE 0.02 1.88 NS 0.07 0.00 0.04 39Note: NS = not significant **p< .05. ***p< 0.01 The 36coefficient of determination (R2), which is a measure of the model’s predictive accuracy (Hair et al., 2014:174), adjusted R2 and the Stone-Geisser’s (Q2) value, which 39indicates the model’s predictive relevance (Hair et al., 2014:178), can be seen in Table 2. Table 2: Endogenous LVs’ R2, R2adj and Q2 values for the students’ structural model R square Adjusted R square Q square BI 0.40 0.39 0.22 EE FC PE SI USE 0.16 0.16 0.11 In addition the endogenous latent variables’ R2 values, the change in their R2 value, 10when a selected exogenous latent variable is included or excluded in the model, was estimated by running the PLS-SEM algorithm twice to calculate the f2 effect sizes (Hair et al., 2014:177), which can be seen in Evans’ (2013) full study. Similar to the f2 effect sizes approach to assessing R2 values, the relative impact of predictive relevance of the exogenous latent variables explaining endogenous ones can be compared by the measure of the q2 effect size (Hair et al, 2014:183), which can also be seen in Evans’ (2013) full study. Academic staff The collinearity assessments (Hair et al., 2014:168) for the two sets of predictor constructs (BI and FC for Use and PE, EE and SI for BI) of academic staff, which were run in IBM SPSS Statistics, showed no unwanted collinearity. When analysing path coefficients representing the hypothesized 59relationships between the independent and dependent constructs for the academic staff sample (n = 73), 10the empirical t value had to be larger than the critical t value (1.99) at a significance level of 5%, and the p value should be less than 0.05 for the hypothesized relationships to be significant. The significance testing of the total effects include the direct 16(PE, EE, SI on BI and BI and FC on Use) and indirect (PE, EE and SI on Use) effects as shown in Table 3 was obtained by bootstrapping. Table 3: Significance testing of the total effects coefficients for the structural model of academic staff Total effect t value Significance level p value 95% Confidence intervals LLCI ULCI BI -> USE 0.42 3.46 *** 0.00 0.19 0.70 EE -> BI 0.14 1.37 NS 0.16 -0.07 0.40 EE -> USE 0.06 1.35 NS 0.16 -0.03 0.18 FC -> USE 0.22 2.09 ** 0.04 0.01 0.46 PE -> BI 0.54 4.42 *** 0.00 0.32 0.83 PE -> USE 0.23 2.25 ** 0.03 0.03 0.48 SI -> BI 0.06 0.77 NS 0.30 -0.13 0.30 SI -> USE 0.02 0.79 NS 0.29 -0.06 0.13 Note: NS = not significant **p < .05. ***p < 0.01 The coefficient of determination (R2), adjusted R2 and the Stone-Geisser’s Q2 value 61can be seen in Table 4. Table 4: Endogenous LVs’ R2 and Q2 values for the structural model R square Adjusted R square Q square BI 0.43 0.41 0.28 EE FC PE SI USE 0.33 0.31 0.22 Academic staff’s f2 effect sizes and q2 effect sizes can be seen in Evans’ (2013) full study. Moderation Having described the relationships of the UTAUT constructs for the primary users of 1e- learning resources at the University of Zululand, attention shifted to understanding under what conditions the constructs operate. Hayes (2013:27) explains that a relationship between two variables X and Y is said to be moderated when its size and sign depend on a third variable or set of variables M. 5Gender was coded as a 0/1 dummy variable consistent with previous research (Venkatesh and Morris 2000 in Venkatesh et al., 2003:439), and 5age was coded as a continuous variable, consistent with prior research (Morris and Venkatesh 2000 in Venkatesh et al., 2003:439). 5Experience was operationalised via a dummy variable that took ordinal values of 1, 2, 3, 4 and 5 to 5capture increasing levels of user experience with the system.5Using an ordinal dummy variable, rather than categorical variables, is consistent with recent research (e.g., Venkatesh and Davis 2000: 197). Students While conducting the moderation analysis in SmartPLS (Ringle et al., 2004), a positive correlation between experience and use of e-learning resources increased R2 of UB in the students’ UTAUT model (0.29). Bootstrapping resulted in two of the moderating effects in the model being significant at the 95% confidence level, that of experience moderating social influence towards behavioural intention — SI*Exper (t = 2.33), and of experience moderating facilitating conditions towards use — FC*Exper (t = 4.05). However, on close inspection the convergent validity (AVE) and composite reliability values did not meet the required criterion to be included in the model. Academic staff While conducting the academic staff’s moderation analysis in SmartPLS (Ringle et al., 2004), the PLS algorithm calculation also showed slightly different results when all the moderating effects were run together. The study observed a strong correlation between experience and use of e-learning resources, with UB’s R2 (0.69) almost doubling. Bootstrapping, however, indicated 43only one moderating effect to be significant within the constructs of the academic staff’s UTAUT model, which was that of experience on facilitating conditions of academic staff — FC*Exper (t = 1.98). However, on close inspection the convergent validity (AVE) values did not meet the required criterion to be included 48in the model. DISCUSSION OF FINDINGS The results of the study demonstrate the1acceptance of e-learning resources by both students and academic staff44in a blended teaching and learning environment — the extent of this acceptance requires the promotion of positive relationships that will influence both their behavioural 71intentions to use, and their usage behaviour, of these resources. The empirical findings show that resources that improved students’ academic performances and that were easy to use were the most significant variables. For academic staff, the most significant relationship was using resources that improved their performances in teaching, research and administration. 16 Another 55aim of the study was to investigate UTAUT’s efficiency in predicting the behavioural intentions, as well as the use behaviour of primary users of these resources. Empirical results demonstrated moderate predictive accuracies (adjusted R2) for students’ (0.39) and academic staff’s (0.41) and that both structural equation models had some predictive relevance (Q2) in this respect. While UTAUT had a low predictive accuracy (adjusted R2) for students’ usage behaviour (.16), the theory was twice as accurate in predicting the academic staff’s usage behaviour (0.31) of e-learning resources, which suggest almost a moderate accuracy. A further aim was to validate the individual relationships between UTAUT’s four exogenous and two endogenous LVs, while also investigating under what conditions these LVs are moderated by the primary users’ gender, age and experience at using e-learning resources. 1. Performance expectancy (PE) The relationship between the expected academic performance gains in primary users, and 1their behavioural intentions to use e-learning resources1proved to be significant and the strongest direct effect on both primary users. Gender and age were found not to influence the students’ or academic staff’s performance expectancy effect on their 1behavioural intentions to use e-learning resources at the university. 2. 74Effort expectancy (EE) The ease of use, or the amount of effort, that students associated with using e-learning resources proved to be significant and the second-strongest positive relationship towards their behavioural intentions to use these resources. While for academic staff, the path coefficient between effort expectancy and behavioural intentions showed a weak relationship that was non-significant. The latter finding possibly demonstrates the theory that individuals’ behavioural intentions become insignificant with increased experience or sustained usage (Agarwal and Prasad 1997:570, 1998; 64Thompson, Higgins and Howell, 1991: 140). The moderating effect of gender provided the first inconclusive result in the student sample, with a significant interaction found through Hayes’ (2013) PROCESS analysis, and a insignificant effect observed through bootstrapping in SmartPLS (Ringle et al. 2004). In academic staff, the moderating effect of gender did not support the UTAUT theory and was non-significant. 9Age and experience were both found to be insignificant influences in students’ and academic staff’s 9relationship between effort expectancy and their behavioural intentions to use e- learning resources. 3. Social influence (SI) Social influences on students proved to be the weakest positive relationship (but still significant) towards 1their behavioural intentions to use e-learning resources. For academic staff, this also proved to be the weakest effect (and non-significant) towards their intentions to use these resources. The social influences of students on their 76behavioural intentions to use e-learning was found to be greater in females than males, as hypothesised by Venkatesh et al. (2003:452), however it was non-significant. Age and experience 58were found to be non -significant in moderating social influences of both primary users. In academic staff, the social influence effects on females’ behavioural intentions to use e-learning are slightly higher than those of males, however the moderating effect was non-significant. 4. Facilitating conditions (FC) The relationship between the facilitating conditions and usage behaviour 1proved to be the strongest direct effect on students to use e-learning resources, as well as the most significant. For academic staff, this relationship proved to be a less salient effect on their usage behaviour and was just significant at the 95% confidence interval. The moderating effects of age and experience had virtually no effect on students’ usage behaviour and were both non-significant. The age of academic staff, shows virtually no 23effect on the relationship between the facilitating conditions and their usage behaviour and was non-significant. However the experience of academic staff showed a small negative effect between the facilitating conditions and their usage behaviour and was not significant in Hayes’ (2013) PROCESS analysis, but the same negative relationship was found to be significant (t = 1.97) when bootstrapping in SmartPLS (Ringle et al., 2004) leading to the study’s second inconclusive result. 5. Behavioural intentions (BI) The behavioural intentions of students to use e-learning resources had a moderately weak but significant positive relationship. The LVs 13performance expectancy and effort expectancy had the greatest effects on students’ 56behavioural intentions to use e- learning resources, while facilitating conditions had a greater direct effect on students’ usage behaviour than their behavioural intentions. 1For academic staff, the significant relationship between their behavioural intentions and use behaviour was almost twice as strong as that found in students. 31Performance expectancy had the greatest effect on academic staff’s behavioural intentions to use e-learning resources. The academic staff’s behavioural intentions influenced their use behaviour the most. 6. Usage behaviour (UB) The 1students’ use behaviour of e-learning resources was most influenced by the facilitating conditions, than by their behavioural intentions. For academic staff, 1their behavioural intentions to use e-learning resources were the most influential. CONCLUSION Benchmarked tertiary education has integrated 32e-learning with face-to-face teaching, research, experiential learning and community outreach to form a blended learning ecology — where the learner can interact with different conduits of information (language, media or technologies) to attain knowledge, meaning and understanding of a programme’ s exit level outcomes and their potential forte in life. Before institutions of higher education implement e- learning technologies it is recommended that they identify, understand and support variables that facilitate their acceptance and use so that these resources are employed for their intended purposes and give a good return on investment. 2At the University of Zululand there are various e-learning resources that require specific investigations to fully support their use in teaching and learning, research and administration. Many tools including 9the Unified Theory of Acceptance and Use of Technology (UTAUT) can be used to gauge the needs, perceptions and expectations of users in a requirements analysis, which forms 69the first phase of the e-learning strategy and implementation plan at the university — UTAUT partially validated the models psychological relationships that influenced primary users’ attitudes, and their use, of e-learning at the university. The theory showed moderate predictive accuracies and relevance towards attitudes of those using these resources; although the results of the study cannot be generalised to all institutions of higher education, the conclusions may be applied to similar learning environments. UTAUT proved more successful in its predictive accuracy and relevance towards academic staff’s usage behaviour — however, few of faculty integrated these resources into their formal teaching. This finding suggested both a need for teaching and learning resources like the LMS to be accessible during timetabled classes, and appropriate skills development and support for these resources. While UTAUT is not the most parsimonious SEM it does attempt to achieve larger predictive accuracies by including a moderation analysis of its exogenous LVs (PE, EE, SI and FC). The moderating effects hypothesised by Venkatesh et al. (2003) were found not to have any significant effects in this university and the paper postulates that the acceptance of technologies in different sectors of the economy (industrial, financial and educational) requires their own contextualised socio-economic moderators for these to be significant. The relationship between the expected academic performance gains in primary users, and their behavioural intentions to use e-learning 1proved to be significant and the strongest direct effect. There was thus general agreement among both students and academic staff that e- learning improved their performances in the blended teaching and learning environment, with this relationship being twice as strong in academic staff as in students. The result possibly indicates that performance is more salient for adults, who are employed, than the others. The study’s results question the hypothesis of 47Venkatesh et al. (2003) that the relationship between usersperformance expectancies and their behavioural intentions 5will be moderated by gender and age so that the effect is stronger for males, and particularly younger males, especially within the tertiary education sector. Thus the acquisition of quality e-learning resources, combined with relevant skills development and support, should improve performance gains, and hence the behavioural intentions of both students and staff to use e-learning. The 9relationship between effort expectancy and individualsbehavioural intentions to use e- learning only proved significant for students. 67These results are consistent with previous findings that the effect of this construct diminishes with increased experience (Agarwal and Prasad 1997:570, 1998; 57Thompson, Higgins and Howell, 1991: 140; Venkatesh et al., 2003: 450). The results could also indicate that many first year students, who made up almost half the student sample, found the two computer literacy modules relatively easy. The paper takes cognisance of the finding that although the majority of students and academic staff agree that they find it easy to use e-learning resources, there was a minority who said they do not — and recommends that these users should be flagged using a similar instrument in the ongoing quality promotion mechanisms at the university. The effect of social influences 1on the primary users’ behavioural intentions to employ e- learning is found to be more significant in situations where use of the resources was mandatory. For example, the relationship between social influences of academic staff on students’ behavioural 13intentions to use these resources was significant for the students’ who had scheduled classes that were compulsory to attend. For most academic staff who considered the use of these resources voluntary, the relationship was insignificant. The introduction of policies to make the use of these resources by academic staff mandatory might strengthen this effect as relationships between management, academic and support staff becomes more significant. Although the study 72found that gender moderated the relationships between social influences and their behavioural intentions in both primary users, such that social influences were greater in females than males, both were found to be insignificant. 20 Facilitating conditions had the strongest, and most significant, direct effect on students’ use of e-learning resources, whereas for academic staff their behavioural intentions had twice the effect that facilitating conditions had on their use behaviour. 1These results indicate the importance of creating conducive facilitating conditions for students and positive behavioural intentions in academic staff to facilitate the use of e-learning. This should include improving student access to these resources by, for example, subsidizing first-year students with portable devices that can be utilised on the wireless campus area network. The moderation analysis of experience of academic staff 23on the relationship between facilitating conditions and use behaviour showed a negative value and significance was inconclusive, possibly indicating that the more experienced academic staff get at using e-learning, the less satisfied they are with the facilities and support at the university. Recommendations for future research include: ? Extending the scope of UTAUT for predicting the acceptance and use of specific e- learning resources, for example, the Moodle LMS. ? Incorporate the instrument and data analysis methods of the study into quality assurance and support of e-learning resources at a departmental and modular level. ? Extend and improve the static, one-shot cross-sectional measurement method of primary users by using dynamic longitudinal and multiple time period measurements that do not rely on self-reporting methods, if possible. ? Undertake a broad mediation and moderation analysis of the UTAUT to identify some significant contextual socio-economic moderators and mediators on the theory’s four exogenous LVs’ (PE, EE, SI and FC) relationships with the two endogenous LVs (BI and UB). REFERENCES Agarwal, R., and Prasad, J. 1997. The Role of Innovation Characteristics and Perceived Voluntariness in the Acceptance of Information Technologies. Decision Sciences (28:3), pp. 557–582. Agyei, V. 2007. From Libraries to E-learning Centres: a South African Library Experience. World Library and Information Congress: 73rd IFLA General Conference and Council. 19–23 August 2007, Durban, South Africa. Ajzen, I. 2008. Website. Available: http://people.umass.edu/aizen/f&a1975.html [ 3 July 2008]. Bhattacherjee, A. 2012. Social Science Research: Principles, Methods, and Practices. Second Edition Published under the Creative Commons Attribution-Non Commercial-Share Unsupported License. Available: http://scholarcommons.usf.edu/cgi/viewcontent.cgi?article=1002&context=oa_textboo ks [12 September 2013]. Brand, J. 2006. Consumer Adoption of the Online Desktop. Available: [22 August 2013]. Boere, I. and Kruger, M. 2008. Developmental Study towards Effective Practices in Technology-Assisted Learning. Third Combined Report from Fifteen Participating South African Universities by University of Johannesburg in Collaboration with Mark Schofield of Edge Hill University, UK. Brown, J.D. 2011. Likert Items and Scales of Measurement? SHIKEN: JALT Testing & Evaluation SIG Newsletter. March 2011. (15:1), pp. 10–14. Available: http://jalt.org/test/PDF/Brown34.pdf [24 November 2012]. Council for Higher Education. 2014. Framework for Institutional Quality Enhancement in the Second Period of Quality Assurance. Definite Schedule WUR-visit to the University of Zululand, October 16–25, 2006. Dillon, A. 2001. User Acceptance of Information Technology. In W. Karwowski (ed). Encyclopedia of Human Factors and Ergonomics. London: Taylor and Francis. Dillon, A. and Morris, M. (1996) User Acceptance of New Information Technology: Theories and Models. In M. Williams (ed.) Annual Review of Information Science and Technology, (31), Medford NJ: Information Today, pp. 3–32. Available: http://www.ischool.utexas.edu/~adillon/BookChapters/User acceptance.htm [8 August 2010]. Evans, N.D. 2013. Predicting User Acceptance of Electronic Learning at the University of Zululand. Available: Faculty learning management systems. 2014. Available: http://elearn.uzulu.ac.za/moodle/sci/admin/user.php, http://elearn.uzulu.ac.za/moodle/commlaw/admin/user.php, http://elearn.uzulu.ac.za/moodle/edu/admin/user.php and http://elearn.uzulu.ac.za/moodle/arts/admin/user.php [15 September 2014]. Gefen, D. Straub, D.W. and Boudreau, M-C. 2000. Structural Equation Modeling and Regression: Guidelines for Research Practice. Communications of the Association for Information Systems, (4), Article 7 August 2000. Available: http://www.cis.gsu.edu/dstraub/Papers/Resume/Gefenetal2000.pdf [4 October 2013]. Hair, J.F., Hult, G.T.M., Ringle, C.M. and Sarstedt, M. 2014. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM). Sage Publications, London. Hair, J.F., Ringle, C.M. and Sarstedt, M. 2011. PLS-SEM: Indeed a Silver Bullet. Journal of Marketing Theory and Practice, (19:2), pp. 139–151. Available: https://sem-n-r.wistia.com [13 November 2013]. Hayes, A.F. 2013. Introduction to Mediation, Moderation, and Conditional Process Analysis. A regression-based approach. The Guilford Press, New York, ebook. Mansell, R. and Tremblay, G. 2013. Renewing the Knowledge Societies Vision: Towards Knowledge Societies for Peace and Sustainable Development. Available: http://en.unesco.org/post2015/sites/post2015/files/UNESCOKnowledge-Society- Report-Draft--11-February-2013.pdf [27 June 2013]. Moran, M. 2006. College Student’s Acceptance of Tablet Personal Computers: A Modification of the Unified Theory of Acceptance and Use of Technology Model. Available: http://www.homepages.dsu.edu/moranm/Research/Dissertation/Mark_Moran_Dissert ation__final_.pdf [25 June 2008]. Muller, W. and Evans, N.D. 2008. University of Zululand E-Learning Implementation Strategy and Plan. First draft. Muller, W. and Evans, N.D. 2009. University of Zululand E-Learning Implementation Strategy and Plan. Final draft. Available: http://elearn.uzulu.ac.za/docs/e- Learning implementation strategy and plan of Unizulu.pdf [10 January 2010]. Ringle, C., Wende, S. and Will, A. 2004. SmartPLS Software Version 2.0.M3. Available: http://www.smartpls.de [9 August 2013]. Siemens, G. 2004a. Connectivism: A Learning Theory for the Digital Age. Available: http://www.elearnspace.org/Articles/connectivism.htm [4 September 2008]. 23 Sturges, P. 2006. Finding New Ways of Serving Real Needs: The Future of Information Services. Open Lecture at the University of Zululand. Available: http://www.lis.uzulu.ac.za/lectures/paulsturges2006.pdf [4 December 2012]. Taiwo, A.A. and Downe, A.G. 2013. The Theory of User Acceptance and Use Of Technology (UTAUT): A Meta-Analytic Review of Empirical Findings. Journal of Theoretical and Applied Information Technology. (49:1). Thompson, R. L., Higgins, C. A., and Howell, J. M. 1991. Personal Computing: Toward a Conceptual Model of Utilization. MIS Quarterly (15:1), pp. 124–143. University of Zululand, 2012/2013. Teaching Development Proposal. Urbach, N. and Ahlemann, F. 2010. Structural Equation Modeling in Information Systems Research Using Partial Least Squares. Journal of Information Technology Theory and Application (JITTA): (11:2). Available: https://www.researchgate.net/publication/228467554_Structural_equation_modeling_ in_information_systems_research_using_partial_least_squares [18 September 2013]. Venkatesh, V., Morris, M., Davis, G., and Davis, F. D. 2003. User Acceptance of Information Technology: Toward a Unified View. MIS Quarterly, (27:3), pp. 425–478. Available: http://www.cis.gsu.edu/~ghubona/info790/VenkEtAlMIQ03.pdf [3 July 2008]. Warshaw, P. R. 1980. A New Model for Predicting Behavioral Intentions: An Alternative to Fishbein. Journal of Marketing Research (17:2), pp. 153–172. 1 3 4 5 6 9 10 12 13 14 15 17 18 19 21 22 24