Dawson

Published on May 2016 | Categories: Documents | Downloads: 43 | Comments: 0 | Views: 355
of 10
Download PDF   Embed   Report

Comments

Content


Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 221
Teaching smarter: How mining ICT data can inform and
improve learning and teaching practice
Shane Dawson
Graduate School of Medicine
University of Wollongong
Erica McWilliam and Jen Pei-Ling Tan
ARC Centre of Excellence for Creative Industries and Innovation
Queensland University of Technology
The trend to greater adoption of online learning in higher education institutions means an
increased opportunity for instructors and administrators to monitor student activity and
interaction with the course content and peers. This paper demonstrates how the analysis of
data captured from various IT systems could be used to inform decision making process for
university management and administration. It does so by providing details of a large
research project designed to identify the range of applications for LMS derived data for
informing strategic decision makers and teaching staff. The visualisation of online student
engagement/effort is shown to afford instructors with early opportunities for providing
additional student learning assistance and intervention – when and where it is required. The
capacity to establish early indicators of ‘at-risk’ students provides timely opportunities for
instructors to re-direct or add resources to facilitate progression towards optimal patterns of
learning behaviour. The project findings provide new insights into student learning that
complement the existing array of evaluative methodologies, including formal evaluations of
teaching. Thus the project provides a platform for further investigation into new suites of
diagnostic tools that can, in turn, provide new opportunities to inform continuous, sustained
improvement of pedagogical practice.
Keywords: Academic analytics, data mining, evaluation, ICT, social networking
Introduction
Like higher education institutions (HEIs) around the world, Australian universities are undergoing rapid
organisational and operational change. This is due to ongoing government reforms to higher education,
increased accountability and competition, as well as changes in the composition and expectations of the
student body. Alongside the many institutional transformations directed towards improving the student
learning experience (Ryan, 2005), we have seen an associated re-examination of the criteria and methods
for measuring what represents quality education (Coates, 2005). This in turn has focused attention on
finding ‘new’ value-add methodologies for monitoring, demonstrating and reporting quality education
outcomes.
The concept of ‘academic analytics’ is now gaining increasing momentum as a process for providing
HEIs with the data necessary to respond to the reportage and decision making challenges facing
contemporary universities. For example, Campbell, DeBlois and Oblinger (2007) have proposed that the
analysis of data captured from various IT systems could be used to inform decision making process for
university management and administration:
In responding to internal and external pressures for accountability in higher education,
especially in the areas of improved learning outcomes and student success, IT leaders may
soon become critical partners with academic and student affairs. IT can help answer this
call for accountability through academic analytics, which is emerging as a new tool for a
new era. (Campbell et al., 2007, p.40)
Academic analytics describes a method for harvesting and analysing institutional data for informing
decision making and reporting processes. In essence, academic analytics involves the extraction of large
volumes of data from institutional databases and the application of various statistical techniques in order
to identify patterns and correlations (Campbell et al., 2007). While the practice of data mining has long
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 222
been utilised in the commercial sector, to date there has been limited interest within the academy
(Goldstein & Katz, 2005). However, the quantity and diversity of data currently accessible to HEIs are
now making it possible to exploit more fully the potential of academic analytics in order to inform a range
of key activities within the academy, from strategic decision-making to instructor practice. The challenge
for HEIs is no longer simply to generate data and make it available, but rather to readily and accurately
interpret data and translate such findings to practice.
The ubiquitous adoption of information and communication technologies (ICT) across the HE sector in
recent times has provided institutions with additional expansive data sets that capture student learning
behaviours through user online interactions. For example, the vast majority of Australian universities
have adopted learning management systems (LMS) to support student learning. At Queensland University
of Technology in Australia for instance, approximately 40,000 students and 5,000 staff use the
commercial LMS BlackBoard (BB) as part of their daily course experience. Increasingly these systems
provide the essential infrastructure which mediates student access to learning resources, and facilitates
student-student and student-lecturer interaction. Additionally, these systems can provide sophisticated
levels of institutional-wide data on areas of student demographics, academic performance, learning
pathways, user engagement, online behaviour, and development and participation within social networks.
These data can also be used to promote practitioner reflection for professional development as well as
identifying students who may require additional scaffolding and/or early learning support.
The importance for early intervention has been highlighted in recent research undertaken by John
Campbell (2007) at Purdue University. Campbell stresses the importance of further developing the field
of academic analytics in order to assist instructors in making informed decisions regarding their
pedagogical practice. Deriving from his work in academic analytics, Campbell developed a predictive
model to identify potential ‘at-risk’ students. This model is based on prior academic results (aptitude) and
the analysis of student behaviour within the LMS (effort). The identification of online student
engagement/effort affords teachers timely opportunities for implementing learning assistance and
strategic interventions. The capacity to establish early indicators of ‘at-risk’ students allows them to re-
direct or add resources to facilitate progression towards patterns of behaviour more characteristic of a low
risk category (e.g. participation in group discussions, regular access of discipline content and review of
assessment criteria). In the Australian context, Dawson (2006a; 2006b; 2008) has also applied academic
analytics to identify relationships between student online user-behaviour and perceived sense of
community. Essentially, these findings demonstrate the potential for ICT user behaviours to inform and
predict factors influencing student learning outcomes and overall satisfaction (e.g. cognitive engagement,
participation in social networks and sense of community).
Despite the vast volumes of data captured and recorded within these systems, there has been minimal
research that investigates how the ‘rubber’ of such analytics can meet the ‘road’ of design, delivery and
evaluation of teaching and learning practices (2008; Goldstein & Katz, 2005). This paper addresses this
research deficit by presenting examples of data derived from an institution-wide LMS in order to
demonstrate the usefulness of applications of ICT-driven academic analytics. Although HEIs have widely
adopted LMS, the extraction and reporting of captured data has often been fragmented, with poor
visualisation tools militating against effective interpretation and subsequent user-action (Mazza &
Dimitrova, 2007). Put simply, the reporting of data derived from these LMS can and should be accessed
and implemented into the current suite of tools available for teaching staff in more effective ways. This
will allow a better understanding of the student cohort learning behaviours, and thus an improved
capacity to respond to their immediate learning needs and to assess the impact of enacted learning and
teaching activities.
What follows is the outline of a current international study investigating the application of academic
analytics not only to inform instructor practice but also to identify key tools for benchmarking an
institution’s level of integration of ICT into mainstream educational practice. The discussion presents
examples of the application of ICT data derived from an institutional LMS to:
• inform the decision making process for senior management with regards to future ICT related
initiatives; and,
• enable individual instructors to evaluate student engagement and the impact of implemented learning
and teaching practices.
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 223
Project and findings
The project outlined here was an initial exploratory study designed to identify the range of applications
for LMS derived data for informing strategic decision makers and teaching staff. Specifically, the study
extracted data from an institutional LMS (BlackBoard Vista – formerly known as WebCT Vista) and
analysed this data in the context of the various levels of institutional operations, namely, enterprise
(whole of institution), faculty management, and instructor. All data pertinent to the study were drawn
from a large Canadian University with approximately 45,000 full-time equivalent students and 5000
teaching and sessional staff. The ICT data was extracted from BlackBoard Vista using the WebCT
PowerSight application over a period of 19 weeks (second semester) from August to December 2007.
Approximately 800 teaching units were included in the enterprise-level analysis.
Enterprise
Faculty exemplars were identified from the initial enterprise-level assessment. Exemplars consisted of
faculties that were ‘most active’, as determined by the overall number of interactions recorded by the
LMS per full-time equivalent student (FTE) enrolled for the teaching period investigated (Table 1).
Interactions were recorded by the LMS within each distinct tool area such as discussion forum; content
page, and assessment page. Interactions were then identified as either non-student (teaching and support
staff) or student (enrolled students). Due to the diversity of interaction types recorded and the differences
in their associated teaching intent, the aggregation of interaction data at this level has some inherent
complications. For instance, discussion forum interactions are dissimilar to content interactions and as
such, foster alternate student learning outcomes. Thus, the enterprise analysis merely provides the
capacity to identify trends of behaviour and LMS adoption.
The Faculty-level data was further refined by identifying the ‘most active’ teaching units in terms of
online interactions. These were extracted and analysed for ICT tool, instructor and student usage.
Analysis of this data allowed for specific teaching units to be examined for learning trends and potential
indicators of student engagement and achievement. One unit of a large teaching class (n = 1026) is
presented and discussed as an exemplar of this process.
Table 1: Most active faculty identified through LMS enterprise data
Applied
Science
Arts Dentistry Education
Land &
Food Sys
Medicine Science Total
% of total
interactions / FTE *
11.45% 2.44% 0.12% 16.88% 46.99% 1.62% 20.5% 100%
* Interactions refer to all online access by student and faculty e.g. posting to a forum; accessing content or
participating in chat sessions. Analysis per Full Time Equivalent standardises against the interaction increases due to
faculty size
The LMS data were analysed using the software package SPSS for Windows © (Vers 15.0). Statistical
analyses incorporated descriptive statistics, ordinary least squares regression analyses, as well as
parametric and non-parametric tests of significant differences between and among groups.
Examination of the enterprise-level data indicated diverse differences in faculty usage trends. For
example the highest users of the LMS were the Land and Food Systems Faculty and the Science Faculty,
comprising 47% and 20% of the total recorded online interactions respectively (excluding distance
education offerings). In contrast, the faculty of Dentistry contributed less than 1% of the total online
interactions per full time equivalent student.
The range of data extracted from the LMS also provided an indication of the types of ICT tools utilised
across the university. In this study, the dominant tool was the discussion forum representing over 80% of
all interactions in the online environment. The second most commonly utilised tool among staff and
students was the content page (Figure 1). During the initial weeks of teaching, other LMS tools, such as
the announcement and assignment tools, were incorporated into the online teaching units. However, the
degree of usage of these tools was minimal in comparison to the discussion forum and content page
(Figure 1).
Examination of the LMS data aggregated to an enterprise level can provide a strong indicator of the stage
of e-learning adoption across the university. For example, the discussion forum in this instance was
identified as the dominant tool in terms of integration and quantity of recorded interactions, while other
tools such as online assessment quizzes, wikis and blogs were infrequently utilised or adopted by the end-
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 224
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
Time (weeks)
P
e
r
c
e
n
t
a
g
e

o
f

t
o
t
a
l

i
n
t
e
r
a
c
t
i
o
n
s
content-page
discussion
Figure 1: Identified dominant tools (discussion forum and content page) as a
percentage of the total interactions recorded by the LMS
users. This is not to say that these tools have not been incorporated into individual learning and teaching
practices via alternate external platforms such as local versions of Moodle or Elgg. However, given the
high level of technical sophistication often required to house or access such external platforms, it is
unlikely that these technologies were or are pervasive.
Rogers’ (2003) oft cited diffusion of innovations theory suggests that the adoption of an innovation
through a social system is commonly integrated first by early innovators, then early majority and finally
the late majority. The widespread adoption of the discussion forum observed in the present study suggests
that this resource is no longer considered a ‘new’ innovation. The discussion forum can be seen to be a
mainstream tool commonly integrated into educational practice and as frequently utilised as the
PowerPoint presentation. However, technology - and more importantly the student experience and
familiarisation with ICTs – are now no longer limited to discussion forum and static content pages. Yet
while the more technically ambitious educators might be embedding of web 2.0 technologies (e.g. social
networking sites; collaborative editing; social bookmarks; blogs; wikis) into their teaching and learning
activities, such resources remain on the periphery of mainstream education. This can be remediated in
part by using the data extracted from the LMS to monitor the adoption of current and future Web 2.0
related technologies and identify instances of how these tools are integrated into educational practice for
the purposes of research and provision of teaching support. In this way, the data derived from BB and
other associated ICT systems can be used to guide and inform the diffusion of technology and integration
into learning and teaching activities.
Visualisation of data
Although not available in current versions of BB, the aggregation of data around tool design provides
instructors and managers with an effective method for assisting in the interpretation of the derived data.
For example: discussion forum, chat, and ‘who’s online’ are three LMS tools that focus on student
engagement. Table 2 outlines the LMS tools available in BB and their associated thematic category. The
analysis and presentation of data in alignment with tool design categories illustrates that content and
engagement are primary modes of LMS tool use (Figure 2). Although these categories are not exclusive,
because each tool can be adopted for multiple purposes, the categorisations do serve as a starting point to
assist staff in the alignment of observed student online behaviour. Further ongoing analysis of this data at
a faculty and individual teaching unit provides a more refined and readily interpretable framework for
informing teaching practice.
Table 2: LMS tool design framework
Administration Assessment Content Engagement
Announcements My grades Student bookmarks, Notes Discussion forum
File manager Assessments Content page, Printable view Chat
Tracking Assignments File, Search, Weblinks Who’s online
Calendar Syllabus Mail
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 225
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
Time (weeks)
P
e
r
c
e
n
t
a
g
e

o
f

t
o
t
a
l

i
n
t
e
r
a
c
t
i
o
n
s
Admin
Assess
Content
Engage
Figure 2: Enterprise usage trends for LMS tool design categories:
Administration, assessment, content and engagement
Faculty
Analysis of faculty level data sought to identify trends in the adoption of various LMS tools and instructor
and student use of the resources. Again, the data allows for a more refined benchmark of activity that
provides senior management with an indication of which teaching units have high or low adoption and
usage trends. For example within the faculty of Science, data related to the level of tool interaction per
member (non student and student) revealed significant differences between the various Science Schools
(Tables 3, 4). School 1 was observed to be the most active in terms of Instructor usage (non-student).
Despite this high level of activity this School was not the highest rated in terms of the level of student
interaction with the online teaching units (Table 3). Anderson, Rourke and Garrison (2001) have
described the importance of promoting a “teacher presence” for effective online learning and developing
student community. In essence, these authors noted that the instructor facilitation of discussion and an
ongoing active presence in the discussion forum and other online communication media is essential for
maintaining high levels of student engagement. Although, teacher presence is crucial for developing
student online engagement the results of this study suggest that the relationship is not uni-dimensional.
That is, the quantity of ‘teacher presence’ and quality of ‘teacher presence’ are influencing factors in
developing and maintaining student online interactions. For instance, while School 1 presented the
highest levels of instructor interaction, it did not, however, promote the highest levels of student
interaction. Discussions with course coordinators indicated that response time to student discussion
queries may be a significant factor in promoting discussion and further student involvement. School 2
(highest level of student interaction) established early guidelines relating to response times via email and
discussion forum. In contrast School 1 responded rapidly to student queries and discussion points. While
the quantity of work undertaken by the instructor to assist students in their learning was admirable, the
depth of discussion among the student cohort was minimal. As Salmon (2000) has suggested, the early
establishment of goals and guidelines in discussion forum etiquette provides a foundation for high order
usage. In this instance providing students with reflection time and an expectation of student discussion
and peer response may well be crucial to facilitating engagement.
Table 3: Kruskal-Wallis ANOVA ranks test for differences in instructor and
student mean levels of interaction between Science schools
Group Instructor mean rank Student mean rank
School 1 50.89 40.26
School 2 40.95 55.26
School 3 16.89 27.32
School 4 43.56 25.94
Chi = 25.986; Df = 3 Chi = 22.448; Df = 3
Asymp Sig = 0.000 Asymp Sig = 0.000
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 226
Table 4: Mann Whitney U test for differences in instructor and
student mean levels of interaction between Science schools
X: School 1 School 2 School 3
Y: Instructor mean
rank
Student mean
rank
Instructor mean
rank
Student mean
rank
Instructor mean
rank
Student mean
rank
School 1 (X:Y) (X:Y) (X:Y) (X:Y) (X:Y) (X:Y)
School 2 23.42 : 15.58 * 14.11 : 24.89 **
School 3 26.42 : 12.58 ** 24.21 : 14.79 ** 26.84 : 12.16 ** 25.89 : 13.11 **
School 4 21.05 : 16.83 ** 21.95 : 14.65 * 26.22 : 18.53 ** 24.47 : 11.82 ** 18.53 : 19.50
Not significant
19.42 : 17.47
Not significant
** Significant at p<0.001; * Significant at p<0.05
The analysis and presentation of data at a faculty level provide stakeholders with a method for
distinguishing differences in online behaviours across Schools and a benchmark of current activity for
future comparison. These measures are of particular interest when attempting to implement learning and
teaching plans or identifying courses that promote high levels of online engagement. The data derived
from the system in its current form cannot be used to assess the quality of these interactions. Correlations
of the harvested LMS data with other quality metrics such as standardised evaluations of teaching will be
needed to assist educators in developing more sensitive and accurate performance frameworks.
Teacher/instructor
The study examined the online behaviour trends of a large first year pre-requisite Science class (n =
1026). The analysis of data was undertaken with a view to identify potential differences between high
academically performing students and those requiring early learning support interventions. Campbell and
Oblinger (2007) discussed the application of ICT data to inform a predictive classification model of
identifying students as having high, medium or low risk of attrition or failure. This model, based on the
degree of student time online, was developed at Purdue University and serves to provide instructors with
lead indicators for early identification of students requiring additional learning support. However, the
model does not differentiate where students spend time online. For instance, time spent in content areas or
located within discussion forum activities may well have a very different learning objective and therefore
outcome.
Numerous authors have espoused the importance for developing social learning opportunities where
students actively debate, exchange and clarify ideas with other peers (Brook & Oliver, 2003; Brown &
Duguid, 2000; Lave & Wenger, 1991; Shapiro & Levine, 1999; Vygotsky, 1978). In an online context
this is commonly mediated through the discussion forum. The present study supports the notion that
teaching staff are frequently embedding discussion forum activities. However, further investigation is
required to determine if the inclusion of such discussion activities is an enabler of social learning oriented
pedagogies in general.
Figure 3 illustrates the peak periods for student online engagement in terms of the number of postings per
student in the study. Immediately apparent in the figure are four discernable peaks extending beyond the
class average of postings. These peaks directly correspond to the assessment periods – 3 mid-term
assessment items (pre-week 10) and a final exam post week 16.
While this finding appears to support the notion that students are likely to be most involved at times when
assessment performance is looming, the data can also be used to highlight peak periods for staff
intervention. Additionally, the qualitative analysis of postings can assist in determining the types of
questions posted, i.e. concepts, factual recall, or administration relating to assessment processes. The
identification of the types and frequency of question categories could assist instructors in the future
design and development of the online environment, such as the inclusion of ‘Frequently Asked
Questions’, and the design, and timing of discussion forum activities and triggers.
To post or not to post
Prior research has foregrounded the strong relationship between student time online and academic
performance. Morris, Finnegan and Wu (2005) for example, noted that student time online in class
discussion forums was related to their overall academic performance, and that students with higher levels
of participation in online discussions have higher grades than their less interactive peers. Similarly, this
study observed that students who actively participated in discussion forum activities (i.e. made a posting
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 227
Posts/ Student
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
Posts/ student
AVERAGE
Figure 3: Discussion forum postings per student in a large Science class over a 19 week period
to the class discussion forum) scored an 8% mean higher grade than non-participating students (66.9%
and 58.89% respectively) (t = 6.870, df = 1024, mean difference = 8.02, p<0.001). Cohen (1988) has
noted that for effect size (ES) calculations for t tests of means, 0.2 is considered small, 0.5 medium and
0.8 large. The differences observed between students who actively make postings to discussion forums
and those who do not produced an ES of 0.425. This statistical result provides a clear message to students
– those who achieve better grades also use the discussion forums, and therefore, there is a greater
likelihood of achieving better grades if one actively and productively engages with forum activities. For
teaching staff, the early monitoring of discussion forum activity can assist in the timely identification of
students potentially requiring additional assistance.
Session times and academic performance
As previously noted, the project aimed to identify potential differences in online behaviour between high
and low performing students. Thus, the sample of student cohort was grouped according to academic
performance. The recorded student behaviours for the quartile groupings were then tested for significant
differences. The results indicated a significant difference between low and high performing students in
terms of the quantity of online session times attended during the course (Table 5). Similar findings were
observed for the total time online. Interestingly, the observed difference in the total time per session for
low and high performing students was not statistically significant. In short, low and high performing
students spend similar amounts of time in each session – however, low performing students attend fewer
online sessions.
Although further research is needed to further substantiate this point, it can be argued that low-performing
students may not be optimising their learning time online in the same way their higher-performing
counterparts are able to. This could be due in part to their struggle with the level of discipline and
intrinsic motivation required for this type of learning. Ertmer and Newby (1996) have previously
described differences between expert and novice learners. They found that expert performers display a
range of learning strategies (reflection, discipline, control, playfulness) to optimise and evaluate their
individual progress. As noted by a number of researchers’ (e.g. Bocchi, Eastman, & Swift, 2004; Doherty,
2006; Kerr, Rynearson, & Kerr, 2006; Ross & Powell, 1990; Rovai, 2002, 2003), online learning
demands high levels of motivation, discipline, persistence and academic integration from students in
order to perform well.
In contrast to their low performing peers, high performers were also observed to spend a greater degree of
time online in both discussion forum and content pages (Table 5). Similar findings have been noted by
Morris et al. (2005), who identified significant differences in online behaviour between “successful” (i.e.
received a passing grade) and “non-successful” completers of a course. Successful students were
observed to spend a greater degree of time online in both discussion and content areas of the online site.
The tracking of student online behaviour provides clear benefits in assessing student progress and the
early identification of potential “at-risk” individuals. The identification of the point of disengagement
from learning activities or the point at which a student drops below a class average in quantity and
frequency of online sessions (content and discussion areas), may assist staff in maintaining student
motivation, and overall persistence.
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 228
Table 5: Mann Whitney U test for differences in quantity of recorded sessions for each quartile
Quantity of online sessions
0-25 quartile 25-50 quartile 50-75 quartile
25-50 !
50-75 ! X
75-100 ! ! !
Number of discussion postings
0-25 quartile 25-50 quartile 50-75 quartile
25-50 X
50-75 X X
75-100 ! ! !
Number of content views
0-25 quartile 25-50 quartile 50-75 quartile
25-50 !
50-75 ! X
75-100 ! ! !
! Statistically significant at p < 0.05; X Not statistically significant
Social network visualisation
Despite the relative ease in extracting tracking data on student online interactions, the aggregation and
visualisation of this data by staff should not be presumed to happen automatically. In short, the poor
visualisation tools available to teaching staff constrain staff understanding of the linkage between student
online interactions and implemented pedagogical approach. To mitigate this problem, the results
informing this project have led to the development of a prototype social networking visualisation tool
1
that extracts discussion forum data in order to build a visual representation of the learning network. This
network visualisation can be generated at any stage of course progression, thereby providing a timeline of
engagement or insights into student online behaviour following designed learning activities. Additionally,
the identification of students actively involved in a network also provides instructors with timely
information about those students who are disconnected from the learning network.
The enabled software provides a small link on the BB discussion forum page for extracting and tabulating
the discussion forum data. The extracted forum data is exported into Netdraw (Borgatti, 2002) a social
network visualisation tool. Figure 4 illustrates the visualisation results from a large class BB discussion
forum. Each node represents an individual student, and while student names have been removed for the
purpose of this paper, the diagram clearly illustrates, on an immediate basis at the time of analysis, the
degree of engagement, central nodes in discussion and students potentially isolated from the discussion
network. The identification of students involved in the discussion network allows a list of non-
participating students to be developed. It is a small step to automate this process and thus allow staff to
intervene at selected and/or critical points in time, either to mobilise a specific student to interact or to
enquire about a student’s learning issues and status.
Conclusion
This present study has provided strong indications that there are discrete online behaviour patterns that
are representative of students’ overall engagement and final assessment grade. The visualisation of online
student engagement/effort affords instructors with early opportunities for providing additional student
learning assistance and intervention – when and where it is required. The capacity to establish early
indicators of ‘at-risk’ students provides timely opportunities for instructors to re-direct or add resources to
facilitate progression towards patterns of behaviour that represent what Campbell and others (2007) have
previously termed as a “low risk category” (e.g. participation in group discussions, regular access of
discipline content and review of assessment criteria).
It is clear that the future trend will be toward greater adoption of online learning in HEIs. Thus, there will
be an increased opportunity for instructors and administrators to monitor student activity and interaction
with the course content and peers. The analyses of these data sets are directly relevant to student
engagement and evaluating implemented learning activities. Questions related to how and when students
are engaged, and what activities promote student engagement, can be answered through the monitoring
and analysis of student online behaviour. While the data examined in this paper are indicative at this
stage, and the interpretation of results discussed here may be influenced to some degree by a number of
exogenous variables, the findings nonetheless provide new insights into student learning that complement
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 229
Figure 4: Sociogram illustrating density of the social network
and students isolated from the network
the existing array of evaluative methodologies (e.g. evaluations of teaching). In this regard, the study
provides a platform for further investigation into new suites of diagnostic tools that can, in turn, provide
new opportunities and new data sets to inform instructor reflection for continuous and incremental
improvement of pedagogical practice.
Endnote
1
The social network software was developed by Aneesha Bakharia and Shane Dawson. The script was developed
using Greasemonkey, a Mozilla Firefox browser extension. The use of the Greasemonkey script avoided issues
surrounding access to a proprietary database. For further details refer to:
http://olt.ubc.ca/learning_tools/research_1/research/
Acknowledgements
The authors would like to acknowledge the contributions from Aneesha Bakharia, who provided expert
technical advice and developed the social network resource. This research has been supported by the
Australian Learning and Teaching Council.
References
Anderson, T., Rourke, L., & Garrison, D. R. (2001). Assessing teaching presence in a computer
conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1-17.
Bocchi, J., Eastman, J. K., & Swift, C. O. (2004). Retaining the online learner: Profile of students in an
online MBA program and implications for teaching them. Journal of Education for Business, 79(4),
245-253.
Borgatti, S. P. (2002). NetDraw: Graph visualization software: Harvard: Analytic Technologies.
Brook, C., & Oliver, R. (2003). Online learning communities: Investigating a design framework.
Australian Journal of Educational Technology, 19(2), 139-160.
http://www.ascilite.org.au/ajet/ajet19/brook.html
Brown, J. S., & Duguid, P. (2000). The social life of information. Cambridge: Harvard Business School.
Campbell, J., De Blois, P. B., & Oblinger, D. (2007). Academic analytics: A new tool for a new era.
EDUCAUSE Review, 42(4), 42-57.
Campbell, J., & Oblinger, D. (2007). Academic analytics. Retrieved 25th October, 2007, from
http://connect.educause.edu/library/abstract/AcademicAnalytics/45275
Coates, H. (2005). The value of student engagement for higher education quality assurance. Quality in
Higher Education, 11(1), 25-36.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ:
Lawrence Erlbaum Associates.
Disconnected
students
Proceedings ascilite Melbourne 2008: Full paper: Dawson, McWilliam & Tan 230
Dawson, S. (2006a). Online forum discussion interactions as an indicator of student community.
Australasian Journal of Educational Technology, 22(4), 495-510.
http://www.ascilite.org.au/ajet/ajet22/dawson.html
Dawson, S. (2006b). Relationship between student communication interaction and sense of community in
higher education. Internet and Higher Education, 9(3), 153-162.
Dawson, S. (2008). A study of the relationship betweeen student social networks and sense of
community. Educational Technology and Society, 11(3), 224–238.
Doherty, W. (2006). An analysis of multiple factors affecting retention in web-based community college
courses. Internet and Higher Education, 9(4), 245-255.
Ertmer, P. A., & Newby, T. J. (1996). The expert learner: Strategic, self-regulated, and reflective.
Instructional Science, 24, 1-24.
Goldstein, P. J., & Katz, R. N. (2005). Academic analytics: The uses of management information and
technology in higher education. Retrieved 25th October, 2007, from
http://www.educause.edu/ir/library/pdf/ers0508/rs/ers0508w.pdf
Kerr, M. S., Rynearson, K., & Kerr, M. C. (2006). Student characteristics for online learning success.
Internet and Higher Education, 9(2), 91-105.
Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge:
Cambridge University Press.
Mazza, R., & Dimitrova, V. (2007). CourseVis: A graphical student monitoring tool for supporting
instructors in web-based distance courses. International Journal of Human-Computer Studies, 65(2),
125-139.
Morris, L. V., Finnegan, C., & Wu, S. (2005). Tracking student behavior, persistence, and achievement in
online courses. Internet and Higher Education, 8(3), 221-231.
Rogers, E. (2003). Diffusion of innovations (5th ed.). New York: Free Press.
Ross, L. R., & Powell, R. (1990). Relationships between gender and success in distance education
courses: A preliminary investigation. Research in Distance Education, 2(2), 10-11.
Rovai, A. P. (2002). Building sense of community at a distance. Retrieved 25 January, 2005, from
http://www.irrodl.org/content/v3.1/rovai.html
Rovai, A. P. (2003). In search of high persistence rates in distance education online programs. Internet
and Higher Education, 6(1), 1-16.
Ryan, J. F. (2005). Institutional expenditures and student engagement: A role for financial resources in
enhancing student learning and development. Research in higher education, 46(2), 235-249.
Salmon, G. (2000). E-moderating: The key to teaching and learning online. London: Kogan Page.
Shapiro, N., & Levine, J. (1999). Creating learning communities: A practical guide to winning support,
organising for change, and implementing programs. San Francisco: Jossey-Bass.
Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes (M. Cole, V.
John-Steiner, S. Scribner & E. Souberman, Trans.). Cambridge Mass: Harvard University Press.
Authors: Shane Dawson: [email protected], Erica McWilliam: [email protected], Jen Pei-
Ling Tan: [email protected]
Please cite as: Dawson, S., McWilliam, E. & Tan, J.P.L. (2008). Teaching smarter: How mining ICT
data can inform and improve learning and teaching practice. In Hello! Where are you in the landscape
of educational technology? Proceedings ascilite Melbourne 2008.
http://www.ascilite.org.au/conferences/melbourne08/procs/dawson.pdf
Copyright 2008 Shane Dawson, Erica McWilliam, Jen Pei-Ling Tan
The authors assign to ascilite and educational non-profit institutions a non-exclusive licence to use this
document for personal use and in courses of instruction provided that the article is used in full and this
copyright statement is reproduced. The authors also grant a non-exclusive licence to ascilite to publish this
document on the ascilite web site and in other formats for Proceedings ascilite Melbourne 2008. Any
other use is prohibited without the express permission of the authors.

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close