Learning Analytics: An Introduction and Critical Analysis

From ETEC 510
Jump to: navigation, search

In this wiki, we provide an overview of Learning Analytics, as it is a relatively new field and new thread on the Design Wiki, so as to provide context for the critical analysis and discussion of design affordances below.

Definitions of Learning Analytics

Learning Analytics, which is often associated with the term Academic Analytics, refers to the collection, representation, and analysis of data about groups of learners and individual learners at universities and colleges (Baepler & Murdoch, 2010). The practice of Learning Analytics often combines data collection and analysis with predictive modeling that suggests the likelihood of students succeeding in, and completing, a course (Campbell, DeBlois, & Oblinger, 2007).

The practice is also often associated with, and has been enabled by, the data collection facilitated by learning management systems (LMS), also known as content management systems (CMS) and virtual learning environments (VLEs) (Baepler & Murdoch, 2010), examples of which include Blackboard, Moodle, and Desire2Learn. However Learning Analytics dashboards are LMS-agnostic and can therefore be integrated into many different proprietary systems.

Learning Analytics is also often associated with the phrase Action Analytics (Norris, Baer, Leonard, Pugliese, & Lefrere, 2008), which suggests that data collected, represented, and analyzed early enough can inform the approach individual instructors can take to any particular student, group of students, or class. For example, if an instructor can determine that the great majority of students didn't understand a particular concept in the previous class, they may select to dedicate a portion of the next class to a more fulsome exploration of that concept.

Learning Analytics is also often associated with the notion of evidence-based course redesign, which suggests that the data captured and analyzed during one course can inform decisions about course design and content for future semesters. Related to this is the notion of a reflexive approach to one's teaching. Related to this is the definition of Learning Analytics as "an engine to make decisions or guide actions. That engine consists of five steps: capture, report, predict, act, and refine" (Campbell, DeBlois, & Oblinger, 2007).

While often used interchangeably, the terms Learning Analytics and Academic Analytics have been distinguished from one another by the Society for Learning Analytis Research (SoLAR) as follows:

Learning and Analytic Analytics

Learning Analytics (LA) is "the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs. Learning analytics are largely concerned with improving learner success." (Siemens & Gasevic, Haythornthwaite & Dawson, Buckingham Shum & Ferguson, Duval & Verbert, S. J. d. Baker, 2011).

Academic Analytics is "the improvement of organizational processes, workflows, resource allocation, and institutional measurement through the use of learner, academic, and institutional data. Academic analytics, akin to business analytics, are concerned with improving organizational effectiveness." (Siemens et al., 2011).

For the purposes of this Wiki, given our audience, we will focus on Learning Analytics. The reference to "understanding and optimizing learning and the environment in which it occurs" speaks to our particular interest in understanding how learning analytics can be facilitated by, and inform, the online environment in which learning occurs.



Learning Analytics emerged not from educational circules but from two business needs, namely the need to understand internal organizational behaviour and the need to understand external consumer behavior. Technological advances afforded businesses the opportunity to derive intelligence about internal and external behavior using data collection from a diverse range of systems, such as PeopleSoft (Buckingham Shum & Ferguson, 2012).

Mitchell and Costello first referred to the term Learning Analytics in 2000 in their analysis of the of international market opportunities for online learning products (Mitchell & Costello, 2000). The term was employed before 2005 by the company Blackboard (http://www.blackboard.com/Platforms/Analytics/Overview.aspx) to describe the data that its learning management system (LMS) can provide academic executives, decision-makers, and instructors (Baepler & Murdoch, 2010).

Underpinning the practice is the field of Data Mining, also called Knowledge Discovery in Databases (KDD), which focuses on collecting and analyzing large amounts of information. According to Romero and Ventura (2007), data mining draws on several aspects of computing, such as decision tree construction and logic programming.

Baker and Yasef in 2009 defined Educational Data Mining (EDM) as “an emerging discipline, concerned with developing methods for exploring the unique types of data from educational settings, and using those methods to better understand students, and the settings in which they learn.” Baker and Yasek trace the field of EDM back to 1995.

The emergence of learning management systems such as Blackboard, Moodle, and Desire2Learn that afford the amassment and visual representation of a large amount of student information has enabled the active development of the field of Learning Analytics in the last five years (Buckingham Shum & Ferguson, 2012).

Reflective of the growth of the field is the forthcoming third annual conference of Learning Analytics and Knowledge (LAK), which will be held in Belgium in April, 2013 (http://lakconference2013.wordpress.com).

LAK is overseen by the Society for Learning Analytics Research (SoLAR), an international group of researchers who critically examine and explore the impact of analytics on teaching and learning (www.solaresearch.org).

Audiences and Uses

"The goal of Learning Analytics is to enable teachers and schools to tailor educational opportunities to each student’s level of need and ability” (Johnson, 2011).

Learning Analytics may have the greatest application for learners, educators, and program coordinators. Benefits of Learning Analytics can include the personalization of learning content, enhanced student motivation through immediate feedback, early detection of at-risk students, and data-driven curriculum and content design (Siemens et al., 2011).

For instance, Purdue University’s Signals project combines data obtained from its LMS with predictive modeling systems to generate a real-time red/amber/green traffic light alert system that illuminates to students and instructors how students are progressing through the course. In Purdue’s application, students exposed to the Signals alerts received higher grades and exhibited higher retention rates than those in the control groups (Arnold, 2010). http://www.itap.purdue.edu/learning/tools/signals/

Signals provides real-time and ongoing feedback to instructors and students starting as early as the second week of class. According to Purdue’s website, instructors who follow recommended practices offered by Signals consistently see students' performance improve in their classes. http://www.itap.purdue.edu/learning/tools/signals/


Institutions that utilize Learning Analytics present tools to their students that allow them to access feedback on their development as they progress in the course. Learning Analytics dashboards can provide basic statistics on attendance at lectures and time-on-task on online activities, participation rates in forums, performance on the online quizzes and tests, and marks on formal written assignments and exams. Students might find particularly useful the suggestions the system makes on how that student might become a more effective, focused learner. For instance, if a student exhibits poor comprehension of a particular topic area, as captured through a quiz, the dashboard may suggest to the student that they perform a particular set of activities and readings to improve their comprehension. Students might also be presented with alternative learning paths that are based on previous activities of student peers or learners with a profile similar to that learner. If a student begins to fall behind, the system might recommend strategies on how to catch up with the rest of the class (Siemens et al., 2011).


Instructors can use Learning Analytic tools to track and gain insight into the range of different factors known to impact sustained learner engagement in a course, and then use this information to adapt their teaching, to adjust assignments and recommend tutoring as necessary (Siemens et. al, p. 6). Learning Analytic algorithms match online behavior to predictive models based on past cohort performance. The algorithms used are a combination of what a professor deems to be best suited to the characteristics of courses run at their university, the student demographic and approaches to learning, and the mix of technologies they use. The analytics also provide insight into student online engagement such as sentiments of learners in relation to a topic, the liveliness of debate around a topic, or engagement levels of different learners (Siemens et al., 2011).


Program coordinators

Program Coordinators can use data provided by analytics to analyze the performance of a group of students, or of all students in their program. They can analyze data to determine what works and what does not in a particular classroom, and whether a particular learning intervention is effective at promoting student learning. Usually, the detailed learning data the system provides can be disaggregated by student subgroup (for example, to see how students without a course prerequisite perform or to compare males’ and females’ progress in the course), by instructor, or by year. Learning system data can support analyses of how well students learn with particular interventions and how implementation of the intervention could be improved. Using the data, program coordinators can suggest policies, new courses, and programs to the administration that could improve teaching, learning, and completion/retention/graduation rates (U.S. Department of Education, 2012).


Researchers use learner data from various systems to experiment with learning theories and to examine the effectiveness of different types of instructional practices and different course design elements. Researchers using online learning systems can do experiments in which many students are assigned at random to receive different teaching or learning approaches, and learning system developers can show alternative versions of the software to many users: version A or version B. This so-called “A/B testing” process can answer research questions about student learning such as: Do students learn more quickly if they receive a lot of practice on a given type of problem all at once (“massed practice”) or if practice on that type of problem is spaced out over time (“spaced practice”)? What about students’ retention of this skill? Which kind of practice schedule is superior for fostering retention? For what kind of students, and in what contexts (U.S. Department of Education, 2012)?

In alignment with the distinction between Learning Analytics and Academic Analytics outlined above, and given the focus of our wiki on Learning Analytics, we will not elaborate on the uses of data by academic executives, fundraisers or marketers at an institution or in government except to note that these groups may also have a vested interest in educational data mining for student enrolment, retention, and funding purposes.

Implications for designing educational media

Based on what we have learned in this course, it is our view that having the analysis of data about student online activity as an end goal informs not only the development of a dashboard that should display to instructors - with easily understood graphical representations (Dyckhoof, Zielke, Bultmann, Chatti, & Schoreder, 2012) - the learner data but also informs the very types of digital objects and interactivities one develops for a course.

For instance, easily quantifiable interactivites, such as the results of multiple-choice quizzes and true-and-false quizzes, may lend themselves more easily to being captured in Learning Analytics dashboards or learning management systems than qualitative short answer or essay responses (which require careful consideration and evaluation by an instructor). Implicit in this suggestion is the unresolved question of whether the types of interactivities and data that a Learning Analytics dashboard can more easily capture are those types of interactivities or interactions that promote higher-order thinking (see the Critical Analysis section below). Is the discussion forum post that generates the most responses the most thoughtful, challenging post or the one that is easiest to respond to, perhaps because of its anecdotal nature? Are educators, programs, or institutions who value analytics potentially at risk of privileging what can be quantified over learning outcomes and how best to promote their attainment?

An educational media designer may consider the following design affordances when building a Learning Analytics dashboard:

1. Ease-of-use and relevancy (what Dyckhoff et al. refer to as "usability" and "usefulness," p. 62). A Learning Analytics dashboard should be able to let both beginner users and more advanced users answer their teaching questions quickly. For instance, if an instructor wants to determine the correlation between students' time-on-task and their results on quizzes, the Learning Analytics toolkit should be able to quickly display that correlational data in easily interpreted visual forms (p. 62). According to Dyckhoff et al., Learning Analytics tools should be integrated into a LMS via easily analyzed and customized dashboards that allow instructors to "zoom in" on data of particular interest to them (p. 60). The current literature suggests that visual representations of data can be "an effective mean to deal with larger amounts of data in order to sustain the cognitive load of educators at an acceptable level" (Ali, Hatala, Gasevic, & Jovanovic, 2012, p. 486).

2. Interoperability. Dyckhoff et al. suggest that if one is developing what they refer to as a Learning Analytics toolkit outside of a learning management system, the toolkit should be interoperable, meaning that it should be easily adapted for various LMS's and be able to collect and analyze data from a variety of platforms (p. 62). Since most Learning Analytics are currently taking place through learning management systems, an educational media designer may want to consider developing digital interactivities, objects, and a Learning Analytics dashboard that are interoperable with any kind of LMS (Dyckhoff et al., 2012, p. 62).

3. Efficient To encourage an iterative approach to one's teaching and to support instructors in detecting students-at-risk or areas/topics that a great number of students don't understand, a Learning Analytics toolkit should provide information quickly and continually to instructors and students.

4. Data privacy considerations. The current literature suggests that students' data be pseudonomized to protect users (Baepler & Murdoch, 2010; Dyckhoff et al., 2012). Designers may need to consider how to mantain the privacy of students while also giving instructors the ability to detect students-at-risk or students who make outstanding contributions and could be considered as mentors or future graduate students.

Additionally, a Learning Analytics dashboard should be able to capture and store large amounts of data, present the data in ways that users who are not staticians can understand, afford "zoom in" and "zoom out" capabilities that allow a user to look at one particular student or a group of students, and afford views of class activity from a variety of comparative perspectives, including by week (how does this week's frequency of activity and performance measure with last week's?), by assignment (how did students do, as a whole, on this quiz compared to last week's quiz?), by interactivity type (do students use the flashcards more than they uze the quizzes?), or by topic (how many students answered the question about functionalism correctly?). Finally designers and administrators may want to consider creating "an opt out" option for students, with a clear explanation of the purpose of the data capture and how the data might be used. While doing so might limit the representative sample of data, from an ethnical standpoint, this might be an important consideration.

Critical analysis and reflections

As SoLAR states, the availability of real-time data about a student's performance can serve to focus their studying on the areas they have yet to fully comprehend, increase their time-on-task by providing frequent feedback, which is motivating, and give them a sense of how their performance relates to that of their peers, which can be arguably motivating (p. 5).

For educators, SoLAR emphasizes that Learning Analytics can "reduce attrition through early detection of at-risk students," "personalize learning process and content," "extend and enhance learnin achievement, motivation, and confidence," and enable the "more rapid achievement of learning goals" (p. 5). It is also possible that information about learners can support a reflexive approach to one's teaching (Dyckhoff et al., p. 60).

Having said this, a number of serious concerns have emerged regarding the uses of Learning Analytics (Boyd & Crawford, 2011). First, ethical and legal guidelines about the aggregation and public sharing of learner data have yet to be established. Second, Boyd and Crawford argue that students who become dependent on continuous feedback from external sources may under-develop their meta-cognitive skills. Finally, Learning Analytics may overemphasize performance indicators that do not capture or support meaningful learning (Boyd & Crawford, 2011).

In the estimation of these wiki authors, if Learning Analytics tools and practices are not coupled with widespread professional development that provides educators with access to, and the ability to judiciously select from and implement, digital objects that enable higher-order thinking and respect a multiplicity of learning preferences, we might be at significant risk of privileging easily-quantifiable interactivities (such as the answers to multiple-choice questions) that don't necessarily reflect meaningful learning.

Can interactivities and objects built for a Learning Analytics framework aptly capture students’ creativity, originality, nuanced thinking, problem-solving skills, and interpersonal communication skills, for instance? Does a Learning Analytics imperative encourage an objectivist philosophy of knowledge acquisition over a constructivist philosophy of knowledge co-construction?

Could there be a tension between the intentions of those seeking to increase student enrolment/retention and the intentions of educators who want to create, for instance, a constructivist learning environment that enables and supports deep, authentic learning?

In the following section, we consider the various concepts we have learned in ETEC 510 to date and whether and how a Learning Analytics dashboard or learning management system can support these concepts.

Learning Analytics and Constructivist Learning Environments

Can Learning Analytics be successfully implemented into a constructivist learning environment that has, at its centre, an ill-defined authentic problem or case (Jonassen, 1999, p. 215)? Is the quantification of online activity at the heart of Learning Analytics compatible with constructivist methodologies that privilege reflexivity, knowledge co-construction, and the distribution of expertise across a group? Perhaps if a learning environment includes a Google Docs type space where a group of learners can collaborate on a problem and have their instances of collaboration captured by a Learning Analytics toolkit, it might be possible to map out the frequency of contributions of each member of the group. Frequency of contribution may not equate with the quality of the contribution, however, but a map of the group member's interactions may give credence to peer review assessments. For example, if peer review suggests a particular group member provided an outstanding contribution and another wasn't very active in the assignment, a mapped visualization of the interactions on the Google Docs space may reinforce or put into question the peer review.

Learning Analytics and Communities of Practice

Barab and Duffy suggest that the "cultural context of schools all too often emphasizes learning and grades, not participation and use, and the identity being developed is one of student in school, not contributing member of the community who uses and value the content being taught" (Barab & Duffy, 1998). They go on to write that, "instead of a culture emphasizing the contribution of the activity to the community, all too frequently, school culture accords knowledgeable skill a reified existence, commodifying it, and turning knowledge into something to be acquired" (p. 10). They support the dissolution of the boundaries between a school context and a real-world community in which students are constituted as members of the community and develop their identity through meaningful contributions to the community (p. 12). These reflections lead to the question of whether the practice of Learning Analytics further reinforces the boundaries between institutions and the world beyond them, the very boundaries that proponents of situated learning and communities of practice seek to minimize? Can a Learning Analytics dashboard capture a students meaningful contribution to a community outside of academia. In the National Geographic Kids Network program described by Barab and Duffy (p. 19), students work on "real and engaging" problems with students and scientists outside of their school. Students work with scientists on relevant issues, such as acid rain and solar energy, and present their data findings to the community (p. 19). With this example, perhaps a Learning Analytics dashboard can capture and present the scientific data that students collect and the research questions that students ask and explore over time, leading to the creation of a rich online repository - and the common cultural heritage characteristic of a community of practice (Barab & Duffy, 1998, p. 12) - that future community members can add to and draw upon. Perhaps Learning Analytics dashboards that are non-institution specific can be employed to capture the the research questions, contributions, and activities of students/community members over a long period of time, thereby creating an online cultural heritage.

Learning Analytics and Learn 2.0

Learn 2.0. is characterized by the creation and clipping of microcontent; participation and collaboration, content aggregation and sharing across domains; distributed intelligence and expertise; and user-driven performances (Alexander, 2006). In light of the openness and social nature of Learn 2.0, Alexander asks, "How can Higher Education respond, when it offers a complex, contradictory mix of openness and restriction, public engagement and cloistering?" (p. 42). Since the practice of Learning Analytics develops within the context of Higher Education, the same tensions and questions apply to the practice of Learning Analytics, in our estimation. The majority of the current Learning Analytic applications that we have found support the collection, representation, and analysis of data at the course, program and, and its most expansive, at an institutional level. Having said this, new organizations like Udacity (https://www.udacity.com/) and Coursera (https://www.coursera.org/) do have lower barriers to entry than most courses (one can simply sign up and register into any course of one's selection within a few minutes), and may offer data about a larger number of students from a wider geographical area (some courses have tens of thousands of students enroled from all over the world, even though the completion rates tend to be much lower than the enrolment rates) than courses offered through traditional online or face-to-face models. However, it isn't clear what sort of Learning Analytics and dashboards these companies provide to instructors and whether they can become effective representations of a Learn 2.0. environment.

Learning Analytics and Gaming

Gaming environments enable a "playful immersion" (De Castell & Jenson, 2003) that is facilitated by "interactivity rather than display and exposition," "negotiating an immersive environment rather than stand-alone task completion," "narrative rather than propositional organization" and "activity structures" rather than "disciplinary structures" (p. 655). From what we have seen in the use cases described below, the majority of current applications of Learning Analytics dashboards and learning management systems tend to capture the kinds of data that reflect task completion (for example, dashboards show instructors who has completed a quiz and who hasn't) and displaying of knowledge acquired about a discipline (for example, dashboards show the number of multiple-choice questions students answered correctly and incorrectly). It doesn't appear evident that current applications of Learning Analytics can meaningfully capture the characteristics of successful gaming environments.

Successful gaming environments "engulf" its users in an immersive world and learning is "stealth" (De Castell & Jenson, 2003, p. 656-657). If students are consistently aware that their online interactions are being logged and recorded, can they willingly suspend their disbelief within the immersive environment?

Use Cases


Purdue's Signals program

Purdue University's Course Signals application, according to it's website, "detects early warning signs and provides intervention to students who may not be performing to the best of their abilities before they reach a critical point. Course Signals is easy to use and works in three unique ways:

  • 1. It provides real-time feedback
  • 2. Interventions start early - as early as the second week of class
  • 3. It provides frequent and ongoing feedback."

To identify students at risk academically, Course Signals combines predictive modeling with data-mining from Blackboard. Each student is assigned a "risk group" determined by a predictive student success algorithm. One of three stoplight ratings, which correspond to the risk group, can be released on students' Blackboard homepage.

When instructors follow recommended Course Signals best practices, grade performance improvement has been consistently demonstrated at both the course and at the departmental level. According to Purdue's website, overall, students in courses using Course Signals receive more Bs and Cs and fewer Ds and Fs than previous sections of the course that did not utilize courses Course Signals. As and Bs have increased by as much as 28% in some courses. In most cases, the greatest improvement is seen in students who were initially receiving Cs and Ds in early assignments, and pull up half a letter grade or more to a B or C.


Blackboard Analytics

In Jan 2012 Blackboard Inc. announced a new Learning Analytics solution Blackboard Analytics™ for Blackboard Learn™ which allows institutions to transform usage patterns and data in their LMS, along with student information system data into actionable information.

The goal of this system is to allow instructors and administrators to use the data generated by the system to help better engage students, measure and improve learning outcomes, and assess the adoption and use of online learning tools.

The system is designed to answer important questions that will help instructors and administrators inform course design and influence student success such as:

  • What student activities are correlated to desired outcomes like grades and course completion?
  • How can I easily find students who are at-risk?
  • Who are the most innovative instructors?
  • How are students performing on learning outcomes over time?
  • What strategies to improve the quality of course design and instruction result in better student performance? Instructor training courses? Course reviews?
  • Which tools are being used in courses the most? The least? Which are most effective to enhance student engagement & success?
  • How many logins, time on task, and other metrics have occurred over time?


SNAPP (Social Networks Adapting Pedagogical Practice)

The Social Networks Adapting Pedagogical Practice (SNAPP) tool "performs real-time social network analysis and visualization of discussion forum activity within popular commercial and open source Learning Management Systems (LMS). SNAPP essentially serves as a diagnostic instrument, allowing teaching staff to evaluate student behavioral patterns against learning activity design objectives and intervene as required a timely manner."

The interaction among users is captured within a discussion forum but from the default threaded display of messages it is difficult to determine the level and direction of activity between participants. SNAPP infers relationship ties from the post-reply data and renders a social network diagram below the forum thread. The social network visualization can be filtered based upon user activity. SNAPP is interoperable with a variety of Learning Management Systems (Blackboard, Moodle and Desire2Learn) and must be triggered while a forum thread is displayed in a Web browser.

"The social network diagrams can be used to identify:

  1. isolated students
  2. facilitator-centric network patterns where a tutor or academic is central to the network with little interaction occurring between student participants
  3. group malfunction
  4. users that bridge smaller clustered networks and serve as information brokers"



Gephi is an open-source software for network visualization and analysis. It helps data analysts to intuitively reveal patterns and trends, highlight outliers and tells stories with their data. It uses a 3D render engine to display large graphs in real-time and to speed up the exploration process. Gephi combines built-in functionalities and flexible architecture to: explore, analyze, spatialize, filter, cluterize, manipulate and/or export all types of data.

Gephi is based on a visualize-and-manipulate paradigm which allow any user to discover networks and data properties. Moreover, it is designed to follow the chain of a case study, from data file to nice printable maps.

We found this resource particularly interesting, because designers of Learning Analytics dashboards may want to consider how to display to educators and students the information about students' engagement and performance through easily interpreted graphic visualizations.



Northern Arizona University GPS: Academic Early Alert and Retention System

Research has shown that feedback is the most important factor in academic achievement (Baepler & Murdoch, 2010; Dyckhoff, 2012; Siemens et al., 2011). Northern Arizona University uses a guidance system ‘Grade Performance Status’ for students aimed at improving student academic success and retention. The system provides feedback to students in four areas (attendance, grade, academics, and positive feedback). Depending on the feedback given, students are given options and pointed to resources to help them improve. (Johnson, 2011) The system works by allowing instructors to send direct feedback to students via GPS emails about their class performance. The student then can choose an appropriate next step once they receive a GPS message:

  • "Need further discussion? Reply directly to the instructor using the GPS email, call, or visit during office hours.
  • Need resources or information to create or execute your action plan? Use the GPS website to connect with the resources to support your needs.
  • Need advice? Contact your advisor if you have an issue with the class, you are thinking about dropping or withdrawing from a class, or personal issues are interfering with performance
  • Advisors and student support personnel can access copies of your GPS messages recorded in LOUIE to improve their ability to help you and offer personalized recommendations"


Inconclusive conclusion

As Learning Analytics is still a relatively new field, it is somewhat difficult to determine how the practice will evolve and how the data about students' online activities will be captured, represented, analyzed, and used. Will the developers of Learning Analytics dashboards be mindful of the needs of educators who want to create constructivist learning ennvironments, for instance? Can data capture and reflect meaningful learning? Who will determine what digital objects are created and what types of interactivities are weighted more heavily in a Learning Analytics toolkit? Ideally, a Learning Analytics toolkit will pay equal attention to instructional design, the needs of educators and students, and the needs of administration. In short, perhaps only in time will we uncover whether Learning Analytics tools and practices are effective means to help students learn meaningfully and teachers teach effectively.


  1. Ali, L., Hatala, M., Gasevic, D., & Jovanovic, J. (2012). A qualitative evaluation of evolution of a learning analytics tool. Computers & Education, 58(1), 470–489.
  2. Arnold, K. E. (2010). Signals: Applying academic analytics. Educause Quarterly, 33(1), 10. Retrieved 27 February, 2013 from the Educause Review Online website: http://www.educause.edu/ero/article/signals-applying-academic-analytics
  3. Baepler, P., & James Murdoch, C. (2010). Academic analytics and data mining in higher education. International Journal for the Scholarship of Teaching and Learning, 4 (2), 1-10.
  4. Baker, R. S. J. D., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3-17.
  5. Berk, J. (2004). The state of learning analytics. Training & Development, 58 (6), 34-39.
  6. Blackboard Inc. http://www.blackboard.com/Platforms/Analytics/Products/Blackboard-Analytics-for-Learn.aspx
  7. Buckingham Shum, S., & Ferguson, R. (2012). Social learning analytics. Educational Technology & Society, 15 (3), 3–26.
  8. Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. Educause Review, 42 (4), 40–42, 44, 46, 48, 50, 52.
  9. Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and implementation of a learning analytics toolkit for teachers. Educational Technology & Society, 15 (3), 58–76.
  10. EDUCAUSE. (2010). 7 Things you should know about analytics. Retrieved 1 March, 2013 from http://www.educause.edu/Resources/7ThingsYouShouldKnowAboutAnaly/202736
  11. Gephi: http://gephi.org
  12. Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K., (2011b). The 2011 horizon report. Austin, Texas: The New Media Consortium, Retrieved 1 March, 2013, from http://www.nmc.org/pdf/2011-Horizon-Report.pdf
  13. Jones, S. J. (2012). Technology Review: The possibilities of learning analytics to improve learner-centered decision-making. Community College Enterprise, 18 (1).
  14. Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE Review, 46(5), 31-40. Retrieved 2 March, 2013, from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
  15. Norris, D., Baer, L., Leonard, J., Pugliese, L., & Lefrere, P. (2008). Action analytics. Educause Review, 43(1), 42-67.
  16. Northern Arizona University GPS: Academic Early Alert and Retention System: http://www4.nau.edu/ua/GPS/student
  17. PR Newswire. (2012, January 10). Blackboard opens field trial for learning analytics solution. Retrieved 28 February, 2013 from http://www.prnewswire.com/news-releases/blackboard-opens-field-trial-for-learning-analytics-solution-137020838.html
  18. Purdue University: http://www.itap.purdue.edu/learning/tools/signals/
  19. Siemens, G., Gasevic, D., Haythomthwaite, C, Dawson, S., Shum, S. B., Ferguson, R., . . . Baker, R. S. j. D. (2011). Open learning analytics: An integrated &. Modularized platform. Proposal to design, implement and evaluate an open platform to integrate heterogeneous learning analytics techniques. Retrieved 1 March, 2013, from http://solaresearch.org/OpenLearningAnalytics.pdf
  20. SNAPP Social Networks Adapting Pedagogical Practice: http://www.snappvis.org
  21. The Design and Implementation of a Learning Analytics Toolkit for Teachers: http://3.bp.blogspot.com/-3QgnEDT3cnA/UE2WOoRMKrI/AAAAAAAAAjU/VzGOI4xAs8M/s1600/
  22. The International Learning Analytics & Knowledge Conference: http://lakconference2013.wordpress.com
  23. The Society for Learning Analytics Research (SoLAR): www.solaresearch.org
  24. U.S. Department of Education, Office of Educational Technology, Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief, Washington, D.C., 2012.