There’s always a huge pile of work to be done and I find myself with less and less time to dedicate to the blog. Still I believe it is an important tool and document for learning.
So, I will be shifting my approach over the next few months. First, I want to move from narrative to outline form. This makes it faster for me to write and faster for people to read. Second, I’ll be incorporating more videos, images, and slides to move towards a more immediate and potentially interesting experience. Third, I’ll be doing both high-level conceptual posts (as whats been done up to now) along with some more low level technical and applied posts.
Some of the ideas the previous post is still bouncing around in my head.
One of the challenges with analytics in an educational context is that people rarely agree on what is important. In fact, there is often vehement disagreement between instructors and institutions about what types of data are most important. So there is a real structural problem with pedagogical methods and consistency of data.
Picking up where I left off, here are some more observations:
Data quality suffers because:
– Data collection is expensive and insitutions are short on money.
– Competing groups fight over statistics connected to policy and funding.
– Inertia of outdated organizational culture and lack of performance mindset
Better data would help everyone since:
-Administrators need to know where to target human and financial capital at the institutional, department, and building level.
-Teachers need to know which pedagogical methods and materials are working within each class.
-Students and parents need to know what they can do to improve individual outcomes.
The optimist in me hopes that there will be better data quality in the future. However, the pessimist in me anticipates more of the status quo.
It is an oversimplification of the process, but in general learning analytics can be interpreted as another application of web analytics. Similar statistical techniques have been used in marketing to profile and target customer groups and in web business to refine user interfaces, increase ad revenues, and improve sales. The issue with education is one of purpose and shared understanding and agreement on purpose for learning. Clearly marketers and businesses face a bottom line which is quantifiable. And there is less subjective perspective or opinion on these matters than there are in terms of education, for which there are diverging social and political agendas.
In general, the process of analytics had a relatively easy time in application to marketing and the web because of some common shared assumptions about purpose and goals. Still, even though there is not the same consensus around purpose and goals within education as an industry, the process of analytics is nonetheless relevant as it speaks to levels of user engagement and effectiveness. Both of the latter issues around attention and persuasion are at issue with formal and informal learning processes.
The conversion rate of a web cart or click through rate of a website are measures that help us gain a quantitative understanding of user behavior within a specific context. The data itself provides some evidence of what is going on, and we can evaluate the situation on a more objective basis. In the same regard, the point of learning analytics is to gain some quantifiable and objective data about learners and the learning process, to provide some more objective basis for decision making in a learning process.
In terms of adoption, the issue for educators will be how soon we realize that the interaction data is already there waiting for us. We may not be collecting or analyzing the learning data in a systematic way, but it continues to flow in a steady stream by virtue of the fact that more an more of our interactions take place in a digital and connected world. At an abstract level, we need to raise awareness of how digitization of communication networks and the constant creation of interaction records is already happening, and how we can gain insights into our current situation by collecting, organizing, and analyzing this information.
Slideshare on social learning analytics:
This slideshare outlines much of the work that’s already happening in terms of learning analytics. From the CALRG 2011 Conference.
Lots of interesting work at Carnegie Mellon is happening around the refinement of online learning materials with analytics.
I find the Digital Dashboard Project for Learning (DDL) particularly interesting. This project attempts to model a positive feedback loop between instructors, students, and content in an online learning system.
Publications page here.
After a much longer hiatus than I would have liked, I’m back chronicling my learning analytics adventures on this blog. This promises to be an interesting year, and I look forward to sharing more of my thoughts and reflections on these topics on a regular basis over the next several months.
Game and app developers are pushing the envelope to understand user engagement. This O’Reilly post highlights Flurry, a company which offers an audience insight product. The core assertions here are that analytics will help developers make better apps, which have greater levels of audience engagement, and in turn improve the outcomes of developer activity. Flurry is also doing segmentation, parametrization, and categorization to bring more detail and accuracy into the analysis.
I’ve thought for some time that mobile analytics will play a big role in learning analytics, especially as we experience greater mobile proliferation, and mobile shifts from being the secondary to primary point of access to online communication and content. Flurry provides a nice model for what a really forward thinking learning analytics company should be thinking about when they design their products.
“Data scientists, generally speaking, follow a simple model: Obtain – Scrub – Explore – Model – Interpret. Sometimes the biggest challenge is that the data you want is not available or even worse you cannot get to it – as you have indicated. I think the problem tends to be on what information is important to track, and who defines important.” – Schwan
Yes,the “traditional” applications of data science and analytics have a protocol or process that assumes some agreement about the types of data to track and what is important. One of the challenges with analytics in an educational context is that people rarely agree on what is important. In fact, there is often vehement disagreement between instructors and institutions about what types of data are most important. So there is a real structural problem with pedagogical methods and consistency of data.
Revisiting the astute comment, we might insert DEBATE – DECIDE – Obtain – Scrub – Explore – Model – Interpret as steps in the modeling process. This is not so much a technical limitation but a sociocultural challenge. However, once we pass that level of decision making the process becomes much neater.
Link to the original post about how difficult it is to get useful data from current LMS vendors and why visualization can be so useful. In this case it appears that the sheer number of visits to a class site and the grade earner are weakly correlated.
Will Pearson succeed in shifting from being a products (books) company to a platform (services) company? I’d argue that stakes are quite high. Here’s a good and detailed post from Michael Feldstein’s e-literate blog.
The context is set for Pearson to be more like Google or Facebook and less like print news or the music publishing industry. The question remains, and time will tell, whether the Open Class project can gain wide adoption and legitimacy as an LMS. Furthermore, at a higher level of observation one might ask whether the LMS is in fact the primary platform of education and learning interactions. The interesting aspect of Open Class is that we see potential for decisions about LMS to be more centered around institutions and less determined by institutions. In fact, this mirrors the trends we see towards consumerization of IT across most industries. In another sense, we might speculate that the move towards platforms and services will force Pearson to rethink the openness and flexibility of its offerings in terms of consumer demands, and perhaps even re-engineer the internal process by which the various educational technologies are conceptualized, selected, and developed.
This of course has huge implications for the educational technology market landscape, but on the whole I believe the expanding consumerization and increased openness of platforms to individuals to be positive trends for learners.
“The deal will be, you can use our tools if we can put your stuff onto our noncommercial public domain,” Khan said. “We don’t know how it’ll turn out, but we suspect there will be some amazing things put up.”
Big news this week from Khan Academy. The previously closed system is now being opened up to teachers and institutions who want to create and share their own content and courses. If successful, this will be a great expansion of the Khan Academy offerings and the Open Educational Resource (OER) movement. KA is now offering all the tools for free (dashboard, analytics, and reporting) for anyone who agrees to make the content open and free.