| Comments

I love the Tin Can community. There is a wealth of information and support everywhere ranging from application and tool developers to various L&D professionals, and the ADL & Rustici (who have realized this new learning standard). We are all traveling down our own learning paths about the Tin Can API, and gaining our own excitement for how it will help us do amazing things that we could never do before. As awesome and simple as this sounds, learning professionals with practical reasons for implementing and unleashing the power of Tin Can will have to make some very important decisions.

“it’s not about the LRS or the LMS…this is not a technology competition”

Once you decide to take the plunge as a Tin Can adopter you will need an LRS, this is an important decision you will have to make. When the Tin Can API was developed, an LRS was defined as the central repository for storing statements (chunks of learning data) from different Tin Can enabled tools and applications. The LRS was outlined as a system that could be bolted on to a Learning Management System (LMS) or one that stood alone, and that all LRSs must be able to share learning data with each other nicely in a standard format.

Once that was defined, a few early adopter LMS companies added LRS functionality to their existing solutions but we at Saltbox chose to build Wax LRS as a stand-alone solution. We made this decision for a few reasons; a) an LRS must be able to scale quickly to meet the storage demands for high volumes of learning records from multiple sources so you don’t have to worry about it, b) an LRS must meet the highest standards for performance and reliability, c) new features and enhancements to the LRS would have to be made very rapidly in an iterative manner based on customer demand as opposed to an installed bolt-on LRS where changes take months based on upgrades and patches, and d) all of this should be made easy without the need for IT planning, support and implementation (and you shouldn’t have to pay for expensive setup/maintentance/install fees).

So the L&D professional has some important decisions to make and it’s not a choice between an LRS or an LMS, but instead a deliberate strategy to build an integrated environment for which multiple LRSs and LMSs work together. The LMS is not dead (heck, you have probably invested 6 figures on that thing years ago and it still provides value). The LRS is not some bolt-on or dumb database system (there are layers of value that different LRSs can provide you). At the end of the day, this is not a technology competition and every organization has specific needs and different solutions must exist that provide different value.

Here’s one possible future that we envision for the L&D organization. Your organization will have one or two LMSs that provide you different training and reporting capabilities, and you have an LRS built in to one of those that tracks some specific information that is important to you. Additionally, you will have a stand-alone LRS that provides you with an added layer of more specific learning analytics about different learning experiences that can’t be tracked in those other systems. Heck, you may even have another LRS that tracks real performance data. The great thing is that the Tin Can API enables all these systems to talk to each other and share learning data in a simple and consistent fashion, and your business intelligence is never locked up in any one system.

Why do you need this? Simple, there is no single magical solution for your needs if you are functioning as a true Performance and Development (P&D) organization. Ultimately, you need to experiment, iterate and build the right learning technology ecosystem for yourself (with the help of the awesome Tin Can community of course).

To learn more about the Tin Can API and to see a stand-alone LRS in action, please register for our free webinar on November 8th. We hope to meet you in Tin Can Alley at DevLearn 2012!

What does this look like for you in your organization? Please share your comments and feedback below.