Open Source Assessment

In Half an Hour: Open Source Assessment, Stephen Downes commented on what “…the ideal open online course would look like. …[his] eventual response was that it would not look like a course at all, just the assessment.” And he goes on to talk about that assessment in authentic assessment terms.
His reasoning was: “were students given the opportunity to attempt the assessment, without the requirement that they sit through lectures or otherwise proprietary forms of learning, then they would create their own learning resources.”

This ties into two threads of conversation I’m having with Theron. The first is how the university could use Web 2.0 ideas to respond to a Governor’s order to close during a flu pandemic, I wrote an emergency packet for this event. (Theron noted that the skills in my packet are the skills we’d want students to have when they leave the university, so we jokingly re-purposed the packet “Open in case of graduation…”)

The second thread is the project we are kicking off for the our ePortfolio competition. (As the links might change, here are three pointers into the work: the 2006-07 contest and the portfolio of its results and the 2007-08 pre-announcement.) The 2006-07 contest asked students to make a portfolio to document some aspect of their learning, and we got some very interesting (and diverse) results. The idea of the 2007-08 contest is that students are to find a problem facing WSU or their community and develop a solution to that problem, then document their learning in a portfolio.

To help prompt students to join the contest, we are collecting short summaries of students at WSU who have already done similar work in the context of classes. In a distance degree course, Decision Science 470, students were challenged to find a problem within their workplace, form teams around the problem and solve it. (Students saved a dairy plant from closing, found new ways to manage inventory, and improved performance of call centers.) In Human Development 410, distance students were challenged to find an problem in their community that mattered to them and learn about the policy and political aspects of the problem in order to understand and become an “engaged citizen” around the issue.

What is missing in Stephen Dowens analysis above, that we had in the courses, is an idea of how to leverage the group of learners to enhance their learning. In the John Seeley Brown notion of repair technicians always having their radios on and thereby becoming a community of practice, the course can be a hub for a community of practice. The designs for the courses above were a set of prompts that scaffolded some open activities (e.g., interview a member of the community who has expertise in your problem but who’s perspective differs from yours, ask how they organize to advance their political goal.). Students share their work on the activities (we used threaded discussion, but Web 2.0 solutions might have advantages), and feedback among peers is encouraged and structured with a rubric. As Stephen suggests, the final assessment is open and the evidence for the assessment is some form of synthetic response (aka portfolio or essay) to the course’s overarching question.

This design does not depend on a central role of the instructor, in fact, we have a growing body of evidence that students can use a rubric to provide peer evaluations that agree very well with faculty assessment (agree as well as faculty agree among themselves). In the case of pandemic, this means that the course could proceed and succeed in the absence of the instructor.

So, given Stephen’s open assessment model, what is the role of the university? I think its role is to make occasions for learners to form communities of practice and networks among themselves as a collection of experts. It might credential, based on these open assessments, but its graduates would have portfolios of authentic work that would be open to evaluation by employers and others outside. Further, because the graduates would be members of communities of practice, they would have reputation that would help third parties assess their knowledge and skills. Web 2.0 thinking lets us have conversations about how such a university might be decentralized, either in time of crisis, or to serve a distributed community.

Advertisements

3 Responses to “Open Source Assessment”

  1. One small step for man » Blog Archive » Cheating on online exams (a speculation) Says:

    […] One small step for man Exploring learning & technologies from outside the university’s walls « Open Source Assessment […]

  2. SC Spaeth Says:

    ‘Theron noted that the skills in my packet the skills we’d want students to have when they leave the university, so we jokingly re-purposed the packet “Open in case of graduation…”’

    I suggest that you change it once more to “Open in case of matriculation…” and I’m not joking!

  3. SC Spaeth Says:

    You said “before graduation…”
    I replied “before matriculation…”
    And now that I see it in print, I see it should say “before accepting…” or even “before applying.” If these are what we aspire to have our students know and be able to do by the time they finish their experiences with us, doesn’t it make sense to make it clear from the very beginning of the process?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: