add this on Delicious
– saved by suefolley
to
octel
from twitter
learning
edchat
…
This is an archive of the 2013 version of ocTEL.
add this on Delicious
– saved by suefolley
to
octel
from twitter
– more about this link
…
How do course dimensions drive and influence our use of technology? Hill et. al.(1) provide a model of the different factors at play here, identifying four key areas: Logistical: student numbers, class/programme duration etc. Practice-based: activity type, participant expertise, existing practice etc. Pedagogical purpose: pedagogical plan and guidance to instructor …
This activity was based on a JISC (2010) publication. I’m basing my reflection on a new undergraduate level 2 module which myself and a colleague are planning at the moment. It is based on an online business game simulation played in groups. It will have two group assessments, a business plan and a performance presentation, […]
Hoping to do more than one thing on #ocTEL this week, but here’s my starter for ten. This blog post will be discussing and critiquing assessment methods available on technology-enhanced learning courses. Method 1: Online test How does the assessment align with the course learning outcomes? Tends to be more appropriate for lower level learning […]
This activity was based on a JISC (2010) publication. I’m basing my reflection on a new undergraduate level 2 module which myself and a colleague are planning at the moment. It is based on an online business game simulation played in groups. It will have two group assessments, a business plan and a performance presentation, […]
Hoping to do more than one thing on #ocTEL this week, but here’s my starter for ten. This blog post will be discussing and critiquing assessment methods available on technology-enhanced learning courses. Method 1: Online test How does the assessment align with the course learning outcomes? Tends to be more appropriate for lower level learning […]
Here’s my recent blog post for Week 5 but it seems to have opened up memories of 1994 when I did a course in the RAAF. I dug out my old course manuals and Learning Styles results of tests…. ah the memories…. http://activatelearning.wordpress.com/2013/05/19/week-5-platforms-and-technologies-for-octel/
|
@AliSheph We don’t offer assessment in the library – so will be learning from you et al. Will discipline preference arise? #octel #libchat
— Elizabeth E Charles (@ElizabethECharl) May 19, 2013
The second activity of this week on “Producing Engaging and Effective Learning Materials” is about the evaluation of resources in our area. So, it means, in my case, evaluating a resource for the learning of mathematics. However, I will start from a more general perspective. Whatever is the targeted learning, the first thing to check is the validity of the content the resource claims providing the learners with respect to the referent discipline. Then only, I will assess it from a learning perspective. Indeed, there are many issues to consider from accessibility to usability, motivation and autonomy. But, three questions have a hight priority in driving my evaluation:
Why would the student do or say this rather than that?
What must happen if she does it or doesn’t do it?
What meaning would the answer have if she had been given it?
I borrow these formulations from the Theory of Didactical Situations (Brousseau 1997 p.65), but the questions are very pragmatic. The theory works here as a driver of our thinking; it is a tool to anticipate what could be the learning outcome, its likeliness, the possible limits and hence the needed intervention of a teacher. Depending on the responses, one may have to stage the use of the resource in one way or another.
Interaction and feedback are the main objects of the evaluation. The issue is not that students will do that or this, but why they do it, because the constructed piece of knowledge must appear as the best adapted to the situation. Knowledge is something you reconstruct for yourself and appropriate because of its use value. The next issue is to verify, if the resource is interactive in some way, that it can feedback students so that they have a chance to realize that something went wrong and then react to that. If the resource is not interactive, then the issue is whether it is possible to figure out any thing about the activity (possibly, just reading) of students and find the appropriate support to bring. Eventually, the stake of this inquiry is the meaning possibly constructed by the student.
All this means that there is enough documentation about the resource, otherwise one has to guess or invent… just having a resource without information about its design, the intention of the designer and indications about its use, it is hardly possible to make a proper evaluation. This may be the reason why I couldn’t do it for the proposed resource. But, anyway, I will make the exercise when achieving the third task of the week.