April 23, 2013 at 11:43 am #2571
orry in advance everyone. I used to work in a statistical office, so I’m a nerd on this topic…
Characteristics in common:
No collection of demographics or duration/time of day information about online habits
Don’t appear to tie in with general population surveys
Contain a lot of questions about non-technology “course completion” issues such as motivation and time-planning and distractions.
Quite a lot of questions that seem so general you’d wonder what is the point? If they are to prompt reflection then they would need to be connected to a results page that provides support resources for self-identified issues.
Omit questions about use of “conventional” communications e.g. phone and written for the most part which I found can be very revealing to identify students likely to struggle online.
No open or “don’t know” options.
All MCQs – most of the options were ratings rather than rankings.
I used a simple survey monkey with adult learners for a social media course over three years – see http://www.surveymonkey.com/s/JF7DG28 if you’d like to try it or borrow any questions.
This was adapted recently based around very good advice I got on a course run by UCC’s Economics Department on constructing self-completion questions. I think you have to start from social exchange theory – what is in it for the students in completing the survey? Are they voicing concerns, creating change, directing the course towards their learning interests?
My survey was designed around understanding what digital habitat they occupied, which communication channels they preferred, and helping them to orientate themselves or “benchmark” against general population surveys of digital technology use (which we did in week 3 of the course).
In particular I want the 30% (GPS figures for North America but a bit out of date http://venturebeat.com/2010/11/11/forrester-privacy-concerns-faceboo/ ) who are very concerned about social media and privacy to realise this is not abnormal and there is no “right answer”, along with the vast majority of lurkers who don’t want to contribute on forums. I should perhaps mention that I would mostly teach over 30s “digital visitors” with a smattering of mid-20s digital residents. There is a stark age-related digital and literacy divide in Ireland which is exacerbated by a very high number of over 60s particularly in rural areas having left school at age 12-14, compared to a large younger population of more highly educated “digital residents”. I also want to vary the course content depending on what “big questions” my digital visitors are interested in finding out more about.
Looking back at my survey and Helen Beetham’s new surveys and the four in the course materials, I found lots to incorporate and lots that I wasn’t sure about.
I do not think questions about disabilities and accessibility, other than the San DIego one about “do you know how to access help if you need it” are appropriate as institutions should have accessibility policies and resources in place (NB I have also taught modules for disability studies about technology – I’m very aware of practical issues like accessible assessments but that’s a different topic to TEL readiness questions…)
None of the questionnaires asked if people keep data backups. Both as a lecturer receiving the “my hard disk died so I need an extension” excuse, and as someone who works in technical support, I think everyone starting a TEL course needs to consider data backup.
The questions about use of video (whether the learners liked it) were, I felt, too general – use of short, easily navigable clips is a different kettle of fish to tedious long lecture recordings.
Although there is controversy about “other” free text boxes in surveys (bias against those who feel unqualified to answer/useful extra information for future surveys) and don’t know options (acquiescence bias), I like them for finding out the answers to questions you forgot to ask to help with the design of the next survey and for supplementing piloting work to find out which questions don’t work or need rewording.
I felt some questions (UPenn) were so general that everyone would rate them “somewhat”. I was puzzled by the question about whether you always have a web browser open. Do you know anyone who doesn’t except if they are “deep thought” writing?
There were a couple of “when did you last beat your wife” questions e.g. San Diego “I am shy and do not communicate effectively with my instructor and other students”. How are the two parts of this option and the question related?
I really liked Helen’s question about whether learners felt they could judge the quality of online resources, and her mention of using methods of study where you can access from anywhere. In the past year I’ve taken to keeping all my study notes in the cloud and I feel this is a crucial tool students need to get to grips with (and can help with data backup problems). A lot of the work I’ve done on digital literacy is around helping learners with search techniques, search results and awareness of filter bubbles http://www.youtube.com/watch?v=B8ofWFx525s.
FInding out who’s uncomfortable with online discussion is also vital, but no-one (except Helen) asked about whether learners had techniques for “re-finding” online information such as bookmarking and referencing tools – vital!
I wasn’t too sure about Helen’s question about the economic value of digital literacy in getting a job – it’s a pre-requisite, necessary capability that isn’t really domain specific. I’m really looking forward to watching her webinar recording this week.
I’ve always like this graphic around digital literacy (page 19, it’s so frustrating not being able to post graphics on this forum!) http://archive.futurelab.org.uk/resources/documents/handbooks/digital_literacy.pdf but turning back to the practicalities of getting worthwhile answers from pre-course surveys, I am not very convinced a lot of my students know what critical thinking is before the course – and suspect a lot will just tick the yes box on those questions…April 23, 2013 at 11:44 am #2572
Even more sorry that the bullet point formatting doesn’t appear to work on this forum!April 24, 2013 at 9:42 am #2656
I found this really enlightening.April 24, 2013 at 12:13 pm #2671
Great post, really useful and thanks for sharing your expertise. Often people think surveys are simple tools and do not put a lot of thought into design of questions.April 24, 2013 at 5:47 pm #2704
Top post – really useful, thanks. Please start your blog up again!
JoApril 24, 2013 at 5:47 pm #2705
Thanks for your post – I was hoping to use some questions from the examples for a student survey in the autumn – but found them quite disappointing, and had some concerns about students knowing they should give the ‘right answers’ affecting reliability. I’ve blogged about them less eloquently than you at http://alicesadventuresinedtech.wordpress.com
Yours, however, is full of useful material, which I might take as basis for adaptation, if that’s ok with you?April 25, 2013 at 10:03 am #2740
Thanks for sharing your survey. Did you consider using a five point scale on “What technology do you use at the moment?” to distinguish between occasional and regular use?April 25, 2013 at 1:14 pm #2742
Very useful post Imogen – thanks for taking the time to share it. I didn’t realise how much I agreed with the points until I saw them all written down.April 25, 2013 at 1:47 pm #2746
I agree with Imogen orginal assessment and addressing in their skills to search.
Ideally world we need to be able to assess all students digital literacy whether face or DL, from the results point them in the right direction of areas they need to learn before embarking or whilst doing their studies, emphasising this will assist their learning overall. Not just backup, where to backup i.e. if using Cloud it needs to be on Cloud where you still retain copyright, as personal Google you don’t, flicker, instrgram, etc. in addition need for file management, constuctive file naming and file version control.
Taught class this morning who in second year on technology related degree who don’t know how to use basic word processing formating, they have been sent away to go and learn and why they need to learn, i.e. they are submitting CV for technology job, in digital format you can’t work word it will show, raise questions about your technical abilties in operating basic software systems.May 12, 2013 at 7:43 pm #3609
Ali I’m so sorry!
I had not noticed that my post had got any replies until today!
You are MORE than welcome to use or adapt any questions from my survey. What I would suggest is working backwards from whatever general population digital/information literacy work is done in the UK (are there surveys carried out by the ONS on this at all?) and see if you can include some of these usefully because then your students can get a feel for where their habits lie compared to “joe public”. I used some questions from surveys that had been carried out for semi-state bodies by commercial marketing companies which were in the public domain (both the question and the results).May 19, 2013 at 10:37 am #3867
Thanks Imogen. There’s so much going on that it’s hard to remember to go back to stuff from a while ago!June 2, 2013 at 8:49 pm #4256
This is a really enlightening look at questionnaires and the issue with those completing them either over-estimating their competence/abilities and the fact that surprisingly some learners still lack basic IT skills. This is why I created the web quest (during the learning design activity) to try and get a baseline of learners competency before a F2F research session took place. I echo Anortcliffe views and the frustration the learner in question may encounter in suddenly hitting this obstacle which just adds to the layers of difficulties that they have to overcome, never mind the demands of the discipline/subject.June 3, 2013 at 7:54 am #4272
Niall, what options would you think should be given on the five point scale here? Would it be “not at all” to “all the time” or “never” to “often”?
Sorry for late reply… 🙂
The topic ‘Surveys… the nerdy view’ is closed to new replies.