"Deep" learning is not the ideal

This topic contains 22 replies, has 17 voices, and was last updated by Profile photo of Simon Fokt Simon Fokt 3 years, 3 months ago.

  • Author
    Posts
  • #15613
    Profile photo of c.collis
    c.collis
    Participant

    See blog post at  http://chcoll.wordpress.com/ Week 2 Approaches to Learning activity. Comments welcome.

    • This topic was modified 3 years, 4 months ago by Profile photo of c.collis c.collis.
    • This topic was modified 3 years, 4 months ago by Profile photo of c.collis c.collis.
  • #15877
    Profile photo of KeithSmyth
    KeithSmyth
    Keymaster

    Many thanks for beginning our ‘approaches to learning’ discussion with a very thought provoking post.

    Lots to mull over, and then I’ll look forward to commenting via your blog.

    I hope other colleagues will too 🙂

    Keith

  • #16778
    Profile photo of glenn
    glenn
    Participant

    I feel this is a balanced article, and raises the topic of employability and learning styles very well.

  • #16930
    Profile photo of dlaurillard
    dlaurillard
    Participant

    C.Collis’s post reflects the problem of the way the deep/surface/strategic categories are seen. This has always been problematic because deep/surface was an evidence-derived category from a combination of student interviews and performance, and defined two mutually exclusive approaches to a text. Of those two, academic learning aims for deep, for the obvious reason that it leads the student to the intended understanding of a text (in the broad sense), whereas a surface approach leads to a misunderstanding or misconception. There is no question that any teacher in any context would aim to help students use a deep approach.

    The strategic approach was introduced by Noel Entwistle to make the perfectly reasonable point that in some circumstances, students are more concerned with extrinsic issues, and worry more about, for example, passing the exam than understanding the concept. This was an issue that my chapter on approaches to problem-solving (in the Marton et al book) focused on, because I’d found that students studied not just in relation to the task, but its context as well:

     

    “Thus the deep/surface dichotomy does not characterise a stable characteristic of the student, but rather describes a relation between the student’s perception of a task and his approach to it. The student’s perception of a learning task encompasses a multitude of things: it depends on its form and content, on its relation to other tasks, on the student’s previous experience, on the student’s perception of the teacher who marked it and of how it will be assessed. But the operational outcome of this combination of judgements and perceptions is an intention either to understand or to memorise, and thereby to use either a deep or surface approach.” (p136).

    So it showed that we could not necessarily identify a deep approach as a characteristic of the student, but of the student-in-context. Introducing the strategic approach muddies the original clarity of the deep/surface dichotomy because a student can be strategic and either deep or surface in relation to a particular learning task.

    While I agree that C.Collis’s post is certainly thought-provoking, and agree with everything else you argue, I think this bit overstates the case:

    “Thus, I don’t think “we” should focus on “encouraging deep learning in online contexts.” I’d prefer to encourage a combination of strategic and deep learning because both are equally valid types and outcomes of education.”

    It’s important for students to be strategic, and it was certainly not considered a bad thing by Entwistle, which is why he argued for that category. But it’s our responsibility as teachers to encourage deep learning. That’s how understanding and the ability learn and think for yourself develops.  We must help students develop that capacity if they are not doing it.

    The oddest thing, for me though, is the task we are given here – why online contexts should be any different from conventional learning in this respect is a mystery. Why would it be?

    • #17223
      Profile photo of KeithSmyth
      KeithSmyth
      Keymaster

      Hi Diana

      Many thanks for sharing your thoughts here, and for taking us into the background of the original work which suggested the deep/surface dichotomy. It’s excellent to have your input and experience in this area to share amongst the group.

      I wonder if I could pick up on your last point, asking whether online learning contexts should be any different from conventional contexts with respect to approaches to learning or striving to encourage deep learning.

      I’m personally not sure they would be different at a fundamental level, but I do wonder whether we know enough about how different approaches to learning might represent themselves in online learning contexts, i.e. what a surface or deep approach might look like online, and what the implications might be for designing online activities and supporting online learning. I wonder if there might be important differences in how learners interact with online multimedia for example – does a ‘surface’ learner simply ‘play’ a video clip and passively watch it, while a ‘deep’ learner studies the content of the clip, and repeatedly plays it until they feel they’ve got the point? And if that’s the case, what can we draw from this in helping learners in general ‘learn how to learn’ online?

      In relation to one of the questions for this discussion, I also wonder if students who might lean towards a surface approach would have a more or a less effective experience online. It seems to me that a campus-based, lecture-based course with regular timetabled classes might potentially provide a more effective learning experience for ‘surface’ learners than an online course with more asynchronous activity, and more flexibility in when they might do things (the danger here perhaps being that they come into discussion activities at the very last minute, with no time to read and reflect on what others are saying or what they themselves want to say).

      I know there is some work relating to the relationship between approaches to learning and what students do online, including by Peter Goodyear and colleagues, but I just wonder if there is more to be researched and understood about how different approaches to learning actually manifest themselves online?

      Your take on this would be extremely valuable!

      Keith

       

      • #17535
        Profile photo of guy saward
        guy saward
        Participant

        Keith, picking up your comment on why online vs offline might be different kinds of contexts, I think this could relate back to the environment and what it is designed/intended/culturally used for.

        An online course which values interaction would I suggest naturally promote a deep approach in which participants are intrinsically motivated by their interest to understand the subject. Learners who are motivated by an extrinsic desire to pass a course might be expected to drop out given the lack of structure, or participate at a superficial level, e.g. making 3 content lite posts to earn a badge.

        The focus then moves on to the learning design of online courses, and how you measure (or assess as per Rose’s take below) success. Completion rate is an obvious one. But apart from getting to the end of the course (as measured in strategic badges? And/or motivated by the learning community?), how do you measure successful completion – and who does the measuring. Would a constructivist/connectionist approach do this via a measure of community engagement (and learning experience)? Would a content driven approach be done by more traditional assessment of knowledge?

        So the use of particular technology (e.g. twitter vs online objective testing) does I think drive particular styles of learning if thought is not given to overall learning design. But good design should be able to neutralise the differences in technology/context.

        • This reply was modified 3 years, 4 months ago by Profile photo of guy saward guy saward.
      • #19677
        Profile photo of ilearninguk
        ilearninguk
        Participant

        I am always interested in our assumed knowledge about assessment and interactions in learning and the observed behaviours of our learners.
        We often set up learning opportunities that suit the establishment rather than the learner. Summative end of year assessment is completely unrealistic and does not reflect ‘assessment’ in the workplace, so why use it? I know of very few jobs (any) where you are graded on a single point of assessment at the end of the year that is a win or lose situation. Appraisals look at performance against KPIs over a period of 6 months or a year, and take in to account fluctuations in performance. As a manager in the past, I know the secret of a good employee appraisal is that is it continuous and ongoing with small ‘corrections’ along the way rather than a single big ‘kick’ correction at the appraisal. Yet we continue to expect our learners to perform at summative assessment points which is even more remarkable when a student might have 4 or 5 different topics to perform in (take exams) with a period of a few weeks!

        I advocate continual assessment strategies with completion and awarding of grades along the way rather than a single opportunity. Course work, short essays, experiential and active learning observations are all strategies that promote deep learning. (I was getting to the point of this post…)

        For a learner to move away from strategic learning to achieve a grade and to place value on the learning and more importantly its application, there needs to be greater support of allowing the learner to provide the evidence they want to in the format they want to rather than being dictated to by an assessment strategy.

        We assume learners want structured courses and managed assessment. Learners download apps to their phones to solve problems. They do not subscribe to single “one-app-does-it-all” approach. They want bite-sized, disposable opportunities to show us what they know.

        We need to get better at recognising that.

        • #19747

          @ilearninguk, excellent points! I agree with you *with emphasis*

          -long post warning!-

          In fact, the technology and protocols for continuous assessment of learners already exists. e-portfolios are an excellent way of organizing continuous assessment . The practice of using them is also well researched and the technologies are improving all the time – Mahara, LiveText, PebblePad, even Google sites.

          *Aside* In order to receive ALT’s CMALT certification, one has to produce an e-portfolio documenting and reflecting on practice.

          What is curious to me is that even though the technology exists for institutional scale e-portfolio uptake, there are not that many institutions which have embraced the opportunity the technology presents to  re-invent or re-imagine their assessment methods.  I understand that changing the way assessment is carried out in higher and further education institution is much more than simply installing new technologies and new procedures. There are cultural dimensions to how assessment is perceived and there will be resistance to change by stakeholders within institutions.

          Despite this however, the fact of the matter is that you are overwhelmingly correct. Our current summative assessment methods are an, dare I say it, anachronistic left-over from another time. Let me quickly explain why I say this: In higher education, students are generally assessed with timed exams. Learners are expected to write essays demonstrating their content knowledge within a specific domain. They are not allowed to carry anything into the exam room with them.

          Contrast this with today’s reality where learners are almost never without access to the internet and the vast array of information that it holds.  People today are not required to be able to remember large quantities of information; they carry smart phones, tablets and whatever device they need to give them immediate access to information. So exactly how useful is the assessment method of testing students’ recall of content, when they will likely never be in a position of having to recall information unaided?

          Let us go further and say that students are required to demonstrate a range of cognitive skills in timed exams, well, the reality is that in the real world, these skills are not practiced in a vacuum environment, certainly nothing resembling an exam room.  There are always cognitive aids available in the environment of the real life situations where these cognitive skills are used.  So exactly how relevant/useful are timed summative exams to the real world practice that students will undertake when they leave school? Exactly how useful/practical are the skills that we are assessing with these exams? I believe that we are not assessing the skills that students will need or use when they leave school.

          Like the qwerty keyboard, our higher education summative assessment methods seem to persist because we “have always done it that way”. I am a strong advocate of e-portfolios and *fingers crossed* hope that protocols can be developed to speed their wide-spread integration into higher education assessment.

          Best,

          Kathy

        • #21048
          Profile photo of meg colasante
          meg colasante
          Participant

          @ilearninguk and @kdaniel – enjoyed reading your interchange here on continuous performance. I’m not going to add to the conversation here (being a week ‘behind’ on the discussion) but I do want to simply also ‘flag’ this issue as ‘progressing’ but ‘not yet solved’ from my small viewpoint in an Australian university. However, my institution serves Vocational Education as well as Higher Education, and we generally do much better at continuous performance in the VE context.

        • #24064
          Profile photo of Simon Fokt
          Simon Fokt
          Participant

          I couldn’t agree more on the uselessness of the end-of-year summary assessments. In philosophy, students are typically given two hours to complete three short essays on topics chosen from 5-10 options. This seems to me to be the most unnatural way to conduct assessment – what is the relevance it bears to any actual practice students will engage in in their futures? The fact that we give students several topics to choose from, additionally encourages surface learning – topics are secret, but it’s not hard to predict what areas they will cover. Every topic will roughly correspond to one are covered in the module, so students can strategically spend all their time learning in only three of all the topics covered in the module, and still reasonably expect to write a perfectly acceptable exam paper.

          I think technology gives us great opportunities to deal with that – continuous assessment ensures that students have to deal with all issues discussed during the course. Forum discussions, regular reports, etc., are a great way to consolidate learning. But I see a basic, and unfortunately very sad problem with this approach: it takes more time and resources than exams. Given the current attitudes towards teaching, and the way governments and university administration undervalues and exploits teachers, it seems rather unlikely that they would be interested in implementing teaching methods which would require an increase in paid teaching hours. It seems to me that courses such as ocTEL should target the people who have the power to actually create conditions in which all this fantastic technology enhanced learning would be possible.

    • #23453
      Profile photo of worldexpos
      worldexpos
      Participant

      Very interesting, thank you Diana.

      I did share the same interrogation about the offline/online distinction (‘discrimination’).

      Various contributions and my own experience of the course help understand how two different settings would make a difference.

      This is a really fascinating question, also in relation to ideas and practice of blended teaching & learning and flipped classroom.

  • #16989
    Profile photo of Gary Vear
    Gary Vear
    Participant

    “strategic” learners don’t just want to please the lecturer and earn high marks just for the sake of high marks: they want to earn high marks in order to boost their employability; they want to follow assessment requirements because they have often been told that these requirements and criteria align closely to key professional attributes.

    I feel this sums up the viewpoint of the students that in the past we would have termed ‘gifted’. It is all about strategy…be that strategic assessment choices or strategic when dividing up time and effort. ‘How can I get the best results with the least effort??’

    We live in a society where in order for a change to take place we have to makes something easier, quicker or cheaper and I feel this mantra has infiltrated education rather than being incorporated formally. (This point is also attributed to staff approach aswell.) How many HE/FE tutors are assessing through oral exams and VIVA’s as it is quicker and easier to assess than reading a 3000 word essay etc.

    I agree with the notion that a mixture of strategic and deep learning is the ideal that we should all aim for, but the reality is that if one path is easier than another, most of us will choose the simple solution.

    • #17242
      Profile photo of KeithSmyth
      KeithSmyth
      Keymaster

      Hi Gary

      Your post introduces a really important point around assessment practice, and where we might be making choices of assessment methods based on convenience and economies of scale (although multiple oral exams versus multiple essays might be a close tie!). This got me thinking about the debate in recent years around assessment of learning versus assessment for learning, and how we might ensure that assessed activities are meaningful learning activities in their own right (e.g. project based work, problem and case based assignments, industry based assignments).

      I’m hoping we’ll come back to pick up on these issues, and what it means in online learning contexts, in some of the upcoming discussions and activities – particularly in the task around designing ‘authentic’ learning activities to promote effective learning.

      Would be great to know your own further views on the above!

      Cheers

      Keith

      • #17308
        Profile photo of Rose Heaney
        Rose Heaney
        Moderator

        Hi Keith et al

        This is a great thread with lots to think about. On the your point re: assessment, I would agree it is key to encouraging deep learning (or otherwise). In fact, as a recent twitter stream suggests the absence of any assessment or award on here is in itself an encourager of deep learning (or that is my interpretation anyway).

        So if assessment is a major contributing factor to deep learning, what should it look like on the courses that have it and is it different in the online environment? The first time I was a serious online learner on a course for credit was quite a few years ago when I took a M level module in learning technologies. The assessment took the form of a portfolio consisting of evidence of participation in a series of online discussion activities, group work and a project on a relevant topic. I only needed to pass the module to satisfy a requirement for my job at the time but I got more & more engaged as time went on through the very rich interactions that emerged on the forums and in groups and ended up getting a distinction because I had a substantial portfolio by the end of it. However the portfolio just emerged naturally – the project took a bit of work but I was ready to make the effort because by that stage I was enjoying the course so much. Would an equivalent face to face course be able to operate in this way? One on learning technologies might because it would have to encourage participants to operate online – but would other subjects  lend themselves so easily to this type of assessment & associated deep engagement?

        Assessment may not be the only driver of deep learning but it certainly is a big part of the equation and in the online environment we may  have more (or different) options.

        Rose

        • This reply was modified 3 years, 4 months ago by Profile photo of Rose Heaney Rose Heaney.
      • #17388
        Profile photo of Gary Vear
        Gary Vear
        Participant

        @keithsmyth Unfotunately I feel we work in an industry where assessment methods will always be chosen on convenience when possible. The irony is that technology can offer some of the more ‘convenient’ assessment methods. I think this brings us back to one of the early points in the course regarding ‘engaging staff’. So long as practitioners are resistant to new forms of delivery, then the sector cannot evolve.

        Having said that, I would be looking forward to picking up this topic again in the future.

        @RoseHeaney “Assessment may not be the only driver of deep learning but it certainly is a big part of the equation”

        I’m not sure why but this sentence really got me thinking, is assessment a part of the equation at all??

        Deep learning comes from having a passion for the subject and wanting to engulf yourself in all the knowledge of the topic at hand. This is ususally sparked from a necessity to improve ones practice in a particular field and having the time to allow yourself to do so. So would a deep learning equation look like this?:

        Deep Learning = Want + Need + Time – Distractions (DL=W+N+T-D)

        This leads nicely to the others. We choose strategic learning when we are unable to submerge ourselves fully into a subject. We may still have a passion for the subject, but life gets in the way of all our passions, so we become selctive over which bits will explore and which we wont.

        Strategic Learning = Want + Need + Distraction – Time (StL=W+N=D-T)

        Surface learning is just the reverse equation. There may be a need to participate, but there is no drive to succeed and delve deeper into the subject matter. WE allow ourselves to become distracted to break the monotony of the course and dedicate our time to alternative activites.

        Surface Learning = Need – Want + (Distraction) [SuL=N-W+(D)]
        ———- –
        Time T

        I’m just thinking aloud here, but have always found great clarity when theories can be placed into an equation.

        • #23459
          Profile photo of worldexpos
          worldexpos
          Participant

          Hi Gary,

          really liking your equation-driven assimilation of the scholarship on teaching and learning.

          Was actually caught in spreadsheet equations recently, trying to work out a statement of assessment at Pre-Honours level, so students are clearer about the assessment regime.

          The good thing about these equations is that they are really concise and accessible, and seem to capture adequately the scholarship they originate from. Can really see using them directly to engage students on what learning is and/or can be about beyond high school.

          Thanks Gary.

  • #17316
    Profile photo of Santanu Vasant
    Santanu Vasant
    Moderator

    A very interesting thread with some really great points – I agree with Professor Laurillard’s points but wonder how best to encourage this deeper learning, as Keith mentions too. We don’t in my view do enough of this thinking and practice it in our teaching and the way assessment is structured makes this difficult. Almost have this image of a wave as learning (analogue) and us as educators taking a sample (digital) of this learning for assessment. Not sure if this makes any sense. I guess reflection is a tool for deeper learning, provided the right framework exists. Interested to know what people’s views on this. In this thread the moderator has turned learner, asking more questions and getting a little confused, but guess this too is a part of deeper learning?!

  • #17348
    Profile photo of c.collis
    c.collis
    Participant

    Thanks everyone for rich and thoughtful dialogue. Diana, I appreciate your anatomising the roots of the “deep/strategic” distinction: I’ve added Entwistle’s and your pieces into my (growing!) ocTEL “to read” list. Gary, I often find designing and assessing what are conventionally seen as ‘deep’ activites easier than ‘strategic’ ones: to design a ‘strategic’ assessment activity I have to incorporate academic concepts and approaches, and then articulate those to professional contexts. Because essays aren’t the standard communication platform used in industry, I design the activity to more closely reflect industry practices, for example, students doing 15-minute video research  presentations ‘to the board’, and then posting questions and responses to each others’ videos online. This means watching lots of videos! It involves much more detailed activity instructions and support. It would be a lot easier to just get them all to write essays: it’s subjective, but I can read and mark a pile of printed essays much more quickly than I can watch videos, find and read students’ online comments, and then assess those. But Gary and Keith, you’re right in noting that there is a tendency to “choose the simple solution”: my faculty recently implemented a policy that sessional staff (ie tutors) would only be paid for two items of out-of-class marking per semester; any other assessment needs to be done in class time. Suddenly, in-class oral presentations are back, whether they’re pedagogically useful, encouraging of deep learning, or not.

     

    In thinking through this issue, more, I think the challenge is to design opportunities for strategically deep (ie guiding students towards areas in which deep learning and learning approaches would be of value to them) and deeply strategic (ie drawing on my own deep learning in order to design activities which have strategic intended outcomes for students such as their development as practitioners beyond university, as well as targetting students’ deep learning towards intended professional outcomes).

  • #17811
    Profile photo of Moira Sarsfield
    Moira Sarsfield
    Participant

    I have just posted about my understanding of surface and deep learning at http://eforenhancing.wordpress.com/2014/05/15/surface-and-deep-learning/ based on Activity 1.3 which we did last week.

    Let me know if you agree with my description and which category you fit into!

     

     

    • #18738

      Moira,
      Thanks for your contribution, I tend to flip between the two. On this course I am in the deep learning mode. I like your approach to demonstrating your understanding of this issue, using narrative is a fresh approach.

  • #18225
    Profile photo of jgriffin
    jgriffin
    Participant

    Thank you for your initial post c collis; challenging the deep learning assumption has resulted in a number of interesting responses and I also enjoyed your reflections on your approach to Octel. I find the inclusion of reflective journals within online courses  and the approaches to the completion of these illuminating as to what level of learning the learners is  engaging.

     

     

  • #20904
    Profile photo of Alicia Vallero
    Alicia Vallero
    Participant

    <span style=”color: rgb(101, 110, 127); font-family: ‘Source Sans Pro’, Helvetica, Arial, sans-serif; font-size: 18px; line-height: 27px;”>I’m copying my comments too. There must be a way to share links?  Or is copy-paste the best way to go? Anyway!…</span>

    <span style=”color: rgb(101, 110, 127); font-family: ‘Source Sans Pro’, Helvetica, Arial, sans-serif; font-size: 18px; line-height: 27px;”>In my response to this question (</span>http://octel14.blogspot.com.au/2014/05/week-2.html<span style=”color: rgb(101, 110, 127); font-family: ‘Source Sans Pro’, Helvetica, Arial, sans-serif; font-size: 18px; line-height: 27px;”>) I “confessed” that I was often “tempted” to opt for the strategic approach when tackling the ocTEL activities. Your comments help me appreciate the value of what I was doing and realise that I should actually encourage my students to balance deep and strategic learning. And like you I find the ocTEL dialogue is very enjoyable and beneficial.</span>

  • #17176
    Profile photo of KeithSmyth
    KeithSmyth
    Keymaster

    Hi all

    Copy of my comments on c.collis’ opening post, which I thought I’d share here as well as via the blog post…

    Many thanks indeed for getting our discussions on approaches to learning underway with a number of very important points, and it’s good to see your thoughts have already prompted a couple of responses including a detailed consideration from Diana.

    There’s certainly a tension between viewing ‘deep learning’ as the ideal to strive for in education, and the reality our students face including the decisions they have to make. At my own institution we run a workshop exploring approaches to learning for educators who are taking their Pg Cert in Learning and Teaching. We often hear from them that while they like to think they would lean towards a deep approach, in undertaking their Pg Cert they have to be at least partially strategic to fit their formal studies around their teaching roles. That makes me think about the fact that many of undergraduate students are effectively part-time when taking into account part-time work and other outside commitments, and will by necessity need to be making decisions about where the trade-offs and priorities in their studies need to be (regardless of whether their driving motivation is around understanding, achieving, or simply getting through their course).

    I do personally think that as educators we need to strive to encourage deep learning, and think this supports a qualitatively richer learning experience, but wonder about how we do this against a range of competing demands and contextual factors – many of which you describe. How can we ensure that those who might be taking a strategic approach still have as rich a learning experience as possible, and how can we ensure those who may be leaning towards a surface approach in a particular context can be supported to go even a little deeper?

    Diana has raised the question of why online contexts should be any different from ‘conventional’ learning with respect to students’ approaches to learning. I’ll pick up on that separately.

    Very best

    Keith

    • This reply was modified 3 years, 4 months ago by Profile photo of KeithSmyth KeithSmyth.

The topic ‘"Deep" learning is not the ideal’ is closed to new replies.