I’ve been thinking a little more this morning about the language used by the researchers in the SJSU Udacity report. They focus a lot on student “effort.” But it’s also pretty common in education to talk about “engagement.” From a technical perspective, the researchers chose the better word. “Effort” is meant to be an observable behavior, e.g., how many minutes students put into watching videos or how many homework problems they solved. “Engagement” is a non-observable attitude that might be a cause for differences in effort that we observe between students. But the connotations of these words tend to encourage different sorts of questions. When we talk about a problem with student effort, we tend to ask how we can get students to do more work. When we talk about a problem with student engagement, we tend to ask how we can get students to want to do more work. The former might lead us to solutions such as student reminders and alerts when they are falling behind or changes in schedule to accommodate students with jobs, while the latter might lead to ideas about increased interactivity or changes to the content.
Just a thought.
doctorkip says
There’s been a long history of measuring student engagement, which has been shown to correlate with retention and success. The National Survey of Student Engagement, a well-tested instrument, is one popular method used at hundreds of 4-year colleges and universities across the U.S. There are good ways to measure students’ perception of engagement even in specific courses, online and FTF; it might be best to think of ways to measure both engagement and effort. Clearly, the two are related in important ways, as long as we don’t confuse engagement with “entertainment.” Studies like the Harvard Assessment Seminars showed that the challenge of a course is related to students’ self-reported levels of engagement (and curiously, the amount of writing assigned is among the factors most strongly related to engagement).
Michael Feldstein says
Thanks, doctorkip. This is great information.
pwik says
Michael, I think this is an important distinction to make — thanks for calling it out. One (effort) is an operational issue, the other (engagement) is a course design issue. And as you observed, effort is clearly tied to level of engagement. However, I think there is a third piece here — motivation. What motivates me to be engaged in this course and therefore put out the effort needed to succeed? Especially in the non-credit course world, communicating that “value proposition” (to borrow a marketing concept) is at the heart of driving motivation -> engagement -> effort.
lauragibbs says
Very much the right questions to be asking, and I agree with the other commenters here. Motivation can lead to engagement which leads to effort in a nice, natural progression (ah, if only it were always thus!) – unfortunately, for students whose motivation is grades or completing a degree requirement, it can sometimes happen that they are ready to make the effort required to get the grade without engaging a meaningful way with the class. So engagement and effort are separate but interrelated challenges. I work on both all the time, both in terms of course design (and yes, I would say engagement is very much a matter of course design… there’s not a lot I can do on a day to day basis that really affects student engagement in a profound way – but the choices I make in designing the class overall have a huge effect on student engagement) and also in terms of communication – in my daily announcements, generic reminders to students who are missing work, as well as every person-to-person communication to individual students, I am always soliciting more effort from the students and praising them lavishly for the effort they do invest. If they don’t make the effort, there is not even a chance of engagement – best case scenario is that I can help a student get into a good work routine for the class and then, glory hallelujah, they discover they actually like it! Sadly, the idea that reading and writing would be something truly satisfying comes as a surprise to some students.
One of the things I like best about teaching fully online is that I have many more indicators, both direct and indirect, of student effort and engagement. In the classroom, I often had little idea of either.
Ann Doty says
Perhaps effort is to “energy” what engaged is to “mindfulness.” I re”minded” every day of how I think about the work completed, the goals and the sources of information I must pursue to get the result the mental model needs, wants or expects to be successful in my career.
Student effort may, as you describe, equal observable and constructive movement toward learning through reading, writing or producing. Engagement, however, can also be observed as a measure of the number of “clicks” on an academically relevant website or the number of contributions to a discussion board and other sites within an online course. Thanks to social media, and the relevance of games and other ways of “being” within the teaching/learning cycle, the student/faculty interactivity has become “one” with a personal relationship. And yes, it is our job as parents, friends and faculty to pursue any way, any technique, any solution to reach and enable student learning so that they may lead the lives they envision.
Alfred Essa says
I think there is a useful distinction here between effort and engagement, but I am not getting it. I get confused as soon as we start saying one (effort) is “observable” and the other (engagement) is not.
We can imagine two students who exhibit equal effort but are unequally motivated. Is that what we mean by engagement. Both are doing the work but one is “going through the motions” so to speak. But if there is a distinction here it must be observable in some way, otherwise it’s a distinction without a difference.
If two students show the same effort, but is one engaged and the other not, how do we know the difference? There must be some “cash value” in observable differences in behavior, otherwise there is no difference in meaning.
lauragibbs says
I can imagine lots of ways to measure the difference, although most of them are not feasible (for me anyway). I would wager that the students I am characterizing as making some effort but without engagement will not remember anything at all about the course six months after it’s over. Immediate results of the effort (a quiz score, for example) might be identical, but the long-term outcome quite different. Just one example that comes to mind.
Michael Feldstein says
Al, the real distinction I’m getting at is cause and effect. Did the student stop doing the work because had a crisis at home and didn’t have time to complete it? Because she wasn’t organized enough to keep on-task? Or because she just found the work to be so boring that she couldn’t force herself to do it? The answer is going to drive the kind of intervention we make in order to increase student effort. If we create an awesome immersive game but the student dropped out because she had organizational problems, then that was wasted effort. On the other hand, if create an engagement early warning system to prod her when she is falling behind but the truth is that she stopped doing the work because she hated the class, then that is also wasted effort. We can observe the consequences of increased or decreased engagement. We just can’t observe that engagement directly because, unlike effort, it is a mental state.
Maha says
This is a very interesting discussion to raise, and the comments are very thoughtful. While I appreciate that engagement has been “made” measurable, I think the value of studying something should not be in its measurability, but in the value of studying it. There is intrinsic value in engaging students… It makes them better learners, and i would argue , better individuals. Effort, on the other hand, while (only slightly) more measurable (you can’t see mental effort or work behind the scenes, not really) does not necessarily result from or indicate true engagement. I have studied many things that took little effort to with which i was very engaged. And vice versa. Placing motivation in the mix makes sense, though the relationships are nowhere near linear.
Laraine says
I think making these distinctions is really valuable, and I’m wondering if they are, in general, being made by those marketing digital products for students. CourseSmart Analytics, for instance, claims to measure engagement from things like amount of time spend on a page, highlighting, book marking…. Given how I, just like my former students, sometimes daydream for a very long time over one page if the material is either hard to process or boring–I’m wondering how these claims could be legitimate i.e. claiming that time spent equals engagement, ditto for underlining, which often misses key points and becomes mindless, and book marking which does not report, or can’t report (I might be being naive but if companies selling these products can report student visits, I find that kind of creepy), if the student actually returns to the site book marked. These seem to be indicators of effort but not necessarily engagement.
However, I actually think the distinction between engagement and effort starts to blur– in good ways– when it comes to student involvement in discussion boards, number of assignments contributed, and pretty much all of the methods Laura and Ann mention above. I would say that these kinds of collaborative efforts can be a very good indicator of both effort and engagement, but I think many of the indicators relied upon by CourseSmart or other companies using the same measures are misleading and they are talking way more about effort than engagement. Thus I am wondering if they even make the distinction.
lauragibbs says
Here’s another factor to throw into the equation… the very dangerous surface similarity between engagement and COMPLIANCE. I think maybe I am wanting the difference between effort and engagement to somehow signal that distinction: you can make an effort out of compliance, but I like to think that engagement (real engagement) is not just compliance. I was prompted to add this comment by a post from Larry Ferlazzo today:
http://larryferlazzo.edublogs.org/2013/09/13/ill-take-90-student-engagement-over-100-compliance-any-day/
I’ll Take 90% Student Engagement Over 100% “Compliance” — Any Day
Kate says
I also found Laraine’s introduction of daydreaming helpful. Jonathan Smallwood writes persuasively about the science of “mind-wandering” and this has been taken up in studies of the impact of daydreaming in educational settings — not all of which is negative. But daydreaming, like engagement, is typically measured by self-reporting. Currently in higher education we seem to be favouring measurably visible activities, as though we want the gestures of learners (clicks, time spent, where they look on the page) to reveal something more authentic and useful than anything they might say about themselves.
This is a really old showdown between quantitative and narrative research methods. The risk is a bias that ends up favouring presenteeism, both online and offline. Showing up is countable; thinking really isn’t. So we’re seeing a major underlying correction in what it is that educators are meant to encourage, and for me this is why separating effort from engagement, and limiting what we think effort might demonstrate, is really helpful.
Alfred Essa says
I suspect there is a useful distinction here but, if there is, it should be observable. As Laura notes the observable consequences might come much later. Laura’s point is an important one and is what makes efforts at improving “learning outcomes” so difficult.
One of the best courses I took in college was a Shakespeare seminar. I was totally lost and struggled the entire term. If someone were to have measured my knowledge they would have surely concluded: “This dude has learned nothing.” It was only a decade or so later that I realized how much the course influenced me.
But it can’t be just a hidden “mental state”, a ghost in the machine, that never leads to anything observable. I tend to be a Peircian when it comes to clarifying the meaning of concepts. Peirce’s pragmatic maxim: “Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.”
If effort and engagement have exactly the same effects or observable consequences, then they mean the same thing.
Michael Feldstein says
Any halfway decent teacher can empirically observe the difference between effort and motivation. Whether an analytics package can do the same is another question entirely.
Alfred Essa says
Ok. We now have three fuzzy, possibly overlapping, concepts: effort, engagement, and motivation. If the assumption is that these are relevant for building effective educational technology tools, then we need to be more precise. So, how does a “halfway decent teacher” *know* the difference between effort, engagement, and motivation? What are the observable cues? And even if we are not interested in building educational technology tools but training teachers to be more effective teachers, then how do we train teachers to recognize the difference among their students?
lauragibbs says
Alfred, for humans it is simple (we may be fuzzy but we are not stupid, ha ha), but I really don’t expect a lot of progress on the machine side for these very human factors.
Effort is a component of engagement; without effort, no engagement. My interventions regarding effort involve checking up on the work that students have (or have not) completed and warning them when they are falling behind. The most important thing is to warn them when a deadline is fast approaching so that they do complete the work. Because our LMS (Desire2Learn) provides such poor support for that, I do most of that manually. Re: Michael’s observation about different kinds of interventions, I make it a point to get to know my students. Some students get very generic emails which I send out with a BCC that contains just a canned formula about needing to get some assignment done today before the deadline blah blah blah, not addressing the student by name. On the other hand, if I know a student is dealing with a difficult situation (for example, managing a medical problem), I do not send the generic email but one which I write specifically for that person, written person to person, finding the right balance between work required for the class and that student’s personal situation.
Engagement: This is a matter of course design. To improve the design, I need feedback from the students. So, how to find out if the students are engaged? Ask them! That’s one obvious way. Also, you can observe some patterns – students who write the bare minimum, students who do sloppy work, students who turn stuff in late week after week, students who skip assignments completely, etc. There are a whole series of indicators that, taken together, given an overall picture of engagement for each component of the class, even if I would not want to label individual students for engagement. Since my engagement interventions are not so much on a student by student basis but instead in terms of course design, I don’t need to accurately label individual students, but I do need to get a sense overall of engagement levels for a given assignment or activity. If I see a lot of late work for a given assignment, a lot of sloppiness, a lot of bare minimum, then I know there is a problem with that assignment. I am continually revising the assignments to improve them (better instructions, more examples, more ‘fun factor,’ etc.), and I do not hesitate to discard un-engaging assignments in favor of new assignments likely to promote higher engagement. I have no idea how a machine would do any of that, although I would appreciate better analytics from the LMS (feedback collectors built into the LMS on an assignment by assignment basis) … but honestly, it’s not crucial. As an extremely attentive and engaged instructor, I have plenty of data to guide my course design decisions.
Motivation: I consider this a kind of raw factor, something students bring to the class. Trying to increase engagement for students with low motivation (inevitable, especially in a Gen. Ed. writing class!) is not easy, but that is what makes my job challenging in a good way.
Andrew F says
Such an interesting discussion but I wonder if measurement of any of this would actually tell us anything? I found that these external measures, however they are carried out, simply lead to more intelligent questions being asked about the learning environment, they never gave me answers.
When I want answers, as Laura suggests, I always go to the student. But the student needs to be empowered to be able to reflect honestly about their effort, engagement and motivation and their feedback valued and used.
When our students start to understand these issues in the context of their own learning then we as educators may start to understand them in the context of our own teaching.
Jan Poston Day says
One other thing to think about is the student’s level of persistence. Persistence it turns out is a very strong predictor of student success. Perhaps if we first measured a student’s baseline persistence as part of their pre-course placement test, then devised lessons to increase it, we would see greater improvements in the numbers of students mastering material and at a greater depth of understanding. http://blogs.kqed.org/mindshift/2012/07/can-kids-be-taught-persistence/
@PhilOnEdTech says
My vote for most interesting & insightful comment thread of the year: on @mfeldstein67 post re Effort & Engagement http://t.co/zk8x8Ia84T
lcjshaffer says
This is fun! But I have a question – what part do demographics play in engagement and motivation? I work for a graduate institution and I teach online for a different graduate institution. Will I use strategies to engage and motivate my 35-45 year old graduate students differently than those who work in a community college engage and motivate the 18 year old students? Also my (work) graduate institution is very diverse, i.e. many non-US students.