The Monsters We Don’t See

scaregrow

It is Halloween week! While this year’s Halloween will look different than previous years, one of the ways we can always connect is with stories. Sure, the medium may differ depending on the times, and we may not be gathered around a room, but podcasts and blogs are a great way to share spooky stories to keep us up after we binge on candy and prepare to watch scary movies in the comfort of our own homes. This year’s story is about monsters you may not be able to see on your own.

monsters of l&d

We face these monsters every day as training professionals. They cannot be tackled the same ways the stories of our past have taught us to tame mysterious beasts. They lurk right in front of us; on our computer screen, on our social media accounts, in our books. Staying home cannot protect us from these monsters! This week on the Train Like You Listen podcast, Mad Scientist Clark Quinn, Ph. D, Executive Director at Quinnovation, joins us to spot and fight the monsters that plague so many in our industry.

Transcript of the Discussion with Clark Quinn

Brian Washburn: Welcome, everyone, to the Train Like You Listen podcast. And today we have a very special, very spooky, Halloween-themed podcast. Joining us today is Clark Quinn of Quinnovation. He is also the author of “Millennials, Goldfish, and Other Training Misconceptions – Debunking Learning Myths and Superstitions”. Clark, thank you so much for joining us today.

Clark Quinn: Thank you.

Brian Washburn: Clark, as we always do as we get started with these podcasts is that we’ll ask our guests to introduce themselves using a six-word biography that kind of sums up their career, or at least this topic, in exactly six words. For me, as I think of our conversation today and these spooky myths, these zombie myths, and some misconceptions, my six-word biography might be that “I pay better attention than goldfish”. How about you? If you could sum up your career in six words, what might it be?

Clark Quinn: Oh, wow. Let me see. Appropriate to the theme of today, “I am a mad scientist of learning and engagement”. [CHUCKLES]

Brian Washburn: I love that. So we’re talking about learning myths here. And just so we can set the baseline, and if we could just first talk about why learning myths can be harmful. You know, when I first went to an eLearning Guild conference and heard this thing about how humans now have a shorter attention span than goldfish, you know, it’s kind of a cute story. It’s kind of a funny story. It makes a point. You see a lot on LinkedIn about millennials and how they’re all digital natives, and we need to cater to the fact that they need technological learning solutions. Why are these myths harmful?

Clark Quinn: I think for several reasons. It would be nice to say, “oh, they’re not really harmful”. What are they going to cause? But first of all, people spend money on these things. So they purchase these personality instruments and use them inappropriately. They also waste time. If you develop redundantly for different approaches, you can waste time.

But to me, the most important one is, really you can limit people. You can artificially characterize somebody by their age and say, “oh, you’re a millennial”. Therefore, you’re this. Given that there is no evidence of this when you look at the data on what people value in the workplace, it doesn’t change by generation. And the generation cut-offs vary depending on who’s talking about it. It’s pretty much arbitrary.

And that means we can be, essentially, discriminatory. We can be saying, oh, you know, or categorize people by something that’s out of their control. And we should be categorizing people by their behavior, what they do. And people can self-limit.

When I say, I’m an NSFW, that may keep people limiting themselves from trying anything else being appropriate. And one of the things we know is people, for instance, learn differently depending on what they have to learn, why they’re learning it, where they’re learning it, when they’re learning it, the time of day, the phase of the moon. And so if they limit themselves in ways, they may keep themselves from taking advantage of opportunities that they miss.

So beyond just the time and money, we can actually be hurting people. And, of course, we may be doing bad learning design as well. So I think all these reasons suggest that this is something to be avoided.

Brian Washburn: (laughing) Yes, and the mad scientist comes out again. These myths pop up all the time. What is the spookiest learning myth that you’ve heard about recently?

Clark Quinn: You’ve already mentioned it, you know? That we’re turning into fish people. We have an attention span less than a goldfish. That would be scary if it were true. Evolution doesn’t work that quickly, and our attention span is sort of an artifact of our cognitive architecture. And it turns out it’s complex.  If you think about it, we think we have control of our attention, but there is the cocktail party phenomenon where you’re talking in a conversation at a party, and another group is talking somewhere else. But they happen to mention your name, and boom, suddenly your attention is way over there in that other conversation. And similarly, I mean, who hasn’t surfaced from a book or a movie or a computer game and realized that many more hours went by than they expected?  So it turns out that that was total– it started with misinterpretation of data, and then was repromoted by people focusing on advertising material instead of real science. So it’s just a bad thing.

Brian Washburn: Yeah, yeah. And I’ve heard, you know, kind of, the comparison to Netflix, right? And it takes maybe less than a second to flip through from one option to the next. But that isn’t our attention span. That’s an entertainment preference.

Clark Quinn: We may have more distractors than we used to, but we still – our attention span hasn’t changed. We may have – make it easier to do it, and maybe take more effort to maintain people’s attention. But that’s not a bad thing. But it’s not that our attention span has changed.

Brian Washburn:  You hear about some of these things in incredible places. At conferences, or in papers that people are writing. And it catches my attention. When somebody said our attention span is less than a goldfish, I was like, “oh my gosh. We need to change how we do things.” Or a long time ago, I was developing or delivering  programs that really spent hours talking about learning styles. And then I read an article by Will Thalheimer that said it’s not – there’s no research that actually backs up this idea that if we create activities catering to specific learning styles that we get better results. And so what are some of the hardiest zombie beliefs, the beliefs that just don’t die in this field? And why are they so hard to kill?

Clark Quinn: If only they were vulnerable to wooden stakes, we might be able to kill them. Some of the most persistent ones– you mentioned it. “Learning styles” is just incredible. So many people believe it, and it’s tied into the way our brain works. Another one that continues to be persistent is millennials, and that one’s easy to understand.

So “learning styles”, we like to categorize people. And anybody who’s ever taught knows learners differ. And it would be nice to believe that we can categorize it, and it resonates with us. I prefer visual stuff versus I prefer auditory. And I get that.

But it turns out, as you referred to Will’s pointing to the research, first of all, the instruments that do it aren’t reliable that categorize us in learning. There was only one when they did a step survey of a whole bunch, and they picked 13 representatives and did psychometric analysis. And they found out they can’t accurately measure the– just for what I mentioned about how we change depending on context, and what we’re learning, and why we’re learning.  And also, they found no evidence that trying to match the learning styles. So it’s appealing. Similarly with generations and millennials, you know? It seems like, well, maybe we had these significant events in our lives, but everybody was on a continuum, you know. At your age versus my age, which I suspect are different. You look much younger and healthier than me.

We both experienced 9-11. It was a major point in our consciousness regardless of what generation we are. It seems simple to want to do this. But way back, the ancient Greeks were going, “oh, those kids these days. They have no respect for tradition.” And it has much more to do with age.

So as a young person, you don’t have a lot of experience, for instance. So you want more certifications, versus old have experience. You don’t need as many certifications. You can point to what you’ve done. Those things matter, but it’s not categorizable by generation. And Dale’s Cone is another one that you learn 10% from what you hear, and 20% from what you read, and on up to 90% of what you teach. That’s hard to kill. We like to find these simple explanations, and our brain wants to find categorization. Our brains are pattern matchers and meaning makers, and they look for patterns. And that means that these things are appealing, and therefore they can be hard to extinguish.

Brian Washburn: And they are very appealing, right? Especially when you see numbers tied to something. Dale’s Cone research that you referenced. A lot of times when you see that in a Google search, there will be a little source that is affixed to it as well. It’s not just Dale’s Cone. There’s the National Training Laboratories or something like that that it’s oftentimes attributed to.  And so people see that. And they see, OK, we have some numbers. We have a source. Let’s use it. And it makes things– you know, it’s a fun story. It’s a neat way to categorize things. I want to believe that it’s true. And yet, when you do a little bit more work, and you can find that there’s actually no research that supports it, it’s tougher. It’s deflating. But it’s also one of those things where we can’t go around teaching bad stuff. The wrong stuff.  You know, I want to believe in unicorns also, but they’re not there. They haven’t been there. What can people who are listening do? What’s the silver bullet or the wooden stake that they can use to slay some of these monsters?

Clark Quinn: Would that there were a silver bullet, but there isn’t. You have to do the work. The process recommended– and this is my process. It’s sort of a mash up of Carl Sagan’s 13 Steps to Detect BS and Daniel Willingham’s four steps to do it.

My step is, first, give it the sniff test. Does it seem causally likely? Is there a plausible story behind it? If it doesn’t pass that, you can stop.

The next step– because you can find out about all the ones that have already been analyzed, but new ones will be coming up. So you’re going to have to have a process to use. The second thing is to cut it down to the bare bones. And this is from Daniel Willingham. And to say, what does it mean I would do differently, and what does it mean I would get as a different outcome?

And if that doesn’t matter to you, you can stop there and go, look, I don’t care if it’s valid or not. It’s not relevant to me. If it passes those two tests, you need to take a next step, and that’s to track it back. Say, who -? And this is what you were talking about. Its to say — when you look a little deeper.

One of the people that Dale’s Cone was attributed to was Mickey Qi, and I happen to know her. Will was the one that actually contacted her. But if I’d known that that was who it was attributed to, I could have checked with her. He did. And she would have never done anything like that. That was a total false attribution, just made up. And I think that’s kind of evil.

Anyways, but when you track back and say, who’s saying this? What is their vested interest? Is this a company telling me this? And what research are they citing? Their own proprietary research, or actual published journal articles in a peer-reviewed journal? If it doesn’t pass that test, you can stop. There’s somebody who has a vested interest in it. And you should also ask, who else is saying it? Is anybody else chiming in? Is anybody saying to the contrary? And what is their vested interest, and what is their credibility?

Brian Washburn: And can you –? I’m sorry to interrupt. Can you talk a little bit about the difference between peer-reviewed material and stuff that’s just out there in pop culture? So if we take a look at the goldfish myth, for example, that was actually something that was written about in Time magazine and attributed to Microsoft, which are two pretty credible sources. And yet it’s still a myth. It really doesn’t exist.

And so what is the difference between peer-reviewed material? How do people access that? And how do people even know to look there when they see something that’s cited in Time and related to Microsoft?

Clark Quinn: It’s complex. Actually, if you track back that one, you see it was Microsoft Canada. Which is still legit, but it was their marketing department. And they had grabbed that story out of a advertising material agency called Statbrain.  And they had gone to an actual peer-reviewed published study in Germany that looked at how long people spent on a web page, and then several years later how long they spent on a web page. Which is a perfectly legitimate thing to do. But the problem is, they then inferred that our attention span had dropped from these– because the time later on, we spent less time on a web page. But can you think of some other reasons why, several years more experience with the internet, you might spend less time on a page? The page might load faster. You might have more experience recognizing whether a web page is worth staying on or not. There’s a bunch of alternative explanations.

Statbrain took this, attached the goldfish to it, by the way– and we don’t really know what the attention span of a goldfish is. We can determine that they do learn. The story there goes back, and it ends up at peer-reviewed journal, but it was misinterpreted. But in general, a peer-reviewed journal means, as a scientist, you do a study. And then you submit it to a journal. And they get a panel of peers to look at it, to review it.

I often do this. I’m an independent reviewer for several different magazines. Educational, technology, research, development, information, science. You anonymously review it, and you give them feedback. And if they get several reviewers, they tend to take several iterations. You get feedback that says, “if you fix this, it’s probably worth publishing”. Sometimes you say, “this is never going to be worth publishing. Throw it out”.

And it’s not perfect. There’s politics involved in the reviewers if you don’t cite the right people. And there is resistance to new ideas. But it’s still the best thing we have out there. There is nothing better than good peer reviews.

Which leads to the last check. The last step in this is, if it passes the track back then, you’ve got to go look at the research itself and look at their method. And you’ve got to cite — what was the question they were trying to answer, and is that the same as the question I’m trying to answer? Did they use the appropriate subjects? Did they have enough subjects? There’s a whole bunch of complicated things to actually detect whether it’s a good study. And the problem is that tends to get written up in an obscure language known as academese. It’s like English, but it’s impersonal, and it’s passive, and it’s deliberately precise to the point of being obtuse. The alternative to that step is to look at the translators.

And we have people who have just demonstrated a track record of being able to read this stuff, translate it into real language, to be able to cut through and recognize stuff that’s garbage. And these are people you mentioned. Will Thalheimer. It’s Patti Shank. It’s Mirjam Neelen. It’s Connie Malamed. And these people have books and blog posts and podcasts and videos and more.

I have a list of them on the site for the book Debunking LearningMyths. You can go to the resources page. There’s a list of a number of the myth-busters. And these are people, if you struggle– and I don’t expect anybody to be particularly good at reading academese unless you’ve been highly trained for it, like a number of these people have.

So go find the trusted voices and listen to them. And if something new comes up, see what they have to say. And if they haven’t, lob it at them and say, “what do you think about this?” I hope I’m one of those groups as well. But that’s the alternative to digging into it yourself and trying to understand academic research.

Brian Washburn: Yeah, I think that’s a great point. And one of the things that I preach all the time is, as learning professionals, it’s really helpful to get on social media. Connect with these thought leaders who you otherwise would never have a chance to connect with. Connect with them on Twitter. Connect with them on LinkedIn. See what they’re saying. Take a look at their material, because it is a lot easier than going through peer-reviewed journals sometimes. And there are people who are doing it. I really appreciate how you termed or gave them the title “translators”. And Will hosts his Debunkers Club, right? And has the whole website dedicated to this, which I think is really cool. Clark, thank you so much for giving me some time and just talking about some of these myths because this is a really important and dangerous part of our field here. Right before Halloween, we were doing a spooky-themed podcast. It really is a serious topic–

Clark Quinn: [SINISTER LAUGHTER]

Brian Washburn: –and I appreciate the thought that you’ve been able to share. Before we end up here, I have a few lightning-round questions so that those who may not be as familiar with you may get to know you a little bit more. Are you ready for a quick speed round?

Clark Quinn: Ah, sure.

Brian Washburn: [LAUGHS] What is your go-to pre-training presentation food?

Clark Quinn: Well, when I went to conferences, which we haven’t done for a while, I always felt bad paying hotel prices for a bowl of oatmeal. $10 for a bowl of oatmeal strikes me as– are you serious? And so I would bring my own breakfast bar. And I’d also try and have a banana for my mid-morning snack.  So a breakfast bar, a cup of juice, a cup of tea. That’s my pre-conference, pre-workshop food, pretty typically.

Brian Washburn: Keep it light. How about a piece of training technology that you can’t live without?

Clark Quinn: Actually, I think it’s OmniGraffle. It’s a piece of software that I used to create diagrams. And diagrams, I think, are really important tools for communication because they take conceptual relationships and map them to spatial relationships. So the ability to create diagrams easily and well and use them in presentations, to me, is a really key part of helping communicate. So that would be my go-to.

Brian Washburn: And the software is called?

Clark Quinn: OmniGraffle. There’s two caveats on it. It’s Mac only, and it’s a bit expensive. But for me, it’s been extremely valuable.

Brian Washburn: How about a book or a podcast that people should be paying attention to?

Clark Quinn: Podcasts. Will and Matt Richter are doing the Truth in Learning podcast. Books I’m really a fan of. Julie Dirksen’s “Design [for How People Learn]”. And there’s a blog that I think– Mirjam Neelen and Paul Kirschner have their 3-Star Learning Experiences blog, which is really worthwhile. They do regular posts about stuff about learning science. It’s really good.

Brian Washburn: And just as a side note, Julie was the first person that I thought actually debunk the goldfish myth at a conference I went to. And, of course, not only do Mirjam and Paul have what they’re working on, but they also have the book
“Evidence-Informed Learning Design”, which is right here on my desk. So that might be another book for people to check out as well.  Do you have any shameless plugs?

Clark Quinn: Absolutely. So, interesting, ATD — the reason I did the myths book– I didn’t want to be the myths guy, but they asked me to write it. And so I agreed. They have now asked me, and I’m in the process of going through the final edits before it goes to copy editing of a new book for them on learning science. Sort of a learning science 101 that I think people need to know.

Also, so the other thing I’m working on is a workshop, and it’ll end up as a book as well. And it’s about the engagement side, except a little bit deeper than just it’s harder to push out stuff. It’s about how do we make it meaningful. Those are two things I’m working on I think are going to be fun. And, of course, the rest of my books.

Brian Washburn: Clark, thank you so much for giving us some time today. I really appreciate hearing you and your expertise on this really important topic. We took a fun angle on it, but it really is really important that people take a look and see what the science says behind learning, and what is just fun stuff that maybe needs to be left out of what we’re doing.

Thank you, everyone, for listening to Train Like You Listen. It has been another very interesting conversation. If you’d like to subscribe to Train Like You Listen, you can do so on Spotify, on iTunes, iHeartRadio, or wherever you get your podcasts. And if you like what you hear, go ahead and give us a rating. And that’s how other people find us as well. Until next time, happy training, everyone.

Clark Quinn: And kill the mist!

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.