Are Online Conferences Worthwhile?

Last week I attended the Adobe MAX conference. I have wanted to attend this conference for years, but it is fairly expensive as conferences go. I am not a true graphic designer, so I have never been able to justify the cost. This year, Adobe offered this conference, free of charge, to anyone, anywhere in the world! The caveat obviously being that it was delivered 100% online. While I had some reservations about this format, I am elated to say that I got a lot out of this conference. Let’s take a look at what Adobe did right to make this online conference successful.

Continue reading

The Monsters We Don’t See

scaregrow

It is Halloween week! While this year’s Halloween will look different than previous years, one of the ways we can always connect is with stories. Sure, the medium may differ depending on the times, and we may not be gathered around a room, but podcasts and blogs are a great way to share spooky stories to keep us up after we binge on candy and prepare to watch scary movies in the comfort of our own homes. This year’s story is about monsters you may not be able to see on your own.

monsters of l&d

We face these monsters every day as training professionals. They cannot be tackled the same ways the stories of our past have taught us to tame mysterious beasts. They lurk right in front of us; on our computer screen, on our social media accounts, in our books. Staying home cannot protect us from these monsters! This week on the Train Like You Listen podcast, Mad Scientist Clark Quinn, Ph. D, Executive Director at Quinnovation, joins us to spot and fight the monsters that plague so many in our industry.

Transcript of the Discussion with Clark Quinn

Brian Washburn: Welcome, everyone, to the Train Like You Listen podcast. And today we have a very special, very spooky, Halloween-themed podcast. Joining us today is Clark Quinn of Quinnovation. He is also the author of “Millennials, Goldfish, and Other Training Misconceptions – Debunking Learning Myths and Superstitions”. Clark, thank you so much for joining us today.

Clark Quinn: Thank you.

Brian Washburn: Clark, as we always do as we get started with these podcasts is that we’ll ask our guests to introduce themselves using a six-word biography that kind of sums up their career, or at least this topic, in exactly six words. For me, as I think of our conversation today and these spooky myths, these zombie myths, and some misconceptions, my six-word biography might be that “I pay better attention than goldfish”. How about you? If you could sum up your career in six words, what might it be?

Clark Quinn: Oh, wow. Let me see. Appropriate to the theme of today, “I am a mad scientist of learning and engagement”. [CHUCKLES]

Brian Washburn: I love that. So we’re talking about learning myths here. And just so we can set the baseline, and if we could just first talk about why learning myths can be harmful. You know, when I first went to an eLearning Guild conference and heard this thing about how humans now have a shorter attention span than goldfish, you know, it’s kind of a cute story. It’s kind of a funny story. It makes a point. You see a lot on LinkedIn about millennials and how they’re all digital natives, and we need to cater to the fact that they need technological learning solutions. Why are these myths harmful?

Clark Quinn: I think for several reasons. It would be nice to say, “oh, they’re not really harmful”. What are they going to cause? But first of all, people spend money on these things. So they purchase these personality instruments and use them inappropriately. They also waste time. If you develop redundantly for different approaches, you can waste time.

But to me, the most important one is, really you can limit people. You can artificially characterize somebody by their age and say, “oh, you’re a millennial”. Therefore, you’re this. Given that there is no evidence of this when you look at the data on what people value in the workplace, it doesn’t change by generation. And the generation cut-offs vary depending on who’s talking about it. It’s pretty much arbitrary.

And that means we can be, essentially, discriminatory. We can be saying, oh, you know, or categorize people by something that’s out of their control. And we should be categorizing people by their behavior, what they do. And people can self-limit.

When I say, I’m an NSFW, that may keep people limiting themselves from trying anything else being appropriate. And one of the things we know is people, for instance, learn differently depending on what they have to learn, why they’re learning it, where they’re learning it, when they’re learning it, the time of day, the phase of the moon. And so if they limit themselves in ways, they may keep themselves from taking advantage of opportunities that they miss.

So beyond just the time and money, we can actually be hurting people. And, of course, we may be doing bad learning design as well. So I think all these reasons suggest that this is something to be avoided.

Brian Washburn: (laughing) Yes, and the mad scientist comes out again. These myths pop up all the time. What is the spookiest learning myth that you’ve heard about recently?

Clark Quinn: You’ve already mentioned it, you know? That we’re turning into fish people. We have an attention span less than a goldfish. That would be scary if it were true. Evolution doesn’t work that quickly, and our attention span is sort of an artifact of our cognitive architecture. And it turns out it’s complex.  If you think about it, we think we have control of our attention, but there is the cocktail party phenomenon where you’re talking in a conversation at a party, and another group is talking somewhere else. But they happen to mention your name, and boom, suddenly your attention is way over there in that other conversation. And similarly, I mean, who hasn’t surfaced from a book or a movie or a computer game and realized that many more hours went by than they expected?  So it turns out that that was total– it started with misinterpretation of data, and then was repromoted by people focusing on advertising material instead of real science. So it’s just a bad thing.

Brian Washburn: Yeah, yeah. And I’ve heard, you know, kind of, the comparison to Netflix, right? And it takes maybe less than a second to flip through from one option to the next. But that isn’t our attention span. That’s an entertainment preference.

Clark Quinn: We may have more distractors than we used to, but we still – our attention span hasn’t changed. We may have – make it easier to do it, and maybe take more effort to maintain people’s attention. But that’s not a bad thing. But it’s not that our attention span has changed.

Brian Washburn:  You hear about some of these things in incredible places. At conferences, or in papers that people are writing. And it catches my attention. When somebody said our attention span is less than a goldfish, I was like, “oh my gosh. We need to change how we do things.” Or a long time ago, I was developing or delivering  programs that really spent hours talking about learning styles. And then I read an article by Will Thalheimer that said it’s not – there’s no research that actually backs up this idea that if we create activities catering to specific learning styles that we get better results. And so what are some of the hardiest zombie beliefs, the beliefs that just don’t die in this field? And why are they so hard to kill?

Clark Quinn: If only they were vulnerable to wooden stakes, we might be able to kill them. Some of the most persistent ones– you mentioned it. “Learning styles” is just incredible. So many people believe it, and it’s tied into the way our brain works. Another one that continues to be persistent is millennials, and that one’s easy to understand.

So “learning styles”, we like to categorize people. And anybody who’s ever taught knows learners differ. And it would be nice to believe that we can categorize it, and it resonates with us. I prefer visual stuff versus I prefer auditory. And I get that.

But it turns out, as you referred to Will’s pointing to the research, first of all, the instruments that do it aren’t reliable that categorize us in learning. There was only one when they did a step survey of a whole bunch, and they picked 13 representatives and did psychometric analysis. And they found out they can’t accurately measure the– just for what I mentioned about how we change depending on context, and what we’re learning, and why we’re learning.  And also, they found no evidence that trying to match the learning styles. So it’s appealing. Similarly with generations and millennials, you know? It seems like, well, maybe we had these significant events in our lives, but everybody was on a continuum, you know. At your age versus my age, which I suspect are different. You look much younger and healthier than me.

We both experienced 9-11. It was a major point in our consciousness regardless of what generation we are. It seems simple to want to do this. But way back, the ancient Greeks were going, “oh, those kids these days. They have no respect for tradition.” And it has much more to do with age.

So as a young person, you don’t have a lot of experience, for instance. So you want more certifications, versus old have experience. You don’t need as many certifications. You can point to what you’ve done. Those things matter, but it’s not categorizable by generation. And Dale’s Cone is another one that you learn 10% from what you hear, and 20% from what you read, and on up to 90% of what you teach. That’s hard to kill. We like to find these simple explanations, and our brain wants to find categorization. Our brains are pattern matchers and meaning makers, and they look for patterns. And that means that these things are appealing, and therefore they can be hard to extinguish.

Brian Washburn: And they are very appealing, right? Especially when you see numbers tied to something. Dale’s Cone research that you referenced. A lot of times when you see that in a Google search, there will be a little source that is affixed to it as well. It’s not just Dale’s Cone. There’s the National Training Laboratories or something like that that it’s oftentimes attributed to.  And so people see that. And they see, OK, we have some numbers. We have a source. Let’s use it. And it makes things– you know, it’s a fun story. It’s a neat way to categorize things. I want to believe that it’s true. And yet, when you do a little bit more work, and you can find that there’s actually no research that supports it, it’s tougher. It’s deflating. But it’s also one of those things where we can’t go around teaching bad stuff. The wrong stuff.  You know, I want to believe in unicorns also, but they’re not there. They haven’t been there. What can people who are listening do? What’s the silver bullet or the wooden stake that they can use to slay some of these monsters?

Clark Quinn: Would that there were a silver bullet, but there isn’t. You have to do the work. The process recommended– and this is my process. It’s sort of a mash up of Carl Sagan’s 13 Steps to Detect BS and Daniel Willingham’s four steps to do it.

My step is, first, give it the sniff test. Does it seem causally likely? Is there a plausible story behind it? If it doesn’t pass that, you can stop.

The next step– because you can find out about all the ones that have already been analyzed, but new ones will be coming up. So you’re going to have to have a process to use. The second thing is to cut it down to the bare bones. And this is from Daniel Willingham. And to say, what does it mean I would do differently, and what does it mean I would get as a different outcome?

And if that doesn’t matter to you, you can stop there and go, look, I don’t care if it’s valid or not. It’s not relevant to me. If it passes those two tests, you need to take a next step, and that’s to track it back. Say, who -? And this is what you were talking about. Its to say — when you look a little deeper.

One of the people that Dale’s Cone was attributed to was Mickey Qi, and I happen to know her. Will was the one that actually contacted her. But if I’d known that that was who it was attributed to, I could have checked with her. He did. And she would have never done anything like that. That was a total false attribution, just made up. And I think that’s kind of evil.

Anyways, but when you track back and say, who’s saying this? What is their vested interest? Is this a company telling me this? And what research are they citing? Their own proprietary research, or actual published journal articles in a peer-reviewed journal? If it doesn’t pass that test, you can stop. There’s somebody who has a vested interest in it. And you should also ask, who else is saying it? Is anybody else chiming in? Is anybody saying to the contrary? And what is their vested interest, and what is their credibility?

Brian Washburn: And can you –? I’m sorry to interrupt. Can you talk a little bit about the difference between peer-reviewed material and stuff that’s just out there in pop culture? So if we take a look at the goldfish myth, for example, that was actually something that was written about in Time magazine and attributed to Microsoft, which are two pretty credible sources. And yet it’s still a myth. It really doesn’t exist.

And so what is the difference between peer-reviewed material? How do people access that? And how do people even know to look there when they see something that’s cited in Time and related to Microsoft?

Clark Quinn: It’s complex. Actually, if you track back that one, you see it was Microsoft Canada. Which is still legit, but it was their marketing department. And they had grabbed that story out of a advertising material agency called Statbrain.  And they had gone to an actual peer-reviewed published study in Germany that looked at how long people spent on a web page, and then several years later how long they spent on a web page. Which is a perfectly legitimate thing to do. But the problem is, they then inferred that our attention span had dropped from these– because the time later on, we spent less time on a web page. But can you think of some other reasons why, several years more experience with the internet, you might spend less time on a page? The page might load faster. You might have more experience recognizing whether a web page is worth staying on or not. There’s a bunch of alternative explanations.

Statbrain took this, attached the goldfish to it, by the way– and we don’t really know what the attention span of a goldfish is. We can determine that they do learn. The story there goes back, and it ends up at peer-reviewed journal, but it was misinterpreted. But in general, a peer-reviewed journal means, as a scientist, you do a study. And then you submit it to a journal. And they get a panel of peers to look at it, to review it.

I often do this. I’m an independent reviewer for several different magazines. Educational, technology, research, development, information, science. You anonymously review it, and you give them feedback. And if they get several reviewers, they tend to take several iterations. You get feedback that says, “if you fix this, it’s probably worth publishing”. Sometimes you say, “this is never going to be worth publishing. Throw it out”.

And it’s not perfect. There’s politics involved in the reviewers if you don’t cite the right people. And there is resistance to new ideas. But it’s still the best thing we have out there. There is nothing better than good peer reviews.

Which leads to the last check. The last step in this is, if it passes the track back then, you’ve got to go look at the research itself and look at their method. And you’ve got to cite — what was the question they were trying to answer, and is that the same as the question I’m trying to answer? Did they use the appropriate subjects? Did they have enough subjects? There’s a whole bunch of complicated things to actually detect whether it’s a good study. And the problem is that tends to get written up in an obscure language known as academese. It’s like English, but it’s impersonal, and it’s passive, and it’s deliberately precise to the point of being obtuse. The alternative to that step is to look at the translators.

And we have people who have just demonstrated a track record of being able to read this stuff, translate it into real language, to be able to cut through and recognize stuff that’s garbage. And these are people you mentioned. Will Thalheimer. It’s Patti Shank. It’s Mirjam Neelen. It’s Connie Malamed. And these people have books and blog posts and podcasts and videos and more.

I have a list of them on the site for the book Debunking LearningMyths. You can go to the resources page. There’s a list of a number of the myth-busters. And these are people, if you struggle– and I don’t expect anybody to be particularly good at reading academese unless you’ve been highly trained for it, like a number of these people have.

So go find the trusted voices and listen to them. And if something new comes up, see what they have to say. And if they haven’t, lob it at them and say, “what do you think about this?” I hope I’m one of those groups as well. But that’s the alternative to digging into it yourself and trying to understand academic research.

Brian Washburn: Yeah, I think that’s a great point. And one of the things that I preach all the time is, as learning professionals, it’s really helpful to get on social media. Connect with these thought leaders who you otherwise would never have a chance to connect with. Connect with them on Twitter. Connect with them on LinkedIn. See what they’re saying. Take a look at their material, because it is a lot easier than going through peer-reviewed journals sometimes. And there are people who are doing it. I really appreciate how you termed or gave them the title “translators”. And Will hosts his Debunkers Club, right? And has the whole website dedicated to this, which I think is really cool. Clark, thank you so much for giving me some time and just talking about some of these myths because this is a really important and dangerous part of our field here. Right before Halloween, we were doing a spooky-themed podcast. It really is a serious topic–

Clark Quinn: [SINISTER LAUGHTER]

Brian Washburn: –and I appreciate the thought that you’ve been able to share. Before we end up here, I have a few lightning-round questions so that those who may not be as familiar with you may get to know you a little bit more. Are you ready for a quick speed round?

Clark Quinn: Ah, sure.

Brian Washburn: [LAUGHS] What is your go-to pre-training presentation food?

Clark Quinn: Well, when I went to conferences, which we haven’t done for a while, I always felt bad paying hotel prices for a bowl of oatmeal. $10 for a bowl of oatmeal strikes me as– are you serious? And so I would bring my own breakfast bar. And I’d also try and have a banana for my mid-morning snack.  So a breakfast bar, a cup of juice, a cup of tea. That’s my pre-conference, pre-workshop food, pretty typically.

Brian Washburn: Keep it light. How about a piece of training technology that you can’t live without?

Clark Quinn: Actually, I think it’s OmniGraffle. It’s a piece of software that I used to create diagrams. And diagrams, I think, are really important tools for communication because they take conceptual relationships and map them to spatial relationships. So the ability to create diagrams easily and well and use them in presentations, to me, is a really key part of helping communicate. So that would be my go-to.

Brian Washburn: And the software is called?

Clark Quinn: OmniGraffle. There’s two caveats on it. It’s Mac only, and it’s a bit expensive. But for me, it’s been extremely valuable.

Brian Washburn: How about a book or a podcast that people should be paying attention to?

Clark Quinn: Podcasts. Will and Matt Richter are doing the Truth in Learning podcast. Books I’m really a fan of. Julie Dirksen’s “Design [for How People Learn]”. And there’s a blog that I think– Mirjam Neelen and Paul Kirschner have their 3-Star Learning Experiences blog, which is really worthwhile. They do regular posts about stuff about learning science. It’s really good.

Brian Washburn: And just as a side note, Julie was the first person that I thought actually debunk the goldfish myth at a conference I went to. And, of course, not only do Mirjam and Paul have what they’re working on, but they also have the book
“Evidence-Informed Learning Design”, which is right here on my desk. So that might be another book for people to check out as well.  Do you have any shameless plugs?

Clark Quinn: Absolutely. So, interesting, ATD — the reason I did the myths book– I didn’t want to be the myths guy, but they asked me to write it. And so I agreed. They have now asked me, and I’m in the process of going through the final edits before it goes to copy editing of a new book for them on learning science. Sort of a learning science 101 that I think people need to know.

Also, so the other thing I’m working on is a workshop, and it’ll end up as a book as well. And it’s about the engagement side, except a little bit deeper than just it’s harder to push out stuff. It’s about how do we make it meaningful. Those are two things I’m working on I think are going to be fun. And, of course, the rest of my books.

Brian Washburn: Clark, thank you so much for giving us some time today. I really appreciate hearing you and your expertise on this really important topic. We took a fun angle on it, but it really is really important that people take a look and see what the science says behind learning, and what is just fun stuff that maybe needs to be left out of what we’re doing.

Thank you, everyone, for listening to Train Like You Listen. It has been another very interesting conversation. If you’d like to subscribe to Train Like You Listen, you can do so on Spotify, on iTunes, iHeartRadio, or wherever you get your podcasts. And if you like what you hear, go ahead and give us a rating. And that’s how other people find us as well. Until next time, happy training, everyone.

Clark Quinn: And kill the mist!

Taking on the Unexpected

Many of us are being asked to do things that we don’t normally do. Maybe it is working from home while the dog barks at the mailman or coaching children through school while taking online meetings. If you are in the L&D space, you’ve probably been asked to convert a classroom session to in-person delivery, deliver a session via an online platform, or support others in your organization to deliver sessions online.

If you are like me, the request to help others have come in more frequently. Even RFP’s are asking that we build in coaching on their chosen tool to get the presenters comfortable with a session they’ll have to deliver. It’s a great idea! And it takes time. To help us address this hurdle we’ve started building guides for trainers that speak directly to the challenges of using these tools and maintaining what we know to be best practices for adult learning. The latest guide is the Trainer’s Guide to Microsoft Teams.

Continue reading

Learning Campaigns (podcast)

What is it like to be on the other side of the training? In other words, do your participants have a working world that lives beyond attending your training? In all of my experiences, the answer to that question is a resounding yes. In fact, often I have to account for not only meeting the training objectives, but also making sure there are several ways for the learned to access information and find various was to prompt them to engage with those tools, events,  and resources.  

The more we can access our learners, the more likely we are to be successful in our training outcomes. This week on the Train Like you Listen podcast, Amy Lou Abernethy, President, Co-Founder and Chief Learning Strategist at Amp Creative, stops by to talk to us about how we can use learning campaigns to increase learner engagement and promote a learning culture.

Top 200 Tools for Learning in 2020

Last week I shared several tools that I’ve found my children’s teachers using for online school activities that I thought could be helpful for those of us in the L&D field. Today I want to continue with the theme of tools we can use by talking about Jane Hart’s annual list of top tools for learning, which was released at the beginning of September.

New Technology in L&D

I’m always intrigued by Jane Hart‘s list because this is where I have a chance to see what technologies others are using, and I am sometimes inspired to bring something new into my daily practice.

miro whiteboard - top 200 tools for learningI was intrigued to see both Netflix (for documentaries) and Spotify (for podcasts) break onto the top 200 tools for learning. There are also a variety of new tools that made the list that may help with virtual staff meetings, strategic planning sessions and presentations, such as Mural and Miro, which are both online whiteboarding tools.

I’m kind of wishing I had written this post last week so that I could have discovered ilovepdf.com earlier. This is a quick and easy way to convert pdf files into editable documents such as Word, PowerPoint or Excel files with, as stated on their landing page, “almost 100%” accuracy.

There are also several new mindmapping, email and game/survey tools to check out as well.

Old Favorites

When you consider that this list of top 200 tools are tools used by both corporate trainers and classroom educators, there is nothing on highest ranked, most popular 20 tools that surprised me. YouTube, Zoom, Google Search, PowerPoint, Microsoft Teams, Word, Google Docs/Drive, LinkedIn, Twitter, WhatsApp, Wikipedia, Facebook, Excel, WordPress, Google Classroom, Google Meet, Slack, Canva, Skype and Trello make the top 20.

Other tools that are still popular in use among the Top 50 (in case you were wondering if some of your old stand-by’s were growing out of date) include Kahoot (for games and quizzes), Prezi (this actually surprises me that it’s still so popular, coming in at #39), Snagit (for screen captures) and Vyond (for animated video creation).

Further down the list, at #182, you’ll find Pixabay. It’s a site I use every week when I’m looking for imagery for this blog or for my PowerPoint decks. If you haven’t stumbled upon it yet and you’re on the lookout for free stock images, definitely give it a look.

Tools for Learning I Plan To Try

mentimeter polling  in top 200 tools for learningMy favorite audience participation tool is PollEverywhere, though I was recently exposed to Mentimeter (which comes in at #26 on the list). I’m not sure if it’ll give me something extra, but I’d like to check it out and see why it’s so popular.

I mentioned Mural as a whiteboarding tool. When I’m in person, I love to use a flipchart, whiteboards, and sticky notes to help organize my thoughts and play with ideas during meetings. In this world of COVID and virtual meetings, this could be a handy tool.

I’ve also just downloaded Snip & Sketch, which appears at #86 on this list. It’s a free download if you have Microsoft Office on your computer, and is Microsoft’s replacement to their Snipping Tool.

If you have a chance to check out this list of top 200 tools for learning, I’d love to hear which tools you’re using, and which tools sound like they could help you with your learning and development programs!


Want to try out a tool that can help you generate training activities – whether you’re delivering virtual sessions or you’re returning to in-person training? Perhaps Soapbox will appear on this top 200 list next year.

Resources for an eLearning Department of One (podcast)

What is the size of your training department? The biggest team I have ever worked in has been ten people. I was at a worldwide non-profit and we served thousands of employees and board members. Even still, that was considered a rather large training department. As I ramped up in the training world, attending conferences and integrating myself into the network of learning and development professionals, I quickly met people who not only had much smaller groups, but often their teams were comprised of merely one or two people.

As trainers, we often talk about wearing more that one hat at work. But how do you know how to navigate all of the challenges that you face when you don’t have a big team? Emily Wood, author of ELearning Department of One, joins us on the Train Like You Listen podcast this week to share some resources and tips for small eLearning departments.

What can trainers steal from online, k-12 instruction?

Back in March, schools abruptly closed and went online. It was a messy experience for students, teachers and parents. This fall however, many schools and teachers have done an amazing job finding new educational technologies and navigating their classes through less than ideal circumstances. As I sometimes catch myself spying on my children in school to see what online school looks like these days, I find that some teachers are using technologies I’d never thought to use (or hadn’t even heard of).

I think there might be some lessons and technologies we, in the world of learning and development, can adopt from these online school experiences. Here are two recent examples that I’ve seen my children’s teachers use.

Continue reading

Troubleshooting for Trainers

Do you walk into every training development project knowing exactly what needs to happen to make it a success? If you are like me, probably not. As a junior trainer, a lot of my lessons were learned from failure and feedback. While those are wonderful ways to learn, it isn’t always ideal to put yourself or your team at risk for failure if it can be avoided. Is there a way to be proactive about troubleshooting your next training event?

Sophie Oberstein, author, coach, adjunct professor, and L&OD consultant, joins us on the Train Like You Listen podcast this week to discuss how you can find solutions to training problems.

Make sure to check out her book, Troubleshooting for Trainers, which is available October 6, 2020.

Listen using the player below. Please leave us your thoughts in the comment section or on twitter @train_champion.

Give Soapbox a Try to Change the Way You Train

Trainer’s Guides for Zoom and GoToTraining

In the spring, the whole world seemed to need to move to online meetings and virtual training, almost overnight. On April 1, we asked Train Like A Champion readers how comfortable they were delivering virtual training, and the results came back, split down the middle. Half of the respondents to the poll chose “I’m super comfortable” and the other half chose “I’m pretty shaky”.

My guess is that with practice and having virtual sessions become more of the norm, people are much more comfortable now than they were six months ago.

Just like with other tools (PowerPoint comes to mind), feeling comfortable using them, and feeling comfortable leveraging all of the features they have to use the tools for maximum impact are two different things. For that reason, my colleague Lauren Wescott, has spent time studying the most commonly used virtual platforms and has begun to generate a series of quick reference guides on how to maximize the use of the platform you may be using. The first two guides focus on Zoom and GoToTraining.

Continue reading

Can Curiosity Be Taught?

Like many other parents right now, I have children at home who are learning online. While our school is doing a good job with this new approach to early childhood education, screen -time limits and other obvious factors have me playing the role of a part-time teacher to fourth and second grader. While my forte has always been training adults, I am noticing a lot of overlap in our young learners and adult learners.

One of these overlaps is curiosity.  Facilitating and training people, young or adult, to be curious is important, but is it really an outcome that can be trained and measured? On this week’s podcast, we talk to Bethany Kline from www.Rover.com about her approach to training learners how to be curious and how she applies her methods to scale innovation across an organization.

Continue reading