Unknown Speaker 00:00
Thank you all for not drifting away after lunch. Not yet anyway, not yet anyway, right. And you know, talking about data can either be scary or boring or exciting depending on where you're coming from it. So I'm Kate Haley go.
Unknown Speaker 00:15
Where are you from?
Unknown Speaker 00:18
I'm from HG and CO. And we we are all have similar backgrounds. Here.
Unknown Speaker 00:24
I'm Ely wood. I am from the Huntington Library, botanical gardens and art museum.
Unknown Speaker 00:34
And I'm Kathy Sigmund, I'm from Archana, which is a research and evaluation forum.
Unknown Speaker 00:39
So the genesis of this session was around data, obviously. But the for us, I think, for me, particularly, there's a large amount of exhilaration around how we're talking about data. Currently, right? When I first started coming to MCN, we never talked about visitor data. We never talked about visitors to be frank, the MCN was really about the back of the house. And how we dealt with the technology in the back of the house, how we talked about collections, how we talked about metadata, we would use and how Dublin Core and a wide variety of things, but there was no audience within that. And in the 1518 years, since I had that first experience with MCN. This organization, these topics have changed so much right? In every session I go to there is discussion of visitors and of audience and thinking about people's needs and thinking about data and what we know about our institutions, whether it's from the interactives on the floor, to the people coming through the door. And so that's exciting. It's really exciting for all of us within that. It's also a little terrifying. And that sort of that double edged sword is the genesis of this conversation. And we I think we have a shared worldview and thinking about data. But we know that we're not the majority within this. So ideas about data and where that comes from. It's very different when you're coming from different perspectives. And I'm not even talking cultural perspectives, types of institutions, etc. But in your the critical thinking lens that you use when you're coming towards data, right? And so we've heard us as a field start to slip towards jargon. In the way we talk about data, well, we have data, right? And so then we have answers, right. And for us, there's a large, there's a large missing gap within this. And so, and we encounter it every day within our work, depending on where you go through, and we could give a long dry piece on, here's how you start the study. And here's where you define within that. And we decided that while we were very capable of doing that, that we would instead like to focus on the situations that we find ourselves in regularly, and that we expect that you might find yourself in and if you if we were sitting with you, what are the ways that we would navigate out of this situation, you're not at the point perhaps of starting a study, you've got something and you're trying to figure out what to do with it. So as a field, we'd like to up our game, in terms of how we deal with the data, we'd like to deal with it more ethically, more rigorously, more thoughtfully, and wave your finger for us. We're gonna start with we're gonna start with talking about how to deal with it a little bit more thoughtfully. Yeah,
Unknown Speaker 03:59
I think just one point of clarification, because I know there have already been some really awesome sessions, just today alone on data. This morning's plenary was on data governance and data ethics. And we're thinking in this room right now, primarily about data generated from visitors. So that's the lens that we're going to research Yes, research on visitor so data related to how visitors experience whatever in your museum. And that could range
Unknown Speaker 04:26
from attendance data to back into interactive is to membership data to some extent, in those pieces, but less so on metadata from collections, although some of the same principles apply. Yeah, okay. We introduced us. So we're gonna give you six situations that that we have, and we're going to start in situation one,
Unknown Speaker 04:53
just start with situation one, this is real life. People were harmed to some extent in The situation at hand, but not physically, just. So the first situation is that you may enter into an institution that doesn't really understand why it needs to collect data in the first place. And so, story to tell, I arrived at an institution with the mission to develop a web based curriculum for teachers on American history and art. And I said, Well, what do you know? Why do you want Why do people want this? What do you what do you know? And I got crickets? And looks like why would we do that? And as an educator, my first thought is, well, because if we want people to use it, we should figure out how and why they want to use it. And if anyone was in the room earlier on the museum's for digital learning initiative this morning was super cool, because I was like, Look, they asked what people want. So when we think about what is your where's your institution at in terms of wanting to know why you're doing something? Why you need help with this? That's the first place. So if you are at level zero, you might want to move up a level in terms of being able to say, why do we need this? What do we want to see happening? Why does this matter? What do we need help with? What do we want to do? Where can we take this so that sort of jump one is actually having a reason to collect and an interest in collecting data, it's actually very helpful. You want to add anything. I would say that
Unknown Speaker 06:36
the small doses of data are catnip, but the big doses of data are overwhelming, right? And so that when I'm trying to get in an organization interested in those pieces, I'm very carefully and narrowly, regularly introducing small pieces of consumable popcorn data for leadership, right? And that we slowly move towards this things that don't take me a long time to produce in order to generate further interest. Yeah.
Unknown Speaker 07:03
And I think sometimes, like a lack of interest in collecting data often stems from just worrying potentially about how much time and resources it's going to take and not an actual lack of like, not actually, it's not actually that you don't want answers to the questions you have. So I almost feel like in this situation, if you ever find yourself in it, or have or just do in the future, what can be a good starting point is, you know, forgetting dropping the word data out of a conversation and just starting to talk to your colleagues or your higher ups about what their questions are about where they're struggling in their work, or maybe don't use that word, but just probe them and start to ask them questions about, you know, what are your big questions about where we're moving? How we're moving forward as an institution, either holistically in your department? And what do you feel like you need to know to or would be really nice to know to help us move forward. And from there proposed collecting data on to help solve that problem, or to help ensure
Unknown Speaker 07:58
we're not advocating collecting it for just collecting it? Because there's no point in collecting things you're not
Unknown Speaker 08:03
going to use? Well, and when I say popcorn data, I mean, someone else's. Did you see this fabulous
Unknown Speaker 08:08
study? Oh, we promised we were not going to get worse stories going? Next one. Okay, so situation number two, you have some data lying around, but you have no idea how good it is, or how useful it is, again, true story. People were harmed in this, but not physically. We had a conversation, we said, Gosh, would be really great if we knew how many people came to set events that had a ticket ahead of time and how many people didn't come? And someone said, we have all that data. And someone else said, Great, send it to me. And guess what I got I got 6050 60,000 data points. And none of it was useful. That was a very painful process to go through. So one of the problems that we have is we collect these things, we don't know why we're going to use it, we get it from somewhere else. And then we get it. And we don't know what to do with it. Because it's just numbers. And it's just info. And it's just stuff. It doesn't actually tell us anything. And that's one of the other problems that we have to think about is how do we actually know how to see what's here? How to understand what it means. And getting to meaning is a long process that we tend to forget as part of what data actually represents. So as we think about it, what is it that we need? What could we change? What is it that we need to actually fix a lot of data requires cleaning, before it's even useful to use. So taking my 50,000 data points, I could narrow it down, maybe I got 1000 that were useful. And what does that mean, to me? It's still only useful if I have a way to answer the question that I had. Oftentimes, we're also collecting data for a question that we don't actually have or we're not thinking about what we will use with it or using somebody else's data to answer a different question, which is also a problem because we don't know how or why they started that information to collect So is it reliable? Yeah, no reliable things are for our own purposes.
Unknown Speaker 10:04
Yeah, exactly. And I think, you know, if you find yourself in a position where you have someone else's old report, or you know, maybe it's not literally in a report format, but you've got data on in some format that's, you know, a little old and you weren't around and it was collected, but you're being given it and told to use it to move forward. I mean, think about, with what within what you've got, like, really be critical when you look at it, you know, in what context? Was this data collected? You might not have all the answers, but see what you can discern from what you've got in front of you, you know, and does that context in which that data was collected. So do all the same conditions, like exist in your institution now? And if everything has really changed a lot, since that was done, maybe it's not something that's that useful for you? And it's okay to realize that and to say that, yeah, sure. Okay. I guess that's it. Okay. So this is probably a situation that most of us have found things. Most of us have found ourselves in at one point or another. So you have an actual project. And you know, you want to collect data, from visitors to answer a question or a set of questions. And really, those questions that you want to answer, of course, revolve around how well is this working? They're all different versions of that. So this is where the need for a very rigorous approach comes in. So from a social science background, you know, we might call this like an evaluation plan, which is a term you've heard, but whether you might have heard or not, but whether or not you want to think about it, like using that framework? Exactly. Really, the most important thing to remember is to spend time planning, so and to spend time planning, think really critically about what exactly you want to know, and why you want to know it. So the first step is really like self reflection on what are those questions that you want to answer? So what are those questions you have about visitors experiences, articulate them and make sure they're airtight? So I don't literally mean the questions that you're going to actually go out and ask visitors, I mean, the questions that are framing why you're collecting data, or what you're collecting data on in the first place. So making sure that those are really airtight questions can actually be answered in the situation you have around you is, is really important. And those questions initially are going to be the gauge by which you measure the success when you're getting to the point of analysis and looking at the data. And another thing, when we talk about rigor and actually going out and collecting data, we often think and in the social sciences, we talk about triangulating data, which is a fancy way of just saying having data coming in from multiple sources. So it's basically like looking at a phenomenon. So what is happening with visitors? On our website on this website, or, you know, visitors experiences with this app, or visitors experiences in this program or visiting this exhibition? And what are the different ways we could collect data on that experience? To answer that question. So we might collect data in some kind of quantitative way through a survey, we might look at analytics. But we might also find, you know, qualitative ways to gather data on visitors experiences, through observations through interviews. And all of this using multiple methods is what we call triangulation. And it's the strongest way to build a reliable study, you know, data that you actually can use, because you've, you've thought critically about what different, what collecting data in different ways will tell you about the visitor experience, they'll tell you different things. And so as much as possible, when you're seeking to go out and collect data, it's great if you have one source of data, but push yourself to think is there another way I could look at this. And that is a really big part of approaching things rigorously and feeling later on, you know, even a few years down the road, that that holds, that's something reliable that I can look back to, and keep using for a while moving forward. And then, yeah, anything? Yeah,
Unknown Speaker 13:53
and you know, even another thing to think about when you are collecting data to answer these big questions you have about visitors experiences, you want to think also about how you're going to approach dealing with that data once you've got it in your hands, because it's one thing to collect it. But we can't talk about this all the time about how much time analysis really takes. And it's way longer than most people think. And we get that it's not all of your jobs necessarily to analyze data, you might never want to do that. But the point is more to understand that it really takes a long time. Ely had mentioned, data cleaning, which is something that, you know, we survey data, for example, it comes back and it's a lot messier than you think it's going to be and it just takes a lot of time to sift through that and deal with the messiness before you can even get to a point of beginning to analyze what it's telling you and then what it means for you and your work moving forward. And all of this is a lengthy process. So as excited as you might be to actually have the data in your hands. The point I really want to leave you all with is just making sure that you understand that analysis is a huge part of the equation and is not something to be rushed.
Unknown Speaker 15:06
Yeah, I'd say, I think some of you have heard me say this before that I would say 90% of the time that I've had a project go off the rails, is because there was not enough time spent in the planning for the study the program, or whatever else it was, right, it's just that we did not spend significant time within that, and that the actual collecting of the data is this, for us is the smallest part of any piece. And so when people tell me that this can be automated, yeah, some of that can be automated the collection of the data, but we spend, you know, three days for every day that you're conducting interviews, bare minimum, right. And that doesn't even get you to the next in terms of cleaning that and analyzing that. And I think, for me, it was that that part of the fear in in when I was talking about how exhilarating this moment is, where we're becoming a data rich society, is that data by itself is nothing. It's absolutely nothing, right? So the data that doesn't tell the story and saying, you know, I get tweets, and I get emails full of when we have data. Okay, so then what, and that in our mind, from a social science perspective, they're even after you've gone through that data, that the analysis and interpretation are two different phases, right. And so analysis is really summing that data, running statistical tests on that data, maybe taking all the key words from your interviews and looking at the trends, right? Whether this is qualitative interpretation is something different interpretation is when you start to generate meaning. And that, in my view, cannot be automated. That's the richness that we have as a field, as a profession, bringing our experience to that analysis and understanding whether it's important or not. So, you know, getting to that very first step of having information is great. We, as a society have an enormous amount of information right now, we are lacking in meaning. So the next step in in developing that is not simply developing skills on how to handle data, but how to think about data critically, that's really about the thinking skills and applying that. And I would love to see that developed up in our field as the next part of this of this wave of information we're having.
Unknown Speaker 17:44
Yeah, that's a good segue to the next slide. Sure.
Unknown Speaker 17:48
Okay, so the situation for you don't really have a specific question. But you need or want to understand your visitors and their experiences. So you're not sure exactly what you want to focus on, but you are hungry for more information. And as you know, collecting data is gonna get you an answer. Yeah, you're excited about data. And that's super awesome. It's a good starting point to be in, but it can definitely, you know, be overwhelming. And I think that the best place to the best way to approach this kind of situation is really to just try to backup and tell Miss team that excitement about collecting data for a moment and ask, you know, why? Why collect data now? Right? What is really driving your desire to, to answer questions, or to find out more about visitors experiences now. And if this is a demand that's coming from your higher ups, or your colleagues, like, maybe you're excited to, but you're told, we don't know anything about visitors, right? Like we need to collect, we need to find out things. So please go find things out. I'm sure that's happened to more than one person in this room. And even you know, working as researchers, we sometimes get those kinds of requests from, you know, potential clients, and just people who want to know, and I think that's so great. But really, the first thing that we do, is we ask a lot of questions. So if someone is coming to you and saying, we need to find out more, we really do a lot of probing, and we say, well, why, why now? Why is this itch there and just get people talking about, you know, what it is? That's really driving their interest in finding out more about visitors? There's probably a lot to unpack there. And it probably does, actually, people probably do have a really specific question. Or if it's just coming from you, you know, check yourself, do you really not have a specific question? I actually kind of doubt that that's ever the case. I mean, I think it's awesome. Like I love I love data, right? I love finding out as much as we can. But you know, there's definitely times when it's like you, you always there's always something driving your interest to find out more, right? And it's just asking yourself why why is that important? Okay, I want to know this. But why is that important to know? And sort of going through that process, either with yourself or with the people who are demanding that you collect data or asking you to collect data is really the first step. I'm in navigating this situation,
Unknown Speaker 20:03
I think I often walk into a situation where we need to know more about our visitors, it's coming from on high. And the person that's talking to me does not have any authority necessarily, to go up to the CEO and say, Why do we need more to know more about our visitors? Right? And so then we're, we're in the put in the position of not having any idea what that is for, and, and crafting something bad, right, some sort of garbage in garbage out where we're not actually getting at the true need. And then the higher ups are frustrated. And we've spent a lot of time and energy going into circles. And so we work on different ways to talk about that process and their constraints without sounding challenging, right. So we know I had situations where it's one thing to query yourself and say, Why do I want to know this? It's another thing to go to your Uber boss and say, Why do you want to know, right? Like, what what is up with that? Right? And so this discussion about like, why now and other sorts of pieces is ways to ask softer questions that help surface the underlying piece. So it may be something like, is there a particular event, or report or meeting that is coming up that shows this need at the moment? Who is going to be reading the results of this? Is this a donor? Is this a board member? And these are softer questions that put you in the position of being a good person in trying to find out more about what you've been tasked at. But really, what you're trying to do is to suss out that motivation, right? Because the difference between find out more about our visitors for a specific donor is very different. And you would think that that was very basic. But the times that we are given this charge without the context is more often than with the context. Yeah. And so the more questions that you can ask that are not necessarily challenging, but contextualizing will hopefully eventually surface what the true need is. Yeah. This is me. So I think this happens to a lot of people, maybe not maybe is his as anyone else, like come across? Where your institution?
Unknown Speaker 22:30
And I just want to say We also regularly have to look at other people's studies and be like, Huh, what do we think of this too, so we're with you,
Unknown Speaker 22:37
right? And so we need to think about how to educate ourselves. So that you don't necessarily want to be able to do scientific reading on this, right? But you need to make the call. Is this something that I want to pass out to my team? Or is this kind of sketchy and we really don't have an idea that these are our true findings, right, that this was based on some sort of bias sample size, and an institution that's not like mine. And so there, there are tells, of course, right with any of these sorts of pieces. And if you're quickly going through things, you're not sitting there saying, Well, I don't know if the effect size is large enough, right? Because I know you're not doing that. Because I'm not doing that when I'm scanning these to go forward in these pieces. Some of the scientists in our team might be but I'm not going to be good at doing that. On my first pass, I'm going to be looking for the tails, right? And so actually, Kathy brought this up, we were going back and forth and different. So I'm stealing her quote, I still read. Yes. The cannon and my tell is Ginny said Mr. Weasley, fat. flabbergasted. Haven't I taught you anything? What am I always taught? Told you never trust anything that can think for itself, if you can't see where it keeps its brain? Right. And so a good study is going to articulate the workings of its brain, on the face of it, even if it's a blog post, even if it is something that you're seeing and clipping. And then one of the more obvious pieces of that is they're going to list what the biases of the study were. Right. So every reputable study that's put out there, at some place, should be able to talk about what the biases were in the collection, in the analysis, and the interpretation of that data, especially within the sampling and the frames. And that's separate from the methodologies. Right. That's not just the industry piece,
Unknown Speaker 24:37
you might also see them as limitation, because bias sometimes gets a weird type of limitation limitation or that we couldn't do. We couldn't talk to all the teachers in the world. Yeah. And therefore this is not about all teachers everywhere. This is not about museums outside of the United States. Yeah, whatever you might think. And I
Unknown Speaker 24:56
think when Kate says it's separate from methodology, what you mean is we think about methodology that's just literally describing what you did. So how did you collect the data? We did interviews with X people from this time period. And you're starting, then start ending, then, in this context, we were here, located here. And here is exactly how we recruited the people. And here's our interview guide, the questions that we asked, or at least the questions we started with, it's the what you did, but it's not a broader framework of situating, that process of what you did. That's what the limitations will add a layer on top, sort of surrounding context that is missed if you just list what you did. So if you're not
Unknown Speaker 25:34
seeing that, you know, upfront, then you're gonna have to go through and figure out what the biases are. And you may, may or may not be buried within that report, but already, that whoever is written that is not being transparent about what is there. And I think, as a culture, our culture is very guilty of this. And we would like to advance something that is much more about transparency of the data, and not just how it was gathered, but how it was analyzed what lenses or frameworks were used for that, and then how it was interpreted. So the AR workshop that we were in just before lunch, was really talking about, one of the benefits of some of the newer ways of analyzing data is that you can, you can have these tracks, you can have this transparency, to be able to show it with other folks, you should see that in the way that people are talking about their work in the paper. And that should give you confidence to be able to share it with your colleagues. Yeah,
Unknown Speaker 26:37
and every study has limitations. No study is without limitations. And so seeing a limitation section or a section that describes bias in some way in a study is actually not a marker of a poor study. It's actually a marker of a really strong study, because it means that the people who conducted it thought very critically about what they were doing and what they were not doing. So they were as rigorous as possible about what they did do. But they acknowledged that they could not possibly and will never be able to possibly look at everything. Yeah,
Unknown Speaker 27:03
the other thing I would say is that if you are not statistically inclined, and you start reading things that start to make your eyes go crazy, that it's okay, if you don't know it, but you should know enough to know when you don't know what you're looking at to decide how do you ask someone else for help? And that's part of that leveling up process is knowing where you're starting from, if you have some basic knowledge, what more can you do? How do you start to look further into understanding that even if it's just on a super basic level, you can still do more to learn how to read things? And to be critical in question.
Unknown Speaker 27:40
So the last one, and we went back and forth on whether this actually happens to people other than us. So I would love to do people encounter evaluation plans in grant proposals, RFPs panels that you may sit on, and you're asked to, to be voting on this project, or give some sort of critical review, right. And this is not necessarily your expertise, it may be about how you're, you're going to handle this right. Um, sometimes it happens, where you're looking for best practices, right, and you're trying to figure out where this is. Mostly, what I am looking for, when I'm when I'm examining this for, say, a grant panel, for someone else, is I'm looking for the clarity of critical thinking around the strategies that they develop, right? Sometimes look at the sample size, but that's not the first or even the 10th thing that's important to me, right? If they haven't listed it there, and then that's, that may be a little odd. But um, but really, it's about the clarity of thinking of their questions as aligned to the to the project. And and I think for me, it's like an intellectual puzzle, which is why I enjoy what I do does this, does this strategy, appropriately, address what the project is doing? And I think that we can all slowly get better about that. Does this gathering this kind of data really represented? Well, what we're trying to say, and so most of the time, when when people are writing grants, they have limited space, right? And so if I was writing a plan for these folks, it certainly wouldn't be a half of a page, right? I'm cutting out a lot to write something for a grant. But nonetheless, you should be able to see that alignment through the programs and a rationale of why the strategies that they are proposing, solve that question. If you don't, if you see that if you see strategies with no rationale, then you're not able to see the brain. Behind that, right. You're collecting why they're collecting data in that way. And so often I see we're going to do a couple of focus groups, because everyone loves to call everything, focus groups, and then we're going to do an online survey, and then we're going to do this and there is no y in that. And so you're looking at that. And you're saying, either they're thinking, I'm not smart enough to know that these things are all different, or they haven't thought it through. And they're sort of throwing ingredients together to make a plan to, to move it out. Right. And and, and that's not where we want to be.
Unknown Speaker 30:32
Yeah, I mean, I think, you know, take this term evaluation plan with a grain of salt, you might find yourself sort of reviewing it in the context of a grant like Kate was discussing. But I think really, what this point is about is, someone presents you with a strategy for collecting data. That's all the evaluation plan is, it's a plan for collecting data, how you go about it, and why you are going to do it. So you might be presented with that in a much less formal way, your colleague comes to you saying, hey, we want to find out more about x in our museum, we're gonna go about it this way. And that's the evaluation plan, right, that they're presenting you with? So thinking really critically about why did they develop this bad strategy for collecting data? And is that really going to answer the the actual questions that you have? Is it you know, are you just looking to data sources that are easy for you to gather versus what might actually be best? Are you triangulating data, like we talked about earlier? You know, so thinking, asking these critical questions about the strategy that is proposed in whatever format you receive it.
Unknown Speaker 31:33
I always like the shorthand. What's your question? What information? Do you need to answer that question? How are you going to get it? And how do you know that's going to help you answer your question? So as long as you have those put together, you've got something.
Unknown Speaker 31:46
And I, of course, think of it as a cooking analogy, which is, I wouldn't serve my CEO, raw ingredients. And I would not put raw ingredients into a plan and assume that you understand that I'm going to make a lasagna from it, right? That this is not happening. But when you have data, or just methods without rationale, you haven't done any of the cooking of that, which is the analysis in the interpretation, it does not become a meal until you have done that, and, and that in the planning stage is about the strategy and add in the finish stage. It's about the interpretation. So we are getting close to discussion. So our manifesto changed. She's got lots of compute. So we have a manifesto or Takeaways it was it is really about data literacy. And we do want to find define data literacy from our point of view, what that means, we think as a field, that MCN can be this place where we're conversant around this process, right, and all of the stages within it. And that that should be a goal for us personally, and for to something to model within our teams. Right. And that that comes within a stage by stage moment.
Unknown Speaker 33:12
I think, from my perspective, it is about knowing where you're starting from, and where you want to get to next. So you don't have to go from zero to 100, you can go from from zero to one in one area, and maybe someplace else, you have more developed capacity or knowledge. And that's okay. It's really just about continuing to look at where you're at how you can get further along in your knowledge base, so that you become Converse. And it's in the same way that you really hope that you go into a meeting with an educator, it was like, but what are the people want, that they actually understand a few things about the technological components with the strategies, or the those types of things, we all need to be more conversant in so many different types of work that we're doing in an interdisciplinary kind of world, that this is the same, it's sort of the give and take part of it. So if you are a gamer, you might think about it as when you level up. You get some new powers, you get some new access to things, but you're not quite there yet. You really want to get to that next thing, but you have to keep working at it. And I think that's the message that we're kind of looking at here is that you really have to practice where you want to get to and find a way to develop your skill set.
Unknown Speaker 34:23
If you want to go through the
Unknown Speaker 34:25
Yeah, well, we've done some of that.
Unknown Speaker 34:26
Yeah, I think she touched on a lot. Yeah.
Unknown Speaker 34:28
So one of the things that we were talking about is I think people feel insecure in the portions of around it that are not just the collecting of data, right. And so it is, you know, I don't know you didn't write I don't know statistics. I don't know how I'm gonna do this. I don't know how to frame the right questions in within that I am. And I would say that this group is particularly well situated for a more agile I'll approach to data literacy in terms of a, and a more iterative approach to data literacy in terms of those things. So each study, each piece that we have, each time we're doing the Google Analytics, each time becoming a little more conversant within that allows us to move forward. Each of us regularly can look back on studies that we've done in the years past, and see that sort of progression, right, you should be able to see that your work gets better over time. And that with with those pieces, I think that with those pieces as well,
Unknown Speaker 35:38
yeah, and I think that another point to make, just thinking about becoming what we call data literate, you know, really conversant in, in how to deal with data, how to talk about data, how it's collected, how to plan for collecting data, and what you do with it, after you've got it. It's not about unnecessarily, it's a little bit about slowing down. But it's not about slowing down to the point of not collecting data or not moving forward, you know, there's a lot of really amazing, rapid iteration that happens on designs or products or programs, or what you know, whatever your bread and butter is your work is, there's something to be said about moving, you know, very in a very iterative and quick way. And this thinking really critically about data doesn't impede that, right? So you can apply this rigorous thinking this critical lens of is this really the right strategy to collecting data in any situation, and it shouldn't preclude you from, you know, doing a lot of rapid, rapid prototyping, let's say, you know, I do a lot of work with formative what I call formative evaluation, or front end evaluation for exhibitions for media that goes in exhibitions, but just for texts that gets written for exhibitions, and really anything that could find itself in that kind of space. And I know we've all been there, as well. And that kind of stuff, you know, happens, you test different versions over and over. But when we do those kinds of studies, and we do call them studies, there's a rigor to how we've decided to test each one of those pieces out, even if we're swapping out versions, you know, every couple of days based on what we learn. There's always a rationale for why we've decided to do focused observations plus interviews with this amount of people, because we're trying to answer these questions. We think that's the best way to get there. There's always a rationale. And so I do want to stress that it doesn't mean, you know, we're not advocating spend so much time doing all of the planning that you never move forward and actually doing the work. That's that's not really what being data literate is, I mean, we do love to sit down and hunker down with a good study. But that's why we are researchers, you know, but yeah, I just want to stress that it doesn't, you know, when I think about being a data literate person, I certainly don't think about that, as you know, as stopping all of the amazing like rapid prototyping and testing work that that happens in the tech space.
Unknown Speaker 37:43
And we hope that the field will slowly move towards impact, and looking at impact of the work we do. So I think that that's where we as a group really lag in, in thinking about what is we spent a lot of time crafting audience needs and motivations around the products, programs and exhibitions we develop. And we spend very little time thinking about was, was that successful in a in a deep and meaningful way with our visitors afterwards. And so hopefully, as we get through this practice, when you can do that a little
Unknown Speaker 38:16
bit more, I think, in some really basic kinds of questions that that we used to work with, in a different place was, you know, what's working? And why and what are people doing with it as a result, as as a basic question that was worth collecting data on? What are people actually doing with this? And where did they take it? And why does that matter to what we meant?
Unknown Speaker 38:37
In the first place? Yeah. And I think also, like, the way that we think about impact is impact on on your visitors, because they're who your institution serve, first and foremost. And so you know, your mission, what you do is your mission is what you do, but your impact is the result of what you do on people. So thinking really clearly, and that can be applied for your whole institution. But you can also, you know, think about what that is and articulate it for, for us for one phenomenon, right? For one app for one website, for one, whatever. What do you hope the result of what you do? What do you hope what you do? Like what's what's the intended result on people? What are they going to take away from it? What do you want them to take away? And starting with that, and then backtracking to all the rest of what we've been talking about, you know, okay, so what does that mean for the way in which I will collect data and how I will analyze the data and how I will interpret it moving forward.
Unknown Speaker 39:26
It takes practice, right? And so like, making a lasagna, it takes practice in terms of these pieces. So with my teams, we've, we've worked on exercises in monthly data pieces, and we will get out something and say, okay, everyone's gonna sit with this. And tomorrow at lunch, we're going to talk about what each of us thought it meant. So we're going to practice the interpretation part of this. We've gotten a little bit of this we've taken it from someone study, we've gotten it in formal science.org is for A lot of studies, there are other places to get them and whether we often alternate between our own study. And so how we can prove our own work. And then someone else's study, right and say, like, let's, and it's fascinating how we can sit around a table and be can come to completely different conclusions based on the same paper there as we're trying to pull those ideas through, right. And it is that sort of learning how to analyze and interpret that portion of it is done in conversation with your peers. And in our current emphasis over data, we haven't made any space for that, that growing portion of learning how to interpret. And that's something that we all practice all the time that we practice, right. And so that's not something that we would expect that people who are not researchers are doing but you'd have to make space for it every once in a while to become NATO literate. I find it fun, but you know,
Unknown Speaker 41:01
yeah. So I think now it's time for question. Now's
Unknown Speaker 41:05
the time for questions. Gosh, it you know,
Unknown Speaker 41:07
this was back. Cricket, cricket? Yeah, she's
Unknown Speaker 41:12
got it. We should? Oh, no, that's the big.
Unknown Speaker 41:17
Are you ready? She's poking me.
Unknown Speaker 41:21
What are you trying to impact? Yeah.
Unknown Speaker 41:22
Tell us more. Okay, well,
Unknown Speaker 41:29
too. We have this qualitative goal of inspiring curiosity. How will we know? We know we have done the right way, right to collect data? Well, it comes back to a question is what does it mean to be curious? How do we know? When we find out more? First we find that that's the reason why we keep asking those questions is well, what do you mean by that? What is curious mean? How do you know someone is inspired to curiosity? If we can break it down a little bit, then we might actually then be able to say, Okay, I don't know, I'm gonna go off top my head curiosity looks like someone who takes one idea, and applies it to something else, and wants to take that idea and go farther and deeper and read 10 books about it. So what does that mean? We need to know, well, then we actually have to find out what ideas got you excited? And what did you do with those ideas when you were done with them? Then we have to figure out where does that data come from?
Unknown Speaker 42:31
Yeah, many places. Part of it is and what you were saying is part of it is a process of articulating really what you mean by that first and then thinking about okay, well, so we know, we've articulated what we want to achieve what we want for visitors? How might that show up? Like in an ideal world, like what might manifest So what might we hear people saying I see people doing or what actions you know, what might we see them? What What will we see to let us know that that's happened? And from there, and being really clear, and that might be many different things. It should be many different things, again, from many different angles, and then you back up, and you say, alright, well, what data collection methods do we need to employ to find out whether or not those things happened and go from there. So it's when we talked at the beginning about how it's so much about planning. And again, it doesn't have to happen super slowly to the point where things never get done. But that planning phase is so important, really articulating what you mean. Yeah, and thinking about what you actually
Unknown Speaker 43:28
might see planned for it in the process of doing it the end, when you cook, then you realize, oh, gosh, we could have we could have figured out some of these things if we had planned for them at the beginning.
Unknown Speaker 43:38
Yeah. I mean, the classic question is, you know, what are your visitors know, think feel, do believe differently? Because they've seen your exhibition in that. And so if that is inspiring curiosity, then that is what you're talking about. And a classic, many of you are from this community view, right? The classic markers in museum learning is that it takes time to manifest and that it often manifests in through conversation with others, right? So we spend an inordinate amount of time interested in how people talk to others, and what they talk to them about. But for that, particularly, you could manifest curiosity in Wikipedia searches, right? Oh, man. So like, there, there's some very concrete pieces around that, that you may or may not want to define. And that can be some rough and tumble, going out to the visitors and being like, so just talk to us about you know, did you know what did you find engaging and developing your metrics from there, right, because Wikipedia sounds like a much more approachable question than inspiring curiosity. But to many people, they seem the same thing, right? Like so.
Unknown Speaker 44:49
I want to make it easy to jump this and we're tackling something very similar to that. So our, our top level strategic objective is to nurture critical thing came. So we went through a process with lots of people to codify that and to codify it, you know, to define the work for us, that means these eight dimensions. And we're, and we've started a process of trying to evaluate for that on the floor through observation, which primarily means for each one of those dimensions, we say, well, what might that look like? What would what would? What would curiosity look like on the floor? What would asking good questions look like? What would analyzing data look like? And then we were, you know, and then we observed for that, so that's kind of that's, that's our baby step into the process. So I'm just curious who, who asked the question, because I would love to connect more with you later on. The Discovery Place in Charlotte has been has been also trying to tackle this, I think around the concept of wonder, for the past few years, and and has been doing some really interesting work around that as well. So what what what?
Unknown Speaker 46:03
Thank you is firing like long term like it might not have.
Unknown Speaker 46:07
Unknown Speaker 46:08
Yeah. Well, a
Unknown Speaker 46:09
lot of research to back that up. Yeah.
Unknown Speaker 46:13
Well, so. So part of part of, you know, part of it is just knowing that, right, like you're a museum, and you cannot do a longitudinal study of people throughout their lives. You can't there, you can, but you really probably don't want to, you probably don't have the reason they can't afford it, and nobody will stay. Yeah. And I mean, so in the evaluation world, you know, we talk about, like, you know, short term outcomes versus long term outcomes, long term impacts. And so we, you know, part of your planning upfront is just is articulating, you know, what might you see, but when might you see it, and knowing that there are a few some things that you won't necessarily be able to observe, like manifest, because they'll happen months later, or they'll happen. Months later, you can potentially contact someone, but they'll happen years later, let's say right, just far after the study is concluded. And so acknowledging what those things are, but also thinking critically about, well, what might be the Inklings that that is starting to happen, because those things will happen in the shorter term. And that's what you're able to see.
Unknown Speaker 47:15
I mean, I wouldn't discourage you from doing that. Right? There are a number of good studies going on about wonder. There are I've done studies on awe. And
Unknown Speaker 47:26
they're clearly in art museums.
Unknown Speaker 47:30
Now, right, and those sorts of pieces and, and so because it's just because it's emotional or mushy, like that doesn't mean that it isn't an impact that we're looking for, and one that we could seek out.
Unknown Speaker 47:42
Yeah, we won't get into the nuances of different types of research. But some ways of collecting information are better for certain types of concepts like that.
Unknown Speaker 47:52
Yeah, others definitely. And I think one thing to think about too, is like with big concepts like curiosity, wonder all even things that are so seemingly nebulous can actually be very rigorously broken down into things that you might see it just, it takes a lot of time to get there. But you really can actually articulate you know, maybe it's 10 different ways, you might see that manifest that together, like collectively translate into or translate into wonder some of those things you might see right away, some might be later, and they all build on one another. But you actually can like, push yourself to really articulate what you mean, and what you might see. Because it's gonna, it's gonna go a long way in terms of collecting valuable data for moving you forward. A couple other. Yeah, so I kind of want to be the person. Hold on, hold on.
Unknown Speaker 48:40
So it's kind of two parts beginning and end because I work in consultancy, as well. And so the word Well, first of all, when do you have any advice for these organizations here, when they're writing an RFP, and asking people to help them with research, and as part of the response, they require an evaluation plan when no discussion has been had at all. And then backside of the end, when we're all everything's said and done, this idea of conversant and how important speaking in plain English and the way you write your reports, like you were talking about methodology and limitations and things like that, most of the time, people that I report to just want the executive summary, and we're like, but I went through all this trouble.
Unknown Speaker 49:28
Yeah, results are top of mind. I mean, that's why you know, you want the answer. So
Unknown Speaker 49:32
yeah, I guess, advice on that early bit when you're when an organization's sending out an RFP, and you can do the final stuff.
Unknown Speaker 49:42
I mean, my personal take in several of you heard me say this is that our work coming from you know, applying a social science mindset to museums and to informal learning, has changed more rapidly in the last last 10 years than any of the museum technology that I see here at conferences, right. So I know that that world in the technologists world has changed. But we have really changed and have a lot more change to do in how we report out useful, actionable pieces in plain English in visual reports, right, and that our field is changing through that. And the American Evaluation Association has one of the best professional organizations that I've ever come in contact with, right with some of these with placemat reports with different ways to, to move forward in terms of this type of reporting. And we have seen it, we have seen it change through that. It's it is it is an issue in terms of being actionable from me, right, that I think that our reports have been too academic in that way, and therefore frustrate the people that we work with as a field, my field. Yeah.
Unknown Speaker 51:02
Well, and speaking to your first part of the question about like, what can you practically put into an RFP or request of some sort, I mean, I think that just a couple of really basic things that come to mind or like, some sort of statement or description of how you anticipate using the data collected, regardless of methods chosen, why how you, you might use this in your work moving forward, you know, knowing that we know that that might change, like later on, but just at the time of putting out the RFP like putting in a line or two about, like, how you would hope to use this, because that's like, the first thing that we will ask when we get there. So it'd be nice to have a little bit of a heads up, you know, and then something like some sort of statement or something, just saying you want multiple methods, you know, you if you have any in particular in mind, for certain reason, that's good to know, too. And then I think the one thing that is, like a lot of RFPs, that I've seen have a tight timeline. And there's many reasons why that are a lot of out of your control. And we get that but, you know, having heard us talk about how long the analysis and interpretation process takes, and it's much longer than data collection, understanding that and trying to build that into the timeline, or at least acknowledging that, you know, you're gonna give it minimum like equal time for data collection and data analysis. I don't know how exactly you'd write that up. But sort of thinking about those things as you're building RFP would. It translates well into, like how we would respond to it.
Unknown Speaker 52:30
But I think the other key thing is that goes back to the question of the hidden agenda, why is it that you are collecting evaluation data in the first place? What is it that you intend to do? Or why are you doing it doesn't always have to be about impact? Depending on what the purpose of the project is? It could very much be about did the system have our new operating process to do a project work internally, so you can look at things on a lot of different levels? And it comes back to? Why do you need to know it? And what are you going to do with it? And if it is, because the grant requires it, if you go back and you look at why grant makers are requiring it is often to be sure that you have got someone else helping you think about what you've done, so that you're not just making it up as you go. It's a checkpoint, it's sort of a way of balancing out have Have you thought about all the pieces? So an advisory panel is just the same idea in a different strategy. Why are you What is the purpose of the evaluation? What are you going to do with it? How are you going to use it? Where do you take it next? Is it something to inform your process that you learn and develop further further? So I think those are the kinds of questions that you really want to be thinking about,
Unknown Speaker 53:42
as you write. The Kathy reminded me that the tight timeline piece, so the institutions that I mean, my peers here included, the the the people that you're going to want to book are going to be booked much further out than often the RFPs allow. And so the smart institutions that we work with, will call us and say hi, I'm going to have an RFP in about two months. We're kind of thinking of going here, just a heads up, right. And they'll do that. I know that they're calling various people to say that I'm like, Oh, that's interesting. Let me think about that. I might go see your exhibition, I might see this sort of piece. So then that the thinking process is happening so that the turnaround is better. In terms of it doesn't Carry.
Unknown Speaker 54:33
Carry. There's a mic. The mic isn't there. We're gonna try to be good about using that. New friends. Thank you.
Unknown Speaker 54:41
Hi, I'm Carrie. I work at the American History Museum and Smithsonian and I just wanted to chime in and say yes, like the evaluative mindset is like, it's super important. And because we're museum technology people we're already kind of in that mindset, and that we'll find a lot of willing partners and changing co After, at our museums with museum educators, because a lot of museum educators are like, already in this mind space and are like ready to go. So having tech savvy partners that can help them, you know, get the data they need and will help us take action on it. And we both I'm in both camps, both a digital person and an educator. And a good resource that I have on my shelf as a museum educator is John Jacobson's book measuring museum impact and performance available on Amazon. I'm not getting paid by him. But it's wonky. It's super wonky, right. But if you're looking for like, what is a metric I could use to measure joy, or, you know, there's all kinds of community impacts in there. So you might find that helpful.
Unknown Speaker 55:42
Yeah, thank you for sharing that. And I love that call back to educators. I, myself come originally, yeah, Billy and I both come originally from an education background. And I think that it's really, you know, even if you are not an educator, and don't plan to be one, but you work with them often, or just want to understand our world more a lot of research. You know, so much of what museums do is promote learning in different ways. And we articulate what that means in our own spaces, like, great, but there's so much education research out there. That's a huge field to look to for rigorous research. You know, it falls under that social science umbrella, what we were talking about. So, thank you for that. But yeah, great educational research.
Unknown Speaker 56:19
And I have just a couple of closing comments, and then we're happy to ask answer questions afterwards. So in thinking about this, when I started at the Institute for Learning Innovation, and John forecasts that I take over a technology, I create a technology evaluation portfolio for them, I was the only one in the institution doing it, but that, that I was very keenly aware that that would go away that all evaluators would be involved in technology, because the technology was going to be involved in everything, right. It wasn't this separate website, anywhere else. And that's happening right now, with data for all of us, right, everyone in this room is that, you know, I may be the person that analyzes data for living, but you will all be involved in that in some way or another going forward. That's the way that our world is moving. And so figuring out how you want to be able to handle that is part of the coming coming piece that we're facing, right? Where we have a glut of data and a deficit of meaning. Yeah. And
Unknown Speaker 57:26
that's, that's why we call everyone to be data literate call on everyone to be data literate. And I mean, even the sessions earlier this morning, that plenary panel was another another version of that just coming at it from a different side, right, but how important it is to, to know about what data is being collected. Reasons Why, where it's coming in what to do with it. That's really all we asked for. Do you want to put our info up? Yeah,
Unknown Speaker 57:49
I was just gonna say if you need a lesson in this, the there's the the useless maps, meme is a really great way as a starting point for you about thinking about like us. I don't know if anyone's seen, I don't have an example to show. But there are maps that are made of ridiculous information. That's that's not very helpful. And I think if you can think about why does this matter? And what does it mean for what I need to know that's really the best place to can start with in terms of making your, your plan for how you can get to the next phase of your understanding, because it's everywhere. And all of us need to know because it affects it affects our lives in ways that I think we are all aware of more than ever before. So being literate is not only good for for your work, but also for your self.
Unknown Speaker 58:38
And you mentioned a good resource if anyone wants more specific resources of like, where do I look just to like look at other studies. Just come talk to us after and we're happy to tell you tell you where to look? Yeah. Great. Thank you. So to turn on