Unknown Speaker 00:00
Good morning, everyone. It is Thursday, November 19. Until we, and I think it's the last day of our six day two week little experience of our first virtual conference. And cM 2020. And I hope you've enjoyed the conference so far. I'm just gonna how many people that we have in the room, Carolyn?
Unknown Speaker 00:28
We have 119 Oh, very nice.
Unknown Speaker 00:31
Okay, plenty 120 good. Okay, so as people still kind of a congregate I'm just so this is this. There's a few sessions. I think this one is about five or six. And then we're doing closing remarks myself with Mitch Mitch Sava and Yvonne Lee at 2pm Eastern. So come back for that. So as I said, this is last day and this session is last but not least. Welcome to digital platforms are not neutral MW roundtable about the responsibility of institutions when selecting digital platforms. And the title of course, is a riff on the now famous museums are not neutral campaign. As the covid 19 pandemic forced people to work remotely zoom to the world by storm and inspite of known privacy and security concerns. Users embrace digital platforms, tacitly consenting to giving away personal data for convenience and free service. As our institutions Now navigate their engagements with tech companies. Are we exposing What are we? Are we exposing our visitors to risk? What responsibility do we as museum technologists have and using platforms ethically? The idea of the session originated in late April when our ncn virtual Town Hall held on Google meetup was porn bomb towards the 53 minute mark into the call. Of course, we ended the call sound some found that amusing sign of our times I personally have a particular antipathy for Google meat and pretty much anything other Google. I thought it was pretty pathetic. And here we are museum digerati who know a thing or two about technology. And they're known vulnerabilities. But as consumers or users of digital platforms, often do we rationalize all well, at least they didn't happen five minutes into the call. So I reached out to Marty first and about this idea of doing a session. And I think what we need to realize is that the agency that each one of us has to participate in certain online platforms and social media channels, doesn't extend to the institutions where we work. Personally, I decided some years ago that I would stay away from social media altogether, except for LinkedIn. However strong my dislike for Twitter and Facebook, maybe as head of an organization that has a presence on these platforms, I just can't unilaterally decide not to use them. And at the same and the same goes for all of you who work in an organization, institution or business, or even a consulting practice, you sometimes you have to use this platform. So the session looks at what is our responsibility as museum technologist in using platforms ethically? Or how can we minimize and I would say how can we minimize some of their worst outcomes and side effects on our users? One word about the scope of our conversation this morning, when we internally as you can see on that slide, this was our one of our many rehearsals leading to this session, we looked at the different areas of digital practice where those issues of privacy and persuasive Design Technology were impacted in in the museum and we decided that for this session, or focus or position really was really about end users or once known are still known as visitors. So we'd be looking at this issue from you know, the user centered approach. Joining us this morning to talk about these complex and thorny, but incredibly important issues, a group of extraordinary talented and forward thinking colleagues, friends and digital practitioners. Marty Spellerberg, Matt Popke, key Nikhil Trinity, Sarah Wambold and Dana Allen-Greil. I let them introduce themselves and also worried about the format. We wanted to do a session to be as interactive as possible. So the panelists will dialogue with each other. And we'll leave about 20 minutes at the end towards for for questions and answers. That's why we this is a meeting. It is not a webinar. To mute your mic, and, and I'll help facilitate the questions and answer on that note. Who wants to go first?
Unknown Speaker 05:11
Hi, everyone. My name is Marty Spellerberg. I am in a shout out to Keir I'm an undifferentiated white guy with a beard. Identify as a he him pronouns. So I'll kick us off by sort of telling a story that I think encapsulates the problem. Do you remember a couple years ago, when zoom hacked everybody's computers, and it was kind of a big deal? You know, like they routed they, they routed Mac's, they made it so that if you uninstalled zoom, he didn't really uninstalled zoom, they still had root access to your computer. This bothered me at the time. And I thought, well, there's a lot of competition in this space, I don't have to use zoom. And when zoom took off in the spring, I was steadfast, I did not install zoom on my on my laptop. But I made it about I made it about seven months. And then my university said that online classes, were going to be on zoom. And so I really had no choice but to install it. And so this is this is a discussion about our responsibility as, as leaders or people with influence inside institutions, to what are we? What are we? What choices are we making for our that impact our users? So? No, it's it's a very broad topic. Sure, your mind goes to a lot of things. We've been thinking a lot about the free and paid services that we adopt. This could be anything from, you know, Google Analytics, putting tracking codes on our on our sites, it could be using Amazon, for our hosting all of the problems that are associated with the big internet platforms, software that we buy from vendors and our engagement with social networks. We should also think about the data that we collect on and store about our users. And how much are we really in compliance with GDPR. But I want to bring Matt into the conversation, Matt, I know you've, you've done writing on this, on this topic, and I wonder about your perspective on how we got ourselves into this situation.
Unknown Speaker 07:58
Hi, yep, for those who don't know. My name is Matt Poppy. I am a developer at the Denver Art Museum in Denver, Colorado, which sits on the unseeded lands of the Arapaho of Cheyenne and ute peoples. And
Unknown Speaker 08:17
Unknown Speaker 08:20
how did we get here?
Unknown Speaker 08:22
Unknown Speaker 08:23
there's so many things that we did, I think it's first, it's important to remember who we is, in this case, we are. You know, it's not just we as museums, but it's we as a whole technology sector and as society as a whole. And we made a lot of decisions over the last few decades that have led us to this position where we don't really have a choice anymore. We we made a lot of short term choices about using free services, without asking questions for what the real cost of those services were, you know, Google Analytics came around in the early 2000s. And we said, Sure, let's start using it. Facebook came around and offered us a low cost marketing channel for reaching people, rather than high cost marketing campaigns that we're used to for, you know, billboards and magazines, and TV and all this stuff. And we went for it. It was just a lot of little choices that we as a society made that ultimately led us, you know, we just kept digging this hole. And now we're at the bottom of this hole where we don't have a choice other than Facebook or Google for a lot of things. We don't have a choice other than amazon for a lot of things. Because we allow these these companies to sort of give us this poison apple of free and cheap services free and cheap tools. And we didn't ask what the long term choice was going to be. And I don't think that we as individuals, or we as institutions are like necessarily to blame for this. This was, you know, this was a death by 1000 cuts, a lot of little choices that didn't seem significant that in aggregate added up to become something Significant. And I think so much of it is rooted in sort of an outdated legal notion. And that's the notion of informed consent. You know, we've all seen these terms of service documents, these terms of service agreements that we've all agreed to, on everything from the services that we use to the music software that we listen to. And I don't think that anyone really understands even if they do read that entire document, no one really understands when they're looking at that document, what they're consenting to what the what the full ramifications and consequences of these networks really are. Most of us don't have a really good understanding of how these things work, let alone how they work together. And to try and ask everyone in civilization to just suddenly and systemically think about everything, when deciding if they want to use Apple Music, or talk to their grandmother over Facebook is a really big ask. And legally, we don't have the right to renegotiate these terms. You know, we agreed to them. And the only thing we can choose to do is stop using these tools. But once these tools become such an integrated part of our jobs and our lives, how do we make that decision? It's almost as though we were lured into a trap.
Unknown Speaker 11:22
Can I just sorry to interrupt but we were supposed to do introductions real quickly around before Marty before we before we dove into the the meat of it. So what why don't we just do this? Now when done mad you can you can continue, you know, you can finish your section.
Unknown Speaker 11:38
Eric, I actually wonder if maybe we should just do intros as part of the round. So let's fly Matt in the Marty Spellerberg met, let Matt and Marty keep going. And then each of us can do intros to the beginning of their rounds.
Unknown Speaker 11:55
Unknown Speaker 11:56
over to you sorry. Oh, well, actually, it
Unknown Speaker 11:58
was just actually just gonna throw this back to Marty.
Unknown Speaker 12:04
Well, so there's a question of how all of these concerns, impact our, our communities. And I think I would be very interested to hear stories from, from y'all. If there are incidents that you are aware of, or harms that you're you are aware of AI. So please, please bring those forward and share them. You know, one, one bit of context that I'd like to bring up is that as we went into, into shelter in place, we all moved our, our, our programming online as much as we could, right institutions did. And so I want to sort of put a big picture out there about the digital divide. And and in terms of harms, when we when we go online, who do we exclude. So the FCC says that there's about 19 million Americans without broadband broadband. And the number could actually be double that because of poor reporting. And Pew Research says that estimates that about 15% of households with school aged children don't have broadband. So even in our choice of the internet as a platform, we're excluding the groups of our audience. So I'm very interested to hear about sort of first order harms that directly impact our communities. One that I, I heard about recently was the how how data gets sold. So there's a an app that was called it's called Muslim Pro. And it was very useful for people in their religious practice. However, that app was collecting data, and is selling that data and that data was being purchased by law enforcement who wanted to keep who wants to surveil these communities. And I'm so that's a sort of a direct line by how a practice harmed directly harms the users. So I'd be very interested to hear about more of those. I think we're all also very aware of sort of the second order harms. You know, we're on Facebook. We're on social media. And we're in the news reading about all of the, the, how detrimental that is to our, to our democracy and into our society. We're, you know, Amazon hosting cost pennies. It's, it's fantastic, but we're also hearing about the abject working conditions In other parts of the company, there are certainly those are American examples. There are certainly many stories from companies in other countries. And I won't go too deep into them for fear of throwing rocks from a glass house. But I think we're quite familiar with those. as well. Personally, I run a small Art Gallery, and I made the choice to get off of Facebook simply because I was doing the social media for the gallery, and I found it just quite a toxic place that I didn't want to be on. And being that it was my gay I could, I could choose that. I also similarly chose to remove Google Analytics from all of all of the the properties that I control, because I, I knew that I wasn't actually basing my decisions on them. But Eric mentioned that we have we put together a list of resources, and that he he dropped in the chat. Yeah, and I just want to call attention to I put some references in there about ethical decision making in business. And I think perhaps we could revisit the idea of corporate social responsibility in this context for museums, but I'm okay, back over to Matt. What do how do we how do we dig ourselves out of this hole that we've gotten ourselves into?
Unknown Speaker 16:41
Well, I think I think the whole analogy is really good, because I think the first thing we need to do is stop digging. When when Google offers us another free tool, maybe don't use it. When Facebook or Instagram or any of the others come up with a new service, maybe ask tougher questions. We have a we have a really deep hole that we have to climb out of. But if we just keep digging, it's only going to get harder. So I think that's the first step. I think another thing is we should provide alternatives. It's one thing to engage our audiences over Facebook, it's another thing entirely to require our audiences to engage with us over Facebook. Anything that we offer through one platform that we think might have data or privacy implications, we should try to also provide it on another platform that doesn't. And if we can't provide something on a platform that doesn't maybe maybe come up with a different idea. I think we have three questions that we should ask when evaluating services, and then I'll be done here. The first question is, what is the worst thing that could be done by the worst people if they got their hands on the data that you're going to collect? Or that your partners are going to collect? Why do you want this data? or Why do you want to use the service that involves the collection of data? And then three, given the answers to one and two? Do we actually want to use this service? I think so often, we skip the first question of what's the worst that could happen? What are the possible consequences? We just skip straight to what are the benefits to us? And if we just ask the question and make that part of our evaluative process, I think it might change a lot of our decisions. I think so much right now is just we've made these choices without asking that question. And if we just start there, that's, hopefully I'm optimistic. I think that's a good place to start.
Unknown Speaker 18:34
Thank you both. Because you're in conversation with a Susan. Is that right? I mean, Sarah, I forgot to or is it is it was a member's
Unknown Speaker 18:43
Unknown Speaker 18:45
Oh, Dana and Tara. That's right. Yeah. To talk about social media and choices that forms. Yeah.
Unknown Speaker 18:52
So I'll start. I'm Sarah Wambold. My pronouns are she her. I am a white, awesome aged woman
Unknown Speaker 19:01
with brown hair.
Unknown Speaker 19:03
I'm wearing a blue shirt. I'm sitting in my living room. There's a lamp above me and a piece of art on the wall behind me. And I am an executive producer at the Met, and I've been working remotely from Denver. Dana, I'll throw it to you.
Unknown Speaker 19:23
Hey, Sarah. Hi, everybody. I'm Dana Allen grill. And there's my phone listening to me. I'm I'm the Director of Digital Strategy at the Monterey Bay Aquarium. My pronouns are she her I'm a white lady of awesome age, also, wearing a T shirt in my bedroom. And I'm coming to you from the homelands of the aloni coast to know and acelin nations nation past, present and future. And I was interested in joining this session. I can tell from the participant list that a lot of you came to Oregon dissipated in are on the panel last year when we did a whole deep dive into the ethics of social media in museums. And so we'll kind of touch on that in the context of neutrality today. And as you saw from that initial slide, we've met several times and had some really great conversations. And one of the kind of key areas we identified as of concern is around persuasive design. And so I was hoping Sarah could tell us a little bit more about what, what is persuasive design? Why does it concern you.
Unknown Speaker 20:34
Um, so, very quickly, it's an area of design practice that essentially is trying to influence your behavior through design choices. So it gestures that we're probably all very familiar with, or the poll to refresh, which we know is very reminiscent of a slot machine, the infinite scroll, which, in which this, the cues to stop consuming have literally been removed. And then my own personal device, which is I think, called the typing awareness indicator with the three little dots that lets you know that a message is coming to you. And, you know, those are those gestures are put there. And there are many others to get you addicted to using these, these products. And these platforms, and they're highly, highly effective. And so, that is a problem that kind of in and of itself, it reminds me a lot of cigarettes, cigarettes are are built to addicted to the product in order to get you to buy more of the product, right. But in the case of social media, there's a supercomputer running in the background that's watching all of your moves across all of these platforms in order to predict your behavior and serve you more content, to to can to again, get you to use the product more to get you to make a purchase, to get you to stay longer. And that's what the business model is really built on. So all of those things combined, when you when you look at it, you might think, well, it's creepy when my phone listens. To me that was very apropos. It's creepy when I talk to my partner about needing a new pair of short shoes, and then Instagram serves me an ad for shoes that I might buy. It's also a bit convenient for me, right? So. And that's the kind of insidious nature that I find really disconcerting about these technologies. And, you know, the more addicted you are, the easier you are to manipulate the more money you're making for someone else. And that that is the business model, I think, is what makes this so problematic.
Unknown Speaker 22:55
Um, you can you bring that back to your work in museums. So to have this concern, how did those concerns kind of guide your decisions at work at the mat?
Unknown Speaker 23:06
Thank you for bringing that back to like a very specific, less abstract idea. I think, you know, at the Met, we measure ourselves, it using the same metrics that that these these companies in Silicon Valley are using, right that we're looking at pageviews session duration pages per session, retention rate, conversion rate, and we use those to justify the work that we're doing. at the Met, we also regularly run a B testing to figure out what you want and what will keep you engaged. So we're, we're not as sophisticated. But we're using the same tools. And I should say here that I am 100%. Part of this problem, I co authored a big pan institutional study with Marty that tracks 1000s of users across two dozen museum sites. But what I've been really trying to think through isn't museums are not capitalist enterprises. And so what is all this data really adding up to? And what systems are we supporting through our participation in these activities? It kind of begs the question, why are we doing this at all? And I think that's really what Matt is, is posing. And, you know, Matt said something about choice, which I think is really interesting. Dana, I know you at the aquarium have recently made a choice to opt out of Facebook advertising through the stop hate for profit campaign. And I wonder if you could tell us a little bit about that. Yeah, I
Unknown Speaker 24:42
mean, I think when it comes to social media, in particular, we tend to act like we don't have a choice. And the reality is we do have a choice, as many of you have mentioned that we've made personal choices. Maybe you've made institutional choices. I think on the spectrum of choices, There are not a whole lot of good ones, including opting out. But we did at the aquarium, participate in the stop heat for profit campaign in July, which was organized by a lot of organizations, including the NAACP, and the anti defamation league. And the gist was, Hey, everybody paying money to Facebook, the company for advertising. And I say Facebook, the company because Facebook, as you know, also owns Instagram. Just don't advertise in July, and we're going to make a list of demands. And those demands are things like, stop allowing lies and political ads and closed down groups that are associated with violence, and better support victims of severe harassment on your platforms with real time support. So that was, those are three among the list of demands of the company, in that campaign in July, generated a lot of support from big brands, who spent a lot more money than the Monterey Bay Aquarium on advertising, it did catch the attention of Facebook at generated meetings, I'm not gonna say it generated great solutions. I think the other nuance here is the aquarium did not like officially sign our name on to the campaign. So we participated, but we didn't put our name on it. We didn't say anything publicly about it and part of that. So please don't tweet about that, by the way. Part of that is because we were not prepared to answer any subsequent questions around. So are you going to divest from Facebook? Like, are you going to advertise in August, we were about to reopen, we thought we're still closed. And that that is a platform that's super important for us to engage our core audiences. And we were not prepared to answer questions about our long term sustainability of divesting from Facebook. So
Unknown Speaker 27:02
did you stop posting like organic posts in Facebook during this time?
Unknown Speaker 27:10
That is a great question. We did not. And this is where this campaign was sort of a piece. One thing we could do, but it's part of a long conversation for us about, particularly Facebook, social media, overall, but particularly Facebook and the harm that is caused by us engaging on that platform, and also the harm that was caused by us not engaging on that platform. So no, we did continue to post and continue to have a relationship with Facebook, the company in that, for example, they actually use our live stream our webcams, and they pull that content in free to their Facebook, Oculus venues where that's a paid platform. So we are in some transactional relationships with Facebook that was not completely shut down. But I want to give an example of kind of what we grapple with on Facebook, for example, our mission is to inspire conservation of the ocean. It's our job to help share our research about climate change, and how that impacts how the how the ocean is involved in climate change is sort of the greatest potential threat to humanity. And every time we post about climate change, because that word has become so politicized, sort of the anti science folks come out and also the bots come out. So they're actually literally looking for the words climate change and spreading disinformation, on Facebook. And because Facebook's tools are not robust in terms of community moderation on Facebook pages, in particular, we have to spend a ton of time responding kind of counteracting that counterprogramming. Essentially, we, at the end of the day, we're spending a ton of resources on that. And wondering, is our audience ultimately seeing more of the counter messaging the anti science, you know, the climate change deniers? And if we chose just not to engage on Facebook, then we're not reaching what is actually our largest social media audience and the most diverse in terms of age in particular,
Unknown Speaker 29:24
for us, so
Unknown Speaker 29:25
what is our responsibility as an environmental organization, we not talk about climate change on Facebook, because the bots come out doing that engage on Facebook. And so I think the issue is really come around to you know, what is engaging in these platforms mean about trust in science, about what is truth? These are really troubling, troubling issues, which I don't think have a clear answer for us, but but I do just want to make sure that people are thinking about the choices that they have, you do have choices.
Unknown Speaker 30:00
I mean, I think that's, that's really my most major concern. And and I think what the the real threat of this is, is what does it mean at scale? What does it mean to have a business model that's built on addiction and polarization, when people are relying on this dark infrastructure to help them make sense of the world? And, you know, I think what is important for us to think about is that this affects everyone, whether or not you are using these products is already mentioned. Right. It impacts elections, the adoption of vaccines, it erodes trust in institutions, including ours. And, you know, I mean, we have conspiracy theorists in Congress now. And, you know, I'll, I'll end on this, this thought, which is maybe a little abstract before I throw it over to nickeil. But I've been really struck by how the pandemic can be a metaphor for how we're operating in the world. So what's best right now is for us to isolate. And you know, what's best for our communities is for us to isolate. You don't wear a mask, because it only protects you, you wear a mask, because it protects somebody else and prevents the hospitals from being overwhelmed. So maybe you don't mind that your data is being sold, or that you're being served ads for products that you're likely to buy, but what's best for you individually is not necessarily what's best for the world. And I think that that's the ethical issue that we need to be talking about.
Unknown Speaker 31:32
perfect segue. Thank you, Sarah. Thank you, Dana. So Nikhil is going to try to wrap up some of these but really looking at, you know, larger world and individual actions and but kind of like that dynamic. And then we'll go to questions and answer. Nikki, over to you
Unknown Speaker 31:49
are, quickly Eric, do you want to introduce yourself, and then I'll introduce myself and then go into it.
Unknown Speaker 31:55
Of course. Nope. I'm Eric Longo. And I'm the executive director of MC n. middle aged white guy French with a little goatee. In my own. Thank you.
Unknown Speaker 32:10
Everybody, I'm Nikhil Trivedi. I'm the Director of Engineering at the Art Institute of Chicago. My pronouns are he him, I'm a cinnamon colored brown man with the salt and pepper, black beard. I'm sitting in my living room right now my five year old is in kindergarten. And then the next room. There's a painting and some photographs on the walls behind me. And I'm wearing a pink sweatshirt with an image of a draw a line drawing of Malcolm X and Yuri kochiyama. That says Asians for black lives on it because I am an Asian for black lives. And I'm talking to all from just outside of Chicago, which is traditionally known as the homelands of the Council of three fires the Ojibwe, the Adela, and the pottawattamie nations, Chicago's this has the third largest urban native population in the country, with over 100 nations represented here. So I honor and respect past present and future caretakers of our land and our waterways. So today, we've sort of brought up a lot of different things that we're deeply concerned about. And we didn't want to leave you all with like a bunch of information, we wanted to try to give you some concrete actions that we can each take with everything we're learning today. So Eric and I are going to try to take a little bit of time to frame some actions we can all take. And for myself, I try to frame actions that I take in the context of having awareness of issues, accepting reality, and then rooting my action and those two things. So that action that I take is sustainable into the future, and is grounded in connection with history and with community. So I'll talk about those three things briefly. So awareness, at the root of everything we're talking about, requires an understanding of the history of capitalism, because the history of capitalism is a history of racism, classism, sexism, and all forms of systemic oppression. Because the history of capitalism has always put profit over people. And much of what we're talking about today is the challenge the notion of putting profits over people and centering people first. The exploitative patterns that we see in big tech today can all trace their roots back to many of these same exploitative histories. So it's important that we have a great Rounding in that history and an understanding of that, before we start moving forward into changing the landscape today, some examples, surveillance right can be traced back to things like the panopticon, or the militarized police states that many of us have. Live in today. Information about information being manipulated, for the purpose of persuasion can be traced back to treaties with indigenous nations, going back four or 500 years, the foundation of our society and the foundation of our tech world operates within these much larger contexts. So this history and these connections matter. But in a more concrete sense, how do you raise awareness about some of the initiatives and issues that we're talking about today, a few specific resources, and I'll put them in the chat as well, the Electronic Frontier [email protected], and the Free Software Foundation fsf.org both have a lot of information about issues that they're trying to organize around, and specific actions that they're asking folks to take. So sign up to their email lists. And also, um, you know, let's be sure to get out of our museum tech bubble and keep our finger on the pulse of what's happening in the tech world. And more broadly, one email list that I found valuable is tech meme.com. It's a daily email that just presents a bunch of headlines about all sorts of different issues, from privacy to surveillance, to, you know, what Apple just announced yesterday.
Unknown Speaker 36:41
So that's awareness, acceptance. We need to accept the truth, that surveillance of all of us and all of our families, allows platforms to sell their ability to convince us. Quite simply, that's what big tech has become. Businesses that sell their ability to convince us to buy things. And it's turned into voting certain ways. But that's really what we're using when we use these products, and the harvest thing of things like location data, our communications with each other things we like, facial recognition, their partnerships with enforcement agencies, the fight against net neutrality, all of these things weave together a thick fabric that maintains and oppressive status quo. Like latonya said last week, normal is broken, normal is oppressive and normal hurts. And all of these things work together to keep an oppressive norm in place. We need to acknowledge that our participation individually, and the partnership participation of the organizations that we are connected with matters. And our non participation also matters even though it might feel miniscule. So I suggest gather resources and share resources, as much as you can find two people within each of your organizations, sign up to those three email lists and have bi weekly conversations about what you're learning just to build a community around yourselves to keep in touch with these issues. And it just has to be to other people to start with. And from the knowledge that you're gaining, take action together and build something greater from there. So in terms of action that you can take, we've talked about awareness, we've talked about acceptance. Now action, and I want to talk about sort of bigger, large world the actions we can participate in and coming down to individual things that we can each do for ourselves. Coming back to the history of capitalism, I saw in the chat, fuck capitalism 100,000% agreed, demand people over profits. When we're making technology choices, we're making choices about the platforms that we're going to move forward with center people and the choices we're making and acknowledge the prophets have historically been at the center and actively push that dynamic around. Participate in existing campaigns that are happening in the wider technology world. Electronic Foundation, FF has an action center that you can follow with a number of campaigns that they're simultaneously working on. FSF has a campaign specific and surveillance that you can follow and contact your representatives about the things that you're learning about. It's been shared that the most effective ways to contact your representatives are through a handwritten postcards, calls, an in person meetings with that group of three people that you've been meeting with every two weeks, set up a meeting with all three of you to meet with your representative in person conversations. Especially for things that may fall under the radar of folks like, you know, really specific technology policy, it's important for us to try to get as much FaceTime with the representatives as we can. And that sort of thing is possible. With the power of numbers. If you can even pull in a few other people to try to have that conversation with you. Coming down to a slightly smaller level, ask your vendors what their policies are, as well. For us individually, you know, we might have unspoken data collection and retention policies, we need to document and write those down so that we can be clear on what our policies are. And we can make sure that everyone around us is on the same page. So document your data collection and retention policies. And if you don't have them, write them down. If you already do have them, review them in light of everything we're talking about today to see if anything needs to change. And once you know, specifically, you want ask your vendors what their policies are. And if there's something you want to see changed, talk to other folks that are using those same vendors about pushing for a change in specific, a change in policies towards a specific direction. For smaller vendors, that might be more easily possible. For things like Facebook and Twitter, I would look to wider campaigns from EF F and SF app to try to pivot those organizations. But for many of the smaller vendors we work with, we do have power to influence their policies if we gain our own understanding, and then just simply ask.
Unknown Speaker 41:44
And so coming to us specifically included in the resource that Eric shared, there's a link to a data detox kit, which has a good number of things that each of us individually can do a few examples, turning off location services on your phone, and asking your friends and families to do the same. Delete apps on your phone that you don't actually use every day. Because you can always just redownload them if you want to try them out again, set DuckDuckGo is a default search engine on your personal browser as well as on your phone and do all your browsing in private and incognito windows. I put a bunch out there. Eric, I want to pass it off to you now.
Unknown Speaker 42:28
Yeah, thank you, the hill and everyone, we have about 15 minutes left. And one of the things that obviously, these are really important issues to all of us and in our sector. And you know, we didn't want this session just to be a nice conversation and we move on and wasn't that session, great two years ago, and then nothing happened. So we would really like to enroll all of you who are part of this community to how could we, you know, in addition to the suggestions that nickeil just made here, what are the opportunities for us as a sector to kind of band together, if you will, and try to with with with intent to effect change? You know, I at the global level, meaning at the at the museum sector level, not just a particular institution or museum or, you know, individual practice, I think it would be interesting. So some some work was done last year, by covin COVID space, who I think may be under call with his great museum tech charter, and I feel that would love to continue and take that work to the next level. And really, yeah, make it a charter and say something that, you know that if you want to vendors, like if you want to work with museums, these are our minimum expectations. So I just want to open it up now to questions and I'll try to look at questions in the chat. And you know, and do as best job I can in moderating those questions and answer is there someone who wants to go first?
Unknown Speaker 44:27
with a question. You can unmute your mic. This is a meeting. This is not a webinar.
Unknown Speaker 44:34
And feel free to post questions and yeah, as well. Yeah.
Unknown Speaker 44:38
Can I lift up something I'm seeing in the chat, which is Sharon kind of pose that there are good things that have that do come out of social media? I think we should acknowledge that. Particularly. I think it's been mentioned in many sessions about the ability to organize the accounts that are holding Institutions accountable and, and even more just about connection and forming communities. I think that's really important. I think, for me personally, I feel like we need to focus in on the business model, which is also coming out in the chat that, that the economic systems that are driving this are really what we need to take a close look at look at.
Unknown Speaker 45:32
Actually, could I add on to that a little bit? I really appreciate some of the good conversations that have happened on social media, I'm glad that there are these anonymous social media accounts that are exposing racism in our institutions. I'm glad that we have these networks that can help you know, marginalized communities organize and talk to each other and offer support to each other and get support from others. But I question whether social media is the tool we should be using for those things, I think it's the tool we're using for these tasks, because it's the tool that we have. And I sometimes wonder if what we need to do is find a platform for the same activities that's safer than social media. I said in the chat a little bit easier. Facebook wants to brag about the LGBTQ support groups and, and and pride parade organizations that happen on their platform. But what they don't like to talk about are all the LGBTQ teens who they've accidentally outed to their parents before they were ready. These platforms are designed for sharing everything all the time with everyone. And I think a really important and essential aspect of, of making that essential that necessary progress in our society is the ability of people to choose when and how and to whom they disclose their secrets. And sometimes it's the secrets of who they are inside or the secrets of how they think their society is treating them poorly, where they don't know it's safe to say this elsewhere. And when they expose themselves to that, on social media. It's it's not a safe and welcoming and judgment free place for that. And I think that's what we really need. And I wonder if community oriented nonprofit organizations might work to start providing better forums for those kinds of conversations, which I know is an awful lot to ask, especially from a sector that still has so many problems. But I think it's still fair to criticize social media for what it is, while acknowledging that we can still sometimes use it for good things, but it's probably not the best tool for those things.
Unknown Speaker 47:53
Any comment, or anyone want to add to what Matt just said,
Unknown Speaker 48:01
I mean, I just think practically having worked in social media for a bulk of my career, so I recognize and acknowledge my complicitous in that, um, I don't know of any tool or platform that is accessible as accessible to as wide an audience. And as cost efficient. And I realized that we're talking about the human costs, the mental health costs, the privacy costs of engaging in social, but I am unclear of what those alternatives are. And it's my job, and my organization to be effective and efficient with spreading our mission and doing it in a way that engages a community that reflects the state of California that reflects the United States. And I, you know, I also oversee email marketing, and I oversee our website. And those are great platforms that they don't get the reach the engagement, of social so I just want to be a little real about the choices we have there. And I don't want to be narrow minded and envisioning alternatives, but I'm really struggling with what those are.
Unknown Speaker 49:13
When one of the one of the things that was coming to mind as we were discussing this panel
Unknown Speaker 49:19
in the lead up to this was
Unknown Speaker 49:24
I'm a big fan of Cory Doctorow, I probably wouldn't surprise anyone on this call. And he read he's done a ton of work in this area. And he references like four areas that we can direct energy and resources to try to make change. legal, financial or economic, social pressure and technical solutions. And I kind of think that the the like legislation and regulation might be the area that we can push that we haven't pushed very hard as as a field yet. I think it's useful to think about metaphors with us and cigarettes was one that came to mind, I've already mentioned that. You know, they're their warning labels on packs of cigarettes, they say they're highly addictive, their age limitations for people who can legally access those things. We have regulations I, Tristan Harris, who is the co founder of, or the founder of the Center for humane technology, these spoken Congress many times about regulations and making comparison to Saturday morning cartoons. And, you know, trying to be very mindful of the advertising that we're feeding children, that same level of scrutiny and regulation hasn't hasn't come into these spaces. And I think I think that that's something that's a real opportunity area for museums and big institutions. You know, this is a space that that can be a leader where we can apply some pressure. And I think we should
Unknown Speaker 51:19
I think one of the when we were discussing the session internally, as well, we one of the things we realize is that of all, you know, from looking at all these issues, and the areas that the impact is, you know, what were what are the areas where we have a, you know, almost no, no influence to those, you know, on upgrading kind of scale to those that we're, we have a much larger chance of, of influence here, and protecting user data, and seemed to be the one area where as in, you know, individual institutions, we actually had the most opportunity to, to have an influence and, and preserve, you know, the privacy of our, or ensure that our user data is actually secure and safe. And so, and I'm just curious, and I know GDPR was, is a European regulation, I think it's probably two years old now. are many of your institutions, using it even as a guideline, even though it's not enforceable in the US? And if you want to put that into the chat, I'd be curious. Or if anyone wants to, if anyone wants to take the microphone and ask your question,
Unknown Speaker 52:44
I'll say about GDPR that I've been asked about it, institutions have been asking about it. I am not familiar with an institution that's fully in the US fully embraced it. I personally would be very interested in learning a lot more about it and what it takes to truly be compliant. Right.
Unknown Speaker 53:06
Yeah. I see anyone in the in today's done that fully, fully implied GDPR at their institution. I was Scott is Lucas got to look into it at the Corning Museum of Glass.
Unknown Speaker 53:24
Do we have any Europeans who Yeah, have done it?
Unknown Speaker 53:29
Must have the moseby Come on people. Don't be shy. Okay, is Elizabeth Gavin here? I wonder if the VNA has been must have no. Not sure if he's here.
Unknown Speaker 54:02
Saskia skeletons here.
Unknown Speaker 54:04
Yes, that's from the Rights Museum.
Unknown Speaker 54:08
Yeah, I'm about to leave so I don't have a lot of time unfortunately, because this is really an interesting discussion. and interact me.
Unknown Speaker 54:21
Let's do Saskia. Your sound is just went off. I think you have an issue with the sound unfortunately.
Unknown Speaker 54:46
Can you hear me now?
Unknown Speaker 54:47
Yes, yes, yes. Yes. Go ahead.
Unknown Speaker 54:51
Sorry. I'm about to leave. So unfortunately, you have very little time but the discussion is really relevant and really very interesting. At the Rights Museum, we have implemented GDPR, about two years ago, it took us about two to three fge to implement that, and to lawyers and legal advice. So it was quite intensive. And it took us about two years for everything to come down and to implement it into surfaces that was, but it made a lot of people aware of what was happening and alerts about the decisions that I made. So.
Unknown Speaker 55:50
Okay. I just wanted like I see, I see some current here that Orson center sentiments expressed that it, yeah, it's difficult to implement, that's for sure. And it ends up not really fulfilling its its intended purpose. I'm not super familiar with it. So I really can't say whether it's effective or not. But this is just one area of the the larger issues that we've been discussing here today. Any other thoughts around protecting user data in museums is anyone done any focus or work around those that that involves I, you know, GDPR, or other practices, practices that could have been homegrown to your Museum in order to to preserve user or visitor data? drain,
Unknown Speaker 56:52
this isn't quite related. I mean, it's, it's associated with that there's actually two European projects that we're just beginning to frame out the pieces of, and there's an interesting balance between the GDPR requirements, which are international, they're worldwide for being able to remove data and track individual data in a way that it can be removed, but also at the same time creating environments where users want to be able to store their data or use data in their own way. And the question of making that transparent, and transportable is something I haven't seen that much in the community right now. So we're, we're just at the early stages of trying to work through, what does it mean to be able to, you know, aggregate, and keep certain things anonymous, to be able to user identify and allow it to be cleared to gain permission from people for them to be able to keep data relating to what they store and how they store it in the way they manage it. And then, you know, lastly, then is sort of thinking more broadly about how users are able to take their data away with them and use standards based things that allow them to be able to, you know, take away their research or collected materials or annotations, or anything else, and, and thinking about how to potentially structure value that can be given back to them to honor what they're giving to us. And so I mean, these are, these are really, obviously big, complicated things that not a lot of people have been dealing with outside of a few of the folks in the tech community, but maybe next week, we'll be able to have a larger conversation about their
Unknown Speaker 58:56
share what I think I mean, we're coming to a close, so thank you during. So I think that you know, if anything, this session was really intended to bubble up these these are important concerns, you know, not just for, for us as individual but for for humankind, and the way that we use information and for the work that we do, there are no institutions. So I think the intention was that we take this work and continue to, to, you know, take action throughout the year, I'd love to see another session on this topic, and next year's conference, and I think the StG that may talk about those issues the most is the data and insight sake, so I encourage you to join them. And thank you for your time. Thank you, Marty, Sarah, Dana, and Matt, Nick Hill. And did I forget any of that? No, I think that's it. And for your time, it was great. Great looking into this session with you guys and thank you for attending