DEEP DIVE: The ethics of #musesocial

Social media such as Facebook, Twitter, and YouTube provide an invaluable conduit to enormous global audiences, but the last few years have revealed an unseemly dark side to these platforms. What are the risks for institutions and the public? To what degree are we asking audiences to sacrifice their privacy and wellbeing when we invite them to engage with us on social media? And how can we leverage the positive aspects of social media without endangering our users? This deep dive will include a framing presentation, case studies and discussion prompts, and small group facilitated discussions to explore the evolving ethical landscape of social media. The direction of these conversations will be guided by the room and could include issues related to: privacy, negative and harassing behavior, addictive interfaces, cultural appropriation, promoting institutional values and protecting reputations, intellectual property, freedom of speech and political expression, mental health, and social photography and consent.


Unknown Speaker 00:00
Hello friends, it's 130 Let's get started. Good. I was asked to let you know that we have Smarties up here just in case. You need a little. And they're they're extra large Smarties, which greatly increases their tastiness. Okay? This is the ethics and news social session. Everybody's in the right place. Okay, we're gonna start with who we are. Then I'm going to ask you a little bit about who you are, and then we'll get started. I'm Dana Allen grill. I'm the Director of Digital Strategy at the Monterey Bay Aquarium and formerly started Social Media at the National Museum of American History at the Smithsonian ran social media at National Gallery of Art. Then at the National Archives, and now at the aquarium. I don't produce content anymore, but I do oversee the team that does.

Unknown Speaker 00:56
And I'm Vicki Portway. And I'm at the Smithsonian National Air and Space Museum. And I also am Director of Digital Strategy. I started our social media presence back in 2008, about the same time you guys did and have managed that up until a couple years ago, and it moved to another area, the museum, so not actively managing it at this point. But I'm still very much involved and interested in

Unknown Speaker 01:21
these issues.

Unknown Speaker 01:25
I'm Poppy. I'm a software developer at the Denver Art Museum. I'm here primarily as kind of the technical consultant today. I work with our social team a lot. We mostly be talking today about how social interacts with our sites, specifically in architectures.

Unknown Speaker 01:45
Oh, hey, everyone. My name is Susanna Addison, I'm an assistant professor at George Washington University in DC. And I teach courses on museums and technology museums and social media and also museum ethics. So this for me is a it's a confluence of the things that I'm thinking about and dealing with as a professor, but also just in life, I think for all of us, right.

Unknown Speaker 02:09
I'm Laurie Byrd McDevitt. And until this past summer, I was the digital content and social media manager at the Children's Museum of Indianapolis for about a decade. Hard to believe, and that is the world's largest children's museum in Indianapolis. I'm also a professor at Johns Hopkins and Adjunct Professor along with a few of us here and probably a couple of people in the room. And so now I've stopped out on my own and have a little digital marketing agency.

Unknown Speaker 02:38
My name is Hilary Morgan watt. I do social media and video production for the Hirshhorn Museum. Herschel museum sculpture garden. It's the Smithsonian's modern contemporary art space. And I was the first person in that role. And I've been there for four years. So before that they weren't really thinking of it. And similarly before that, at the Natural History Museum was the first person in that social media role as well.

Unknown Speaker 03:00
And I am Mike Edson. No, I'm not he's not here. How many of you saw Mike Edison's Ignite talk on Tuesday? Okay, that's kind of in a nutshell what he was going to talk about here, but not in wrapped form. He did record a video with his slides. And we'll have a link to that in at the end of our slides here. But we really felt like that wasn't the best way to use your time when we're all in a room together and can talk to live people who are not a video. So we encourage you to check that out after this session. Very thought provoking. And we'll include a link to that and then link to the slides at the end of the session. So I hope that's okay with everybody. Okay, now, let's hear a little bit about you. We can kind of do quick poll, you can tell me if there are three categories not on here. So are you if you're directly responsible for social media content or engagement? Let's see a show of hands. Okay, so half ish. Maybe you supervise social media staff. Okay, awesome. Are you just generally a concerned user of social media? Everybody? Are there any with that? Is there anybody who's responsible for legal or privacy issues? Okay, awesome. Thanks. Thanks for being here. What am I missing? covered at all. Okay, well, thanks for being here. This session is for everyone. The context is this is sort of our our thesis statement. I'm sorry, I'm thinking I'm controlling I'm not controlling the slide. Okay. Social media such as Facebook, Twitter and YouTube provide an invaluable conduit to enormous global audiences. But the last few years, maybe the last entire time that they've existed, have revealed an unseemly dark side to these platforms. And that's we kind of want to Talk about that duality that social media really provides amazing opportunities for us to connect with people and connect communities. But there are also really serious downsides and consequences of engaging on social media, and how do we balance those when we make decisions for our institutions? I apologize, I have a little bit of a cold. Okay, these are the kind of the key questions we wanted to grapple with. What are the risks for institutions? And then I added and staff and the public? To what degree are we asking audiences and our staff to sacrifice their privacy and well being when we invite them to engage with us on social media? And how can we leverage the positive aspects of social media without endangering our users. So we're going to figure all that out all the solutions in two hours. setting the expectations high. Alright, here's what to expect in terms of the format today. So we're going to everybody sitting in this row is going to do kind of a quick case study or discussion prompt about five to 10 minutes each. So we think that'll take about 45 minutes. And we're going to kind of ask for topics that how we want to organize into small groups and have a meaty discussion around 45 minutes for that, and then do a report out from the group to kind of share what what kind of bubbled to the top, both in terms of issues as well as maybe potential actions that we can take, as individuals and as institutions. So that's sort of the layout of what we'll be doing for the next two hours. Take it away, Vicki.

Unknown Speaker 06:37
Okay, so I just wanted to really touch on basics to kind of get started. How many of you have a code of ethics or code of conduct at your museum, particularly with reference to social media? That's great. Okay. So the kind of ethical dilemmas we get ourselves into, can often come as a resort or resort result of conflicting values. And so what we prioritize as important can sometimes drive us in directions that unwillingly or unwittingly get us into ethical dilemmas. So I like to think about it in terms of does your organization's behavior on social media align with the mission and values of your organization? Are you acting in a way on social media that reflects those? And have there been times when your organization is encouraging unethical behavior? A lot of times when people get in really big, on ethical problems, or even illegal problems, it started very small, and it kind of developed over time. Sometimes organizations really encouraged the behavior, not knowing that it's going in that direction. So you definitely don't want to end up in the legal arena. So I pulled out a couple of prompts or examples to think about, one that has popped up over the years for me is your influences your responsibilities. So there have been times when, for instance, a department might come up with a great idea to engage young people on social media, let's get middle schoolers to tweet each other or tweet the museum without being aware of the laws that govern that kind of thing. And so everyone here is familiar with COPPA, COPPA, Potato, potato. Oh, not as many people as I thought, Okay, well, that's cool, because I can spend some time on that. So it's basically the FTC rules or law regarding and you can correct me if I get any this wrong, that deal with collecting personal private information from children online, and interacting with children online. And a lot of it has to do with parental consent, and what you can and can't do. And it applies to everybody. If anyone was in the earlier session today, DNA from Smithsonian touched on it a little bit in terms of collecting data. I think about it in terms of what are we encouraging people to do? And is it directing them in a way that's going to make them break rules. So as we all know, to have a social media account, you have to be 13 years or older. And that's because of COPPA. So, if you're tweeting something out, that's meant to get kids excited, and they are following you online. Because some parents decide that they're going to let their kids sign up at 11 years old for Twitter, then that's actually not ethical, in my opinion, you're actually encouraging those kids to want to sign up for social media. And so it's kind of this fine line where you could encourage them to engage in illegal behavior even if you're not but the bottom line being we have to be really careful about creating a safe space for kids and not get ourselves into trouble in that arena. So if you aren't familiar with Cobra, I would encourage you to become familiar with it. Another aspect and this is really kind of old news but surprisingly still happens a lot. Is being human the poor issues of having a lot of resource constraints within the museum field is not unfamiliar to any of us. But automating things can really go south. Is everyone familiar with what happened with progressive auto back in 2012? Okay, so in 2012, progressive auto had two clients, one of whom was paying their insurance premiums and died in a car wreck, and the other one was driving the car. And so it became a huge issue when the brother of the woman who died, wrote an article that said, my sister paid progressive auto her premiums, and there are defending her killer in court. And it blew up on social media and everyone started tweeting to progressive auto for a response. And they gave a robo response to every single person who tweeted them. And that, of course, blew up even further on social media. So it was not a good look for progressive, a very tragic situation, as they said in their tweet, but unfortunately, it was the same exact tweet to everyone. So they really were not appreciate it was not appreciated that that we're not being human, and giving an individual response to everyone, surprisingly, I've just found one from this year with Snapchat doing a robo response, little different situation, they were doing it in terms of their IT support. So there was a broken, whatever, I can't remember what what's called, because I don't use Snapchat.

Unknown Speaker 11:25
And anyone who tweeted them about that particular feature would get an automated response. And of course, they were called on it, people were smart enough to figure it out and started, you know, teasing them for this. So I've been in situations where people didn't want to be completely transparent about the reasons for something, and I've encouraged them, just just be honest, because people are smart, and they're gonna figure it out, they're gonna call you on it, and you don't want to be called out on it on social media. But I also just think it's unethical, it's just not the right thing to do. Another one is engaging people in controversies. So we have a very, very specific position about having landed on the moon, it happened. But about 5% of Americans are denying that it happened and do not believe it happened. And they happen to be very, very vocal about their opinions. And so we've, you know, in several situations, had folks approach us on social media or in lectures or wherever about this and really want to engage in conversation. So of course, our mission and our value being that we want people to know the accurate information and increase and diffuse knowledge, our mission. At the same time, it's a really difficult thing to give people a platform in which to advance their denier ideas. So you're kind of you know, rock and a hard place. Sometimes you try and take these conversations elsewhere, like we had a guideline we were somebody keeps engaging you too much on social, you take them to email or some other place to have that conversation. But these kinds of things can get really drawn out and difficult to deal with. And then there's, you know, other topics like climate change, or anti vaxxers, that, you know, people actually die in those situations. So this is not quite that, but it's just an example of how engaging on controversy can do harm to others, if they're attacked in that arena. Or if a denier decides they're going to latch on to someone who believes that we landed on the moon, and then we're the platform for them to kind of go after those people. So that can be difficult. And then the last one, for me, actually, Mike gets touched on this a little bit in the night taught was giving access to our content on social media, and only on social media, which happens a lot. I mean, it's it's very quick and convenient to get your content out on social platforms. And not always quick and convenient to catalog and put your images in the dams with, you know, all the content that goes along with it. And we all face that problem from an archival perspective. But also, I think it's just an unethical thing. To be putting your content in one place that isn't your branded platform, and force people to have to join Facebook or join another platform and give away their private information in order to get to your content. I just don't think it's ethical. And I think it's something that is only going to get harder and worse over time. But that we have to start dealing with. This is just one example. That is not social media, but Google Arts and Culture. Clearly, they want to engage with museums and put our content there. This is a case where we had content on both our platform and their platform. Totally by luck. This was not because we have tons of resources. It just so happened that they did Streetview of our museum and put it on the platform. And then we actually had some volunteers who were doing 360s in the museum at the same time. So we have a virtual tour on our site. And we have a virtual tour on Google so great people can get to it from you know equivalent experience, at least on both platforms, but this is not the norm. So how moving forward is Actually, if social media is going to become more and more about experiences and not just individual posts, how are we going to ensure that people aren't going to be forced into these platforms that they don't want to engage with, or we just lose the opportunity to, to engage them because they don't go there. So something to think about, just in closing, because I'm over a little. So there's a couple of really good articles that you can google ones from Harvard Business Journal. And the other one is, if you just Google when good people make bad decisions, really interesting, some of the studies they've done, about people getting into real legal trouble over doing bad things, and the causes for that, and this is just a lot of the main causes, particularly if you have a toxic organizational culture that encourages bad behavior and forces people in a lot of cases to feel like they have no other choice. internal pressures and resource constraints. Obviously, if you're in a situation where it's not safe to speak up and say, Hey, let's push boss, should we be doing this right now?

Unknown Speaker 16:03
That's really difficult. But it can also come from just lack of knowledge or training, you know, if you don't know what COPPA is about and or COPPA, and are engaging kids online, just because you didn't know the rules that's different. And then, of course, if you don't have a process in place to stop yourselves and think about stuff before you move forward. And then generally, ethical decisions happen every single day. If you take a pen home from work, technically, that's unethical. But you know, day to day decisions, and really thinking about social media post by post and asking yourself the questions that can stop you from doing harm to pinch it potentially doing harm to your audience, if you just take the time to stop and try and frame a process that can avoid those situations.

Unknown Speaker 16:55
Thanks, I need to take a picture of your last slide. Okay, Matt?

Unknown Speaker 17:03
Can you hear me? Okay. So it seems like there's a new, like privacy scandal in the news pretty much every week now. Facebook, Twitter, Instagram, LinkedIn. Like, I don't think I have to go into too much detail to talk about how, what like every other week, there's a new thing with Facebook losing a million users data, or handing over a million users private data to app providers or something along those lines. And it's not a new scandal. It's discovery that the scandal we reported last week is actually much larger than originally reported. Facebook in particular has this habit of saying Facebook in particular has this habit of saying that. Facebook in particular has this problem where they will report like 400 apps 440 apps had access to people's private information that was a violation of Terms of Service. And then six weeks later, it was 5000 apps. And then six weeks later, they reported Oh, no, we're sorry, we were wrong. It was 10 1000s apps, 10s of 1000s of apps. And a lot of these companies, they don't seem to be taking our privacy and our security seriously. And I can go to the next slide, please. What's the consequences of this? Why do we care if our privacy gets leaked? And then there are the obvious problems, someone has my personal data, I'm going to start getting more spam. All the things that we kind of think about individually, but on a larger level. problems we have to worry about with our privacy and our security being, you know, not respected by these networks is that discriminatory advertising can happen things like in 2016, Facebook was sued by the HUD by HUD for allowing discriminatory housing ads to be placed on Facebook ads that specifically targeted white people for affordable housing. And you could target a segment in Facebook where you said, you know, I only want this ad to go to people who I'm pretty sure not people of color. And that way when I'm selling my affordable housing, I know who I'm gonna get. In 2016, this housing suit was filed, Facebook claimed to fix it. Two years later, HUD opened up the suit again, Facebook claim they fixed it. In March of 2019. HUD filed another suit against Facebook is still ongoing, so they don't seem to really care about fixing these problems. Another ACLU sued in September 2018, where Facebook was sued for allowing gender discrimination in job ads, targeted misinformation and scam campaigns. We all know about the Cambridge Analytica debacle that happened both in the US and in the UK with the Brexit vote. More recently, Facebook was found. I'm surprised that this ad segment exists. But Facebook was caught selling ads to a company that was selling Nazi memorabilia online. And they were specifically targeting people who were anti Semites. So there was a Facebook segmentation category in advertising for people who don't like Jews. And you have to wonder, how did how do we get this information? Why does Facebook even have that as a segment? That's possible? They don't anymore because the controversy caused them to remove it. But how does a machine algorithm create something along those lines? And Why did no one catch it? Why did no one remove it? And then, of course, there's targeted scams and other forms of misinformation. We've mentioned anti vaxxers Before vulnerable, vulnerable populations. One of the more common ones targeting known migrants with ads that talk about give us your information, or you could lose your visa kind of thing. You've probably had similar phone calls on you know, like robo calls, but Facebook enables these kinds of ads on their platform as well. And, of course, the Rohingya, the genocide that occurred in Myanmar recently, the riots where racial violence was stirred up against a particular ethnic minority in Myanmar. And what was it I think, was the UN investigation that determined that Facebook, misinformation spread through Facebook was instrumental in starting those riots and causing those things to happen? Excellent. So I want to talk a little bit about network effects. Because how do these things happen? Where does this information come from? Why does Facebook Twitter, Instagram, LinkedIn? Why do they have the categories that they have about people? And how are those things used? And the most obvious answer is that when you're on Facebook, they track everything you do.

Unknown Speaker 22:05
But a slightly less obvious answer is that when you're not on Facebook, they're also tracking everything you do. Every Facebook Share button that we put on a website that uses Facebook's API, is reporting to them the same information that analytics would get, every, every time we use the reaction widget, every time we embed a Facebook feed, every time we embed a Facebook video. One of my favorites are these AD conversion pixels. So when you put an ad online, and you get these pixels, these little bits of JavaScript that track conversion from that ad to your website. Sometimes those companies are owned by Facebook. And sometimes those companies are just selling their data to Facebook and Google and to others. So even when you're not dealing directly with Facebook, sometimes you've got Facebook code on your site from these third party trackers, these third party embeds. And let's go ahead and next slide. So what can we do about this? It sounds pretty bleak. We have this massive global surveillance network that is invading the privacy of every single person online as often as it possibly can. And the end results can be pretty terrible. But we actually have some power over how widespread this is, we can't control what happens on Facebook site, but we can control what happens on ours. And some of these things are low hanging fruit, they're easy for us to get rid of AD conversion trackers. I think one of my favorite things about AD conversion trackers is that they're mostly useless. The data you get from them is pretty crap. And if you are using UTM links or campaign links on your on your marketing materials, instead, you can actually get way better data from analytics than you could ever get from a conversion tracker. So there's really no reason to put that third party code on our site. That's that's probably the easiest one to eliminate. Share buttons, the little Facebook, share this on Facebook, or retweet this on Twitter or share this on Instagram. It's very easy to use the Facebook and Twitter API's in such a way that we don't actually have to put their code on our site ever. We can get away with making those opt in technologies. So for instance, on the Denver Art Museum site. The Facebook Share button doesn't actually use Facebook's JavaScript code. It's just you click the link and then it opens up Facebook and then you can share it. So if you want to share something on Facebook, if you want to use that share button, you're opting into it, you're already logged into Facebook. It's not something that affects every single person who loves the site. Unlike the Facebook Share button which even if you don't interact with it, every single person on your site is being logged by it. embedded social feeds there are so many API's so many plugins for WordPress and Drupal and other CMS that take care of those things for us. We don't have to use the tools that Facebook and Twitter and Instagram provides To us, we can use a whole selection of other tools that are available. And if anybody wants to, we can talk about some of those alternatives after the session. So those are kind of the easy things. And what's on the next slide. So what's not easy? So we kind of need to talk about analytics to because Google is as much a part of the problem as Facebook is. And what what do we really get from analytics? What do we really use with analytics? And Are there alternative ways to get that same information? I don't know about your institution. But in my institution analytics is wildly underutilized. There's an absolute ocean of data in our analytics data. And the overwhelming majority of it is pretty close to useless without having some qualitative research guiding the questions that we're asking and helping us interpret that data. So we don't, because we don't have time for that qualitative research. So could we get by with a product that gives us less of that information? Could we get by with a product that doesn't spy on our users quite so much. And also, analytics is wildly inaccurate in a lot of cases. I mean, I know that I block it on all of my devices. Google Analytics never logs my activity anywhere, because my systems don't even load it. And I know, I'm not the only person who does that. So are we relying on a metric that doesn't even provide us information, let alone, you know, the information we need? And I think also,

Unknown Speaker 26:40
we know that better tools can exist. We know that there are third party analytics programs out there. There are commercial analytics programs out there. There are open source analytics programs out there self hosted analytics programs that we could use. But most of us aren't, we've all sort of defaulted to the free option that everybody uses. And I think like, the key point I really want to make here is we're never going to get the ethical tools that we want to use, we're never going to get the ethical alternatives to these technologies, if we don't ask for them. And it might be time for us to start having those conversations with our consultants and with vendors and with the companies that provide us with these tools, or else, we're never going to make any progress on this. So hopefully, this is a chance for us to start those conversations.

Unknown Speaker 27:33
All right, that is pretty heavy stuff. So I have an honor for you. But this isn't a cute story. I'm sorry. So this session is not under Chatham House rules. But I will ask you, personally, if you would consider waiting until I'm done with this case study and deciding if it's the right thing to do to tweet about this. That's, that's up to you. But I will ask you to at least wait a few minutes. How many of you have see I heard some sighs How many of you saw this tweet last year? Okay. I had just started at the aquarium I was like three weeks in and I was actually out sick the day this happened. So that's fun. Um, there's a lot to unpack here. So I'm just gonna look at this through a couple of lenses. There are more happy to talk more. I've talked about this many iterations I presented to the board about this. So there's a lot that I may skip over that you want to dig into. I'm happy to do that later. But for now, I'm just gonna dive into a couple of of angles on this. So as you can see, this tweet got a lot of engagement about 57,000 people liked it. 18,000 people talking about it. The content itself is actually a mash up of several memes. This was what I had to explain to the board. And then I was like, well, let's talk about what a meme is. And yeah, so you probably get this but there are references to other memes such as the absolute unit from the Museum of English rural life. There's references to the chunk chart, which is like a scale of how chunky a house cat is, which house cats are really popular on the internet. So there's kind of a lot going on here and you don't really get to the science content until the next thread where we shared the hashtag body positivity and then talked about how awesome Abby is and the reason that she's so bulky is she's doing this really amazing job being a surrogate to stranded sea otter pups. So there's a science story here but it was a little buried. So that's one thing to think about. The Washington Post thought this was newsworthy. Basically, within a couple of hours this tweet had I'm really viral and someone on Twitter who is a professor at the University in the science field, kind of called us out. And her concern was not with with the scientific content, but what she called us, called us out for appropriate appropriative language that was insensitive at best and potentially perceived as violent. And while we were kind of framing our response to her, we, her DMs were closed, and we couldn't kind of reach out to her directly. So we knew we needed to frame a public response. And in the meantime, this tweet was taking off. And I think it was really slow news time, because it was already kind of several media outlets, LA Times, et cetera, we're writing about the viral illness of this tweet. And so we knew that our response to this could not be to delete the tweet, which was actually what the preference was of our Head of Diversity and Inclusion, because he felt that we were continuing to do violence to our community by leaving the original tweet up. We felt like that would look like a lack of transparency, and that we could use this as a learning opportunity and kind of take ownership of what we did, and address it to the people who had seen the original tweet. Here's a quote from The Washington Post, I just want to point out here that they called out the individual by name, included her place of work, and linked to her tweet. And essentially, that's why then this backlash happened of our community, as well as other people on the Twitter who were looking for things to do that day, then harassing her, saying she didn't have a sense of humor, and this is PC culture. And that escalated, she ended up having to make her account private. And we felt really, really awful about that. So here's our apology, that it's a lot. And maybe you've already seen it, so I'm not going to unpack it too much. But other than just to say that, what we've what we've determined is that actually the apology itself, we still think it was the right thing to do. But it actually kind of whipped people up into a frenzy and had a really negative impact, potentially even more negative than the original tweet.

Unknown Speaker 32:21
So that's just something to think about. So this platform that we typically use to bring people marine biology, joy, and to inspire conservation of the ocean basically became a weapon of insensitivity racism, sexism, even ageism, people attacking the the people who didn't think it was funny saying, Well, you just don't get the internet. And a lot of kind of internal strife. So you can see I have a lot of feelings about this. And we've been talking about this, it's been almost a year, and there's still a lot, I think, to learn from this case study. I'm gonna kind of switch gears here to a different case study that I think illustrates, oh, sorry, yeah, it was all over for a while lot. And Fox News said we fat shamed an otter. So there was that. And then we did have some followers coming out saying, you know, you really need to stick up for this, go to the next slide. So, you know, this is this person is now being harassed and abused because of you. Okay, so the next fun topic, climate change. I actually see a lot of parallels here. I think, the point that I wanted to talk about today, and again, there's lots of layers here, right, but is that our platforms can be used in ways that kind of give voice to something that we would never intend to want to support. We've found this a couple of times when we, when we post about climate change, I'm sorry, that's a very, very tiny font on the left. But essentially, it was Seattle Awareness Week, and we interrupted Seattle Awareness Week to say, hey, the, the UN IPCC just came out with like, really depressing report about climate change and how bad our oceans are doing. And if we don't do something, we're all in really bad shape. But with our like, positive tone that we always put on those kinds of things. And then we get responses like this. It's called whether it changes all the time. And our team really prides ourselves on we have really deep subject matter expertise. We we spend a lot of time on social care. And we take all of our responses really seriously. You see that 50 people interacted with Shelley's response. And so it's really important to then engage in responding to those kinds of comments and you see that our response got 124. There's dozens of these on just this one post. That's a lot of time. But essentially, what we're thinking about here is our initial reaction is to model you know, respectful science based response to that kind of comment. And this one's not inflammatory really. But to just do that over and over so people All who were following us can see that you can have, you know, you can continue to have a respectful conversation and try and educate people on these issues if you go to the next slide. But we've now started to think when we're getting dozens of these, a, we started realizing some of them are bots that are just looking for posts about climate change and sticking a myth in as a response. And then we're spending a lot of our staff time responding to that. But it's what is the net effect of this right, like, it feels like for our organization, we can't not talk about this really important international report on climate change that's really core to our mission, inspiring conservation or conservation of the ocean. But what's happening here is our followers are now seeing more misinformation than they might have if we didn't post about this report at all. And so what really, is the benefit of engaging in this way? Is this the best use of our time, is it the best use of the attention span of the people following us and sort of what is the right way to, to handle this kind of issue. And in this particular case, it actually devolved into people that calling each other's names. And so in addition to spreading misinformation, and myths that are not based in science, there's also then people personally attacking one another. And now we're in the business of saying, Hey, we all need to be respectful and get along and spending our time on that. So these are just a couple of case studies, I wanted you to think about in terms of not just what the platform's themselves are doing, but what we're kind of complicit in when we participate in platforms that are

Unknown Speaker 36:42
permissive of this kind of behavior. And now you can decide if you want to tweet about that.

Unknown Speaker 36:53
Continuing on, sort of the role of social media, I was thinking of something to, you know, maybe a gap on this super meaty panel, so I thought I'd talk about the position itself, and who's in it, and who manages it, and how you do content. This tweet. Oh, really? Hello, I'm very sorry about that. So I'm gonna talk about the role of social media itself, and the ethics around it, and who does it and how you're stuck with it, and the hours and the schedule, and all of the demands. This was a tweet sent to the National Portrait Gallery, like last week, my colleague COVID, banana goes, you screenshot it, and send it over to me. And I'm sharing it, because it's the thing that if anyone who's been in the role, you receive this pretty pretty frequently, probably once or twice this season, you know, Everyone just assumes you're the intern, you're the unpaid intern, you're there temporarily, you're not qualified, you're young, it's gonna be funny, like, Oh, it's so cute, you know, making the intern laugh, like, this happens all the time. And this is sort of a good piece to keep in mind as we sort of move forward. Because basically, the role it's, it's really isolated with minimal support. And from the ethical framework, I'm saying that you're basically consistently set up for failure and trauma. And that's, we're gonna move forward with this so we can have the next slide. So let's start with the the internal framework of how you fit in with your team and how you know, where you are in in the role in the department and the hierarchy, how you interact with the board. Basically, it's lonely for a lot of reasons. Traditionally, it's probably one person if you have three, that's amazing. That's, that's very exciting. And I'll come back to that, too. Because even that's not enough relief, depends on the content that you're dealing with. We all know that everyone thinks they can do your job because they have a Facebook page or an Instagram account, or they like to post photos of their food. Everyone's a photographer, I encourage everyone is artistic. Everyone has ability. I do agree, but not everyone's strategist. And so because of that, if you get a seat at the table, it's not equal. You know, when you're in an exhibition team meeting, no one's really questioning the conservator or the registrar in the way that they'll question what you do. And that's just overtime, maddening. There's really nothing you can do to combat that. And this sort of the system of where you are with your team and your departments, nothing's really changing. If that's changing for you, I'd love to talk to you afterwards. And so that you're also even lucky to get a seat at the table because again, if everyone thinks they can do what you do, decisions are gonna be made and other meanings without you. No one's gonna loan an artwork without talking to the registrar, but someone will contract post on social media and a development contract with a donor. Why does that happen? Again, it's like maddening you're just gonna be down this crazy spiral. And, and then with things, situations like that happening over time, you're just set up to be the no person because you're trying to explain, you know, a larger content strategy, you're trying to explain policies, jokes that are not funny. How you can't turn a 10 page research paper instantly into an Instagram post how things take time. And so instead of hearing Claire I had this great comment yesterday. Yes, yes. But instead of hearing Yes, but which I'm going to try more now, they're still mostly just hearing the no part or that eventually, it's leading to a no. And then, with the role as it exists, now, it's over a whole spectrum of qualifications. It can be three people, it can be one person that can be highly qualified. They could, it could be 1/5, of a press person, or 1/5 of an educator, it can be an intern, it can be a contractor, it's it's all over the place. And there's a lot of regulation, because it's new. And there's a lot of excitement there. There's a lot of potential, but mostly with museums, we know that means it's super underfunded. And it's really low on the on the priority list for existing or, or, you know, any promotional aspect. And then if you look, I can speak again to the Smithsonian, across the different museums and research centers, there's probably difference of $50,000 of people doing the same job, like that's insane. And so then go into private museums, I'm sure it gets even worse and even lower. And then, again, all of that just adds to the confusion of what is the role who's doing it, you know, should they be in this meeting? Should they not, can I make the decision for them. And then building on that the schedule is impossible, if you're doing the monitoring. Everyone thinks, oh, you know, just check it once at night, or check it on the weekend. And, you know, it's, it's the is it fair, sort of the hours you put in versus someone else, like people who leave at five versus people have to monitor at night, there's a lot of assumptions, you know, if you're in senior staff, you're making three figures, and sort of some of that is expected to go along with all the extra hours and the responsibility, but that is not reflected in people who do social media, or community management. So there's a lot of inequality, which I'm sure a lot of you are familiar with. And I say this again, it's like a privileged white woman doing social media. So I do have that in mind.

Unknown Speaker 42:07
And then it's all of the monitoring, it's all of the responsibility, and none of the respect, and just like this dark cycle that you're on. And then in terms of professional development, we know like, Yay, for everyone who's here, this is so great. This is training this professional development, not everyone can afford to be here. The platform's constantly update algorithms constantly update, if you're not on top of your game, you're not going to be good at your job. And so now you're the no person, and you're the person asking for money, because you need new tools, you need to be a video producer, you need to pay for ads to compete on Facebook. So there's just there's just a lot going on with that. We can go to the next one, we'll jump to external forces, that's just a peek at inside of what you're dealing with. And then we know that you're basically a filter for the worst of humanity. And you're seeing joyful moments and happy moments that are shared, but you're also seeing like the absolute worst of the internet. And depending on the institution, what your museum talks about, you're going to sort of see the worst on like a day to day basis. The worst in racism, sexism, hate, and even like violence. I mean, who has had to report like threats of violence to security, who has had to report, you know, suicidal comments, you know, to security to the platform themselves, it's, it's a lot. It's often directed the museum, but even if it's directed at, even if it's not directed at you, it can still obviously cut you it can be content that's hurting you at your core fundamentally in, in conflict with what you believe. And there's your back to the schedule, there's no break from it. So even if you have a team of three, which would be so cool, you get it, you get a week off, but that doesn't really solve, okay, you've had a week off from hate, that's great, good month, positive month, but you still have to go back to it. Like, there's not really a good solution to that. For private museums, you might have a little more leeway with your terms of use and how you engage with the community. I work in a an institution that receives federal funds. So there's lots of rules about transparency and access. And our lawyers right now we're really struggling with how we could block someone or not or how we can hide a comment or delete a comment. We do have Terms of Use, we have community guidelines, but they're guidelines. So right now, sort of the rules are basically there's there's nothing you can block. You can I don't even that we had a big conversation about it this summer that there's nothing there's no no filter, there's no barrier. You're just kind of supposed to be able to see it because it's America and everyone has freedom of speech. And so right now, depend on the post. There's these comments on Instagram or Facebook. They're just sitting there and floating there and hurting you and hurting your community and just, they're just gonna stay there. And that's, again, I don't have an answer for that. It's just really hard. Next slide. I have nothing to say that's going to fix this for anyone. But I have a few framework ideas to sort of reflect on. In terms of boundaries and unplugging, I'm not going to minimize what you can accomplish, at least with your hours, right? Like if you're in that role right now, one thing you could do after you know this, this session is reevaluate the time that you're spending on it and try to come to some agreement. If you're checking nights and weekends track those hours, and how can those built into like a comp time? The internet's not going to stop it, like the humans have to wear museums, we're not, we're not solving cancer, we always say that. And so I pity the person who's doing social media for a cancer research facility, I don't know. But you need to have some boundaries. And you need to have boundaries, even with your your teammates. And even if you're off, then you have the stress of, but who's covered me in they're not really trained, right? Like my first like, two years ago, when I was like, officially got my unplugged vacation, an hour into it, I couldn't help a check, because I'm a masochist. And someone had responded, sorry, smiley face. And I was like, Oh, my God, I can't, I can't leave you alone for a minute. So investing time now, so that you can take time off leader would be great. And you know, having your supervisor your teammates support boundaries, you know, your content calendar, that there's no emergency now you should have asked for two weeks ago, and like how to like plan forward.

Unknown Speaker 46:34
And then in terms of weekly reports, this is sort of an act of stuff you might want to take, what we do at my museum, is we have a weekly report, I'm based in the communications and marketing department. So we just send highlights, and now it's called comes in 60 seconds. So you can have a quick scan of what we're doing in the news, web traffic, web highlights, and like social media highlights. So that's also sort of like a sneaky way to like some other people with like data of, you know, this is what we're doing. This is how it's awesome. And we shared this video clip, and it was seen by like, 30,000 people on Instagram or whatever. And like, over time, I think the hope is that people would be a little more familiar with what we do. So you know, good luck with that. And then I'm also trying to trying to embrace more weekly joy, that I need to have structure into my week, something that's gonna make me really happy and love my job. Again, I know I'm privileged, I have a great position, yeah, content yard, but I need to do something that makes me feel good, and gets me back to that spark of being like a new person of like, everything's amazing. So, you know, it could be something big, it could be something little, like a really quick fix for me is just going back into Instagram, like really talking to people who were posting the last week, you know, putting in full sentences, you know, either in Instagram, direct message or on their photo, and then they're obviously so excited. And it just sort of, it's a simple fix of bringing back some happiness. And of course, your community of peers, everyone here. The Facebook group is amazing. We seem Social Media Matters Facebook group. Its cryptocurrency genocide, but there's great comments on there, if you if you go. And also just maybe seeking out your own people, I mean, no one's going to be offended by like a cold call, or email of Hi, I really like your work, I would love to talk to you more about what you do. Like, everyone sort of feels that compliment. And I think everyone who does this work is really, um, sort of open to chatting more. And then another thought I had was, if you're not in the role of social media manager, you know, how can you be a better ambassador for them? You know, if you're managing them, chances are you could already do more, maybe you're all rock stars, there's always more you could do. If you're in a team meeting, and someone's you know, talking about a project, you know, maybe you need to like say no, actually, we should loop in Emily to consult on that, like hashtag or like that strategy, you know, so things to keep in mind.

Unknown Speaker 48:52
First of all, does anybody need a hug, cuz I do a deep breath thing, and oh, I'm sorry, I didn't mean to steal your thunder. I just, uh, you know, just having done social media so long ago, and feeling like we have an understanding, and everybody here probably has an understanding, because you used to do it, but don't do it anymore. It's so tough, and I would not want to be doing it today. The way that social media is today. So huge applause. You know, just if you need a hug, let us know.

Unknown Speaker 49:21
We appreciate you.

Unknown Speaker 49:24
Yeah, I actually think it is a time for just Can we take a second and just do like a good two deep breaths and like we did in the opening keynote yesterday.

Unknown Speaker 49:42
Because this is a community and we are here together. And hmm, was I think, talking about some really helpful strategies for dealing with stress, stress and the challenges of what we're facing with but there's also a there's an idea on thinking that a lot of The first sort of current thinking around self care is basically responses against capitalism and the system is pretty messed up. And we're trying to deal with individual solutions to a really systematic or systemic problem. And I'm going to talk about some bigger systemic problems. I'm going to talk about algorithmic discrimination and machine learning and AI and how we may or may not be contributing to that. So let's go to the first state. Some of you may have seen this, but I think it is just a very visceral representation of algorithmic discrimination. So this is a soap dispenser at Facebook. If anyone cannot see what is happening, the white skinned hand was able to make the soap soap dispenser work and the dark skinned hand does not make the soap dispenser work. And then when the person with dark skin uses a white piece of paper, while law, the soap dispenser works again. So we can we can sort of keep going from there.

Unknown Speaker 51:25
But, you know, algorithms, machine learning, all of those sorts of things are being built with certain assumptions built into them. And they're built based on who's creating them the things they think of as risks, in fact, a lot of our social media, things, a lot of the things that we're dealing with now I think about violence that happens about sort of the kind of gang violence that happens in social media spaces, there is an argument to be made that a lot of that is because when it was being created, or was not being created with safety of participants as a key priority. And Matt, before mentioned, Facebook, the way Facebook has been dealing with the US Department of Housing and Urban Development over purposeful discrimination against protected classes. So they, you know, would they they make movements to saying we are no longer enabling this, but of course, that still continues. So there's been a study at Northeastern University that ran a series of otherwise identical ads, but with variations around budget and headline or text and images and subtle changes to that language changes who would reach us, as does the content. So if ads were related to jobs, and now we're about preschool teachers and secretaries, they was trying to women, if they were around janitors and taxi drivers, they tended to show be shown in the feeds of a higher proportion of minorities, ads for sales of homes were shown to white people, white users, ads for home rentals were shown to more people of color. So we're really thinking about sort of complicity in bigger sense, because we all know, museum collections to have their own biases, and that this is building into some bigger things. Do you want to just go to the next thing. We mentioned Google earlier, many of you may have seen this was from the Google Arts and Cultural app, ask people to take a selfie, and they would get their likeness in 1000s of artworks. Turns out though, museum collections have pretty significant biases. Next, next slide. So people were finding that if they were non white, they were not getting a great variety of results, they were not getting accurate results. And the results that were coming up are often reinforcing stereotypes. So women of Asian descent, descent are being shown as geishas and so on and so forth. So there's a real sense of continuing and reinforcing systematic discrimination and the kinds of things that we've been talking about, I think, in the sector around how do we better deal with the biases in our collections? We're part of that becomes when we're putting our collections online when we're putting them into social media environments, how will we actually continue to reinforce existing oppression oppressions ultimately. So that becomes something for us to then think through? Can we continue? So the next one, then there are companies that are realizing this when we're thinking about machine learning, and we're thinking about artificial intelligence. And a lot of this is now coming down to things like facial recognition software as well. So in January of this year, IBM started to think about things like those challenges around algorithmic bias. germination to do with faces across gender and across race are not as well recognized. So, variations in faces are not well as well recognized. And so IBM put out this dataset called the diversity and faces dataset, and they described it as advancing the study of Fairness and Accuracy and facial recognition technology. They pulled a million human facial images primarily from Flickr.

Unknown Speaker 55:33
This is something that Kate Crawford and Trevor paglen say about this. The data set continues the practice of collecting hundreds of 1000s of images of unsuspecting people who'd uploaded pictures to sites like Flickr, but the data set contains a unique set of categories not previously seen in other face image data sets. The IBM di F asks whether age, gender and skin color are truly significant in generating a data set that can ensure fairness and accuracy, and concludes that even more classifications are needed. So they move into truly strange territory, including facial symmetry and skull shapes to build a complete picture of the face. Researchers claim that the use of cranial facial features is justified because it captures much more granular information about a person's face than just gender, age, and skin color alone. The paper accompanying the data set specifically highlights prior work done to show that skin color itself is a weak predictor of race. But as this begins, the question of why moving to Skull shapes is appropriate. Although flicker has tended to be you know, these are all works that were available in the comments, people said that they could be used. But when people putting them online 10 years ago, or 15 years ago, they were not necessarily thinking that their face was going to be used in surveillance training technologies. When we ask our visitors when we ask them to put images of their face online, whether it's encouraging museum selfies, we are potentially encouraging people to be training things that are going to surveil them. When we take photographs of our visitors, and we put them online, we are continuing to create systems that are going to be better trained around facial recognition. Next slide. This is actually I'm going to just play the first few seconds of this, this is some work that Kate Crawford and Trevor paglen have done.

Unknown Speaker 57:32
We can't hear it so we can pause it there. This is an exhibition that Kate and Trevor have recently opened up. It's called Training humans. They've been looking at how we can go to the next slide. They've been looking at how machine learning algorithms are trained around human faces and the kinds of taxonomies that are applied to humans. If we go to the next slide. And you know, and then we've got companies like IBM using footage from NYPD CCTV cameras to develop surveillance technologies that can search for individuals based on bodily characteristics such as age and skin tone. Let's go to the next page. This is where it comes back to social media though. We have surveillance technologies that people are using in their homes like ring that then can act apps. Also apps like next door will go to the next page that basically being used as places for increased racial profiling for people who can't read that it says Does anyone here use the next door app? I'm working on a story and have some questions. Someone says years ago until all the times I had to remind folks that when they're calling cops for illegal behavior just got too much. One of the quotes then is Sorrell black youth hanging out next door calling the cops he's casing and then the response is Susan he lives there. Also hanging out isn't a crime. So the systems that we are creating, reinforcing systems of oppression systems of violence and systems of racism. So my series of questions and they in no way prevent the seriousness that we've been talking about is how are we contributing to recreating and reinforcing racist and oppressive systems as institutions? So how are we using our collections? How are we participating in these social technologies and social spaces? How are we contributing to recreating and reinforcing racist and oppressive systems as individuals? How do we reframe the way we think about risk so that we're not prioritizing so that we're prioritizing risks to people over risks to our institutions? I think it becomes a different motivating question, when we think about how do we make sure that no one is going to be harmed by our actions as opposed to thinking about we miss out on some people being aware of what we're doing? It shifts the conversation. Is it ethical of us to post photos of our visitors? Is it ethical of us to encourage visitors to post photos of themselves? And what actions can we take to dismantle some Ellen's capitalism. So I have no easy answers, but they're the things I'm thinking about right now.

Unknown Speaker 1:00:06
Thanks. Big Questions. Thanks. So you can get at the next slide. I'd also ask similar to Dana, you can tweet but maybe just don't mention my museum for the sake of not triggering my former colleagues. And you'll see why in a second. This is pretty fresh. This happened earlier this year. So as I mentioned earlier, I previously worked at the Children's Museum of Indianapolis, it's pretty large museums, we had a have a 130,000 object collection, which contains significant pop culture objects, including a number of Michael Jackson objects. So here up on the left, you can see from our one of our exhibits, it's not actually a video. It's actually a screengrab from our Facebook Live, where we were proudly showing his, his glove and fedora and 2017, we actually opened an American pop exhibit, or these are featured and we later added his jacket to the exhibit as well. On the right is our exhibit The Power of children, which features the story of Ryan White, and he's the Indiana boy that is known for struggling with his sharing his struggle with AIDS and many celebrities supported him, including Michael Jackson. This is his mother, Jeanne there, let the spotlight there in the middle, she's sharing her story, which she often does. And you can see a poster life sized poster of Michael Jackson there on the right. There's a number of objects related to Michael Jackson in the recreation of his room. So she donated the entirety of his bedroom to the museum. And it's recreated completely in the museum permanently on display. So you can go ahead. So when the documentaries earlier this year came out, bringing the accusations about child sexual abuse back into the public attention, I was one of three staff that suggested we proactively removed the objects from display. Our CEO quickly agreed and they were hastily removed, with the exception of one single object in Ryan's room. And that was for the sake of maintaining the integrity of the recreation of his room. So the big giant poster stayed up, because that was added later. But there was one object that remained in it was this little picture of Michael and his Chimp, you know, of that. So that was still kept up. So we have this ongoing relationship with his mother, Jeanne, and we did check with her simultaneously as these were literally being ripped from the walls. And she did understand why we were doing this. She did have a very close relationship with Michael Jackson as well. So we were concerned what she might think. But thankfully, she was, she was pretty understanding. So that same day, the indie star, which is the major newspaper in the city, contacted us and asked, Do you have any Michael Jackson objects, and we did tell them while we just took them down, so they are no longer on display. And they pretty quickly went to our director of collections and asked for a quote, things move pretty fast. So we did not follow our usual process for talking points. And he provided an unfortunate quote, which went viral pretty quickly. So that's what this says up here. Obviously, we want to put stories in front of our visitors, showing people of high character. So this was the main social posts at the end star put up. And in a matter of days, things escalated really quickly, leading to, you know, things like a petition on, you can go ahead and go to the next slide. So, in those days, in between every major media publication ran the story, and fans from around the world descended upon us and our social team. And for a week, we implemented our robust crisis communications plan, we have a robust crisis communications plan, but there's only so much that we could do. We, you know, we followed the impact. We've we followed every major post and we were keeping track of everything, we decided that we would respond only to grossly inaccurate comments. So things that were just plain wrong. So if they said that Michael Jackson donated to the museum, which is not true, it was a misconception, or if they said something completely inaccurate about Ryan White, we would respond and and tell them the true facts. But of course, we would not otherwise engage. And so then, the other thing we did on top of that was that our customer service team was receiving a flood of phone calls and emails in addition to all of this. Go ahead, and

Unknown Speaker 1:04:59
so we had no idea do that what we had walked into was an organized social media fan campaign that attacks companies who come out against Michael Jackson. And they were ready to pounce on us the second that this happened, our reviews and our net promoter score just tanked. So you can see here that our Facebook rating went down point five points from 4.7 to 4.2, which is really significant for our social media managers. Our net promoter score went down 28 points from 82, which is really huge for us. And then 30, over 30, major media outlets actually covered the story. We actually had local police prevent some local fans from picketing outside or Museum, which is, maybe there's some museums that get that, but the Children's Museum doesn't get that. So that was a pretty big deal for us. And then we just had, you know, a ton of shares and retweets across all of Facebook, and it was in the range of 1000s. So one of the main things as well was that we actually were impacted really, with our mental health throughout this entire week. So I actually was, I was attacked in my personal DMS on Facebook. So people are able to find my name and come after me, our director of collections had to completely shut down his Twitter account, because he was just attacked as well. Go ahead, you can. At the same time, we were distracted by our museum leadership, and the board waffling on our decision. So they were really distracted by the response on social media. And while we were trying to manage the reaction, we were also wanting to use our expertise as social media managers to explain to them who these people were, who were, you know, making this outcry, and why they do or don't matter, and why we should stay the course. So it was a really stressful time. And ultimately, you know, the children's Museum's mission is about children and families. And these documentaries, they resurfaced this new triggering context that caused these objects to become charged in a new way. And that's what was so important in this context, and we had to protect our audience and our mission, they no longer held the same narrative context that they were intended to convey. And that's what we're trying to explain to the leadership in this time that they were so like, just struck, you know, in fear. So we did stick to our Museum's mission in spite of the optics. And in spite of the detriment to the brand on social media, you know, we had to come back from it. But ultimately, I'm really glad that we stuck to it, you can go ahead and press it one more time. And I'm really glad that our mission went out, rather than the bullies of the internet. I think that the big question here is that there's a lot of situations where you won't please everyone. And on social media, this is amplified, you know, sometimes this is just in a meeting or in a boardroom. But when this has to be posted on social media, you get the trolls, and then you might get your partners, your brand, your your donors, and then you have your mission. So where's that gray area? And where do you make those decisions? And so we have to always be thinking about that.

Unknown Speaker 1:08:28
I'm not gonna ask you to breathe in, I'm gonna ask you to breathe out. That ends are you already in you're on the slide. So that ends the sharing of very intense stories from the panel. So now it's your turn to share your intense stories and your solutions. We thought we'd do about 30 minutes of small group and then 30 minutes of report out roughly. I think we still have another hour or so. Okay. So here's what we thought we would do you tell me if this sounds good. We thought we could probably do up to like four or five smaller groups in here. And we could have people kind of shout out what they want to talk about. We could write them on a big sticky Hilary Morgan. Yeah. So we'll just give them numbers and once we hit like four or five, then we'll stop and of course, you can still try and find other people or convince them to talk about the thing you want to talk about, but that seems like a good first stab. So if that sounds okay to everybody. I will give the microphone to whomever would like to pitch the topic they want to dive in deeper on or sneak out of the room which is also fine. Okay, so the other thing we could do is just q&a and chat as it's not huge room. So if you would you rather do that. Let's vote Do you want to do small group discussion? Do you want to do q&a and discussion as a group Okay, and maybe end a little early. Okay, so who has a question or a comment.

Unknown Speaker 1:10:11
And I'm not a social media manager, I'm a imaging person, head of the imaging. And so I supply content to our social media team. But during the keynote speaker presentation, they were talking about the English culture ministry partnering with fortnight, which is a software which basically is training young children how to become consumers of the future, and basically buy nothing mindless shopping, and is gathering a lot of data about children and children behavior. It's basically, you know, programming them. And, you know, and I was wondering, there's a lot of these kinds of partnerships between museums, and tech companies, Google's one of them, where you had mentioned that there's absolutely no control museums, a lot of museums have, let them digitize their collection, they get the images, the images that the museum get are of questionable quality. And then they have no control over that data after it's put out. And so I was sort of curious about that. And also, you know, our, our visitors being used to train AI. So we have software, for example, that's, you know, to help us label event photography, etc. And it's essentially using data from our board, our visitors or donors, etc, that then is then gathered by these organizations, which, you know, help us tag photos, but they are taking that data and doing things with it that we don't even know. And so I was kind of curious about your comments with that.

Unknown Speaker 1:11:45
Actually, if we hadn't played the long video, so I would recommend a long video because this is like a major part of what Mike talked about. Yeah. The big concern over third party platforms, and like Mike, I've become increasingly concerned about it, we all have. And I had mentioned COPPA, earlier, they're actually having conversations. Now FTC is having conversations about making it stronger, maybe making it weaker. And so there's a lot of debate, if you Google about it, you'll find recent articles just within the last month. The FTC fined I think it was YouTube, one of them got fined, like huge fine, for breaking the rules, the couple rules. So they're trying. The big thing now with YouTube, I think, is the targeted ads, and they're targeting young children. So they're gathering data about children who use YouTube, and then they're targeting with ads, and they're a really vulnerable population who, you know, sometimes have mom's credit card, or other ways of purchasing things. So, you know, there's nothing here. Yeah, so all of this is a huge concern. I don't have any good answers. And I'm sure Mike Edson would have tons to say about it, and probably does in his video in terms of how we deal with it. But I think one thing is, is having those partners, partnerships, brokered very carefully, so that we have ways out. We have our data retained in our own repositories, as well as with them. I can speak from Smithsonian side, the relationship with Google has gotten better in terms of getting hold of the content that we actually generate for ourselves, as well as them that Street View example, they did do that in a proprietary technology, and there was no way to get that data out. But other types of interactions with Google, we have gotten the data where they've digitized our collection, and then we got a copy of it. So but it still is fraught with concerns about where that data ends up how it's used, how it's used to basically monetize for Google secondhand, third hand. So those are all big issues. I don't know if anybody wants to.

Unknown Speaker 1:14:01
I think one thing to think about in any organization is who's making the decisions? It's so often, our partnerships with other companies happen as a consequence of someone in development makes a connection at a cocktail party, and they see dollar signs. And they don't ask the same questions that someone with a little bit more technical expertise might ask or someone in a different department might wonder about or not just to pick on development. But this can happen all over the place where different people in different departments are going out and making connections with different partners and looking for opportunities to, you know, engage the audiences or do any of the great work that we want to do. But is there some central place in your organization where all these partnership deals have to be funneled through and approved before they actually happen? And it's surprising how many institutions that doesn't happen there. There is no advisory board. There is no oversight for that. And maybe the first step is just pushing for that. organizational change to happen so that we can make better and just more informed decisions about these things.

Unknown Speaker 1:15:09
To add, next question or comment? Did I see hands? Go here first and then over here. Hi,

Unknown Speaker 1:15:25
I'm one of you mentioned having a, like a crisis plan in place. And I guess the, the Michael Jackson story really resonated with me, because I'm sure my museum is not the only one that has work by people whose personal lives are problematic in some way. And so as someone who's relatively new to the museum world, I find myself, you know, Googling these, these artists, to try to make sure there's not anything problematic on on the radar already. And so I was curious about the crisis plan, or really, for any, any of you who have dealt with those sort of PR, things? Like, have they just sort of naturally run their course? Have those things just died down? I'm sure it kind of all varies, but I, I guess I'm just kind of curious about about the crisis plan. And if there is sort of a graceful way to sort of deal with it, and then move on to the next thing.

Unknown Speaker 1:16:34
This is apparently on, but just not working very well. But can you hear me okay? Yeah, we so we do have a pretty, as I said, robust crisis communications plan, and part of it is existing templates for pretty much every really horrible scenario you could ever imagine, which is really sad. But they're templated. Get press releases, essentially, that you fill in the blanks. And as things come up, but then we're like, oh, we need a template for that. And one of them, we had opened a really large outdoor sports experience with sports legends. And we were and we're thinking of what bad things could these sports people get into. So we had already been considering kind of the moral issues around, you know, humans, we also had already had a partnership with subway. So let's think about Jared Fogle a second. And being the Children's Museum. So we'd been down that road before. And I think it's just a matter of, yeah, if you have the capacity for thinking ahead and trying to, you know, just put your brain there and getting a group together that are good with words. And, you know, it's, it's good to be ready, and have a plan for social, have a backup to the backup of who's going to be monitoring, who's going to be putting the press release out, and just having that terminology, so that you're not caught off guard. And you don't say that wrong thing in that heat of the moment, because you need to be ready to say that nuanced, you know, thing that sounds human and like, just like what Dana did as well, you know, you you need to, you know, own up to the right thing. And it might not be the exact situation, but you can at least have as much of the structure in place and know who needs to be where. And also, you also have in your mind, this is crisis comms mode, and you just like are in it, and you know, this is it, and people just act and you're like, you know, super team go. So it really is very powerful.

Unknown Speaker 1:18:50
I want to acknowledge expertise in the room, I would love other people to add on to that. I would just also say that I've now lived through several times where social media becomes the story, right? So it's not even always an external thing, but something that you or someone else did on social is now a story that sort of, I feel like the process for that is not as well defined in a lot of institutions, because that's still a fairly newer thing. So did you want to add on to that, or did you have another? Okay, does anybody else want to add on to the crisis comms question. Anybody else dealt with artists in particular? Okay.

Unknown Speaker 1:19:23
I'm curious to see how many institutions are represented in this room that have diversity, catalysts, or diversity Dei, people who you can utilize in times of a crisis like that? Do most of you have diversity people? Because how many do I guess is my question how many actually have a fully functioning diversity catalyst? So we had one, and we don't, and we're having a hard time getting the institution to reinstate One. So individually, I'm in a, in the Carnegie, we have four museums, we are individually, trying to strategize and create our own teams. However, at the times that we've had these types of emergencies, we that's why we needed them at that time. And now that we're back to not having someone, they're relying on people like myself who are of color, who have collections of color, and the they're running things by us, and, and we'll give our opinion. But like you said, I think one of you said, we're not being compensated for that extra work, we may be quoted are being utilized by our institution to say, so and so said this from such and so and she knows, because there was so and so and so. And so it puts a lot of pressure on on us as people who are trying to do the right thing, but can't get this across, because we do need a team. And we do need people who are expert in that. And I've heard that AE M is is a little birdie told me that there may be something coming down the pike that we that all institutions, like museums will be required to have them. So especially because social media is so fast and so immediate, it does so much damage so fast. And we do need almost like a counsel to say, you know, read this really fast. So that was my question comment. community could help with just having people to go to

Unknown Speaker 1:21:36
and actually speak to that I didn't get to that entire part of my story that we were trying to find experts to provide us insight to go to our board with to help decide what to do after we've made the initial decision when there was the waffling. And so that was this big back and forth of how do we find the experts, because one of the questions was around race, one of the questions was around, you know, sexual abuse. And so we were trying to find experts around each of these things to get quotes and to back up our decision. And so having something like that would have been really helpful, because we just didn't know where to turn.

Unknown Speaker 1:22:16
So Oh, and I briefly mentioned this, but we do have a diversity and inclusion committee and a person who's in charge of that, unfortunately, the incident that I described, there was a big disagreement between the marketing team and the DEI folks about what to do. And after that, what's happened is now members of the social team have joined that committee to try and help build that bridge. It is a more long term collaboration and not just a crisis collaboration. So that was definitely a learning for the institution there. And there were definitely some bridges burned both with that committee and with the staff in general, who felt embarrassed of about what happened. I think, Claire, you

Unknown Speaker 1:22:59
don't have a question. I just have just wanted to share something that I thought would be useful for the group. In New York, we have a very active Museum, social media managers meet up. And we last year, we did a meet up at Twitter. This is now an infamous story. So we you know, we pose the question to our Twitter rep, like, platform is all well and good. But cultural institutions have a huge challenge with finding out you know, how can we activate Twitter in a way that's not vitriolic doesn't get political, etc? You know, what are your ideas for how we can use the platform effectively? And basically, her answer was like, Oh, we love the controversy, like, lean into it, you know, like, clap back, get in, get get into that space. And obviously, we were all horrified. And Brittany and I in particular, were very vocal about that. So it was like very disheartening to see the platform itself really not have an understanding that bad PR is not something culture institutions want to lean into. And then related, I also recently met with someone from Pinterest and asked sort of like, you know, what's the dark side of Pinterest? Like, the Matt has a huge collection of arms and armor. And so if there if we post a picture of a 19th century firearm, is their algorithm going to push that to a crazy gun nut? And what does that say, you know, what's the how do they deal with that? And when I said, What's the dark side of Pinterest, he said, Wow, I never thought about that. Let me get back to you. So it's, it's challenging that they're not really a lot of them are really not getting it. So that's tough. TBD? Yeah. Haven't heard back.

Unknown Speaker 1:24:44
This isn't a question. I'm doing that annoying non question panel thing. But, um, something that I've been thinking about increasingly over the past year is, a lot of times, I'm in a meeting and we're talking about like, what could go wrong? How would this harm us? And we need to have a plan in place that you like a crisis communication plan or like our talking points or whatever. And I keep saying, if that's a possibility that we could be harmed, does that mean that first we're doing harm? And that probably is the case? And why are we doing this? That question doesn't go over very well, with curators who have a very specific point of view, and may only want to talk about one specific point of view, especially with a an exhibition. And if you want to talk to me afterward, I can tell you a couple specific examples. And so I guess maybe my question is, maybe there's a better way for me to present that concern that what is the context that we're either not presenting? Or could come up that could be harming our audiences? And what is our responsibility to that?

Unknown Speaker 1:26:10
A comment that's helpful, not helpful. I don't know. Not to be like, so negative, but to what you just said, you're like, How can I frame this better? What context could I add? It's probably not you, you're probably saying the real right words, and they're just not listening and getting back to who we are in this role, right? How an example, anonymous ish, you know, this, you know, we're gonna do an exhibit of this artist, this artist has said terrible things about women, this artist continues to say terrible things about when, while you're still going to open that exhibit, the minute we promote this exhibit, this is the criticism that will follow. And you're still going to have a solo show about an artist who thinks women can create, and then we open the show, we promote the show, everything still happens. Everyone's unhappy with the criticism, everyone's unhappy with, like the donors and the situation, it's, we all knew it was happening, we all knew it was going to happen. All the things were said. So like, then what is it the market and money, right art museums?

Unknown Speaker 1:27:18
Thank you, everybody, for this panel. I'm struck by how alone you all seem in your jobs, and and that a lot of the solutions for this sort of heartbreak of the work that had been proposed are largely on you to do and as opposed to like, sort of structural solution, like to Susan's point about, you know, individual responses to systemic problems. And I'm wondering, if you're aware of museums that have attempted to address this problem in a more sort of proactive way, or failing that, because I'm gonna guess the answer is no. What can you imagine that a structural solution to that problem would look like, you know, where it's like, is it rotating social media managers on and off duty on a regular basis? You know, like, what would that look like that would make the job less, obviously, emotionally taxing.

Unknown Speaker 1:28:13
I know, I had a moment of clarity, when I was writing about this. When I, I get so frustrated when people say, oh, you should give yourself more time for self care. And it so they put it on you. But why doesn't the work help you have time for that? So I feel like in general, just for any job, I feel like why? Why can't it be HR helping like that? I mean, there's already bias or, you know, a stigma around mental health. Right. So I just wish that there could be that in general and the entirety of the workforce, but you know, I feel like that is a big giant answer would be that it's not a mental health day. And it's kind of like a jokey thing. It's like, no, like, you get to have self care. And your workplace helps you do that proactively, rather than it being on you to create the space when you also have kids and hobbies. And you're a social media manager. So really, you don't have hobbies.

Unknown Speaker 1:29:22
So your hobby is checking the page. Right, exactly. So

Unknown Speaker 1:29:25
that was kind of my aha moment was like maybe in 25 years or so there might be proactive self care solutions in the workplace. Just a really small

Unknown Speaker 1:29:34
one. Someone was asking recently on Twitter about if you wanted to try and build deliberately self care provisions into a job contract, what would you do and I used to work back in Australia, I used to work at a university that had a really robust flex time system. So you had a setup of how many hours you were meant to work in a week 35 or 40 hours, whatever it was. As and you logged your hours but not in a like a creepy way you did it manually. And if you worked three hours over one day, you got three hours back at another time. And so you really could control the way your time was administered. And that was really helpful. So thinking about if you have 40 hours in your week, but you know, you're going to be on social media for an hour at night, every single night that that's actually part of like actively part of what your hours are considered, and that you get that time back in another space. Like, I think there are provisions like flex time that you could make arguments for when you're negotiating or got a job contract that recognizes the work that you actually do, rather than it being unspoken and silent work that you're expected to do, but not rewarded for that I think could help with some of these, treating it not just like an individual solution. The other thing with this flex time, you could acquire up to or you could earn up to three days owed that you could then have as days off, so it meant if you had to get to a doctor's appointment, you could you could pick those days to do that. You could also sort of oversubscribed. So you could actually use slightly more time than you had earned, knowing that you would pay it back. And so it made it really flexible for people who had kids, for people who had just medical needs, all those sorts of things. So I think that model would probably work really well for a social media manager model.

Unknown Speaker 1:31:31
I think when we started out we How many of you said you supervise social media oversee social media. So a little bit, it's a little bit on you, too, I'll just say having been a social media manager and then being promoted up to, to definitely give my staff praise and make sure everybody knows the good work they're doing but to also talk to my boss and the CMO and the director about how hard this work is and to recognize the challenges and to put the staff in front of the rest of the staff or in front of leadership to talk about what's going great. And what's really hard. Because I don't think unless you've done it to I think Hillary Morgan's point, like people think it's very easy. They don't understand what goes into it, they don't necessarily understand what is going on with mental health or that you feel like you need to save the world. And all you have is a Twitter account to do it with. So I think I think it's sort of on all of us as colleagues of our social media staff to make sure that that work is recognized, the invisible labor is recognized, in addition to structural policies like flex time, which are great to talk about what's hard. So you're going to add on to that.

Unknown Speaker 1:32:38
I'm going to try to be concise, but now I have so many thoughts. I think first and foremost, having been in the field for about a decade, I have seen some improvement. So I'm a little optimistic about it that, yes, we have been doing a lot of work to self advocate and show our work and the importance of what we do and our expertise and what it requires. And also, we've we've earned a lot of respect over the years I feel and social media in general has become more of a standard in our society. So it is a little bit more respected than it was 10 years ago when it was like, just tweeted and you know, and then I wanted to use this opportunity to plug our 430 session on burnout, because we'll be talking about a lot of the same things and elaborating on that. And I think self care is a huge solution to it. But as soon as pointed out, there's probably some structures we can put into place and and start to solve this systemic problem. And I'll just leave it at that for now.

Unknown Speaker 1:33:56
Just to echo what Dana said that it's up to the managers as well. And the people love to make sure that folks are taken care of and have options. But I would also say that this is a bigger problem writ large for digital, and the lack of respect for the digital audience that exists within our organizations. And so one of the things I've been really fighting hard for is, you know more, as well as the individual staff members and what they're doing, to really try and raise the profile of the audiences that we're serving and saying that, you know, every person who walks in the building a physical visitor, the online visitor is just as important just as human just as much deserving as our attention and good customer service and good experience and all of that. The out of sight out of mind has always been an issue. And I think the lack of respect, the audience actually feeds into the lack of respect for the role. But that's just my theory.

Unknown Speaker 1:34:54
A lot of the issues that you're bringing up, that's based by sorry, can you hold on? Yeah, yeah. Have you a lot of the issues that you're bringing up in terms of respect and being included in table, etc? That's sort of across the board. For women in technology. I think if you would talk to managers who are higher up in their field at this conference, they would echo many of the same concerns that you have, I don't think it's specifically unique to you guys. And I think yeah, you know, and that is a, that's a big issue. And I think that's something that really this conference should address more frequently in terms of pay equality, equality, equality, and, you know, hours, etc, etc. Because a lot of these cultural heritage institutions are powered by women. And we're oftentimes not given the respect that we do. And I think to Susan's point, um, you know, with the things that you're mentioning, those a lot of those things that you mentioned, are available through hourly contracts, as opposed to a full time contract. And I think maybe for social media, things like that, where somebody has to be basically on call, having an hourly contract is helpful. So that basically you log in, you're at work, oftentimes, museums don't want to pay overtime. And so they'll become much more conscious of the time that you're spending and then you could, you know, potentially have the flexibility through an hourly contract to set the hours so that you're working X number of hours during nine to five, and then you know, and so that that's something that could be advocated on, not all museums offer that option. But that's something that could be negotiated potentially, for people if it makes sense.

Unknown Speaker 1:36:38
So I would love to take this opportunity to get you all's help on a piece in an exhibition we have coming up in the spring that I have some personal consternation about. And I don't have too many details about the exhibition as a whole at this point yet, but one of the pieces entails it's a video that depicts the slaughter of a chicken for and it's related to cultural practice. And apparently, is is done in in a heap as humanely as possible way. But I would love to hear what you all think about. I perceive concerns from the animal rights community, of course, and and just other visitors who might find the content upsetting. And of course, there are, you know, there are just many, many different perspectives to, to consider. And so just anticipating people who might be upset, I'd love to hear what you all would recommend, particularly should get ahead of the issue, and kind of talk about it on social media in advance of the exhibitions opening or kind of, you know, don't draw attention to it like we're talking about, sometimes drawing attention to it might end up making a mountain out of what is not a molehill, it's definitely a really important thing. But yeah.

Unknown Speaker 1:38:25
Interesting. We have so many questions like did the artists slaughter it just for the presentation? Or was it acquired and then utilized as part of their presentation?

Unknown Speaker 1:38:37
Questions? My understanding is that the video was filmed for the installation, but that the chicken was then used in this cultural related meal. And so yeah, I wish I know. It would be helpful to have more details on the actual

Unknown Speaker 1:38:58
it's really interesting that the animal cruelty act just made it through Senate it's good timing

Unknown Speaker 1:39:13
know that our our curator who listens and reaches a decision is herself a vegetarian and I, I really believe and her into the consideration she was given, but yet I did hear and just discussions about it. Someone Oh, cool. And I found that like, I was like, no, like, visitors might not appreciate it. The purpose of it?

Unknown Speaker 1:39:48
I don't know. I mean, my first thought and this was just off the top of my head is to have an internal focus group of staff talk about it because it sounds like you have concerns and there's probably other staff that have concerns to talk with the curator about it, and then

Unknown Speaker 1:40:00
you talk to other museums that have shown this artists work. And yeah, their social team dealt with it. I would recommend that.

Unknown Speaker 1:40:08
And then external focus groups, you know, even your social media influencers, if you have some well known people that you connect with, to get their thoughts on it, you know, and basically interacting with your community on it before it gets out there.

Unknown Speaker 1:40:22
I would definitely be prepared. What like, you could have too much content in preparation? And that's a good thing. And you may never have to use it. But I would not, not do it would be my recommendation.

Unknown Speaker 1:40:34
Yeah, I think a small addition, which is not at all about social media, but is about figuring out how you can give visitors a way of deciding whether this is or is not something they want to encounter. And probably doing that, that would be part of your social media plan. But also, you know, whether it's, you obviously won't necessarily have influence on the exhibit design, but thinking about where it's positioned thinking about if this can like warning labels and things so that basically, if you can not just be preparing people digitally, but preparing people who have then the choice whether they would find it personally difficult weather they don't want their children to see it. Giving? Well, again, I've been thinking a lot about risk and how we associate risk, and I think risk to visitors, including kinds of retraumatization and those sorts of things. The more you can provide people ways to control their experience, the better prepared you are, I think,

Unknown Speaker 1:41:37
because then that prevents a negative experience for them that then might show up on social media and cause trouble for you. So

Unknown Speaker 1:41:45
I wouldn't Yeah, I would amend my recommendation to also talk to the guests experience or floor staff of other institutions that have because they may even know more than if it never bubbled to the surface of social, but they will the people on the ground would know how people reacted.

Unknown Speaker 1:42:01
Yeah, definitely echo everything everyone said, but also in this specific incident. Oh, so reach out to the comms team. At the Guggenheim, they had an incident with an artwork and some lots of comments online, and I believe they actually ended up not showing the artwork, it was gonna be in a an exhibition of contemporary Chinese artists, so I would check in with them.

Unknown Speaker 1:42:31
So my comment slash question is actually more about these third party tools. So it but in response to the question about of there being other models for these kinds of things? So is that okay, if we do, okay, so I can think of a couple of models where some of these, like partnerships with tech and the surveillance kinds of things have come in. And there are examples from both libraries and archives, I can think of where, like, as a community, those communities got together to combat those. So the first would be, you know, when Google came knocking to the universities to digitize all their books, to make sure that they had their own data repository and weren't relying on it, they didn't just individually store that they came up with hottie trust, and they all went in on it together. And it became a whole, you know, research data set, and all of that kind of stuff. Interesting, you know that. So that's kind of something to think about, like, in a very real way, pooling resources among museums as an alternative to some of the kind of, you know, with with just not maybe so much ethics, but like preservation and faith in those companies to actually kind of like, take that data in the right direction. The other thing I can think of is when archivists started talking about doing social media and web archiving, especially in response to more like contemporary events like the the Ferguson outrage and things like that, there's a great project out there called documenting the now that built their own tools, and have their own ethical bent to really, I guess, I would say, to get at kind of some of the stuff Susie were talking about, like to not rely on the platforms to be the tools and to try to remake similar technical tools that did not reproduce the same systems of surveillance, right. So so there are these things that happen in those communities. And it makes me think that, you know, for some of these third party platforms, it reminds me of the earlier session, I went to on open source where, you know, we just don't have the same kind of like, you know, pooled technical kind of thing going on here. But, you know, it doesn't always have to be technical. It could just be shared communities of practice or agreement that, you know, as museums. We aren't going to use Google Analytics in these four ways or something like that. Right. And so, I guess I'm wondering if there's like this spaces that like we could operate in, even without, like these huge universities and that kind of thing to do to replicate, like these tools on our own terms where we do kind of opt out as a community. And then, you know, build what is appropriate for us without it being like, every museum is doing its own thing, because that is wildly unsustainable, right.

Unknown Speaker 1:45:23
You're basically talking about collective bargaining. But instead of like unionization, we're talking about how we will do collective bargaining with, I think that's a really great and interesting conversation. And if anyone wants to talk about that, let's have a, like, Let's get together in a group and have a conversation about how that might work. Because there is, that is actually one thing. We've got people here on this panel, most of whom are from pretty large institutions, or institutions with power, and I used to, I've worked at small to midsize institutions. Before I got to GW and you were not particularly like very small institution, I worked at a with its 10 and a half staff, you know, we did not have any power to get, like, contact with a Facebook rep, like that was just not something that was going to be conceivable for us. And so it's also thinking about where we deploy the power of our bigger institutions and the relationships, they have to help then bring all of our small institutions along, because you know, there's what 30,000 also institutions in the US individually, that is there is no power. And most of them, if they were all together in a group discussion, that becomes a really different way of navigating some of these conversations. And that might bring a little little bit more power to to some of those conversations and how we could work together around collectively solving rather than individually solving things.

Unknown Speaker 1:47:00
I would add to Susan's answer, if you have access to development resources at your institution, or if you have contractors who you work with regularly, ask them critical questions about the tools that they're using. Ask them critical questions about alternatives that might exist. So frequently, we default to the sort of industry best practices and the tools that everybody else is using without even looking to see if there's something other than Google out there. And a lot of times there are, and if there are any of those technical people in the room right now, any other developers, and you're interested in maybe working together on helping with some of these projects, because I I'm in the process of evaluating a lot of open source alternatives to these things right now, for my institutions purposes. And there are already existing projects out there that, you know, they might not be what we want now, but they're open source, and we could make them what we want now with collective action between us. So we don't just have to collectively bargain we can collectively build too.

Unknown Speaker 1:48:02
I also just want to make a small addition to Matt's point. I was a while ago, I was on a granting panel, I was doing some grants assessments. And there's a whole section on grants assessments about like privacy, and how you're going to look after participant privacy. And someone said, Well, this is what our privacy policy is, but they were also partnering with another organization and that other organization clearly had different privacy expectations. And so this idea of interrogating who your vendors are and who your partners are, and what their values are. COVID wrote a really great piece early this year, late last year about partnerships and thinking about the values of the partners were working with then, you know, Google's values and Facebook's values etc. are going to be different than museum values and that we need to be thinking about whose values are we prioritizing whose values are we buying into and ways that we can we can interrogate some of those things.

Unknown Speaker 1:49:08
I almost forgot what I was gonna say. But now I want to say how grateful I am that we are such a collaborative community and we do support each other as social media managers. And having been in house at two museums and now transferring as a freelance consultant. I think i i and probably Laurie and other fellow consultants have a unique position to see things from the outside with backed by our experience in house but I think my goal personally is to help museums solve a lot of these problems and help support the social media manager that's doing their thing on their own. And I'm, I'm just thinking bigger picture like how can I help enact this change and collaborate with all of you To to work on this problem. So

Unknown Speaker 1:50:01
let's just

Unknown Speaker 1:50:05
don't forget about the consultants.

Unknown Speaker 1:50:09
Does anyone here on the social media SIG? Would this not be a good? Is everybody in the social media SIG? Do you know what I'm talking about? Or Aren't you the former? Chair? Why don't you tell? Tell the group? Because this seems like a good platform for continuing the discussion. Sorry to put you on the spot, Amy,

Unknown Speaker 1:50:27
about what the SEC is, or? Oh, I'm Alexis was in here. But she's a chair, who else is out here? I would prefer you to apologize.

Unknown Speaker 1:50:42
Yes, please sign up for the social media SIG. Like many of the other six, you know, we try our best to hold monthly discussions, monthly calls. And then we have our base camp, where we can deal with discussions just like this. So I invite everybody to throw your questions out there on a base camp to start up a thread to get things going. Sometimes things get really lively, but more often they do not. So the more discussion that we can have in there, I think the better. So please, if you're not a part of the social media, Sig Basecamp, just let me know, I'll be happy to add you. And we can look forward to having you guys talk more on the Basecamp.

Unknown Speaker 1:51:27
Thanks for that. Alright, I feel like yeah, was there another question? In doing that,

Unknown Speaker 1:51:33
so because I'm an imaging, I actually, we're actually part of a consortium with museums. And there are a number of open source software's that we use and things that we've done, both for photogrammetry and other things. But the problem that we found is, and while it's a great solution, because it targets specifically museum needs, there's generally not the funding behind it that these private companies have. And so an open source solution gets developed, it works. And then the person who developed it, or was kind of the, you know, babysitting, it leaves their position, or goes on. And so I think, for, I think the open source would require a paradigm change within museums to really kind of do collectively get together, decide what open source solutions we need, and then figure out a long term maintenance plan for how we can keep those things going. Because it's not enough to just kind of start something up and then get it working. And then within three years, it falls apart again, because the software is not constantly updated. And there's not any innovation going on. And so I think that's like a really great solution. But it needs to kind of go to the next level. Yeah.

Unknown Speaker 1:52:46
Definitely. And this is a problem we have, like you say, you've you've run into this with digital imaging, but we've run into this with CRM, ticketing, we've had these conversations for years about how we can better utilize open source. And I think Greg Albers had an entire session about it this year. You're right, that that that paradigm change does have to happen, there has to be some staff time set aside for aiding in the maintenance of these projects. But I think if we get enough of us talking about it, maybe we could make that happen, at least in enough of the larger institutions that they could support the use of the other institutions. We don't have to have the entire sector shift, we just need to have enough manpower from a few institutions that are willing to do it to keep some of these projects going.

Unknown Speaker 1:53:33
Sherry, you're gonna be the last person to get the mic. Just setting expectations.

Unknown Speaker 1:53:43
I mean, I guess that's what I was suggesting is that, you know, that it for us to be serious about opting out, and really rethinking the ethics of our involvement with these companies and stuff, it's actually it would demand a paradigm shift and like we would have to, and that would be even within the museum, like I think about the fact that I don't know that the work that is done is valued even at the upper levels of like contributing to the museum field in the same way that say like a curators research contributes to the historical field, right. So there are like lots of staffing issues and things there. But you know, when I worked in academic libraries, like if you worked at a Stanford or UC or a University of Michigan or something like you were seeing you were promoted based on what you contributed to the open source University and library community, right. So, you know, if we were, but it takes this it takes saying like, No, we need to actually say we're going to opt out, this is what is for the the health of our sector, and then and then put the paradigm behind it, you know.

Unknown Speaker 1:54:47
So I think my follow up to a paradigm shift and that opting out concept is, then how do we get the audience's to where we're, where we're changing to, you know, We are on these platforms out of somewhat necessity to be able to reach the audiences where they are? And how would we convince them of the importance of having a separate platform for museums? Or, you know, it's just a question of leads to millions more questions? Well, okay, I want to say the discontent with

Unknown Speaker 1:55:26
social media amongst the populace is growing, in terms of the privacy issues, and that as an S, you know, sector, we need to get better at SEO and the things that people are using when they are finding content.

Unknown Speaker 1:55:42
Some of those could be solved, solved for us though, for example, Twitter just banned political advertising. And that includes issue advertising, which is I guess, aquariums or issues. So now, we can't advertise on Twitter, which we weren't doing anyway. But I'll be curious to see if Facebook follows, you know, the some of these questions could be, potentially, but we can also start with maybe we don't pay to boost our posts. Maybe we don't give them the money. We're still giving them our attention and our awesome content. But maybe we start somewhere with not funding their business model. So thank you all for coming. I know.

Unknown Speaker 1:56:24
If you want to watch Mike Hudson's video, it is listed in the slides and all of his slides are in there as well. So if you feel like diving even deeper, it's all there for you to watch and read at your leisure.

Unknown Speaker 1:56:38
Thank you all for coming. This is fantastic.