- Welcome, everyone. I'm Kristen Olson, Vice President of Learning and Thought Leadership at the Tessitura Network. I use the pronouns she and her. I'm a White person in my 50s with short, dark, curly hair. I'm wearing glasses and a gray sweater. On the wall behind me hangs an abstract painting in blue and green brushstrokes. In today's DEAI learning series, we invite you to explore how you might recognize and address bias in the data we collect, analyze, share, and use to make decisions. I'm so pleased to have Sarah Herrera joining me today to share the important work she has done as an individual, as the assistant director of Business Intelligence at the Mondavi Center for the Performing Arts at UC Davis, and in various antiracism working groups. I will return after Sarah's presentation to moderate a Q&A session with Sarah. So if you have questions as she's presenting, please do go ahead and put those into the question pane, and we will get to those questions shortly. Sarah, thank you so much for joining us today to share your work on this very important topic. Over to you. - Thank you so much, Kristen. So my name is Sarah Herrera. Thank you, everyone, so much for joining us. As Kristen mentioned, I am the assistant director of Business Intelligence at the Mondavi Center for the Performing Arts. And I use the pronouns she, her. I am a Latina woman in my late 30s. I am wearing glasses, a lime green sweater, and I have long, curly, dark brown hair. I am in my home office where there is a white wall and a picture frame behind me. I want to first start off by thanking you all so much for being here to learn more about how we can make our data projects more inclusive. And I also want to thank the Tessitura community for creating a platform to share ideas around diversity, equity, inclusion, and accessibility. My main role at my organization is to serve as a data analyst, but I also participate in our organization's DEAI committee. And there are a few projects I will mention during this presentation where I have merged data with DEAI initiatives. I have served on the Major University Presenters data cluster group, helping our committee gather metrics on our antiracism work. And I have also been working with a UC Davis researcher on the concept of belonging and how we can measure that in our audiences. I am passionate about making sure that this work happens in an inclusive way and the information gets shared in a way where all feel represented and heard. The lived experiences we bring to these DEAI conversations are so important. So I just want to share a little bit about myself. I grew up in a small, a very small California farming community. And the thing I love most about the arts is the experiences that they create for people and the energy that they create that ripples for days and sometimes years. And today, I want to share some examples of what bias looks like, why data ethics are important to our DEAI work, and empowerment framework for getting our work started, and how we can bring concepts of belonging to our data projects. I will finish up with more resources and some examples of how we might apply these concepts to our work. And at the end of the presentation, I will also include a checklist that includes all the resources I will be sharing today and can serve as a guide to take back to your organizations and used to work on data projects. I want to start off by setting the tone for this presentation. I have a strong belief we all have a role to play when it comes to reducing harm and encouraging inclusive data practices at our organizations. We can all work to bring awareness to the possibility that data can cause harm when used incorrectly. There may be some people on this webinar who have never worked directly on a data project, and they may be wondering, "What can I contribute to projects involving data?" I want to assure you that you belong here, even if you have never worked with data. And I'm actually going to prove today that some of the best skills that we need to bring more inclusion and belonging to data projects, or to reduce bias in our work, actually have nothing to do with technology skills at all. In the next few slides, I'm going to show some examples of what bias looks like. So let's get started. Here's an example of a sample question about gender with some answers displayed. In this example, we had originally listed multiple-choice options starting with male, then female, before listing other options like non-binary and trans man, trans woman, an option to self-describe, and prefer not to answer. After using the randomization feature in the survey software, there is less perceived bias in the question and we no longer lead with male and create a perceived bias of that gender being most important to us because it's the first thing we think about. We can also achieve this goal of removing bias by listing the answer options in alphabetical order. Here's a bar chart that represents audience gender. There are a few things to point out here. Using mail first when there is a higher representation of people who identify as female in the dataset, using colors like blue for male and pink for female that reinforce gender stereotypes. And then there is this category of other. Other may be represented because the survey author did not ask for the respondents to provide more options other than male, female, or other, or because the person creating the data visualization chose not to include the other genders on the chart. So here's the visualization after we made some changes. We have ordered the chart by numerical significance. You can also address this by ordering it alphabetically. by listing other, male, then female. And we also lifted up the other people of the genders represented in this dataset. And we also removed the gender stereotyping of using colors like pink and blue to represent the bars. In this example, we used orange. And many of us have also seen a chart like this in our careers that represents the ethnicity of people using some type of graph. It is important to focus on people-first language when labeling our charts. In this next example, another way to order the data, when appropriate, is by doing it alphabetically like we see in this chart. And here we have also identified who is included in the other people category by listing prefer not to answer, unknown, and mixed race. We have also considered the labels and added the word people to them to start viewing them as people and not just data points. One more option to use instead of other is another, like another age group. On this slide, there is an image of people in a crowd watching a sporting event. And some of the people have yellow squares over their faces. I saw this image in scientific magazine, and it really caught my eye. As I read the story by John McQuade, it was a familiar one. He tells the story of a company setting up cameras at a conference vendor area. And the cameras were scanning the attendees' faces and the images were being fed to a software that was making decisions about whether the attendees were enjoying the conference experience or not. I couldn't help but think about post-show audience surveys that we often rely on to understand if someone enjoyed their experience at one of our events. Like we just saw, some bias may present itself in the way we write our survey questions or visualize our data, but in the future, it could look drastically different in our industry. And as technology evolves and gets introduced to our work, it is ever more important that we understand how bias can creep in with technology. And I want to take a moment to demystify the term machine learning. I think it's important because this term is being used a lot more and we're hearing it more in our work. I also want to share a little bit about this term because I feel that we all have a place when it comes to discussing technologies, especially when they're going to impact us. And in this particular example, I want to first start off by shouting out, all the data scientists here, I'm going to simplify this term, but I know this is like complex work that you do and it's very much so appreciated. So in this particular example, the learning and machine learning consist of repeatedly just comparing those photos to another set of photo images that have been labeled with emotions, like happy, sad, angry. And I want to point out that these algorithms or these softwares that learn these behaviors to identify the images, they can also learn biases that exist from those people who are assembling the dataset. So the data can become biased by the person who's saying, "This picture looks like someone who's happy." But it can also be skewed from demographics and trading sets that are skewed, that don't represent the population, or just these unconscious attitudes of people that are putting together the information. I'll wrap up by saying, in the article, the author also describes some cases of gender bias that occurs of women being labeled happy more than men. Also the datasets, some of them were drastically skewed to include White male more than other types of people. And one other thing was that Black people were actually labeled angry twice as much as White people. The author says all this to say that these automations that could happen can also spread bias as well. And before we shift into some other sections, I want to share some reasons why addressing data bias in our work in the arts is important. In the past few years, we have experienced some changes. The arts have gone more digital, we are relying on more data to make decisions, and some of us have experienced reduced staffing that's left us with the only choice to outsource some of our work to make up the difference. And there are also a few other important things to note when we discuss reducing data bias. I feel it's important to acknowledge the digital divide that exists in our country. It's an important preface to this discussion. As we learned through the pandemic, the digital divide is very real. Many of us saw images of small children huddled in the cold next to cold schools, just to tap into the Wi-Fi so they can keep up with their classmates. And when we talk about data projects to learn more about our communities, we have to also consider how we can learn from those who may not have the ability to receive that email survey. The digital divide is one reason why there is also a lack of diversity in the technology industry. And according to the US Equal Employment Opportunity Commission, women and minorities are highly underrepresented in technology field. And when we work on solutions for collecting and sharing data, we should also consider users of all abilities, ensuring that we have technology solutions for all. And as we just saw in a few examples, just because data is accurate does not mean it is free of bias. And once I understood more about the types of bias that existed, I really had to reevaluate how I approach data projects. And so I felt that I really had a responsibility to limit that bias as much as possible. It's important to understand that data bias can exist in our work. There are several types of data bias, but there are three types that I've seen the most in my work with data and in the arts. Those are historical bias, representation bias, and measurement bias or feedback loops. Historical bias can exist when we use data points that have been impacted by events in history. For example, there is some bias that exists in housing data like zip codes because of unfair housing practices that favored certain races. Sometimes even though we don't intend to use race as an element in our analysis, we can unintentionally insert that proxy if we use zip code instead. There can be representation bias that exists when certain groups are missing from our database. This can happen when we have free events and don't ticket, or when we just don't have the information for everyone in our system because they were someone's guest. And measurement bias or feedback loops can happen when we continue to use the same information in our database to make decisions about our DEAI efforts. As I reflect on some of the things I have learned in my work with antiracism and DEAI, I have to take the time to acknowledge that this isn't always the easiest work. We are human, and sometimes we get tired. And for our colleagues, this fatigue and defeat can feel very real. Sometimes we're not sure if we're making a difference and some days we may even feel like we're failing. As I reflected on this reality, I started to search for a solution that I hope could help others doing this work. And more specifically, those who are starting working on these projects to measure success using data. My search led me to areas that have high failure rate so that I can learn some tools to share with others. Digital transformations which can sometimes just be described as adding new technology to a company fail 75% of the time. This number was introduced a few years ago in a Harvard Business Review study. Like I mentioned, I'm a firm believer that we can learn from the failures of others and apply their lessons to our work. So what I started doing over the past year is studying how some companies have been able to break past this failure rate to accomplish great things with technology. And that took me to a book titled "Why Digital Transformations Fail" by Tony Saldanha. And what he challenged the readers to understand is that there's no magic to what makes some companies successful and some not. And he stressed the importance of systems and discipline. Discipline, so when things get tough, we push forward. And when it comes to our work in DEAI, it can get tough sometimes, but it's very important to continue to push forward. One of the systems discussed in his book is a system called Disruptions Empowerment. The four elements are having a massive transformative purpose, or an MTP, the freedom to fail forward, public support from leadership, and small wins. This empowerment puts those in charge of driving change forward empowered to do this work successfully. Thinking about our data projects through a lens of inclusion, belonging, and diversity is a challenge that requires us to approach old practices a little different. I want to discuss these elements because I saw a parallel from some of what I've witnessed happening in my own work in antiracism. In the book, the author says that underinvesting in empowering the transformation leaders is a mistake that often comes back to derail progress. The first ingredient in this system is identifying a massive purpose. Now, many of us may already have this statement at our organizations, and that's really great. If not, here are some recommendations for how to create one. Having an audaciously big and inspirational goal that can cause significant transformation to the community or planet. And that helps us shape a clear "why" behind the work that we are doing. The second element is creating an environment where there is room to fail forward. We will not get it right the first time, but what's important is that we keep going. And what's even more important is that we have the support from leadership at our organizations, not only to fail forward, but also to pick up and just keep going because there's still work to be done. And how does leadership support our work in DEAI? They declare support for our work visibly and often. And they can do this in front of a board, publicly in staff meetings, or in newsletters, and sometimes even funding those efforts. The final element is small wins. In the second part of this presentation, I will be sharing some data project considerations and resources. I encourage you to try to pick just one of these items that you think you can implement. Trying to do everything at once will be challenging and it may cause burnout. But small incremental steps can help this work exponentially as those steps add up over time. Now, imagine if there are five people at your organization who see this presentation and each of you pick just one thing, that would make a handful of changes at your organization. Here is one of the most helpful tools in this work. Self-care. That can include going for walks, drinking water, reading, meditating. In my lived experience, there has been a major empowerment that has come from taking time for self-care in this work. I want to share the concept of belonging in this presentation on data bias because I feel like a big part of this work is increasing the diversity of the people on these teams, and creating an environment of belonging can really help with this. There are two ways to think about belonging in our data projects. We can look at the concepts of belonging through a lens of our teams, and we can look at this through the lens of our audience members and community. As we think about data collection, sharing data, picking our teams to work on these projects with, there are some elements that improve our feelings of belonging in the workplace or our feelings of belonging as we interact with organizations. When you are seen, when you're connected, when you're supported in the work, and when you are proud of your work and the mission of the organization. Another extremely great resource when it comes to the concepts of belonging in our work and data projects is the Othering and Belonging Institute at UC Berkeley. I wanted to share a great resource they have which is very detailed guide that can help lead you through a data project. The guide is on a topic called Targeted Universalism, which is an approach to change an analysis that helps the greater good, but also gives a framework for developing strategies for groups based on their individual needs. Step one is to establish a universal goal based on a broadly shared recognition of a problem. And the example we're going to use today to walk through these steps is to make the arts more accessible to all audience members and the community. Step two is to measure the general population's performance relative to that universal goal. So we might do this by starting with a survey to understand if there are barriers to attending a performance. Step three would be to identify the groups and places that may be performing differently with respect to the goal. And in our example, we can do this by using self-identified sociodemographic information to see if groups in certain geographic locations or certain age groups might have different barriers. Step four is to understand what some of the barriers may be that impede each group or segment from achieving that goal we've set at step one. And there is no one-size-fits-all. Some of our elders in our community may not be able to attend because they can't drive at night anymore. And other segments may have financial barriers. And some may feel that the space does welcome families, or some may feel like they don't have a sense of belonging in our spaces. So this is the part where we analyze the feedback and look for some common themes. Step five is to develop and implement targeted strategies for each group or segment to reach the goal from step one. This is the part where we may need to develop several strategies to improve the universal goal. And this is where it's also so important to have the support to experiment and fail forward. And it's also important to acknowledge that at this step, some strategies may take time as we learn more about our communities and build trust. In our example, we may decide to send a price offering to one segment or set up a shuttle service for another. I'm going to take the rest of this presentation to offer some practical tips and resources related to the process of reducing bias and data. And here's the part where I'm going to encourage you to think about that one thing you want to take back to your organization as we get through the rest of these slides. Staff diversity is such a powerful tool in data ethics that I always feel that I need to start here before moving on to the other resources. People are the best assets we have in our organizations to bring equity to data projects. Diversity of opinions and different lived experiences bring more innovation and creativity to data projects. There are many right ways to achieve a goal. We all have expertise, and we can learn so much from each other. In addition to implementing inclusive hiring practices to diversify your teams, there are other ways to diversify those existing teams now. We can invite community stakeholders into the development process of our data projects. They will almost always have great feedback on data collection methods and what things to ask to make our data projects better. So, for example, maybe for one segment of your community, it really isn't helpful to send them a survey digitally. And for some groups, it may be best to invite them into our spaces and conduct a focus group. I encourage you to be open about your data projects across your organization because you may discover expertise in other departments. And I want to share some inclusive data collection practices that can help us reduce bias, specifically bias that comes from a feedback loop of just looking at the same customer data over and over again when making decisions. Some of these include finding ways to get feedback from people who may not have access to technology. Also, working with your team to figure out how to collect information from those people who did not purchase tickets but came with someone else a guest to a free event. And another more challenging but worthwhile endeavor is to pursue a way to come up with some creative ways to get feedback from people who have never been to our venue. When starting a new data project, creating a summary of the project creates a roadmap and creates accountability. I encourage us to build this into our project timelines and include your why, a description of the information you will collect, and include a detailed summary of what you will do with that data, and then honor that promise. I also consider sharing a version of this summary with the survey respondents or those participating in this work to build trust in that community. Another very important thing is to consider data privacy and setting clear expectations of who will have access to the data and how long you will retain that information. And it's really important to create a mechanism for allowing our stakeholders to self-identify when we are collecting sociodemographic information. As I shared previously, using the randomization feature and survey software can help reduce bias, or you can also list options in alphabetic order. I encourage us to also beware of cognitive load that can happen when we create really complicated matrix questions. People with different learning styles may be impacted negatively by these matrix questions. A different question style will be more useful to all. We have implemented a DEAI committee survey review at the Mondavi Center as part of this work, and it's brought so much value to drafting our surveys. So I encourage us all to budget time in this step of using our DEAI teams or diversity committees. Sometimes the process of analyzing data can be done in different silos at our organizations. So before finalizing a data project, I encourage analysts to include others in this process. We all bring our lived experiences to how we process information, and it may not be appropriate for all projects, but there are sometimes projects where it's great to just bring in different team members. And what I've done in the past is I've cleaned up the data, shared my screen, and ask people, "Share with me things that you see in this data." And it brings so much more wealth of knowledge to this process and can make our strategies better. Transparency of information should be important in any data project. Including staff, stakeholders, and survey respondents in the sharing process can bring more equity to projects. This can be done by hosting reports online for staff to access or providing an executive summary that is approved to share with the general public after a survey project is done. And once you have established data collection practices for your DEAI goals, it's always great to revisit them from time to time. A data ethics audit is looking at the data methods we use to make business decisions and evaluate them regularly to see if they are still in alignment with our DEAI missions and the stakeholders we are trying to impact. Care about data ethics as much as we care about cyber security. When we conduct cybersecurity audits and see no major concerns, we don't stop doing them. We continue to monitor cybersecurity. A data audit process should not be a one-time task that gets checked off and then forgotten. Here are some of the audits we do at the Mondavi Center. We look at our constituent loyalty scoring that we use to make a lot of campaign decisions. And we discuss priorities from time to time that we give to seniors subscribers. I also want to mention a debate that we continue to have, I think we will always have, which is merging duplicate records. There is a one school of thought that when a system identifies a possible duplicate account, that we should just merge it immediately. However, there are some who feel that customers create second accounts because maybe they're trying to purchase in an anonymous way. So to merge or not to merge, I think this is one example of an ethics discussion that we're going to keep having in our organization in the future. Partnering with data vendors or consultants is an amazing way to bring value to our organization's DEAI efforts. It's a significant but worthwhile investment, with decisions being made that will impact our teams and constituents for several years. So I encourage all of us to be courageous when asking about their data ethics practices. The Mondavi Center has adopted this process when engaging with vendors. Here are some things we can ask when interviewing vendors in our vendor questionnaires. Ask how they use inclusive practices in their data collection methods and hiring practices, ask what some of their goals might be around diversity, and who makes up their teams and are they conducting data ethics audits? Staff training is also very important and can take on many forms. Training staff on bias in general and how it can impact our work can be very helpful. DEAI trainings can bring awareness to these elements in our work. And training for our technology teams will also help them with their analysis process and how to be aware of data bias in each step of processing data. I want to wrap up with my favorite tool, which is human intervention. A lot of times, the best way to break a data feedback loop is to use human intervention. And we've discussed the benefits of partnering with our DEAI committee, but I also want to share one more resource in this work, and that's an organization's engagement or outreach team. Building relationships to break feedback loops or only reaching the same people over and over again can be made better with the help of this engagement team. This partnership is a great way to discuss how we learn more about those people that are not in our database. Going where they are and asking, "How can we serve you?" is a great way to do this. In this past year, the Mondavi Center has paired data with understanding the audience engagement journey, and that has been a very powerful tool for us in our DEAI work. Now I would like to use an example to share how some of the resources I just shared can be put into action. In the Major University Presenters antiracism group, we came up with a framework for how we can implement some antiracism practices into a few different categories in the performing arts. These include: how we make marketing and fundraising decisions using data, how we make budget decisions, staff culture, and hiring, and reducing racial bias. We decided we wanted to establish a baseline using data to understand how each organization was performing. So we created a team who would work on creating a survey for the organizations who wanted to participate. We made this an open invitation and ended up with a very diverse team made up of different lived experiences and roles in the arts. When we met with our main stakeholder to come up with the project summary, that got everybody on the same page. And then next, we work together to create a survey that can help us measure the progress of each of the main areas I mentioned previously. Our next step is we're going to continue to meet with the entire committee and refine that approach that we're taking. We provided a draft of the survey. And like I said, this is going to be something that we'll continue to revisit with this group. I know we discussed a lot of techniques today to bring inclusion to our work and data and also to reduce bias in those projects, so I wanted to do a quick summary of those main points with a few checklists. And again, this checklist will be referred to at the end of this presentation. The first is the system of empowerment. Excuse me, my slide got a little, let me backup one second. So a system of belonging. So we went over the four elements of belonging. Targeted universalism as another resource. Then we went into resources, sharing data with all, conducting data ethics audits, which is frequently revisiting data for decision-making, data vendor partnerships built on shared values, staff training, human intervention. And I want to close with a few thoughts. First, I want to encourage us all to be our true authentic selves as we approach this work. And to practice forgiveness. Some of us may have started this work of collecting information for diversity efforts, and the projects fizzled out. And it's okay that that happen. We can forgive ourselves and move past it. I encourage you to celebrate the mistakes that happen along the way as an opportunity to learn. Be agile. If something doesn't seem effective, move in another direction and keep pushing forward. And finally, listen. Isn't this why we are collecting information anyways? But don't just listen to each other, listen to your teams and learn together. You can click on this page to find a link and resource that I created as a checklist. So feel free to take a photo of this, and this will help you bring inclusion to your data efforts and reduce harm while you do it. If you have any questions after today's session, you may also reach out to me using my email information that's on this slide. Thank you very much. - Thank you, Sarah. Such a great presentation. I'm going to give you a little bit of a break by answering the first two questions for you. The first came in and was about restating the title and author of the book that Sarah referenced. The title is "Why Digital Transformations Fail," and the author is Tony Saldanha. Quite a few people wanted to know whether the PowerPoint and the supporting resources that Sarah mentions are going to be shared after the webinar. And in addition to the QR code that's on the page here, Tessitura Network will be sharing a recording, the PowerPoint, and the checklist on our website. If you're a member of Tessitura, using Tessitura software, you can access that on the Learning Resources page. And it will also be under the thought leadership section of the website for anyone who doesn't have a login to the website. Sarah, I'm going to go next to a question about the concept of belonging. Can you talk a little bit more about that? And do you have any ideas about how that can be measured? - Yes. So the concept of belonging is one that's actually really exciting to me. And the way that we're working on measuring that is we actually partnered with a researcher who has a passion for understanding belonging as well. And he had this access to this database that had some questions in there where four primary questions that we're going to ask, and those questions that we've asked have to do with how someone feels they belong in a group setting. So do they feel like they see others that they can essentially relate to? And then the other ways we ask is do they think the people in our spaces look like each other or are very similar to each other? So there's four really interesting questions that I'm happy to share if anyone wants to reach out to me, and I can provide that information. Okay. A couple of questions drilling down a little bit more on the examples you shared at the start of the webinar. Is there a reason on the female, male, and other slide that you chose to put trans men and trans women in the other category rather than as part of the male and female categories? - Yeah, I think that for the presentation here, I was trying to simplify those charts. There is actually absolutely no reason why we couldn't list every single option. That is one thing that I would just kind of drive home, is that we really don't have to not list any of the other people in some of these categories. We should feel the freedom to definitely do that. - All right, excellent. What methods do you recommend to collect more inclusive data? How do you reach underrepresented groups in your database when you have little or no data on the groups themselves? - Yeah, the example I gave at the end about engagement teams and like really trying to bridge relationships, that is one area that I highly recommend. And we actually are trying to make some really concerted efforts to actually start to focus on the relationship aspect of that. And there are other ways that one can do that. You can actually do things like creating a partnership with another vendor who may have done some of that work in your community and work together on that. And that's where that understanding of their shared values comes into play, understanding the way they've done some of that data collection. But those partnerships can really help as well. - Excellent. Going back again to the start of the presentation, a question on using race as a factor. The questioner says, "I'm challenged by using race as a factor as if, in your example, Asian is a race based on color. How might we get rid of that bias?" - Yeah, I think that race is a very interesting thing because the example I gave about focusing on barriers, I think that that example where we actually focus on, like, what is our main goal? What are we trying to achieve? And do we actually need this element? Does this data element really matter? So I think that understanding our why can really help drive whether or not we think we need to include some of those questions in our surveys. - Okay. Our next question begins with a thank you to you. "Thank you for this presentation. I appreciate the checklist and the resources. I'm wondering about titles. The system we work with is managed by a larger company who values using titles like mister, miss, and missus. But we've run into an issue where people no longer identifying with the title put in the system long ago are uncomfortable with the titles being used or are not represented in the title options. Do you have any ideas on how to handle this when you share a system with multiple companies?" - Yeah, that's a really good question. We actually recently discussed this in our committee as well. So we opted to not use them internally. We decided that we weren't going to use them at all 'cause we felt like we didn't want to make any type of assumptions. I would highly recommend, the fact that you're sharing this information with others, I would highly recommend just kind of calling maybe a meeting or some attention to that and just get a team together and have that discussion. Because I understand also when we use systems more broadly, there may be different organizations, just a different kind of levels and having different conversations. So I would highly encourage like lifting that dialogue up and seeing if there might be a global solution that could help everyone. - Great. Couple of questions about internal surveys. Do you have any suggestions or things to focus on in regards to potentially soliciting feedback from staff on why response rate to an internal survey was low? Wondering if we should approach that at all and hesitating to create a survey about a survey. - Sure. So can you actually repeat that? That was a long one. Yeah, sorry. - Yeah, no, that's okay, that's okay. - Do you have any suggestions or things to focus on in regards to potentially soliciting feedback from the staff on why response to an internal survey was very low? - Yeah, I think rather than a survey, I would actually wonder if it isn't worth sitting together and looking at how that survey was distributed and setting some goals at the top of these different projects about what success looks like. I would highly recommend starting there. Okay. Do you think income or education is a value? And if so, should these be randomized if displayed? - That's a really great question. So I'll answer the second one first. We actually decided not to randomize those and we went with cognitive load on that one. We wanted to think about people of all learning abilities and felt like if we kept those sequential that that would benefit the greater audience. And we, again, consulted with our DEAI group on this. Regarding asking those questions, again, I would challenge us to ask what our goal is, what is our why, what is our goal? And if that then makes that applicable to a particular study or survey. - Through a lens of bias, what's your perspective about an employer asking for a response to sexual orientation or sex identity in an arts application? - Yeah, really, really great question. And I want to start off by saying that all of these questions, they're very personal, and I know that I have my own experience and my own reaction to them. And I also want to share something about another experience that I had. It's not very often that a data analyst has the, I'll just call it opportunity for lack of a better word, to actually go out into the community like I've talked about and really like have that contact with people as you're asking some of these questions. So I actually participated in a volunteer event, and part of this volunteer event was handing out food. And because it was donated by a county, we had to ask for income and family size, like right in front of those people. It's very different when you have to look in someone's eyes and ask some of these questions and see their response and see them feel like their worth is connected to a number that, really, it's not, like they're so much more worthy than what that number is. As we think about asking these questions, I highly encourage us to think about what that person might feel like that's filling them out. And I will say this, I will say that there are some examples of companies doing really fantastic things with that data that they're asking because they truly have a mission to ensure that there is diversity of all different types in their company. I know that when we talk about employers asking those questions, I think the value there lies within the mission of that company and if they truly want to use that to make a difference and not just to have some metrics that they can pat themselves on the back about. - Great, thank you. - Yeah. - What are your recommendations about purchasing demographic information from a data firm and depending to user accounts? - Yeah, I think that we just have to be honest when we're going to do that. And we have to just really represent that in our visualizations, for example. So if we're going to be sourcing data, pairing it with ours, I think it's important to just have a note in that study, in that information that says that this data was provided via a source because we didn't give those people the opportunity to self-identify. - All right. We don't have any more questions. We've got one comment that I'm going to share. "This has been so valuable. So thank you, Sarah. Thank you for calling attention to other also. As a non-binary person, every time I see other or choose not to identify as my options, I immediately feel a ping of indifference and frustration for the survey. So having another hadn't ever crossed my mind. And while it's not perfect, it is a better option. Having two real categories and then other just feels so lazy on the part of the data creator. I look forward to the day when I no longer see other on surveys." And so, again, Sarah, thank you so much for this very informative and thought-provoking presentation. I am going to close right now with just one announcement. This webinar is the first in a series of DEAI learning events aimed to elevate ambitious ideas and effective initiatives. Through our series, we hope to inspire advancement in diversity, equity, accessibility, and inclusion in all of our organizations and in the arts and culture sector. Please save the date for our next webinar in May when Heath Wilder, head of CRM and Business Intelligence for the Sydney Dance Company, will join me to explore how we can become more inclusive of neurodiversity in our workplaces. In closing, I want to thank Sarah again for such a great presentation, and, as always, all of you for joining us. I'm Kristen Olson, and on behalf of Tessitura Network, our DEAI committee, and the Community DEAI Committee Advisory Committee with whom we co-curate this DEAI learning series, we hope you enjoy the rest of your day, and please, everyone, stay well.