20 Nov

#MCN2018 Recap

Most years, on my plane back from MCN, I am furiously typing up notes from sessions. This year, I was volunteer co-chair and Human-Centered Design SIG co-chair. As a result, I was ever-present but not always there when it came to sessions. However, I had a better sense of what people felt about what they heard. Here are the five ideas I heard most often:

AI, Machines, and Thoughtfulness:

Amber Case said in her keynote, “I don’t want to be a systems administrator in my own home.” She was alluding to the prevalence of digitally-enabled devices in contemporary life. Museums are now more commonly using iBeacons, RFIDs, and other tools that collect data on patrons. This data can be incredibly useful for improving experience and operations, however, data collection is also incredibly challenging. First, and foremost, data is a responsibility. Our institutions need to be thoughtful about honoring our tacit relationship with our visitors to treat them well, including by treating their data well. We also need to help visitors understand how we use data, anonymize data as often as possible, and be thoughtful in the conclusions we draw from the data. Finally, visitor data is only one part of decision-making. Staff feedback is an essential tool. Most museums do a poor job of aggregating staff data on visitor experience and an even poorer job of honoring and acting on that data.

Humans make Mistakes:

No person is faultless but many museums are still reticent to be honest about failures. Sharing failures and working collaboratively across institutions to find better solutions could save the field money and headache in the long run. Many museum professionals find strictures prevent them from being honest with peers at other institutions. They also find it challenging to find places other than conferences to share their challenges, particularly places where they can publish failures.

Humans together are better than apart or against each other:

Collaboration remains a perennial topic. Collaboration with other organizations is particularly hard for many as their internal systems are in disrepair. Even when collaboration is successful, many of the collaborative projects are grant-funded, or time-restricted. The lessons learned about collaboration are often not folded into the museum processes.

Bias isn’t Mitigated without Action:

Everything is biased because humans are. Data is created by humans and therefore biased. Many of our technology-projects are outcome-focused and deadline-driven, like a DAM that must launch in six-weeks or an interactive for an exhibition. Timelines and ignorance have meant that these technology projects have often been produced without considering and mitigating bias.

Design for Accessibility is Actually Good for All:

Accessibility and inclusion are about being thoughtful to accommodate the widest range of people. But, in doing so, everyone is helped. Accessibility, however, doesn’t happen by accident. Thought must be taken to make the right choices so all patrons are included. While upfront cost might make this seem frivolous, the increase in audience engagement for the broadest audience makes designing for all worth it. User Experience Design, Service Design, and Human-Centered Design are useful ways for organizations to make sure to develop accessible projects. These processes can be adopted by all types of professionals. There are many resources, including these from me and MCN’s HCD SIG workshop. (Join by DMing @artlust).

Girl Surrounded by Technology objects


Overall, while the conferences was called humanizing the digital, I felt that the conference was really about humans and their existence in a dense digital environment. The ideal is to create digital that does not destroy nor negate our humanity. This ideal will only occur with careful thought. When digital is seen as the medium and not the message or the meaning, people are able to have superlative experiences.


Finally, I heard over and over that MCN is attendee’s annual chance to recharge and reconnect with champions. The MCN community comes out in full force at the conference. For some of us, it remains with us during the rest of the year, like on social media. Yet, many people mentioned how they wished they had more chances to share ideas, like in publications. Think of how much better the field would be next year if all of the 500 plus attendees shared one idea to a peer at their institution, one idea to a supervisor/ director, and one broadly to the field. These ideas could be shared in emails, tweets, talks, blog posts, published articles, or books. The community of MCN is only as strong as its participants and their strength is in their ideas. By sharing these ideas, attendees can exponentially expand the good happening in the field.

08 Mar

Looking into the Well-Reported Statistic about Museums, Starbucks & McDonalds (Data)

A well-reported statistic compares the number of museums to Starbucks and McDonalds. There are 1.5 times more museums in the country than the caffeine and fries purveyors. A friend Michelle Epps got me thinking about what this statistic. (You might know Michelle from her tireless work on the Emerging Museum Professional network).

In looking at the numbers, Michelle is right. While statistics about the sheer numbers of museums seem positive, they mask some real challenges. Museums can easily grow their reach. They have the physical space to interact with more people and the cultural capital to improve our society. But, they also don’t have the staffing capacity across the board. The majority of American museums have 3 or fewer full-time staff. Most, if not all, museums buoy their organizational capacity with volunteers. This staffing challenge is hugely detrimental to the field. Volunteers are wonderful, and I myself love volunteering with local organizations. But, they also effectively subsidize work at these institutions. Starbucks, in the opposite, is well-known for its commitment to giving numerous benefits to retain staff.

Starbucks and McDonalds (combined) are serving 70 times more people than museums.  These scores of patrons are also always interacting with paid staff when they are at those establishments. As Michelle pointed out to me, people don’t get a Master’s degree to volunteer at Starbucks. Instead, they work at Starbucks to be able to afford to volunteer at a museum.


Sources and Numbers:

Starbucks: 453600000 people served, 8,222 stores, and 238,000  staff (does not staff if these numbers are full-time staff only)
McDonalds:1266960000 people served, 14140 stores, and 375000 staff (does not staff if these numbers are full-time staff only)
Museums: 85000000 people served, 35000 museums, and 725000 employees

29 Jun

Keep Clean Data

Data seems pretty cut and dried, but don’t be fooled. There are plenty of ways to fold in bias.  Here are some concrete steps to help you do your best to counteract the most common pitfalls.

Start with a clean tool/ protocol to collect data.

1. Keep data clean

There are plenty of ways to keep you tidy. First, have everyone use the same protocol. Ideally, keep your data collector pool down to a minimum. More people means potentially multiple interpretations. Train everyone the same. Take out the protocol and make sure everyone understands it. And, make sure everyone uses the same data collection tool. I used to work in a team of three data collectors. We had to agree to everything, and often huddled up to make sure we were on the same page. Be vigilant

2. Observe First, Interpret Later

Years ago, when I worked on hiring teachers for the public schools, I had to take a course on legal job interviews. The fear that the trainer burned into my soul always returns to me when I do interviews. Only write what people say–word for word. Do not interpret. This goes against your human nature. And, if you have a hard time writing, ask respondents if you can record them. Also, feel free to ask the people who provided the data whether your interpretations seem to be representative of their beliefs. Once all the data comes in, then you have the joy of interpretation. That said, once you get familiar with interpretation-free listening, you will also find joy in data collection.

3. Check out the competition.

After your initial interpretations, look to others to see how they are tackling this issue. What are their findings? What other issues might be occurring in the literature. This is sometimes called triangulation. If you can find other sources of data that support your interpretations, then you can have more confidence that what you’ve found is legitimate.

4. Check for alternative explanations.

False conclusions are absolutely the most likely place that bias comes into understanding data.  Jumping to conclusions can feel normal, like finishing someone’s sentence.  But, just was you can’t fill in the blanks for your respondents, don’t fill in the blanks for yourself too quickly.  Consider whether there are other reasons why you obtained your data. If you can rule out or account for alternative explanations, your interpretations will be stronger

5. Review findings with peers

Don’t be an island. Unless confidentiality prevents you, let others look at your data. You will only become better at your work with critical assessments. Additionally, when you allow peers to review your work, you might find commonalities. You might even be able to augment your argument.

For more about data bias, here is a long read sharing more issues like confirmation bias, ingroup bias, and knowledge bias. 

27 Jun

Its Not the Destination OR Journey Mapping for Museums



Visitor experience is everyone’s job, not just those people who have “visitor” or “experience” in their title. Picture your visitor. What is the first thing that comes to mind? What are they doing? Buying a ticket? Standing in your gallery? Reading your labels. These are the types of touchpoints that are the focus of many museum professionals. However, you are missing important elements of your visitor’s experience. Much of the make and break comes at the moments in between.

Step back for a moment, think about going to the grocery store. You bought vegetables, milk, and bread. You also bought six things that were not on your list. Is that what you remembered? Or did you also remember the old lady who cut you off on the way to the corn? And, the sample guy trying to convince you that “pea-based false meat” is pretty good. Then there was your third-grade teacher standing in the lunch meat aisle. Many of your memories are about the moments in between destinations. As the adage extolls, it’s the journey not the destination.

Pathway Planning

Focusing on the journey requires changing focus from end-point planning, where you focus your energy on the galleries, turning instead to pathways. This shift requires focusing on the visitor’s needs and actions. In doing this, the energy shifts focus from the institution, often placing its decision-making heft in gallery-based decisions, to the visitor, whose experience is often born of the spaces in between the parking lot to the gallery. Mapping out people’s paths is called Journey Mapping, in User Experience Design talk. But, basically, you visualize what people do and why they do it.

Why use Journey Mapping?

As another old adage goes, don’t judge until you walk a mile in his shoes.  The saying, trite as it is, points to the role of understanding in creating a Journey Map. In other words, an ideal pathway planning process requires purpose and empathy to be foregrounded. Instead of just the nodes, or the point of getting somewhere, you spend your energy on every moment in between.  When you do that you learn new insights into your visitors’ decision-making processes. You also learn when serendipity and/ poor planning cause reactions. In other words, you get insight into why people react to your spaces.  In this way, journey mapping helps break through status quo planning, i.e. doing something as its always been done.

How do you Journey Map?

  1. Just as with fiction, journey maps should draw on what you know. So, start by observing patrons. But, then use that as the base to creating your map.
  2. A journey map is not a generic map. The journey map starts with a person. Specificity is essential. This is not like google maps. Instead, it’s more than the map your best friend gives you with asides about great signs and tips about places you will get lost. When doing journey maps, take a point of view. Keep that person in mind as you work.
  3. Next go for story. Imagine this person coming to your organization. Why are they there? What do they want out of it? That will be the motivation. Write out a two-sentence story of their motivations and goals, like the plot of their visit.
  4. The map is sort of the arc of your story, with all the tangents and eddies that your character might need to be authentic. Make sure to think out the path and the stops. Be specific about the character’s motivation and well as their process.
  5. You might imagine that you start by drawing. But, the best journey maps are visualizations of an experience that you have thoroughly planned. They are not random. So, waiting to draw allows you to be purposeful.

Even if you choose to hire someone to do your journey maps, understanding the process is incredibly useful. It helps you understand why maps are useful. They help you understand your visitor’s holistically. Often museum staff prioritize decisions without having a thorough understanding of their visitor. Tools like journey maps help you center your visitor in your process in ways that draw on process and empathy.

30 May

Bias in Data Analysis #musedata #musetech #data #bias

This is the second in a series of posts about confronting bias. These #longreads use narrative to help bring up bias in an accessible manner.

As Director of the Art Museum of New South Overthere, you are constantly being asked to make decisions based on data. Sure, you had your last math class in 10th grade, and then avoided math through your PhD. But, you are a specialist, I mean in lantern slides, but still, you got this, right?

Read these short scenarios and suss out where bias might come in.

Scenario 1

You are dying to know how much people like lantern slides. You write out a survey for your staff to deploy. You want to be direct with your visitors so that you don’t waste their time. So, you asked the people in the Joe Bright Memorial Lantern Slide Gallery (and broom closet).

  • “What do you like about lantern slides?”
  • “Is there anything that you don’t like about lantern slides?”
  • “What would you like to see with the lantern slide display?”

You were happy to find out that there is nothing that they don’t like about lantern slides. They also love your lantern display as it is. The only thing they want is more information, which is what you thought. They just wished they could know more about the slides! How wonderful to know what you do for a living is so relevant for people.


In this case, you have several problems.  First, your survey questions are constructed with a particular slant towards lantern slides. This a situation of interview bias. It’s as they say in legal shows; you are leading the witness. When you construct survey questions you don’t want to tip the participants off to the “right” answer. People have an inherent need to please, and so they will answer in a way that seems correct.

Additionally, there is a selection bias at work here. You went to the gallery where you hope to make changes. On one hand, you are being proactive. However, you are skewing your data. You have a sampling error at play. A better study would interview not just current visitors to the lantern slide gallery but also those in the museum who are not currently going to the lantern slide gallery.  In other words, you want visitors and potential visitors to draw a complete picture of the situation.


Scenario 2

Your first meeting this morning was with the board. This nice old lady, Sweetie Monroe, heiress to the great Marshmallow Mills fortune, was hoping you could explain why you don’t have any students in the galleries.  Now, you have never seen Sweetie upright before 1:00 pm in the morning. But, you also know that the school scheduling staff member, Peaches LaPew, is busy every morning. Last week, she tried to get you to do a Kindergarten tour, because you had more students than staff.

You’re not going to be able to show Ms. Monroe children in the flesh (you aren’t a miracle worker), so you ask your staff to do a little comparative analysis. Your head of Marketing/ Audience Research/ Programming & Security, Joe Exhaustino, has emailed you a super long report. Does he understand how busy you are? You don’t have time to go through this like you were in school. Luckily, it has a clear summary.  You have plenty of kids coming. Fabulous.


In this case, I have bad news for me. You didn’t look too closely at your data. You used the data like a yes-man. This is an example of choice-supportive data. If you looked more closely at the data, you would notice that only 4 percent of visitors are 18 and under. Most of those are school groups. Now, I don’t know what your measure of success is, but 4 percent of total visitors seems extremely low.


Scenario 3

Before you can get to your chance at regaling Ms. Monroe about your fabulous student tours, you find yourself stunned by numbers. You are sitting at a breakfast meeting with the directors of all the local organizations. The head of the Community Development Corporation is sharing a graphic. Apparently, the average percentage of family visitors at museums is 10 percentage. Eek. You get nervous. So, you turn to the guy next to you, the Director of Coffee Cups & Porcupine baskets. He smiles and then says, “Oh, yes, that’s just average. We are at 18%.” You leave the meeting despondent, and shoot off a quick email to your grant office/ gardener to get money right now for school tours.


In this case, the data was combined inappropriately, though you wouldn’t necessarily know it. You didn’t have all the information when you sat in that meeting and looked the graph.  The number crunches decided more numbers were better. But, in doing so, they didn’t use like categories. In this case, the number cruncher didn’t separate out school groups. Only two of the four museums do school tour. This means that in those museums children are coming in though out the week without adults. They will have higher numbers of children than those who don’t do school tours. The lesson is that you need to thoughtful in combining data. There are other challenges with combining data. If you aren’t careful, when you aggregate data, you can accidentally contradict what the original data said. This is called the Simpson Paradox.


Scenario 4

You are hoping to buy more media adds. You call on Joe Exhaustino again, but this time with your demographic numbers. He gives you the classic bell chart with mostly mid-aged people attending the new exhibition. No surprises. So, you will just use digital adds to get more young people. Finally, an easy decision. After the ads go out, you take a turn in your Lantern Slide gallery. Something odd is up. The gallery is full of really old men. You go back to Joe and ask him if his numbers are right.  He shows you his numbers. He had used a good-sized sample. He crunched the numbers and he ended up with a graph that didn’t look right. So, Joe removed the outliers, the old people.


Joe has been doing numbers for years. And, he assumed that the attendance numbers should conform to a bell curve. This is called the Non-Normality bias. But, another bias was in play here.  Instead of investigating the outliers, they disregarded those numbers. Joe did better on his second crack at it. Along with the quantitative data, he looked at surveys. Turns out the lantern slide gallery had become a mecca for the over 95 set. Practically, everyone in the state in that age demographic comes to the museum to check out those sweet slides, particularly on “free coffee Friday”.

05 May

Big Trouble in Little Data #musetech

Big data is , well, big thing these days.  Honestly, it has been for a while.  We make so much data by interacting with digital tools.  Daily 2.5 Exabytes are produced every day.  That is the equivalent to 5 million laptops filled to the brim with data. Imagine yourself right now attempting to find one thing in the middle of all of those computers. (You might envy the person seeking the needle in the haystack). So, even with all of these data points, I am making a radical suggestion.  We don’t have enough data.  Or rather, we need to diversify the types of data we collect.

Let’s take this scenario. A visitor decides to participate in a class at your organization.  They use your online ticketing system. Even the worst of online ticketing system gets their name, address with zip code, their preferred class, and their method of payment. (If you don’t at least get this, make a new choice of system). Right now, you have a great deal of data about that patron. And this data is basically quantitative; you can run the numbers on them. You calculate elements that help your audience.  For example, this person is 1 of X number of people from a zip code or one of X number of people who choose Mastercard.

So, let’s get back to our patron, shall we? Then this person arrives at your facility. They drive up to your institution in their shiny teal mini-cooper. They press the button for the ticket, and it doesn’t work.  They press it again.  Finally, a staff parking attendant comes out.  He apologizes and explains that the button they are pressing says “Staff”. They need to press the larger button which says, “Please Press, Dear Visitor”. Mr Parking R. Deck hands the patron the ticket with a smile. Mollified, the patron drives into the parking lot. In this second connection point, there were other data elements playing out. Many questions come to mind immediately. How many times does the visitor try the wrong button before calling the staff? How often do staff get called to explain the buttons?  Is there a correlation between age and misreading the button?  Is there a correlation between height of car and misreading the button?  I could go on…

Most institutions leave this type of data on the table.  It remains anecdotal; the stuff of staff meetings and lunchrooms.  There are several factors for this:

  1. The first might be the word data itself. Data has a mathematical aura that is often segregated to certain fields, like finance and technology. Going into roll call at security, you might seem incredibly radical to ask security to track their “data”. Solution: There needs to be wider understanding across the field of data and the possible sources.
  2. Data only exists if captured. Think about that for a minute. If you are not holding on to it & aggregated it,  there is no data.  Data collection takes time and resources. You need to have tools for collection. You need to train people. Solution: Institutions need to reallocate expectations to change the culture of data.
  3. Data use needs to start with a goal. Right now, many institutions are in the peer pressure phase of data collection; collecting because everyone is doing it. Rather than employing the scientific method that underlies so much of the work of our field, they don’t collect with a goal or thesis in mind. Without a goal, it sure is hard to make a roadmap to that endpoint. Basically, institutions are often wandering in the weed fields of data. Solution: Data literacy needs to include clear education on goal setting.

Making data a part of institutional culture might seem costly, and it would be, but think of this.  There are many members of the museums staff who spend the bulk of their time with visitors, including guards, teaching artists, and visitor experience professional.  Collectively, this is the sector of the museum who tracks the least amount of data.  If you asked any one of them, if they’ve noticed a challenge finding the restroom, they would be able to tell you immediately.  This is also one of the most transient members of our staffing.  They might move up in the field or move out of it.  Either way, their observations remain anecdotes—basically little data.  In order words, institutions throw away what could have been the most valuable data on their clients every day.