Category Archives: Design thinking

Please stop designing for your mother

In the past 10 years of designing software, I’ve been repeatedly told by co-workers:

“Make it easy enough for my mom to use.”

“So simple my 97 year old grandmother can figure it out.”

“Imagine you’re designing for your mother.”

The requests have good intentions – make this software easy to use so people who aren’t experts can use it easily. But three things bother me.

1. Women

Not once have I been asked to “make it so simple my grandfather can use it.” Or, “imagine you’re designing for your dad.” Never, “make it simple enough for a 22-year-old frat boy.”

No, it’s always making it simple enough for a woman.

Where does this assumption that women are the lowest common denominator of customers originate? When a colleague was asked to make it simple enough for her mother, she replied, “my mother teaches computer science at the University of Texas.”

It’s a small thing, but small things add up to big things. Big things like systemic sexism, and this is an example of it. Sexism isn’t always overt – it’s also subtle and happens when people reinforce negative stereotypes under good intentions in ways that seem innocuous.

2. Age

When people ask me to make things simpler, it’s always for someone older. Parents. Grandparents. Old people. Like the previous example, I’ve never been asked to “make it so simple a 12-year old can use it.”

Old = slow and physically limited
Young = fast and dexterous

Old = luddites
Young = tech savvy

Old = poor eyesight, big fonts
Young = sharp eyes, small fonts

But the world is diverse. Young people have physical limitations, too. I certainly did when I was 20 and my wrist was broken for three months.

Old people can be tech-savvy too. My grandfather introduced and explained the internet to my family in 1994.

Font size and ease of use – something widely believed to be correlated with age – isn’t at all. Research has found that small text is just as difficult for teens as it is for older people.

How many other assumptions about age are off base?

3. Measurement

It’s tough to measure success if the only metric is, “so simple my 97-year-old grandmother can use it.”

My grandmother is an artist. Does that mean it should be easy for artists? For people with short-term memory problems? For people who use a computer frequently? For people who use a product only once? Or many times in one day?

If you want something a certain way, state a goal, be specific, and make it measurable so everyone has the same understanding.

Why is this a problem?

Entrenched stereotypes make it easy to make false assumptions about people, and when unchallenged, those false assumptions lead to faulty product decisions, which can lead to major problems in a final product. It goes like this:

  1. I think I’m smart and understand the world and everyone in it
  2. I am making a product
  3. I don’t need to validate anything with anyone because I already understand them (they’re stupid)
  4. I release my product
  5. My product has usability, desirability, or content problems

When I hear someone say “make it simple enough for your mother”, I’m really hearing, “I’m smart and people who aren’t like me are stupid.”

But they’re not stupid. What they are is completely uninterested in software. Those stupid people are doctors, teachers, and bakers. Aid workers, architects, and sanitary workers. Waiters. Taxi drivers. Flight attendants. People who make the world function as much as anyone else. Having no interest in the tech industry does not make someone stupid.

If you understand your product and your customers don’t, they’re not the stupid ones. You are. Because you just spent a lot of time and effort releasing a product that’s hard to use and makes people feel stupid.

It’s not about simplicity

Simplicity isn’t the goal. Rather, as Don Norman writes, it’s about managing the complexity of a system so complex things are possible. It’s about making software usable and learnable, and making people feel confident about themselves.

What I’ve found effective is using measurable outcomes that avoid stereotyping anyone. So instead of, “so simple your mom can use it”, try these alternatives:

  1. Using it for the first time requires no training
    This is applicable to anyone and implies a level of ease of use.

  2. 100% task completion when people use it for the first time
    For a specific task, this helps indicate what’s important: completion rate in the context of first time use.

  3. 100% task completion within 5 seconds
    With a specific time attached, the expectations are obvious about what’s an acceptable goal for whatever’s being measured.

  4. 100% task completion within 5 seconds and a 100% satisfaction rating
    By measuring satisfaction, the goal isn’t just efficiency, but how people feel about it in the end. People might be able to finish something in 5 seconds, but how helpful is that if they feel terrible at the end of it?

Clear and measurable goals make obvious what matters and what doesn’t when making something. And ideally, what you make is usable by anyone – not just an assumed stereotype of someone.

Flat is a misnomer

I worked with someone fond of saying “I reject your false dichotomy.” Any time he was presented with an either/or option, it was most often rejected. Presenting a false dichotomy often meant I hadn’t thought in an integrative way to make the best of both options work together.

In software user interface (UI) design, flat vs. skeuomorphic is a false dichotomy to me.

Skeuomorphism is the use of shadows, textures, and patterns to make things look like real world objects. The intent is to make software easier to learn and use if it mimics real world objects, interactions, and metaphors. Even if there isn’t a real world equivalent of what you see on screen, an object’s styling can provide enough cues on how it should work.

“Flat” is the name that’s been given to the removal of skeuomorphism.

Right now there’s a popular belief that flat is better, as if there’s a binary choice between flat and skeuomorphic. But making it an either/or choice seems misguided. Discard everything helping discern depth? Discard everything providing affordance? Discard everything supporting metaphors?

Instead of flat vs. skeuomorphic, I thought it’d be interesting to look at things in different terms: Ornament vs. Metaphor

Ornament: How much detail and embellishment, like shadows, reflections, and textures, is applied to the appearance of UI?

Metaphor: How literally does the UI represent familiar objects and concepts?

I always find visualizing things on two spectrums reveals insights, so here’s some “flat” and “skeuomorphic” UI elements arbitrarily compared by ornament and metaphor.

buttons

The button metaphor is consistent and literal, but the ornament varies quite a bit.

close_buttons

Looking at the window controls, all are quite abstract concepts. The windows icons only make sense once you know what the buttons do (hide, full screen, close). The OSX metaphors are extremely abstract, as they follow the patterns of north american traffic lights: Red = stop (stop using this window), Yellow = slow down (hide the window), Green = go! (make the window big and start using it to its full potential.) Regardless of the styling, the metaphors are the same.

sliders

These are all over the place and present extreme opposites: The original iOS style looks like a literal volume slider on a 1970s hi-fi amplifier, complete with a machine milled aluminum knob. The Windows Phone slider is abstracted as far as visually possible. Even the slider knob has been removed. It’s conceptually pure, representing only the percentage of progress or used space. But does that make it more compelling?

toggles

Again, Windows Phone is extremely devoid of embellishment. The iOS 7 toggles are interesting: They look exactly like the hardware switch used on the iPod shuffle and iPad switch (literal), yet also appear abstract.

allCombined

And here’s everything in one diagram because it’s fun.

Observations

I didn’t know what to expect from this exercise. But in doing it, I realized my compelling aesthetic is less about flatness and more about subtlety. Removing all ornament in a puritan idealism reveals stark, sharp, high-contrast controls that touch on brutal and is in no way subtle.

At the other end we see extremely rendered lighting effects, shadows, and reflections that (while I have a soft spot for them) are also not subtle can be seen as distracting.

But that spot in the middle? To me, that’s where everything feels right and balanced. Notice the iOS 7 sliders and toggles are greatly reduced in literalism, yet still use shadows and highlights to convey physicality. However, the OK button uses nothing. While iOS 7 is deemed “flat”, it’s certainly not. Instead, it’s subtle.

Subtlety isn’t about not using ornament. It’s about using ornament where it’s necessary.

Milton Glaser gave a fantastic talk decrying “less is more”, articulating it better than I ever could:

Being a child of modernism I have heard this mantra all my life. Less is more. One morning upon awakening I realized that it was total nonsense, it is an absurd proposition and also fairly meaningless. But it sounds great because it contains within it a paradox that is resistant to understanding. But it simply does not obtain when you think about the visual of the history of the world. If you look at a Persian rug, you cannot say that less is more because you realize that every part of that rug, every change of colour, every shift in form is absolutely essential for its aesthetic success. You cannot prove to me that a solid blue rug is in any way superior. That also goes for the work of Gaudi, Persian miniatures, art nouveau and everything else. However, I have an alternative to the proposition that I believe is more appropriate. ‘Just enough is more.’

Conclusion

I’m not writing this to say you should use “flat” or “skeuomorphic” or that one aesthetic is better than the other or that you can even come to major conclusions without considering other experiential aspects like colour, motion, or sound.

Instead, the next time you’re making an either/or decision between A and B, consider whether A and B are appropriate labels and whether they’re obscuring parameters that might reveal more insights.

Then visualize the conversation on those parameters.

If you’re interested in false dichotomies and integrative thinking, I highly recommend reading The Opposable mind by Roger Martin.

Why I’m okay with someone stealing my idea

Ten years ago in 2004 I made The Collective Type Project, an online experiment where anyone could draw letters of the alphabet. Everyone’s input would be averaged together for each letter of the alphabet, and in the end a typeface (font) representing everyone’s contribution would be created and made available for free. The project completed in 2007, but you can still download the font and see all the letters.

2as
2 contributions averaged for A

 

255 contributions
255 contributions averaged for A

 

Screen Shot 2014-07-16 at 3.42.43 PM
The final typeface for The Collective Type Project

 

Recently, The Universal Typeface Experiment was posted on my Facebook feed:

collectiveFacebookPost

It was posted because I also made a globally crowdsourced, mouse drawn, eventually downloadable font 10 years ago.

Here’s the description of Bic’s Universal Typeface Experiment:

This experiment allows individuals from all over the world to contribute their handwriting. A specially developed algorithm then calculates an average, allowing us to merge contributions into a single, ever-changing and always evolving typeface.

Ten years ago I’d be livid to see this because I would have thought they stole my idea.

But today, after ten years of designing products, I feel the opposite. I don’t think my idea was stolen. In fact, I’m excited for Bic’s project because I’ve learned a few lessons.

Lesson 1: Ideas are rarely unique

Before I make something, I keep the following in mind:

  1. Before starting, assume what you’re about to make has already been made
  2. While making, assume other people are actively making the same thing
  3. After you’re done, assume other people will make the same thing, whether intentionally or unintentionally

It’s just how things work. Ideas are cheap, plentiful, and tend to repeat. Don’t take it personally when they do. But some people do take it personally, and that leads to the next point.

Lesson 2: I don’t think anyone stole my idea

It’s a marvelous conceit to believe someone stole your idea.

The first time someone stole my idea was in Mrs. Small’s first grade classroom when I was seven years old. At show-and-tell, I was going to show off my transformer toy, but Scott shared the exact same toy before I could.

He STOLE MY IDEA.

Two years later I discovered a large quartz deposit in the backyard of a house next to the schoolyard. A chain-link fence separated me from some dirt, so I used a stick to dig at it, scored some sweet quartz, and quickly became the first quartz baron of Oakwood Public Elementary School.

A week after showing everyone my quartz haul, dozens of quartz-greedy children abandoned a sweet playground for poking the dirt with sticks through a rusty chain-link fence, allowing me full reign of the swing set.

But still, they STOLE MY IDEA.

Things like this happened for years. All the time. Even recently when I worked at Microsoft designing the thumb keyboard for Windows 8.

We were so excited to reveal it to the world at the D9 conference in 2011. Months of work led up to this moment. After the big reveal, there were like, three tweets about it. It was that monumental.

Screen Shot 2014-07-16 at 2.01.15 PM
I remember where I was when the world was changed forever.

Four days later, Apple revealed updates to their forthcoming iOS 5, which included a thumb keyboard for iPad. What!? They must have seen our keynote, and in 96 hours scrambled, strategized, planned, designed, coded, tested, and integrated a fully functional thumb keyboard into iOS. Because there’s no way they could have had that idea without seeing my idea first.

Apple STOLE MY IDEA.

If the theme of STOLE MY IDEA doesn’t sound completely ridiculous yet, it should. Because saying someone stole your idea (lacking evidence) is like saying humans are incapable of independent thought. That ideas are not intuited, but only exist by stealing from others.

You may be saying THEY STOLE MY IDEA. But what everyone hears is I’M INCREDIBLY INSECURE ABOUT THIS SMALL THING THAT NO ONE BUT ME CARES ABOUT.

Ask anyone “Who released the thumb keyboard first: Apple or Microsoft?” and you’ll get a consistent answer: “Who cares?” No one cares who’s idea it was. No one cares who was first.

In the end, screaming “they STOLE MY IDEA” only makes me sound like a petulant first grader in Mrs. Small’s class.

Lesson 3: Influence is inspiration

If you want to influence and inspire people, you can’t be upset when their work reminds you of your own.

Averaging many visual things into one visual thing isn’t a unique concept. I was probably influenced by Jason Salavon’s work where he averaged every Playboy centerfold into one image:

salavon
Jason Salavon’s Every Playboy Centerfold, 1988-1997 

One of my all-time favourite songs is A Warm Place by Nine Inch Nails. Interestingly, the melody is nearly identical to David Bowie’s Crystal Japan.

In an interview with the two musicians, Reznor talks about writing A Warm Place and how it sounded too good to be original. Unintentionally, he re-wrote elements of Bowie’s song. Was Bowie pissed to find out? No, because Bowie wrote it 14 years prior and had written, evolved, and released a whole pile of new work since then.

But not only was he not pissed, the two collaborated on “I’m Afraid of Americans.” Similarity and influence doesn’t have to end with antagonism.

Did I influence someone involved with the Bic project? Maybe someone saw my site ten years ago and was unconsciously influenced. Maybe not. But what I do know is I now have something in common with MediaMonks (the people who built the Universal Typeface Experiment.)

Honestly, I’m really excited for MediaMonks because the project will reveal some really cool data, insights, and human behavior just like Collective did. It’s really fun stuff. And I hope I’ll be able to meet or at least chat with some of them to learn how they approached the project, where they’ll take it, and some of the cool stories they find in the data.

I’m sure they’d love to hang out if the first thing I said was “HEY ASSHOLES, YOU STOLE MY IDEA. We should get coffee some time!”

Lesson 4: Execution > idea

The Universal Typeface Experiment is being executed in a way Collective couldn’t. Because I love the concept of aggregating mass participation into an unpredictable and functional end product, I’m just super stoked to see it evolve.

In 2004, no one had a touchscreen, web servers had bottlenecks, and mass participation was difficult.

In 2004, people had to “write” letters into Collective with mice connected to desktop computers. My web server limited me to storing 255 contributions per character. Rallying mass participation was difficult unless you were mentioned on a major design site like K10K or NetDiver.

When I built Collective, Facebook was only for ivy leaguers, Twitter didn’t exist, Reddit didn’t exist, YouTube didn’t exist, and DeviantArt was a lot of drawings with bad lens flare effects. The best I had for rallying mass participation was combining MSN messenger, email, and Friendster and hoping it’d appear on SlashDot.

In 2014 things are different. Ubiquitous touchscreens make drawing letters easier and better. Cheap computing allows Bic to get over 1 million contributions instead of 30,000. And mass participation is easy thanks to the powerful sharing methods of the social network du jour.

On top of that, Bic is bringing in user data for things like handedness, country, age, and gender to make things even more interesting.

I had fun building Collective, it was a cool portfolio piece that helped me get a job I loved, and then I moved on to the next thing. If I cared about it that much, wouldn’t I have evolved it by now? How can I be mad about someone doing something similar when I practically abandoned the project eight years ago?

Lesson 5: Protect when necessary

Everything I’ve written so far has been lovey-dovey. But there’s always the chance someone intentionally took Collective and re-implemented it.

IP theft is horrible, frequent, and can destroy people and businesses when it happens. But if you’re in a position where someone stealing your idea results in you getting fucked, your job description has a new bullet point: Protect and defend IP. If someone steals and implements your IP, you need to get better at your job.

In conclusion

Know what’s valuable, know what’s worth defending, and protect it. Because it will be replicated intentionally or otherwise.

The most intimidating part of creation

If you’re doing it wrong, the most intimidating part of creation is starting. An empty page, a blank canvas, or the first entry of a diary terrifies you.

I said “doing it wrong” because the potential for creation should be inspiring. An empty page, a blank canvas, and the first entry of a diary should excite you, fill you with anticipation, and spark an impatience that you can’t start soon enough because the act of creation is more alluring than the end result. If you’re doing it right, the most intimidating part of creating is stopping.

When you’re doing it wrong, the act of creation is terrifying. You feel fear. You’re anxious. You procrastinate. You’re distractible. You’ll do anything but start. The only things you create are excuses.

“I’m tired.”
“I’ll do it later.”
“Just one more.”
“I’m not in the mood.”
“I need to decompress.”
“There’s not enough time.”
“I have to ______________ instead.”

Excuses and procrastination are acts of self-preservation guarding you from something worse than death: discomfort and isolation. But let’s back up and ask “why” a few times.

Why aren’t you creating?
Because I’m procrastinating.

Why are you procrastinating?
Because the empty page is intimidating.

Why is it intimidating?
Because I don’t know where to start.

Why don’t you know where to start?
Because I need to organize my thoughts and do my research.

Why do you need to organize your thoughts and do your research?
Because I need to be prepared.

Why do you need to be prepared?
If I’m not prepared, I might be doing something wrong. I might make a mistake.

Why would you avoid making a mistake?
Because mistakes are bad.

Why are mistakes bad?
Because mistakes lead to criticism and I don’t like criticism.

Why don’t you like criticism?
Because criticism is judgement. With judgement comes the potential of failure.

Why avoid failure?
Failure leads to rejection.

Why avoid rejection?
Rejection leads to isolation, discomfort, and a harder life.


The blank page is terrifying because you’ve judged your work and witnessed failure before you’ve even done anything. And what better way to avoid failure than abstaining from activities that bring failure?

What better way? How about reframing “mistake”, “failure”, and “criticism”, and embracing serendipity instead.

1. Redefine “mistake”
We’re taught from an early age to avoid mistakes. But mistakes are how we learn. If you’re not making mistakes, you’re not learning. If you’re not learning, you’re not growing. Growth makes you better, resulting in better work.

“Mistake” can be subverted too: You’re not making mistakes, you’re experimenting.

2. Prioritize failure
Mistakes add up to failure and we’re taught from an early age to avoid failure.

Fail the test and you’ll get a bad grade. Enough bad grades and you’ll fail the class. Fail the class and you won’t get into university. Fail getting into university and you’ll fail at getting a well-paying job. Fail to make enough money and you FAIL AT LIFE.

It’s hard to shake the failure mindset because it’s taught at an early age and it helps us survive as a species. Consequences are severe when you fail at drinking water, crossing the street, or flossing and brushing your teeth. Consequences are not severe when writing a blog, drawing a picture, or journaling.

For everything you do, ask “How severe are the consequences of failing what I’m doing?” Most times the answer will be “Not at all.” It’s incredibly liberating.

3. Good criticism makes you better
Good criticism doesn’t judge. It allows other perspectives to reshape your work in ways you can’t. It helps you grow.

Here’s good criticism:
“This part is working for me because ______________. That part isn’t working for me because ______________. That’s my input and it’s your choice to use it or not.”

Here’s bad criticism:
“I don’t like it. This sucks. That’s stupid. You should do ______________ instead. ______________ would be better.”

Good criticism brings new perspectives, entices conversation, and builds trust. Bad criticism is bossy, judgmental, and hinders growth.

Reject bad criticism. Especially bad self-criticism.

4. Embrace serendipity
Removing expectations and allowing yourself to be taken in new unexpected directions results in great work. Expecting to create A and ending up with Z is one of the greatest outcomes of creation. You surprise yourself, you end up with something better than you expected, and you grow along the way. This is embracing serendipity, and it’s a lot easier when you redefine “mistake”, prioritize failure, and focus on good criticism.


This is my first post for Maker Year. It took me seven weeks to actually sit down and write it because I was intimidated by the blank page for all the reasons above. I started writing this with the expectation of creating A (sharing my plan for the year ahead). But by embracing serendipity, I unexpectedly wrote Z (what you just read), grew in the process, and got excited about creating again.

What do you want to create and what’s preventing you from starting?