All posts by Jeff Weir

I'm making things while traveling the world for a year. Ideally collaborating with others. UX design, art, and dance.

Maker Year Project #11: Letter Apps

Letter apps are apps whose icon’s primary visual element is a letter. For instance, Hyperlapse, Gmail, and Groupon are letter apps.

icon-mistaken icon-abstracted icon-letter

I looked through the top 10 000 apps on the iTunes App Store on January 1 2015. From those, I found 834 letter apps. Then I visualized them. Tap or click the visualization below to see it in its full glory.

makeryear-letter-apps
It looks really good really big. 1 MB

Observations

When using a colour background, white is almost always used for the foreground.

For the letter C, over half of the icons have white backgrounds. No other letter has such a high percentage of white icons.

Many names of the O apps don’t actually begin with O.

M and S are very popular letters (62 and 57 icons). This is probably because a lot of app names begin with M and S.

J, K, and I are least popular (6, 11, and 11). Most likely because there aren’t many apps beginning with J, K, or I. But I like to think the dots on the i and j make them particularly unpopular.

icon-hybrid

Sometimes apps like Guitar Master Class are letter apps and sometimes they’re not. In some, the icon dominates the letter. In others like Gmail, the letter dominates the icon.

icon-doubleLetter

I didn’t include apps like CamCard because there’s more than one letter. But if I did, there’d be 386 more apps in the set.

Please stop designing for your mother

In the past 10 years of designing software, I’ve been repeatedly told by co-workers:

“Make it easy enough for my mom to use.”

“So simple my 97 year old grandmother can figure it out.”

“Imagine you’re designing for your mother.”

The requests have good intentions – make this software easy to use so people who aren’t experts can use it easily. But three things bother me.

1. Women

Not once have I been asked to “make it so simple my grandfather can use it.” Or, “imagine you’re designing for your dad.” Never, “make it simple enough for a 22-year-old frat boy.”

No, it’s always making it simple enough for a woman.

Where does this assumption that women are the lowest common denominator of customers originate? When a colleague was asked to make it simple enough for her mother, she replied, “my mother teaches computer science at the University of Texas.”

It’s a small thing, but small things add up to big things. Big things like systemic sexism, and this is an example of it. Sexism isn’t always overt – it’s also subtle and happens when people reinforce negative stereotypes under good intentions in ways that seem innocuous.

2. Age

When people ask me to make things simpler, it’s always for someone older. Parents. Grandparents. Old people. Like the previous example, I’ve never been asked to “make it so simple a 12-year old can use it.”

Old = slow and physically limited
Young = fast and dexterous

Old = luddites
Young = tech savvy

Old = poor eyesight, big fonts
Young = sharp eyes, small fonts

But the world is diverse. Young people have physical limitations, too. I certainly did when I was 20 and my wrist was broken for three months.

Old people can be tech-savvy too. My grandfather introduced and explained the internet to my family in 1994.

Font size and ease of use – something widely believed to be correlated with age – isn’t at all. Research has found that small text is just as difficult for teens as it is for older people.

How many other assumptions about age are off base?

3. Measurement

It’s tough to measure success if the only metric is, “so simple my 97-year-old grandmother can use it.”

My grandmother is an artist. Does that mean it should be easy for artists? For people with short-term memory problems? For people who use a computer frequently? For people who use a product only once? Or many times in one day?

If you want something a certain way, state a goal, be specific, and make it measurable so everyone has the same understanding.

Why is this a problem?

Entrenched stereotypes make it easy to make false assumptions about people, and when unchallenged, those false assumptions lead to faulty product decisions, which can lead to major problems in a final product. It goes like this:

  1. I think I’m smart and understand the world and everyone in it
  2. I am making a product
  3. I don’t need to validate anything with anyone because I already understand them (they’re stupid)
  4. I release my product
  5. My product has usability, desirability, or content problems

When I hear someone say “make it simple enough for your mother”, I’m really hearing, “I’m smart and people who aren’t like me are stupid.”

But they’re not stupid. What they are is completely uninterested in software. Those stupid people are doctors, teachers, and bakers. Aid workers, architects, and sanitary workers. Waiters. Taxi drivers. Flight attendants. People who make the world function as much as anyone else. Having no interest in the tech industry does not make someone stupid.

If you understand your product and your customers don’t, they’re not the stupid ones. You are. Because you just spent a lot of time and effort releasing a product that’s hard to use and makes people feel stupid.

It’s not about simplicity

Simplicity isn’t the goal. Rather, as Don Norman writes, it’s about managing the complexity of a system so complex things are possible. It’s about making software usable and learnable, and making people feel confident about themselves.

What I’ve found effective is using measurable outcomes that avoid stereotyping anyone. So instead of, “so simple your mom can use it”, try these alternatives:

  1. Using it for the first time requires no training
    This is applicable to anyone and implies a level of ease of use.

  2. 100% task completion when people use it for the first time
    For a specific task, this helps indicate what’s important: completion rate in the context of first time use.

  3. 100% task completion within 5 seconds
    With a specific time attached, the expectations are obvious about what’s an acceptable goal for whatever’s being measured.

  4. 100% task completion within 5 seconds and a 100% satisfaction rating
    By measuring satisfaction, the goal isn’t just efficiency, but how people feel about it in the end. People might be able to finish something in 5 seconds, but how helpful is that if they feel terrible at the end of it?

Clear and measurable goals make obvious what matters and what doesn’t when making something. And ideally, what you make is usable by anyone – not just an assumed stereotype of someone.

Flat is a misnomer

I worked with someone fond of saying “I reject your false dichotomy.” Any time he was presented with an either/or option, it was most often rejected. Presenting a false dichotomy often meant I hadn’t thought in an integrative way to make the best of both options work together.

In software user interface (UI) design, flat vs. skeuomorphic is a false dichotomy to me.

Skeuomorphism is the use of shadows, textures, and patterns to make things look like real world objects. The intent is to make software easier to learn and use if it mimics real world objects, interactions, and metaphors. Even if there isn’t a real world equivalent of what you see on screen, an object’s styling can provide enough cues on how it should work.

“Flat” is the name that’s been given to the removal of skeuomorphism.

Right now there’s a popular belief that flat is better, as if there’s a binary choice between flat and skeuomorphic. But making it an either/or choice seems misguided. Discard everything helping discern depth? Discard everything providing affordance? Discard everything supporting metaphors?

Instead of flat vs. skeuomorphic, I thought it’d be interesting to look at things in different terms: Ornament vs. Metaphor

Ornament: How much detail and embellishment, like shadows, reflections, and textures, is applied to the appearance of UI?

Metaphor: How literally does the UI represent familiar objects and concepts?

I always find visualizing things on two spectrums reveals insights, so here’s some “flat” and “skeuomorphic” UI elements arbitrarily compared by ornament and metaphor.

buttons

The button metaphor is consistent and literal, but the ornament varies quite a bit.

close_buttons

Looking at the window controls, all are quite abstract concepts. The windows icons only make sense once you know what the buttons do (hide, full screen, close). The OSX metaphors are extremely abstract, as they follow the patterns of north american traffic lights: Red = stop (stop using this window), Yellow = slow down (hide the window), Green = go! (make the window big and start using it to its full potential.) Regardless of the styling, the metaphors are the same.

sliders

These are all over the place and present extreme opposites: The original iOS style looks like a literal volume slider on a 1970s hi-fi amplifier, complete with a machine milled aluminum knob. The Windows Phone slider is abstracted as far as visually possible. Even the slider knob has been removed. It’s conceptually pure, representing only the percentage of progress or used space. But does that make it more compelling?

toggles

Again, Windows Phone is extremely devoid of embellishment. The iOS 7 toggles are interesting: They look exactly like the hardware switch used on the iPod shuffle and iPad switch (literal), yet also appear abstract.

allCombined

And here’s everything in one diagram because it’s fun.

Observations

I didn’t know what to expect from this exercise. But in doing it, I realized my compelling aesthetic is less about flatness and more about subtlety. Removing all ornament in a puritan idealism reveals stark, sharp, high-contrast controls that touch on brutal and is in no way subtle.

At the other end we see extremely rendered lighting effects, shadows, and reflections that (while I have a soft spot for them) are also not subtle can be seen as distracting.

But that spot in the middle? To me, that’s where everything feels right and balanced. Notice the iOS 7 sliders and toggles are greatly reduced in literalism, yet still use shadows and highlights to convey physicality. However, the OK button uses nothing. While iOS 7 is deemed “flat”, it’s certainly not. Instead, it’s subtle.

Subtlety isn’t about not using ornament. It’s about using ornament where it’s necessary.

Milton Glaser gave a fantastic talk decrying “less is more”, articulating it better than I ever could:

Being a child of modernism I have heard this mantra all my life. Less is more. One morning upon awakening I realized that it was total nonsense, it is an absurd proposition and also fairly meaningless. But it sounds great because it contains within it a paradox that is resistant to understanding. But it simply does not obtain when you think about the visual of the history of the world. If you look at a Persian rug, you cannot say that less is more because you realize that every part of that rug, every change of colour, every shift in form is absolutely essential for its aesthetic success. You cannot prove to me that a solid blue rug is in any way superior. That also goes for the work of Gaudi, Persian miniatures, art nouveau and everything else. However, I have an alternative to the proposition that I believe is more appropriate. ‘Just enough is more.’

Conclusion

I’m not writing this to say you should use “flat” or “skeuomorphic” or that one aesthetic is better than the other or that you can even come to major conclusions without considering other experiential aspects like colour, motion, or sound.

Instead, the next time you’re making an either/or decision between A and B, consider whether A and B are appropriate labels and whether they’re obscuring parameters that might reveal more insights.

Then visualize the conversation on those parameters.

If you’re interested in false dichotomies and integrative thinking, I highly recommend reading The Opposable mind by Roger Martin.

Thanks, AT&T

Today I called the American Telegraph and Telephone Company (AT&T) to suspend my phone account. Since I’m traveling for a year and AT&T has awful international roaming options, my best option is suspending my account for a year and buying prepaid SIM cards in every country I visit. This ends up averaging $20 per month for data, calls, and texts. This is significantly cheaper than adding a $30 per month international package to my existing AT&T bill.

Since account suspensions have a maximum of 6 months and I’m halfway through the year, it was time to re-suspend.

I was greeted in the typical fashion.
“Hello, and thank you for calling AT&T. My name is Ebony. How can I help you today?”
“Hi Ebony, I was wondering if you can help me suspend my account for 6 months.”
“Sure, I can do that. May I ask why you’re suspending it for 6 months?”
“Well, I’m traveling the world and I won’t need my –“
“WHAT? You’re traveling the world for 6 months? That’s amazing…” her voice trailed off.
“Yeah, it’s really fun! But I’m actually traveling for a year which is why I need to re-suspend it.”
“Oh, wow… I wish I could do that. How can you even do that?”
“Well, I started saving eight years ago. And a lot of cost reduction. Like think about your cable bill. Let’s say it’s $25 per month. That doesn’t sound like much, right?”
“No.”
“But if you think about a full year of cable, that’s $300. And over eight years –”
“That’s $2400.” She was much faster than me with the arithmetic.
“Yeah, and that’s just TV. Imagine what you’re paying for a car with insurance, gas, upkeep, and the car itself. Or anything you pay for in life.”

We ended up talking about finances, travel, and other non-AT&T related things for 10 minutes. Knowing these calls are recorded for quality purposes, she started winding down because Tom from Delaware probably needed help suspending his teenage daughter’s account.

“Well, I’m inspired.” she said.
“I’m so happy to hear that! It was nice chatting with you.”

After hanging up, I got a giddy feeling. I never thought my journey for inspiration would be something that could inspire others. But this conversation made me realize some people want to do what I’m doing but can’t imagine it ever being possible.

So, thanks Ebony. You’ve inspired me to start sharing how I planned this year off and how to overcome the fear and doubt that prevents us from believing we can do what we dream of doing.

This is Maker Year

I had a habit of doing side projects outside my day job, but a few problems existed:

  1. I was more interested in side projects than my actual job
  2. I have more ideas than time to build them
  3. Working at a computer day and night leads to repetitive stress injury and poor health

Instead of working on side projects while half-assing my job, I quit and dedicated a year to working on personal projects. I’m calling this Maker Year.

The plan for Maker Year

Do 8 one-week projects. After that, work on the best project for 4 more weeks.

mkryrplan1

Do this 3 times.

mkryrplan2

Work on the best project of the year for 12 weeks.

mkryrplan3

In the end, there will be:

  • 24 seed projects
  • 3 incubated projects
  • 1 developed project

It’s a process that should provide great breadth, great depth, and great iteration. Something I’ve noticed about ideas is they’re amazing until you start working on them. One week is enough time to learn how amazing something is or isn’t. And with the 4 week and 12 week blocks for further development, I can easily abandon a project at the end of a week to start something new.

In an ideal case, one or more of these projects can generate income and I can keep doing this. In the worst case, I won’t do any projects and look back at my year with great regret.

This is Maker Year

It’s exactly like a paid sabbatical, except there’s no pay and no job to return to.

It’s exactly like unemployment, except there’s work. Work that’s 100% self-directed, and doesn’t feel like work.

It’s exactly like school, except there’s no classmates, no credentials, and no teachers.

This is Maker Year. It’s exactly the opposite of “Don’t quit your day job.”

Why I’m okay with someone stealing my idea

Ten years ago in 2004 I made The Collective Type Project, an online experiment where anyone could draw letters of the alphabet. Everyone’s input would be averaged together for each letter of the alphabet, and in the end a typeface (font) representing everyone’s contribution would be created and made available for free. The project completed in 2007, but you can still download the font and see all the letters.

2as
2 contributions averaged for A

 

255 contributions
255 contributions averaged for A

 

Screen Shot 2014-07-16 at 3.42.43 PM
The final typeface for The Collective Type Project

 

Recently, The Universal Typeface Experiment was posted on my Facebook feed:

collectiveFacebookPost

It was posted because I also made a globally crowdsourced, mouse drawn, eventually downloadable font 10 years ago.

Here’s the description of Bic’s Universal Typeface Experiment:

This experiment allows individuals from all over the world to contribute their handwriting. A specially developed algorithm then calculates an average, allowing us to merge contributions into a single, ever-changing and always evolving typeface.

Ten years ago I’d be livid to see this because I would have thought they stole my idea.

But today, after ten years of designing products, I feel the opposite. I don’t think my idea was stolen. In fact, I’m excited for Bic’s project because I’ve learned a few lessons.

Lesson 1: Ideas are rarely unique

Before I make something, I keep the following in mind:

  1. Before starting, assume what you’re about to make has already been made
  2. While making, assume other people are actively making the same thing
  3. After you’re done, assume other people will make the same thing, whether intentionally or unintentionally

It’s just how things work. Ideas are cheap, plentiful, and tend to repeat. Don’t take it personally when they do. But some people do take it personally, and that leads to the next point.

Lesson 2: I don’t think anyone stole my idea

It’s a marvelous conceit to believe someone stole your idea.

The first time someone stole my idea was in Mrs. Small’s first grade classroom when I was seven years old. At show-and-tell, I was going to show off my transformer toy, but Scott shared the exact same toy before I could.

He STOLE MY IDEA.

Two years later I discovered a large quartz deposit in the backyard of a house next to the schoolyard. A chain-link fence separated me from some dirt, so I used a stick to dig at it, scored some sweet quartz, and quickly became the first quartz baron of Oakwood Public Elementary School.

A week after showing everyone my quartz haul, dozens of quartz-greedy children abandoned a sweet playground for poking the dirt with sticks through a rusty chain-link fence, allowing me full reign of the swing set.

But still, they STOLE MY IDEA.

Things like this happened for years. All the time. Even recently when I worked at Microsoft designing the thumb keyboard for Windows 8.

We were so excited to reveal it to the world at the D9 conference in 2011. Months of work led up to this moment. After the big reveal, there were like, three tweets about it. It was that monumental.

Screen Shot 2014-07-16 at 2.01.15 PM
I remember where I was when the world was changed forever.

Four days later, Apple revealed updates to their forthcoming iOS 5, which included a thumb keyboard for iPad. What!? They must have seen our keynote, and in 96 hours scrambled, strategized, planned, designed, coded, tested, and integrated a fully functional thumb keyboard into iOS. Because there’s no way they could have had that idea without seeing my idea first.

Apple STOLE MY IDEA.

If the theme of STOLE MY IDEA doesn’t sound completely ridiculous yet, it should. Because saying someone stole your idea (lacking evidence) is like saying humans are incapable of independent thought. That ideas are not intuited, but only exist by stealing from others.

You may be saying THEY STOLE MY IDEA. But what everyone hears is I’M INCREDIBLY INSECURE ABOUT THIS SMALL THING THAT NO ONE BUT ME CARES ABOUT.

Ask anyone “Who released the thumb keyboard first: Apple or Microsoft?” and you’ll get a consistent answer: “Who cares?” No one cares who’s idea it was. No one cares who was first.

In the end, screaming “they STOLE MY IDEA” only makes me sound like a petulant first grader in Mrs. Small’s class.

Lesson 3: Influence is inspiration

If you want to influence and inspire people, you can’t be upset when their work reminds you of your own.

Averaging many visual things into one visual thing isn’t a unique concept. I was probably influenced by Jason Salavon’s work where he averaged every Playboy centerfold into one image:

salavon
Jason Salavon’s Every Playboy Centerfold, 1988-1997 

One of my all-time favourite songs is A Warm Place by Nine Inch Nails. Interestingly, the melody is nearly identical to David Bowie’s Crystal Japan.

In an interview with the two musicians, Reznor talks about writing A Warm Place and how it sounded too good to be original. Unintentionally, he re-wrote elements of Bowie’s song. Was Bowie pissed to find out? No, because Bowie wrote it 14 years prior and had written, evolved, and released a whole pile of new work since then.

But not only was he not pissed, the two collaborated on “I’m Afraid of Americans.” Similarity and influence doesn’t have to end with antagonism.

Did I influence someone involved with the Bic project? Maybe someone saw my site ten years ago and was unconsciously influenced. Maybe not. But what I do know is I now have something in common with MediaMonks (the people who built the Universal Typeface Experiment.)

Honestly, I’m really excited for MediaMonks because the project will reveal some really cool data, insights, and human behavior just like Collective did. It’s really fun stuff. And I hope I’ll be able to meet or at least chat with some of them to learn how they approached the project, where they’ll take it, and some of the cool stories they find in the data.

I’m sure they’d love to hang out if the first thing I said was “HEY ASSHOLES, YOU STOLE MY IDEA. We should get coffee some time!”

Lesson 4: Execution > idea

The Universal Typeface Experiment is being executed in a way Collective couldn’t. Because I love the concept of aggregating mass participation into an unpredictable and functional end product, I’m just super stoked to see it evolve.

In 2004, no one had a touchscreen, web servers had bottlenecks, and mass participation was difficult.

In 2004, people had to “write” letters into Collective with mice connected to desktop computers. My web server limited me to storing 255 contributions per character. Rallying mass participation was difficult unless you were mentioned on a major design site like K10K or NetDiver.

When I built Collective, Facebook was only for ivy leaguers, Twitter didn’t exist, Reddit didn’t exist, YouTube didn’t exist, and DeviantArt was a lot of drawings with bad lens flare effects. The best I had for rallying mass participation was combining MSN messenger, email, and Friendster and hoping it’d appear on SlashDot.

In 2014 things are different. Ubiquitous touchscreens make drawing letters easier and better. Cheap computing allows Bic to get over 1 million contributions instead of 30,000. And mass participation is easy thanks to the powerful sharing methods of the social network du jour.

On top of that, Bic is bringing in user data for things like handedness, country, age, and gender to make things even more interesting.

I had fun building Collective, it was a cool portfolio piece that helped me get a job I loved, and then I moved on to the next thing. If I cared about it that much, wouldn’t I have evolved it by now? How can I be mad about someone doing something similar when I practically abandoned the project eight years ago?

Lesson 5: Protect when necessary

Everything I’ve written so far has been lovey-dovey. But there’s always the chance someone intentionally took Collective and re-implemented it.

IP theft is horrible, frequent, and can destroy people and businesses when it happens. But if you’re in a position where someone stealing your idea results in you getting fucked, your job description has a new bullet point: Protect and defend IP. If someone steals and implements your IP, you need to get better at your job.

In conclusion

Know what’s valuable, know what’s worth defending, and protect it. Because it will be replicated intentionally or otherwise.

Maker Year Project #1: The Steven Seagal Turing Test

I’m happy to share my first real project for Maker Year: The Steven Seagal Movie Title Generator. If you’re familiar with Steven Seagal or his films, I’m sure it will make you laugh.

Project Origin
On June 5, I arrived in Cambodia’s capitol, Phnom Penh. Exhausted, overwhelmed, and disoriented, I sought reprieve in my hotel room after a long “short walk” through the city. Trying not to sweat too hard in my 35° C room, the television presented three options:

  1. Watch Twilight with Khmer voiceovers
  2. Watch Force of Execution, Steven Seagal’s 2nd newest film
  3. Watch nothing and turn it off

Naturally, I chose #1.

But I was quickly confused. Why does this gentleman always have a 10,000 foot stare? Why does he sparkle in the sunlight? Why is this girl’s lip not scarred and scabby from two decades of pensive biting?

That’s when I turned to Force of Execution. A film with a Tarantino-esque title sequence followed by a 99 minute plot hole.

On learning the title of the film, I realized it means absolutely nothing. It’s just random words that conjure imagery of detectives, criminals, or courtroom proceedings. Yet it sounds intense, serious, and legitimate. Something that would impress a 12 to 15 year old boy. So I look into Seagal’s other films to see if this naming convention is “a thing”.

Not only is it “a thing”, I quickly learn Seagal has starred in over 40 films since his 1988 debut, Above the Law. And all titles follow this theme.

And this is when I invented the Steven Seagal Turing Test. “Can a computer fool someone into thinking an algorithmically generated Steven Seagal film actually exists?”

Thus, in Southeast Asia a seed was planted. The Steven Seagal Movie Title Generator is but a seedling, ready to grow into a sturdy tree bearing sweet fruit.

What problem does this solve?
Boy, do Eric Ries and Startup Weekend love that question. This doesn’t solve anyone’s problem at all. This solves my problem: How do I transition from building things in Flash to building things in technologies that work on any device?

I figured it was time to learn me some SVG and jQuery.

What I learned

  • SVG needs major improvement in the typography department. Word wrapping doesn’t exist.
  • Use getBBox() to find the x, y, width, and height values of SVG elements.
  • Don’t create SVG elements with jQuery. The elements won’t be recognized as SVG and things like getBBox() won’t work. Instead, I use this method provided by a helpful person on Stack Overflow.
  • At the moment, getBBox() doesn’t work with anything in a <tspan> in certain browsers, so use <text> if you’re programmatically measuring and positioning text.
  • SVG attributes like textLength are case-sensitive, so they’ll break if you define or manipulate them with jQuery’s attr() function. Attr() converts everything to lowercase.
  • You can get the raw HTML object from a jQuery object like this: $(‘#elementName’)[0].
  • When learning a new language or technology, the project I’m applying it to needs to be well scoped, easily understood, and devoid of ambiguity. Otherwise I’m figuring out two things at once and I’m easily overwhelmed by not knowing anything and having to learn everything. I’ll rapidly stall out and resign myself to watching Terminator 2 with Khmer voiceovers.

Next steps for this project

  1. Make the posters savable and shareable
  2. Apply the actual Steven Seagal Turing Test by letting people vote whether they think a title is real or automatically generated
  3. Create more poster variants
  4. Improve typography: layout, sizing, and typefaces

Fun facts about Steven Seagal
Steven Seagal is 62 years old and people say awful things about him “becoming old, slow, and fat”. To those people, when you reach age 62 I invite you to compare your appearance and achievements to those of Steven Seagal’s. You’ll beat him on total number of internet comments written. That is all.

Steven Seagal is prolific. To date, he’s been in 45 films, and not a single one involved being a kindergarden cop, acting alongside Danny DeVito, or doing a voiceover for a baby or cartoon character. Every single one involved crushing someone’s bones or joints.

Here are all his films:
Above the Law (1988)
Hard to Kill (1990)
Marked for Death (1990)
Out for Justice (1991)
Under Siege (1992)
On Deadly Ground (1994)
Under Siege 2: Dark Territory (1995)
Executive Decision (1996)
The Glimmer Man (1996)
Fire Down Below (1997)
The Patriot (1998)
Not Even the Trees (1998)
Prince of Central Park (2000)
Exit Wounds (2001)
Ticker (2001)
Half Past Dead (2002)
The Foreigner (2003)
Out for a Kill (2003)
Belly of the Beast (2003)
Clementine (2004)
Out of Reach (2004)
Into the Sun (2005)
Submerged (2005)
Today You Die (2005)
Black Dawn (2005)
Mercenary for Justice (2006)
Shadow Man (2006)
Attack Force (2006)
Flight of Fury (2007)
Urban Justice (2007)
Pistol Whipped (2008)
The Onion Movie (2008)
Kill Switch (2008)
Against the Dark (2009)
Driven to Kill (2009)
The Keeper (2009)
A Dangerous Man (2009)
Machete (2010)
Born to Raise Hell (2010)
Deadly Crossing (2011)
Sheep Impact (2011)
Maximum Conviction (2012)
Force of Execution (2013)
Gutshot Straight (2013)
Dark Vengeance (2014)

The most intimidating part of creation

If you’re doing it wrong, the most intimidating part of creation is starting. An empty page, a blank canvas, or the first entry of a diary terrifies you.

I said “doing it wrong” because the potential for creation should be inspiring. An empty page, a blank canvas, and the first entry of a diary should excite you, fill you with anticipation, and spark an impatience that you can’t start soon enough because the act of creation is more alluring than the end result. If you’re doing it right, the most intimidating part of creating is stopping.

When you’re doing it wrong, the act of creation is terrifying. You feel fear. You’re anxious. You procrastinate. You’re distractible. You’ll do anything but start. The only things you create are excuses.

“I’m tired.”
“I’ll do it later.”
“Just one more.”
“I’m not in the mood.”
“I need to decompress.”
“There’s not enough time.”
“I have to ______________ instead.”

Excuses and procrastination are acts of self-preservation guarding you from something worse than death: discomfort and isolation. But let’s back up and ask “why” a few times.

Why aren’t you creating?
Because I’m procrastinating.

Why are you procrastinating?
Because the empty page is intimidating.

Why is it intimidating?
Because I don’t know where to start.

Why don’t you know where to start?
Because I need to organize my thoughts and do my research.

Why do you need to organize your thoughts and do your research?
Because I need to be prepared.

Why do you need to be prepared?
If I’m not prepared, I might be doing something wrong. I might make a mistake.

Why would you avoid making a mistake?
Because mistakes are bad.

Why are mistakes bad?
Because mistakes lead to criticism and I don’t like criticism.

Why don’t you like criticism?
Because criticism is judgement. With judgement comes the potential of failure.

Why avoid failure?
Failure leads to rejection.

Why avoid rejection?
Rejection leads to isolation, discomfort, and a harder life.


The blank page is terrifying because you’ve judged your work and witnessed failure before you’ve even done anything. And what better way to avoid failure than abstaining from activities that bring failure?

What better way? How about reframing “mistake”, “failure”, and “criticism”, and embracing serendipity instead.

1. Redefine “mistake”
We’re taught from an early age to avoid mistakes. But mistakes are how we learn. If you’re not making mistakes, you’re not learning. If you’re not learning, you’re not growing. Growth makes you better, resulting in better work.

“Mistake” can be subverted too: You’re not making mistakes, you’re experimenting.

2. Prioritize failure
Mistakes add up to failure and we’re taught from an early age to avoid failure.

Fail the test and you’ll get a bad grade. Enough bad grades and you’ll fail the class. Fail the class and you won’t get into university. Fail getting into university and you’ll fail at getting a well-paying job. Fail to make enough money and you FAIL AT LIFE.

It’s hard to shake the failure mindset because it’s taught at an early age and it helps us survive as a species. Consequences are severe when you fail at drinking water, crossing the street, or flossing and brushing your teeth. Consequences are not severe when writing a blog, drawing a picture, or journaling.

For everything you do, ask “How severe are the consequences of failing what I’m doing?” Most times the answer will be “Not at all.” It’s incredibly liberating.

3. Good criticism makes you better
Good criticism doesn’t judge. It allows other perspectives to reshape your work in ways you can’t. It helps you grow.

Here’s good criticism:
“This part is working for me because ______________. That part isn’t working for me because ______________. That’s my input and it’s your choice to use it or not.”

Here’s bad criticism:
“I don’t like it. This sucks. That’s stupid. You should do ______________ instead. ______________ would be better.”

Good criticism brings new perspectives, entices conversation, and builds trust. Bad criticism is bossy, judgmental, and hinders growth.

Reject bad criticism. Especially bad self-criticism.

4. Embrace serendipity
Removing expectations and allowing yourself to be taken in new unexpected directions results in great work. Expecting to create A and ending up with Z is one of the greatest outcomes of creation. You surprise yourself, you end up with something better than you expected, and you grow along the way. This is embracing serendipity, and it’s a lot easier when you redefine “mistake”, prioritize failure, and focus on good criticism.


This is my first post for Maker Year. It took me seven weeks to actually sit down and write it because I was intimidated by the blank page for all the reasons above. I started writing this with the expectation of creating A (sharing my plan for the year ahead). But by embracing serendipity, I unexpectedly wrote Z (what you just read), grew in the process, and got excited about creating again.

What do you want to create and what’s preventing you from starting?