Coming Up For Air

This morning I defended my dissertation (successfully).

Last week a part of my literature review was published in New Directions for Evaluation. Link to the whole issue here.

Later this week I’m headed to  talk to the Indiana Evaluation Association about graphic design in evaluation reporting.

In the start of November, I’ll be at the American Evaluation Association’s annual conference, running around like mad. I have a full day workshop on graphic design in evaluation slideshows, a 90 minute demonstration on the same topic, plus the business meeting for the Data Visualization and Reporting Topical Interest Group that will be run as an Ignite session. Not to be missed. And we’re hosting a Slide Clinic on Tuesday night in the Laguna Room, performing on-the-spot slide triage. Talk about my kind of fun.

So you’ll forgive my absence on the blog and catch up with me soon, right?

Advertisements

Evaluation Report Layout Checklist

A graphic designer, I am not. A laborer of long words and awkward sentences structures, I am. That’s why I became super fascinated by the world of report layout and formatting. Maybe the geekiest hobby, I hear you. But so important!

I’ve detailed the importance of good communication elsewhere on this blog. For evaluators in particular, the packaging and presentation of our content are often dealbreakers. Indeed, at times our choices in font and line length actually impede our clients’ ability to comprehend our findings. Yikes! Not our goal!

After reading a bazillion books and getting input from a panel of graphic design experts (Kevin Brady, Peter Brakeman, Christy Ennis Kloote, and Chris Metzner), I’ve compiled a checklist of graphic design good practice specifically for written evaluation reports.

 

 

 

 

 

 

 

 

Want a copy? Send me an email.

But be warned, I’m about to use the checklist on roughly 90 evaluation reports as part of my dissertation. Surely in there I’ll find good reason to make a tweak or two. I’ll post the revised version then. But in the meantime, go forth and make good work!

I Think Powerpoint Just Did Something Right

It is so awesome to see Microsoft addressing the misuse of Powerpoint (I mean, did they really have a choice? One of their main tools has been so badly knocked in the media for its contribution to harming understanding, it was either address it or repackage!).

Though the storylines were insulting and they mistakenly referred to only males in the first sentence, good gracious they got it right!

1. Keep your chart simple

2. Don’t overdo the animations – use them when you are changing sections or topics

3. You can kill your presentation by overdoing bullet points

4. Move supporting data you would put on the slide into the notes section or in a printout

5. Ask yourself: Can I make my point with an image (Hint: YES!)

6. Please don’t read off your slide

7. Cut slide text wherever you can (then go back and cut more)

8. Distribute your handout at the end of your presentation so the audience stays focused on you

9. Help your slide content stand out by using simple, unpatterned backgrounds (but avoid their recommendation to use the preloaded templates – everyone recognizes them as a template!)

10. Step away from the computer to see if your text is large enough for the audience to read (Hint: Pick size 30 or larger)

See all 5 short videos here: http://www.microsoft.com/office/powerpoint-slidefest/do-and-dont.aspx?WT.mc_id=oo_enus_eml_videos

Looks like you can also submit your best slideshow for a contest starting soon. I’d love to see how this gets judged because in my view the best slideshows are completely uninterpretable without the presenter. If a slideshow can stand on its own, you need to cut text.

Nevertheless, it looks like the folks at Microsoft have finally figured out what some of us have been screaming (and begging and pleading) about for a few years now. Sweet! The more, the merrier.

 

What a Break!

It has been a quarter of a year since my last post. I partly was consumed by disstertating. I partly didn’t have much to say. Back in a flash!

Ooh, The Universal Traveler

A super weird chain of events landed this book –

The Universal Traveler: A Soft-Systems Guide to: Creativity, Problem-Solving, and the Process of Reaching Goals – in my hands. It was published in 1972, so you know what this means – typewritten text, rubber stamped headings, and collage for graphics. Oh yes. And the second author, Jim Bagnall, calls himself a graphic coordinator and has this to say for his bio line: “He sees his role as that of visually simplifying the complexities of life.” Oh hell yes. Could it get better? THERE IS A SECTION ON EVALUATION. Giddiness hardly explains my state.

Travel, in this sense, is the systematic inquiry into just about anything. The authors propose that preparing oneself to travel with logic and creativity (yes, both), a good design, and some operationalization will equip one for any journey.

And the basic function of embarking on this journey is to “free us from uncertainty, anxiety, confusion and other insecurities.” Sounds A LOT like what evaluators hope to do for clients (only stated with more eloquence). In fact, the whole journey process they outline is, itself, a process of evaluation, though they only call the final step “evaluate.” And in this book, the final “evaluate” step is much more laid back and mellow than the typical evaluation journeys I am on. So the methods they list might seem less-than-rigorous, such as writing a letter to a friend. Evaluation in their use of the term comes across as more of a debrief than a systematic process in and of itself.

In fact, the seven step process of taking a journey is much more akin to the systematic process we typically think of as evaluation:

1. Accept situation

2. Analyze

3. Define

4. Ideate

5. Select [appropriate methods]

6. Implement

7. Evaluate

You’ll have to check out page 20 to see this process laid out in several types of rubber-stamp-and-pencil logic models. From 1972. So awesome.

So even if their idea of evaluation is more of a thought exercise, their process as a whole reminds us professionals to lighten up a
little so there’s room for the creativity needed to push ourselves and our field to  new heights. Ultimately, The Ultimate Traveler proves that it is possible to be both logical and cool. In fact, both appear to be critical to growth.

__

Apparently its popularity is not contained here in my chair – the most recent edition was published in 2003 and Amazon carries copies.

 

3 Lessons from DADT Reporting

Yesterday’s release of the findings from the Pentagon’s study on the repeal of Don’t Ask Don’t Tell encapsulated three important lessons for evaluators regarding how we report and communicate about our work.

#1 The segment on CNN was summarizes in seven words: Letting Openly Gay Troops Serve Won’t Hurt. Seven words. These seven words have been long awaited by many on each side of the issue. For many LGBT troops and their families, it is a confirmation of what they have known all along. For many conservative politicians, it was the evidence they sought before taking further action on DADT (and some are still insistent on heel-dragging *cough*McCain*cough*). Serious and extensive methodology was put into place to systematically survey multiple stakeholder groups, consuming months of staff time and energy, producing surely a report several reams in length. And yet the authors can cut to the quick with just seven words. We should all strive to be so succinct.

#2 Okay, I’m still stuck on the seven words. Because the last two are so critical – “Won’t Hurt.” These words were clearly written by someone who is not a well schooled academic. For well schooled academics with backgrounds in statistics know full well that it is almost always impossible to state anything with such certainty as “Won’t Hurt.” Academics are trained to couch findings with a lot of “maybe under these precise circumstances the impact would be less severe” and a grip of caveats like “of course, we recommend further investigation of the topic in a battery of studies across multiple sites before action should be taken based on these results.” You know these academic reports – they are the ones you finish reading (if you get to the end) with a sense of wonder about what they really proved. While it is fine and well to want to CYA, the caution typically dispelled in academic reporting takes the wind out of the sails of action. It is clear and decisive language like “Won’t Hurt” that decision-makers need in order to proceed.

#3 The main finding was reported with one important secondary finding – that strong leadership is the key to culture change. No surprises there for anyone involved in consulting organizations. This is mantra-like in our world. But it isn’t the content of this finding that holds our lesson. The lesson lies in the design of the communication. The study authors and/or segment writers used a powerful 1-2 punch, setting out the main finding in plain wording and pairing it tightly with the necessary component for success. Granted, there was a lot of talking in there, but the listener or reader walks away with the two key ideas in unassailable language.

The Pentagon’s report on their study of the repeal of DADT was one of the best acts of communication of findings I have seen from a statistical endeavor in as long as I can remember. Kudos. So if everything we think about good reporting and communication is true, this thing should be repealed in no time.

Here is the link to the original broadcast on CNN, complete with follow up story sharing the details of the findings http://www.cnn.com/2010/POLITICS/11/30/military.gay.policy/index.html?hpt=T1

Review: Napkin Sketch Workbook

An anonymous sweetheart sent me a copy of Don Moyer’s Napkin Sketch Workbook in the mail yesterday. I read it in one sitting. I’m not sure if it wasn’t meant for me to review, but seeing as how this is an unsanctioned blog, let’s do it anyway.

The workbook’s overall premise is that visualizations are pretty important in conveying information and creating understanding. He also rightly distinguishes visualization of ideas (the topic of this workbook) from data visualization (the topic of lots of other books and sometimes this blog, too). For the timid, there are super helpful “classic” visualization strategies – such as showing a hierarchy or a timeline – that provide a structure to work within.

The co-big idea of the book – and the reason I like it soooo much – is this: Anyone can draw. Even you. Even  me. The annoying part of the workbook is the patronizing step-by-step explanation of how to draw a stick figure (and are we still using skirts for women, really?). But Moyer does it with such humor and simplicity that I found myself chuckling along. In fact, it served as perfect inspiration for a talk I was preparing this morning. Here’s a sneak peek of a visualization I made – totally inspired by the workbook – showing how graphics assist keeping information in long-term memory.

While Moyer doesn’t specifically speak to evaluators, I can totally attest to the role of hand drawn visualizations. It’s been used more than once in an interview, when the interviewee had to draw her answer. I used it once to depict dose-response ideas to a prospective client, who talks about that drawing to this day. As evaluators, we deal with a lot of complex information and sophisticated (or complicated) methodologies and analysis techniques. Drawing and visualization are awesome tools to help us translate those concepts into a succinct and accessible format.

My Graphic Design Circa 2005

It’s so easy to knock instances of bad graphic design in evaluation. They’re ubiquitous and, like many of you, I’m really good at it.

So in the spirit of transparency (and curiosity) I dug up old flash drives to locate bad examples of graphic design and evaluation of my own. Nothing but pages of text in my unbringing as an evaluator. So I went back further – to anything I’d ever done in academia. Not a graph or chart or picture to be found. Not a single powerpoint to pick apart, even from my days teaching undergraduates. No kidding. The closest thing I could find was a set of transparencies (did I think it was 1980 instead of 2005??) I used in a conference presentation on international child labor. Deep breath – here is page one:

I mean, oh god! And this was my dynamic opener!

If memory serves correct, I broke nearly every cardinal sin of presenting findings:

1. I read the text aloud, probably slower than the audience was reading in their minds.

2. This page of grey text did nothing to support my talk, when graphic representations of several of these points could have been more appropriate.

3. Even as a takeaway handout, the bullet points don’t stand on their own. “Bonded and invisible”?

4. I stood with the projection light in my face, so the audience had no choice but to focus on this boring transparency (or their shoes. Thank god smartphones weren’t popular back then).

5. Each of the subsequent transparencies looked exactly the same – and not because of ingenious use of a grid system, but because I had no idea how to be a decent presenter.

And as I recall, there was not a single question from the audience when I was done – and I’m fairly sure it wasn’t because they were dumbstruck by the enormity of the problems in child labor in Mexico. I made the most engaging presentation possible.

The good news is that change is possible. Five years later I can look at this bulleted list and see several options for visualizations that would better represent the story I was trying to tell. (You are probably thinking Well, show us your ideas! but I have a dissertation to write and I’d actually like to see your ideas more. Email them to me!).  Progress and growth means that I am ready to rally my fellow evaluators toward more thoughtful and intentional designs, ones that make people listen and wonder and act. Join me!

Product Evaluation of Evaluation Reports

Okay, people, I just endured another heartwrenching conversation about the state of slideshows in evaluation. We all get it, right? Death by Powerpoint is a common affliction. Point well taken.

But if it is such common knowledge that our slideshows and our reports are an utter bore to endure, why are we still churning out the same thing like it was produced on an assembly line instead of by a team of inherently intelligent and creative people?

I propose the problem is that we really don’t know how bad our own work is. We are generally really good editors – we can critique another’s evaluation report like we are preparing for late night stand up. But we are nowhere near as adept when it comes to looking at our own report, fresh off the printer.

As a step toward a solution I say we speak our own language, assuming that will be the best way for evaluators to listen and learn. I say we treat our evaluation reports as the products they are and submit them to product evaluation. Perhaps we do need to treat it like an assembly line item for a moment to accurately assess – rather than conjecture about – the state of affairs.

I’m going to develop a checklist (and please don’t point me to Miron’s evaluation reports checklist) for what good use of graphic design should look like in our evaluation reporting. But I need your help. What are the graphic qualities of a good evaluation report (let’s think of strictly the written modality for now) that make it interesting and engaging for the audience? Email me, comment on this post, or send me a message on Twitter.