Tuesday, December 05, 2006

With all due respect
I promised at the end of my last post to talk more about how surveys can provide more than just a measure of communication vehicle use. I think this topic also provides a nice link to this site's most recent comment. We'll get to that in a minute.

Understanding what messages are getting through to your audience, and through what vehicles, is powerful stuff. But it doesn't tell you much about the impact of your messages on that audience. I use another set of simple questions for that:

I found Dr. Clem’s presentation/the article about pencil safety:

  • Interesting - it was entertaining and informative
  • Credible - I believed what I heard
  • Relevant - the topic is important

Respondents rate their level of agreement with these statements from “strongly agree” to “strongly disagree” or “I did not attend/read this.”

I like this approach because it gets at the impact of communication that is beyond the text of the message. People make many judgments based on their personal feelings about the messenger or the credibility of the vehicle. It also provides a window into how a team may be relating to a leader over time – is that leader gaining or losing credibility? If you ask this question of each presenter, article or vehicle, you’ll have a hierarchy to examine for key strengths and weaknesses. As the communications lead, you gain powerful information that will help you recommend actions to respond to changes in audience sentiment. Even the most intractable leader will have a hard time staying the course in the face of a low credibility ranking. (Pause for laughs.) You can more effectively coach leaders based on their communications strengths and challenges.

OK, let’s return to the comment I mentioned at the top, posted by Pat May. Here’s an excerpt:

With all due respect - I find it admirable that you do an effort to measure the effect of your channels. But is it enough? …We're hired to make a difference in our organizations. That difference in essence is to influence behavior. Behavior has an effect on business measurement like employee retention, customer retention, sales, new bizz etc. What I want to find out is how far can you go with pure metrics to connect input (communication) with output (organizational results that are measured on the 'bottom line').

This is the question communicators debate at IABC meetings and late at night at the bar at Ragan conferences. Are we worth a damn? I have, at least, 10 possible responses to this.

  1. To use currently-popular cliché, don’t make the perfect the enemy of the good. This blog was started to address the paralysis communicators often face because metrics are both difficult and not perfect. Use metrics to understand your basics first – don’t try to solve the world’s problems until day after tomorrow. Measure what you can and see if it helps you get better at your job.
  2. Communications aren’t going to go away without you. People are told – and learn – that they have to show up in the morning for work. That’s behavior. However, they have a lot of different feelings about how they are told. That’s where you – and your metrics – come in.
  3. If you really want to move the business, focus your communication abilities on the business. If you’re doing newsletters about the latest developments in your department, and who had a birthday, stop. Instead, find out how you can use your skills to help your department do its job better. Sounds harsh, but I’m sure we can all find ways to use our skills that are more important to the bottom line. I recently did a daily newsletter tracking online buzz about a new product so we could stay on top of issues. People are now beating down my door for the same treatment for their new products.
  4. I was half-kidding above. Those little birthday notices can be valuable in a million ways. They build community and team cohesion, and may draw people to the newsletter so they are exposed to other important messaging. However, I don't know if that's true – someone should measure it!
  5. If you’ve correctly identified a behavior, you should be able to track it. Do you conduct an annual employee survey? Go increase the response percentage – that’s behavior you can measure. What behavior do you want to change? As I've written before, if it's "work harder and have a better attitude" you have not identified the behavior you want. Driving behavior change may be another blog entry, but in the meantime, go explore the SimplerWork.com site for inspiration.
  6. If you do everything you can and can’t move the needle, don't blame the metrics.
  7. Many communications teams do not drive the business. I’d even say most. I've been on communications teams that were extensions of the CEO’s ego, for good or ill. Many have other duties, such as event planning, that require some non-communication skills. (And many communications skills, too.) Many provide a sense of family and community that is extremely comforting to a sizable segment of any large employee population but of no interest to others. I’m not aware of any – there may be a few – that are revenue sources. We are cost centers. A few – these happy few – provide significant cost avoidance advantages. Anything you do of significant value to the business will be pure gravy and will blow your bosses away. Be sure to measure it. If you have metrics, you will gain influence and credibility, so you will have more opportunities to improve the business.
  8. If your problem is that you don’t know what behaviors or messages to drive, welcome to the club. Neither does your boss. Make it your job to observe and learn what your colleagues respond to and what they don’t. Metrics can help. That’s why it’s good to ask those questions about interest, credibility and relevance. People are not machines. You can’t “input (communication)” and “output (organizational results)” or even behaviors in a one-to-one manner. In our field, communications provide directional information that should improve your chances of getting through to people, with the right message, in a way that doesn’t piss them off.
  9. It takes an amazingly small amount of data to prompt well-directed change for the better. I promise that if you ask 10 good questions you'll be overwhelmed with data that drives new ideas. And, you don't need the kind of statistical certitude in communications that you need in drug manufacturing or atomic power generation. You wouldn't be satisfied with being 80 percent sure that an aspirin won't kill you, but most leaders will act quickly on communications decisions that metrics have said have an 80 percent chance of success.
  10. This job is hard. To quote Tom Hanks as Jimmy Dugan in A League of Their Own, “It's supposed to be hard. If it wasn't hard, everyone would do it. The hard... is what makes it great.” If you really want to use metrics to understand how you’re driving the business, get after it. You may be the person to crack the code.

One more thing - start anywhere! For example, this blog entry, according to the Readability Statistics built into MS Word, is at an 8th grade reading level with a Reading Ease score of 62.5 and 6 percent passive sentences. My goals are to always be below Grade 11 (I love being at Grade 9 or below!) and above 50 in Reading Ease with passive sentences below 10 percent. I really use this tool, flawed as it is, because it keeps me honest. I can show, with metrics, that my audience should be able to read my writing. Not a bad metric for a communicator.

Can someone come over and help me off my soapbox?

Friday, October 06, 2006

Where can I find models of metrics to use for communications measurements? My company is big on metrics and measurements. As the new Communications Manager, I want to start off with benchmarks and measurement tools in place.

Hey, I got a comment! Thanks, Lawrence.

First, let me say that I have not written a book. I've made exactly one presentation at a conference. I'm not a paid measurement consultant, nor do I carefully follow the thinking of those who are. There may be great models out there, but I'm not familiar with them.

[Alright, let's do some research, by which I mean Google. I'll search "measuring internal communications." Hmmm. Melcrum will sell you a 210 page report. I'm sure it's excellent. Call me when you finish it -- next June is pretty good for me if you've finished by then.

For a change, let's NOT be snarky for a second. Check this string out. I've not read it yet but I will. It's a discussion on isixsigma.com titled "Examples of Measuring Internal Comms Messages." I bet you get some nuts-and-bolts information there.]

But you posted your question on my blog, so I'll give you my answer.

Models are not easy to come by because there isn't a lot of standardization in this area. I think that's OK, because your metrics should be tied to your goals. If you have standard goals, you will probably be able to find standard measurements.

However, I'll bet you have goals that are unique to your role, or to your view of what's important in internal communications. If you read some of my early posts, you'll know that I'm interested, mainly, in two key questions:

One: Did you get this message?
Two: If yes, through what vehicle?

This, for me, is the blocking and tackling part of metrics. Which vehicles are effective at delivering my messages? From that, I can figure out which vehicles to promote, which to deemphasize, which are used by different audiences and so on. (I like some demographic questions in the mix, too.)

I don't believe in asking people how they want to get their messages. Everyone will tell you that front line managers are the best way to communicate. For me, it would be Angelina Jolie whispering the corporate strategy into my ear, but that's not going to happen. Probably not. OK, not. And you are probably not going to be able to control what front line managers do, either. Get over it.

So, the approach above will help you understand your vehicles. The questions go like this:

* "Did you learn lately of a new employee health plan offering?"
* "If yes, how did you hear? Check all that apply."

Then you list your main vehicles, and some others, grouping where possible. For example:
* Company newsletter
* Company home page
* Department e-mail newsletter
* Directly from my manager
* Town Hall meeting
* External news source
* Peer
* Can't remember/don't know

You will think of more options, but try and group them if possible to avoid 20 options. More than 10 is a lot. Also, the external sources are important. (Stop me if you've heard this.) We asked "Have you heard about the increasing costs of health care?" a few years ago and more than 80 percent of respondents said yes, from external news sources. That took a big educational task off of our plate in advance of increasing health care costs. The networks had done it for us.

There is a separate question of the impact of the messages when they do get through. How do people feel about those messages? How do they feel about the leaders who delivered them? Next time, on Stand On A Box!

Wednesday, June 21, 2006

The best metrics

I've been continuing to conduct surveys to learn how our communication efforts are going over. (Typically, I've done a couple where I've not presented the results. Not too busy to do the research; just too busy to share. Hope that makes all you measurement procrastinators feel better.)

But I've had a couple of recent experiences that illustrate the power of experience and instinct. I want to share one in particular. It's a reminder that research only tells part of the story.

We hold our town hall meetings in a space designed for them. A stage with a podium, three screens for slides or video, a sound system and control booth -- the works. Because it was built to accommodate larger groups, our team tends to feel a bit sparse there. However, the town halls earn good marks on surveys, by and large.

Recently, however, we had a major organizational change. Our team and several others met in the "big room" for the key announcement. We then immediately held a meeting for our team only, crowding 50 people into a conference room. Some sat around the table while the rest stood leaning against the walls while a few peeked in the doorway.

This second meeting was a revelation. It was a dialogue, with terrific give-and-take between the team and leadership, much humor and a general feeling of camaraderie. The informality of the setting -- the lack of distance between "presenters" and "audience" -- freed everyone.

So my next group meeting for this smaller team will be in a big conference room, with chairs at the table and more around the walls, and people crowding in, and everyone -- boss, admin, staff -- at the same level. I'll let you know how it goes.

OK, back to surveys. I've been conducting a lot of them lately. A post-Town Hall survey; a survey to measure the popularity of a newsletter I launched to support a short-term project; and another as a kind of focus group to see if we can predict the interest in a contest we're designing for employees.

The first one is part of my job -- how did people react to the meeting, the presenters, the boss and the message delivered? I like these because after you've done a few you have a nice baseline and you can tell when interest and support is soft and when people are truly engaged.

The newsletter survey was pure self-promotion. Yes, I needed to do it as part of good, disciplined communication. But the newsletter was a smash hit and I already knew that from comments, subscription requests and other indicators. But now I have numbers, charts and quotes that will still be there when my performance review rolls around. I put together a PowerPoint deck and sent it off to my boss just to make sure that the good news arrived. Also, the newsletter had one detractor, someone who isn't a big fan of transparency and has a good sense of the power of tightly controlled information. (Which the newsletter took from him.) His responses to the survey stick out like a sore thumb. He gets to make his points, but the numbers are with me.

The third survey is a new type for me. I had to invent a bunch of new questions, redesign them when they didn't work and even toss a few out. It's always eye-opening to preview a survey you've just slaved over and realize how misleading, muddy, confusing and wrong headed it is. So if you don't already, test them yourself and send them to some friends who won't go telling everyone how lousy your drafts can be.

Tuesday, January 31, 2006

Tangent. Sort of.

This has something to do with measurement and everything to do with internal communications, so bear with me.

Between you and me, I'm against planted questions at Town Halls and other internal meetings. If people are not curious or comfortable enough to ask a question, we should know that and address it. (That's the measurement part.) I've never seen any evidence that planted questions "prime the pump" for additional questions. I have seen planted questions blow up when the setup was exposed. It can seriously undermine attendees' faith in what they are hearing and seeing.

One officer I worked for years ago would ask for questions at the end of his monthly video call with staff. If there were none or only one he would slowly say "I can't believe that there are no more questions. Just one question? No one has any questions about our company?" He didn't have to go on very long before questions started to come up. It's so much about a leader's perceived interest in actually addressing issues. Planted questions, I think, send the message that leadership is not interested in real questions, but in the perception of real questions. And let's face it -- the questions we dream up to plant just sound phoney. "What can I do to help drive additional market share?" "How do you think our values contributed to our good third-quarter results?" Ugh.

If leaders really want to answer questions they can be coached to elicit them.

OK, I'm climbing down off my soap box.

(I can't believe I haven't posted in two months...)