Wednesday, June 29, 2005

I was writing about using read receipts to measure the effectiveness of e-mail messages. The first step was getting the information into Excel, a task that looked easy, and then got a lot tougher. Here's why.

These response receipts come as an e-mail in three format flavors:

  1. Employee name / Read: Message Title /Date-Time Received / Size
  2. Employee name / Not Read: Message Title Date-Time Received / Size
  3. System Administrator / Message Delivered / Date-Time Received / Size
First, I just tried the obvious easy way -- I copied all the messages in the folder where I'd stored them and tried pasting them into an Excel spreadsheet. It worked! The e-mail headers pasted neatly into columns with the same labels as they have in Outlook: From / Subject / Received / Size.

But.

I tried again later and it didn't work. All the message title information was in a single cell and there were no delimiters that could be used to separate the text to columns. I killed myself trying, finally figuring out how to select and export the data as an Excel file (In Outlook: File>Import and Export...>Export to file>Next>Microsoft Excel>Next>Select Folder to Export From>Save Exported File As (Browse)>Next>Finish).

But.

I figured it out today. The key is to sort the e-mails by Subject in Outlook. If you do, you can simply cut and paste them into Excel. I then did a find-and-replace to cut out the title so I just had the names, whether it was read or deleted without being read, and when that happened.

OK, what does the data reveal?

Out of 905 read/not read replies, 85 percent had been opened. Just 15 percent were deleted without being read. Sounds pretty good, though we know what we don't know -- did the people who opened the message read and/or absorb the message.

But.

What about those other responses -- the third "flavor" above. Turns out those are important. They show who actually received the message. You may also have the same information from your distribution list. Anyway, turns out my message went out to 1928 addresses. Now the numbers are not so hot -- 40 percent opened it, 7 percent deleted it without opening it, and a whopping 53 percent have yet to touch it.

Next time, an even deeper dive.

Friday, June 24, 2005

Hello and welcome. This is a new blog where communicators can discuss methods for measuring the impact of their work. While my experience is in internal corporate communications, I hope some of the measuring methods can be used for all kinds of purposes.

I intend to get geeky here. What I hear from communicators is that they don't get the nuts-and-bolts of how to undertake measurement from most sources. They get theory. I'll do my best to give step-by-step descriptions of how this stuff works.

Anyway -- latest measurement thought.

I recently sent out a broadcast e-mail to a subset of our employee population. The e-mail mistakenly had both the the read- and delivery-receipt options selected. Luckily, the message went out from a group mailbox so I didn't get all the responses in my own Outlook e-mail. I saved them, though, and now I have 913 bits of data, with more coming in every day. I'm trying to see if they provide a bit of a view into what happens when we send this stuff out.

Here are a few of the things I might learn from this information:

  • How many people open these messages vs. just deleting them without reading them?
  • What is the timing of this activity? How long do they wait to take either action?

This is a start. Through a follow-up survey, I could find out if the people who opened the e-mail retained any of the message. I could find out why so many people never opened it, and if they received the same message through some other channel. Is there a better way to reach them? Are there regional differences in how they treat the data? Can I improve my "opened" numbers with clever subject lines or other tactics?

This all seems worth doing -- we rely on e-mail to an amazing degree. By looking at the first two bullets I can size the problem and decide if it's worth pursuing. I don't need any additional data to tackle those two, but I do have to figure out how to get these bounce-back responses into a spreadsheet so I can easily analyze them. I have to make the machine eat the work.

More on that process -- the nuts and bolts -- in my next post. I'd like to know what e-mail programs you use, so we can determine if this will work for systems that don't use Outlook.