Category: Measurement and Evaluation

  • The new shape of measurement in 2022

    The new shape of measurement in 2022

    Early in 2022, CoverageBook was kind enough to ask me to share my perspective on how communication would be measured differently in 2022.

    You can find my humble contribution along with predictions by Alex Judd, Sarah Mawji and Brian Wallace in the article ‘The new shape of measurement in 2022: PR experts predict how you’ll evaluate success this year’ here.

  • Securing your spot on the Top 100 PR Influencer Index

    Securing your spot on the Top 100 PR Influencer Index

    What does it take to truely become an influencer in the public relations industry? The internet is awash with lists and rankings of dubious validity and methodology that tempts us to ‘game the system’ to improve our ranking – such as repeatedly autoposting the same hashtags or buying a large following (don’t do that, please).

    The Top 100 PR Influencer Index

    Recently, Commetric published their brand new PR Influencer Index, and I was intrigued because the top 3 people on the list have only a relatively tiny following of fans – fewer than 35,000 each. So what was going on? Is Reach no longer important?

    My curiosity spurred me to invite Maya Koleva, Commetric’s Head of Research & Insight, to join me in a conversation about the methodology behind the Index, hoping to glean some insight into what makes an influencer influential.

    Maya provided a lot of juicy information, most of which you will have to watch the video to get. But for starters, if you hope to make the Top 100 list in the future you need to:

    • Have a minimum of 5,000 followers (real ones). This is the minimum definition of a ‘micro influencer’, so at 2,794 I still have a ways to go (but please feel free to follow me for insights about PR measurement).
    • Have a Twitter bio in English with some of the relevant key words – this gets your foot in the door for the primary sample
    • An alternative to the bio key words is to engage frequently on Twitter with PR trade media such as PRovoke Media (former Holmes Report), PRNews or PRMoment.
    • Use a personal account – sorry, no organisations, companies etc. make the cut
    • Focus your effort on replying and engaging others in interesting conversations. You won’t become a recognized influencer by continuously blasting out tweets that get limited engagement – or if you ignore the responses you get.

    Basically, the methodology behind Commetric’s PR Influencer Index rewards being part of a vibrant network, engaging in true conversations, sharing valuable and relevant content, and not forming ‘opinion bubbles’. This is expressed in the Centrality Score, which is a part of the methodology, which Maya explains in the video below.

    Form your own influencer strategy

    What do you think of Commetric’s methodology and strong emphasis on network centrality and engagement rather than (massive) reach? Will it influence (pun intended) your personal strategy to achieve Influencer status?

    Not sure if you are already in the Top 100 PR Influencer Index? Revisit the full list here.

  • Podcast: I was a guest on Measurement Mashup

    Podcast: I was a guest on Measurement Mashup

    Dr. Mark-Steffen Buchele and Steffen Rufenach were kind enough to invite me as a guest on their podcast, Measurement Mashup.

    The title of the episode is: Towards Measurement Maturity – Best Practices from Denmark, and in the episode I share some of the best cases that have been presented at my Measurement Day conferences over the years.

    The recording is in English and you can find it along with shownotes and more here.

    You can also find it on Spotify here.

  • Goals, objectives and KPIs are not interchangeable – how to tell the difference

    Goals, objectives and KPIs are not interchangeable – how to tell the difference

    Goals, objectives and KPIs are vital in describing what an organisation does and why. Understanding how these key terms differ and when to use them in your communication evaluation and reporting is important to avoid widespread confusion.

    Goals

    Goals are somewhat similar to objectives and KPIs but are different in certain key aspects that set them apart.

    A goal is aspirational, whereas an objective is operational. The goal or goals of an organisation are typically long-term and tied to its vision and mission. Goals are usually fairly general, whereas objectives are much more specific.

    That means that goals also often become very fuzzy or vague and difficult to measure. But because their nature is aspirational they can be a great motivator or ‘guiding star’.

    Examples of common strategic goals for an organisation are:

    • “We want to be the preferred supplier of X service or product to Y target audience”
    • “We want to create a world in which people no longer have to live with X” (e.g. disease, carbon emitting cars, the fear of being in danger)
    • “We want to bring a better standard of living to Y target audience” (e.g. making sure families can buy affordable homes)

    You will notice that most if not all goals are also transformational in nature – they describe a journey from position A (now) to position B (the future), or a development from situation A (now) to situation B (the future).

    Below such very lofty statements you sometimes find a more ‘internal’ sub-set of goals that help the organisation define what they are supposed to do. These statements are not specific enough to be considered objectives, nor are they inspiring enough to be used in the external marketing or as the organisation’s ‘guiding star’ or vision/mision statement to motivate the employees.

    But they are more actionable and can thus provide a link or ‘bridge’ down to the objectives following in the next level of the hierarchy below them:

    • Increase profit margin
    • Increase efficiency
    • Capture a bigger market share
    • Provide better customer service
    • Improve employee training
    • Reduce carbon emissions

    (credit for examples to Don Hofstrand, Iowa State University)

    Generally speaking, if there are several ways to go about accomplishing a stated intention, it is likely a goal, not an objective. You might also say that goals are generally strategic in nature, whereas objectives are tactical.

    Objectives

    Below goals in the hierarchy we have objectives and KPI’s that are often both much more operational and specific than goals. Objectives can be confused for goals and KPIs can be confused for objectives, so we shall deal with them in that order.

    An objective is, generally speaking, set to accomplish goals. They are more short-term, more specific, much more clearly defined as actions or steps that need to be taken in order to succeed.

    The SMART method helps you define an objective that is: Specific, Measurable, Achievable, Relevant and Time-bound. You should always check any objective you create or are given against the SMART checklist.

    Compared to a goal, an objective is small-scale and dealing with a specific issue that needs to be solved. I like to use the metaphor that if a goal constitutes a journey from A to B, each ‘step’ that you take on that journey requires an objective. So, accomplishing a goal actually means accomplishing a string of underlying objectives, either in succession or simultaneously.

    Examples of (not-so-specific) (communication) objectives might be:

    • Through targeted messaging increase the audience’s understanding of how to apply correctly for funding (in order to cut down on time wasted processing wrongly filed applications)
    • Increase target audience’s understanding of how to turn leftovers into delicious meals (in order to avoid food waste and gain momentum and increased support for the Stop Food Waste movement)
    • Increase trust in the police and authorities (to make more people willing to report serious crimes)
    • Turn customers into fans by increasing their loyalty and appreciation of the product (to get them to advocate the product to their friends and peers).

    Because objectives are much more specific than goals, they are easy to measure (as long as you act SMART). Where goals describe a transformation, journey or development, an objective describes a specific desired outcome of a planned activity or task.

    KPIs

    The abbreviation KPI stands for Key Performance Indicator, meaning a metric used to measure an ongoing and continuous process. It is a term used to describe how you are going to measure how an organisation or team is performing in relation to a priority you have set strategically or tactically.

    Some very common examples of ‘performance’ related to communication or marketing / sales are:

    • Reach – how many potential new customers were exposed to our messaging in the past month?
    • Traffic – how many visitors is our web page receiving each day, week or month?
    • Average number of news stories generated in a month
    • Average PR-score for news stories generated in e.g. a month
    • Percentage of news stories that fall in the positive / neutral / negative sentiment category over the course of e.g. a month
    • Number of new leads generated each month via PR for sales to work on
    • Customer satisfaction – measured continuously via polls and surveys, NPS etc.

    What all of these KPIs have in common is that they are relative. They go up or down – but they never ‘finish’. Each time you do a status check, you update the numbers and verify that they are in between a certain desired minimum and maximum number.

    If you are below the desired minimum for your KPI, your organisation is not performing well. If you are above the desired maximum, you are performing ‘too well’ – which means you can take your foot of the gas and spend some of your resources in a better way on other activities.

    Telling objectives and KPIs apart

    Sadly, it is a very common mistake among professional communicators to confuse objectives with KPIs and vice versa – using the terms interchangeably when they are not.

    To anyone who knows and understands the difference, it makes you look uneducated at best. At worst, it causes widespread confusion and possibly mistakes.

    Remembering how to tell the difference is actually very simple if you use this visual reminder:

    An objective can be visualised as a finish line. You can look at a stated objective and ask yourself: ‘Did we accomplish that?’

    • Did we get 5,000 new subscribers for our YouTube channel?
    • Did we land an interview with our CEO on national news television?
    • Did our survey show that we increased our target audience’s trust in our organisation or product by at least 30 percent?

    These are ‘yes or no’ questions, just like crossing the finish line. Because you know what success looks like, you know if and when you have achieved your objective.

    KPIs can be visualised with a speed indicator or a thermometer – really, anything with a scale on it and a preferred state of affairs between a minimum indicator and a maximum indicator. You ask yourself: ‘How are we doing?’

    • Do our ongoing polls indicate a trust level in our target audience between 67 and 83 percent?
    • Does at least 1/3 of our media coverage feature one or more of our brand messages?
    • Does at least 75 percent of our stakeholders rate our corporate reputation as ‘favourable’ or ‘very favourable’?

    If these questions indicate a state of affairs, not a one-time ‘over and done with’, it is a clear sign that you are dealing with a Key Performance Indicator, not an objective.

    And that is really all there is to understanding the difference between goals, objectives and KPIs and why you should never, ever use them interchangeably.

  • When better data means smaller numbers

    When better data means smaller numbers

    Usually, getting more accurate data is a positive thing. But what do you do if better data means smaller numbers and your management have come to expect big numbers because, in their mind, it is the sign of success?

    Accurate data should always be encouraged in communication measurement and evaluation. Learning and growing is an important part of our evolution as a profession and that means constantly aiming for a better data set.

    A common problem raises its ugly head, however, when structural reward systems in organisations are geared towards ‘big’ rather than ‘accurate’. Then communication measurement becomes a numbers game and the emphasis shifts from quality to volume.

    I’ll give you three quick examples from my own past:

    1. Sorting media mentions by media relevance

    For many organisations, all media coverage is not equally important. There will be mentions in some outlets that are critical to the organisation’s stakeholders and management and therefore shape the image and brand. And there will be mentions in outlets that have absolutely no impact on the organisation’s stakeholders – and therefore are of little or no consequence.

    Rather than examining all media mentions in one big pile and treating them all as ‘equally important’, more and more organisations switch their media monitoring to a ‘tier system’ that divides media outlets into e.g. ‘tier 1’, ‘tier 2’ and ‘tier 3’ media.

    They then analyse the coverage accordingly, spending more time and resources accurately examining mentions in tier 1 media and little or no time at all evaluating mentions in tier 3 media. The result is usually a much more accurate overview of the likely consequences of their media coverage – because they focus on the sources their stakeholders consume – but the volume is consequently also smaller.

    2. Cross-analysing ‘topic’ with other data

    In 2008-2013, I was the head of corporate PR for VisitDenmark and later a Nordic construction company called NCC. At both companies I organised a media monitoring and analysis routine that included tagging each media mention as belonging to one of 10 topics on a pre-made list (10 was the max).

    Because of the way the statistics and analytics system works, no mention could go without a topic tag – which basically meant that I had nine distinct topics to prioritise, and by default everything that did not match one of the first nine topics on the list was ‘dumped’ into the 10th topic category. This category quickly became known as the ‘trash can topic’ or ‘trash can category’.

    Media coverage that ended up being coded and tagged as ‘trash’ was anything that could not reasonably be said to fall under one of the other nine topics. It could be articles where our organisation was mentioned in passing but did not feature in a relevant context or it could be e.g. obituaries that mentioned a former colleague. Whatever the content, it represented zero brand or business value for us.

    Analysing our coverage over time, in both organisations I discovered (rather horrified) that a full 15-20% of all our coverage every month fell in the ‘trash can topic’ category. Which meant that almost one in five pieces of media coverage was absolutely worthless.

    But the alarming insights did not stop there. Cross-analysing topics with other data points revealed a number of startling facts. For instance, at NCC my predecessor had used the report slide titled ‘Top 10 Media Outlets’ from our monitoring and analysis vendor to regularly report to his boss (who became my boss) and to management. The slide is basically just a list that shows which 10 media outlets have mentioned our brand or company name the most in a given time period – like the last quarter or year.

    My predecessor had pointed to the slide – which included eight of the largest national media outlets – and concluded that ‘all was well’ because we were enjoying tremendous ‘reach’ for our corporate message in all the best media outlets possible.

    However, the first time I cross-analysed Top 10 Media Outlets with Topics, I discovered that in some cases as much as 79% of the so-called coverage we enjoyed in these national outlets fell in the ‘trash can topic’ category and was worthless. In truth, we were not at all ‘top of mind’ for any of the key business journalists we should be looking to talk to regularly.

    3. Examining the ‘fingerprint’ of the team on your media coverage

    Another insightful exercise at NCC was when I decided to split our media monitoring and analysis report from our vendor into two subsets:

    • Subset 1 included every piece of media coverage that we had somehow influenced during its creation; either we had initiated it (by pitching the story to the press) or we had responded to an enquiry from the media and helped shape the reporter’s perspective on the story before it went to print.
    • Subset 2 included all the media coverage that just showed up in the papers and online without us knowing about it in advance. You might say these stories happened regardless of whether we in communication did our job or not.

    Organising every piece of media coverage into Subset 1 and Subset 2 was a manual process back then – one that I had to personally manage every week because our monitoring and analysis vendor was not able to make the distinction.

    Once the subsets were divided, we then ran the two data sets through all the usual analysis systems, in effect generating three reports: the total coverage, Subset 1 and Subset 2.

    What was really interesting (but also expected) was that the quality of the coverage in Subset 1 was superior in every way. It had a better sentiment (positive/negative tonality), a higher average PR-score, always appeared in relevant topic categories and usually featured one or more of our brand messages.

    In comparison, Subset 2 was less relevant, often had neutral sentiment, did not feature our brand messages, and included all mentions in the ‘trash can’ category.

    I used this side-by-side comparison to illustrate to my boss and to management the difference between having an active PR team and being passive. But it also meant taking credit for a much smaller portion of our total coverage, as Subset 1 usually only constituted 35-40% of our total news coverage in a given quarter.

    The dangerous numbers game

    The three examples above illustrate just a few of the insights you can gain from a more accurate data set. But if you have painted yourself into a corner where your boss and management expects and demands to see ‘big numbers’ it can be incredibly difficult to transition to smaller, more relevant numbers.

    Here is my advise on how to handle that situation:

    Understand the consequences of your actions higher up in the hierarchy

    First of all, your decision to go for ‘better numbers’ might have consequences for people higher up in the organisation – especially if it is a big one. An organisation that is used to see ‘big numbers = success’ is likely to have a hard time understanding why what was good yesterday is not good today?

    Even more important, some organisations base their pay and bonus incentives for managers on volume – which means your decision to focus on a smaller data set could result in a financial loss for someone higher up in the hierarchy, which is almost certain to come back to you in a bad way.

    Make sure you alert the organisation to your intended changes well in advance – and give them a chance to ask questions, object or make suggestions

    Emphasise the gain, not the loss

    You should never take something away from anybody without replacing it with something even better. It is simple psychology. If the organisation has trouble comprehending why the big numbers it used to celebrate were actually less than ideal, it is your job to explain why.

    A great example of how that is done – in part – is Richard Bagnall’s AMEC blog post: “The Definitive Guide – Why AVEs are Invalid”. In it, he lists 22 arguments why you should never, ever use Advertising Value Equivalence to measure communication.

    But those arguments become less potent if we are not ready to tell our organisation what can take AVE’s place. That is why it is your job as the communication professional to illustrate for your organisation all that is gained by adopting the new, smaller numbers in your analysis.

    I always recommend striving for a ‘culture of evaluation’ that welcomes experimentation, learning and growing – as opposed to using measurement primarily for documenting the past and as a basis for punishment and reward. Because that leads to less learning, zero risk-taking and a stale organisation.

    Make the transition gradual

    For a while at least, it will be a good idea to let your stakeholders receive both the old format and the new format of your media report – for comparison.

    Once the organisation adjusts to using the new, more accurate data set – and adjusts its incentives and bonusses etc. accordingly – you are safe to gradually phase out the old format.

  • What will cookie changes mean for PR attribution?

    What will cookie changes mean for PR attribution?

    The increasing limitations imposed on the use of cookies by all major browsers look like they will have far-reaching consequences for not just the advertising industry. PR professionals relying on cookies to map out user journeys and identify audiences across time and platforms likely also have to update their strategies.

    Let me be the first to admit: I know next to nothing about the technical aspects of cookie use in digital communication and advertising.

    But a few days ago, I stumbled across another fantastic episode of the PR Resolution Podcast with Stella Bayles. This time she interviews Russel McAthy – CEO and co-founder of Ringside Data and a marketing expert in the art of ‘attribution’, i.e. figuring out which initiatives actually brought about the desired effect.

    Click here to listen to the episode.

    Listening to Stella and Russel opened my eyes to a few things that I must shamefully admit I had hitherto been blissfully unaware of. Which I why I thought I’d do this post and spread the word to other unfortunates like me. 

    Apple and Google are coming for your cookie!

    So, what is going on?

    Well, cookies, as you may or may not know, are used – among other things – to track you across time and different websites. (For further technical explanation go here). Cookies can be used for re-targeting (showing you ads for something you previously searched for or looked at) but also to help remember passwords, favourite settings on websites you visit frequently etc.

    In classifying them, we differentiate between first-party cookies (created by the domain you are visiting) and third-party cookies (created by other domains). (For more on that, go here).

    Right now, two major changes are happening – which Russel talks a bit about in the podcast:

    1. The ‘Intelligent Tracking Prevention’ (ITP) system built into the Safari browser and iOS operating system by Apple is about to progress to version 2.3, which will limit cookie use even more than version 2.2 already has. This update is reported to take place in February 2020.
    2. On January 14th, Google announced that their Chrome browser will phase out all use of third-party cookies over the next two years.

    Why is this a big deal? Because between them, Safari and Chrome account for a very large share of the browsers users are using today to access the internet (and Firefox is mentioned in the podcast as following suit, making the situation even more serious).

    Tracking and attribution without cookies

    So, what are some of the consequences? Well, for starters a lot of your analytics data is likely to already start looking differently.

    When Safari reduced cookie ‘life span’ to just 7 days in version 2.1 of ITP, it meant that the same person visiting a website on Day 1 and Day 9 would get tracked as two unique visitors. Version 2.2 reduced that time span further to just 24 hours. And now version 2.3 is supposedly cracking down on some loopholes the industry has been exploring.

    Google now following suit with the elimination of third-party cookies will short-term leave a lot of marketeers and PR professionals struggling to track user journeys across different websites, linking the digital bread crumbs and correctly analysing cause and effect and attributing effect to the relevant initiatives being implemented.

    Google has stated it will continue to support “privacy-preserving and open-standard mechanisms” that will maintain “an ad-supported web” but the marketing industry has still greeted the news with a mix of hope and fear.

    A game-changer for Earned Media attribution?

    From my own perspective, I look forward to following this development closely – the reason being that I know of at least one global media monitoring corporation that just last year made big announcements about their latest innovation: A cookie-based system designed to track readers of e.g. news articles and map out their subsequent actions, such as website visits, purchases, sign-ups etc.

    In short: They more or less said they had ‘fixed’ the decades-long problem of reliable attribution of PR to specific business goals such as conversions, sales and participation.

    I am not saying that this system or that vendor is in a ton of trouble now – I have no way of knowing that – but from having the product explained to me, I would imagine that they will have to revise how their system is going to continue working going forward.

    For the sake of the PR industry, I hope they succeed.

  • Podcast: How to measure Native Advertising

    Podcast: How to measure Native Advertising

    This June, I was a guest at the Danish Business Daily, Børsen, where we recorded an episode for their podcast series about measuring Native Advertising.

    I was invited as a guest along with native advertising expert Maria Topp from Benjamin Media to talk to editor Stine Bjerre Herdel about how to measure your native advertising campaigns and efforts.

    It is in Danish, but if you are interested you can listen to the 27-minute podcast here for free (it is episode #6 of 7 in Season 1).

  • Podcast: Do you measure you communication?

    Podcast: Do you measure you communication?

    The Danish trade union Kommunikation & Sprog was very kind to invite me for a podcast interview about how and why to measure your communication.

    The podcast is about an hour long and we cover a variety of topics related to measurement and evaluation – including education, vanity metrics and where the industry as a whole might be headed.

    It is in Danish, of course, but if you are interested you can listen to the episode in full for free here.

  • How to report on your communication to the C-suite

    How to report on your communication to the C-suite

    Producing a useful analysis or report to C-level executives is an exercise in precise, relevant and to-the-point communication.

    As communication professionals it is only natural that we want our work to matter to the C-suite. Unfortunately, a lot of us fail to recognise that C-level managers need very specific facts and insights to help them make informed decisions. Thus, it is a common mistake among communication professionals to include too much detail in our reporting, when we should be focusing on a few key elements.

    It is not about you

    Brevity is a must for busy executives and board members. You may think that they have been looking forward all week to your exciting report on the results of last quarter’s press coverage, but you’d be mistaken. They are interested in what drives business outcomes.

    Avoid the temptation to put in stuff simply because you hope it will make you look good and stop making lengthy accounts of everything you and your team have been doing. The details are not relevant – the results are.

    Know your business objectives

    The most critical step in putting together a useful report is knowing the strategic objectives of your business or organisation. Only then can you link your activities and their outcomes to the overall goals in a meaningful way.

    Helpful information to present to the C-suite includes:

    • Measured outcomes and impact of communication
    • Learnings from evaluation of past activities
    • Insights that provide opportunities for improvement
    • Imminent choices and their likely consequences
    • Predictions and forecasts
    • Recommendations
    • Critical facts and the status of KPIs
    • Precise, relevant and to the point

    Your report should always follow these three important guidelines for reporting to the C-suite:

    Precise: Are you providing specific details about the how, when, what and why?

    Relevant: Is the information you are providing relevant to your audience’s needs? Will it help them make informed decisions about issues your organisation is currently facing?

    To the point: Are you presenting the information in the clearest, briefest way possible?

    Pick a clear format

    When it comes to communicating to the C-suite, less is often more. Providing a 1- or 2-page handout with an overview of everything they need to know will often prove much more useful than a lengthy report.

    A format that includes both visuals, such as graphs and charts, as well as text will likely satisfy the need for a quick overview as well as a level of detail. Tweak it until you have perfected the layout.
    Remember that your goal is to help them make informed decisions but not burden them with information overload. It is a delicate balance.

    Ask for feedback

    Finally, try to get your audience’s feedback on the way you have presented your report. Even a few comments in an email will help you hone in on the perfect format and level of information. Here are a couple of useful questions you can ask:

    1. In what ways was this report helpful?
    2. Did you feel any of the information presented was redundant?
    3. Was there anything (visuals or text) you found confusing or vague?
    4. Did the report meet your needs for actionable insights and informed decision-making?
  • 8 Star Wars lessons about communication measurement

    8 Star Wars lessons about communication measurement

    Christmas and New Year are behind us and perhaps you spent part of the Holidays – like me – watching the new Star Wars movie ‘Rogue One’? Don’t worry if you didn’t, I am not going to spoil it for you. But I was inspired to write this blog post about 8 important lessons about communications measurement that we can learn from the Star Wars storyline.

    1. Don’t be seduced by the Dark Side of the Force (AVE)

    Star Wars LEGO

    Darth Vader’s seduction by the Dark Side of the Force, and how he and Emperor Palpatine also try to turn Luke Skywalker, is a central theme in Star Wars. Whereas the Light requires patience and discipline to master, the Dark Side lures us with promises of quick and easy power.

    In the world of communications measurement, ‘vanity metrics’ are the Dark Side and none more so than Advertising Value Equivalency (AVE). They will try to seduce you with big numbers that you may be able to impress your boss or client with (for a while). But the Dark Side can never be as strong as mastering the Light, namely looking at output, outtake and outcome and using the full range of quantitative and qualitative methods to measure the actual effect of your communication.

    2. Search your data carefully for important insights

    Star Wars LEGO

    Analyzing the stolen data plans enabled the Rebel Alliance to find a small design flaw that revealed a critical weakness in the defense of the dreaded Death Star. This allowed Luke Skywalker to fire a proton torpedo down a thermal exhaust port, exploding the battle station.

    Measuring communication serves a dual purpose: to document your results and to create new knowledge.You do the latter by analyzing your data for insights that can help you improve your communication in the future. Make sure you set aside some time to really look at your data sets. Insights can sometimes reveal themselves in the most unlikely of places.

    3. Seek out training and learn from others

    Star Wars LEGO

    To complete his Jedi training, Luke Skywalker went to the Dagobah system to learn from the last Jedi Master, Yoda.

    While you can learn a lot about communications measurement from reading books and articles on the web, you should also consider the benefits of learning from others. Organisations like AMEC offer online certification courses in measurement and you should also be on the lookout for webinars, guest lectures and after-hours-meetings taking place near you.

    4. Sometimes small and agile is better

    Star Wars LEGO

    An Imperial Star Destroyer may have superior firepower, but Han Solo proves time and again that the Millennium Falcon (which did the Kessel run in just 12 parsecs) can out-fly and out-maneuver them at every turn.

    Managing your measurement setup, you should pay attention to the fact that big is not always better. An agile setup with incremental learning is usually preferable to a model that requires data gathering for e.g. a year and subsequent analysis, because it allows you to apply what you learn much faster and adjust your strategy accordingly.

    If you only look at results annually, you may be disappointed with your results and find that you have wasted an entire year when you could have switched course sooner.

    5. Practice makes perfect

    Star Wars LEGO

    When Luke Skywalker first started training as a Jedi, he didn’t know how to use the Force. Master Obi-Wan Kenobi therefore made him practice blind-fighting with his light saber using only his feelings.

    Like using the Force, communications measurement takes practice. Be patient while you are still learning and try to get a broad perspective. Measuring news coverage, social media, your newsletter and internal communication may all seem similar but there are subtle differences that are important to be aware of. Practice until you master them all.

    6. Test your ideas and establish a baseline

    Star Wars LEGO

    Darth Vader tested the carbon freezing facility in Cloud City on Han Solo to see if he would survive the process. The objective was to gauge whether it was a safe and reliable means of hibernation for Luke Skywalker’s capture and journey to the Emperor.

    It is always a good idea to test the validity of your plans on a small scale before any major launch. Even an unscientific sample of data can give you a clue about whether or not to proceed.

    Similarly, if you are going to measure the effect of your communication, you should always establish a baseline that will allow you to measure progress in whatever metric you are monitoring.

    7. Be careful about assumptions

    Star Wars LEGO

    When the evil Emperor Palpatine laid a trap on the forest moon of Endor, he assumed that a legion of his best troops would be enough to capture Han Solo, Princess Leia and the rebel force sent to blow up the Death Star’s shield generator. But Palpatine did not anticipate that the brave Ewoks would join the fight. We all know how that went.

    Assumptions can also cause a lot of problems in communications measurement. When we try to make an audience do something by communicating to them, we are creating what is academically called ‘an intervention’. The operating mechanism (the impulse to act) is called an ‘intervention logic’ and it is always based on assumptions.

    If, e.g. I promise you a reward for doing something, the underlying intervention logic behind the possible success of my communication is that this reward is sufficient and relevant enough to motivate you. Other assumptions built into the intervention logic would typically be things like you understanding the language I am speaking and so on.

    If, for some reason, the intervention logic and the assumptions it is based on are flawed, then the communication will almost always fail. And it will be very difficult for you to measure why or how. So you should always take a moment to ponder what assumptions your plan or intervention logic is based on – like Palpatine ought to have done.

    8. Remember to celebrate your success

    Star Wars LEGO

    When Luke Skywalker and Han Solo blew up the Death Star, it was not the end of the evil Galactic Empire. But it was a major victory for the Rebel Alliance, so naturally our heroes were honoured with a celebration.

    Too often, we are so preoccupied with “the next mission” that we forget to take a moment to pause and be happy when something succeeds. Measuring your communication makes it easier to know when you have accomplished something worthy of a celebration.

    May the Force be with You

    —————————————————

    Photo credits: aitoff on Pixabay (header), Blockaderunner on Flickr (photos 2, 3 and 4), fullnilson on Flickr (photos 1, 5 and 6), Avanaut on Flickr (photo 7) and Legoagogo on Flicker (photo 8).

  • Make sure you set SMART communication objectives

    Make sure you set SMART communication objectives

    One of the most common yet serious mistakes in communication measurement is poorly stated objectives. It is a mistake you can easily avoid by sticking to the SMART method.

    It was conceived by the management guru Peter Drucker in 1981 in his book “Management by Objectives”. Since then, the method has been revised and rewritten several times. The version I am presenting here is the one most commonly used. It is also, I feel, the one most relevant to communication measurement and evaluation.

    The SMART method

    S for Specific

    First and foremost, your communication objective has to be specific. Use all the common questions to drill down to the heart of the matter: What is going to happen, who is going to do it, when is it going to take place, how much is going to change, why are we doing it etc. Being specific is all about the details so you know precisely what needs to be done.

    ‘Increasing awareness in the market’ is not a specific objective. ‘Increase un-aided recall and awareness of our brand proposition from current 7% to no less than 15% in the target audience females age 17-34’ is.

    M for Measurable

    Your next step is to ensure that you have a way to quantifiably measure whether you succeed or not. This may sound logical, but have you actually considered how you are going to gauge when you are at the finish line? What quantitative or qualitative methods are you going to apply to collect data? How are you going to analyse it? How will you be able to conclude if you achieved your stated objective? An objective that cannot be measured is actually not on objective – it is just something you wish for (and thus useless).

    A for Achievable

    The third criteria in SMART is to make sure the objective is achievable. Do you have the necessary time, money and resources? Is it realistic or are you biting off more than you can chew?

    R for Relevant

    Relevance is the fourth step. This is all about ensuring alignment between your objective and everything else your organisation is trying to do. Is what you are trying to do a natural development of your strategy? Or is it a vanity project? Will succes mean improvement for the business? Will it strengthen other areas or projects?

    A second way to look at relevance is to ask yourself whether you are measuring what is important – or what is easy to measure? Make sure your chosen metrics that make your objective Measurable are also Relevant.

    T for Time-bound

    And finally, you must make sure that your objective is time-bound. That means setting a start date, an end date and milestones along the way. Plan out when important events or decisions will take place. It is almost always better to collect data in a shorter period of time and immediately apply what you can learn incrementally. If you collect data over a long time-span, it risks turning obsolete before you get a chance to analyse it and react to the findings.

  • To grow you need to identify your Alpha audience

    To grow you need to identify your Alpha audience

    If we are only counting faceless mentions and likes, we will fail to identify the fans and customers that are truly passionate and matter to our business. That is why we have to identify who our self-motivated loyal fans are.

    On September 10th, the best-selling author Mark W. Schaefer was in Denmark, speaking at DONA‘s annual assembly. He gave a talk during which he stressed the importance of identifying and nurturing your ‘Alpha audience’ – the customers and fans that are truly your ambassadors and advocates.

    Mark pointed out that he too is struggling to come up with a business model that is resilient against content piracy – in his case piracy of his books. To him and a lot of other small businesses it is therefore critical to identify the true ambassadors and advocates. However, very often we come up short because dashboard analytics only gives us an overview and not enough detail.

    If you want to watch the entire keynote by Mark W. Schaefer, it is available on Twentythree’s website.

    A warm thanks to Twentythree and their video platform for making this clip available for my blog.