Optimising social media content

Here at ONS we’ve got insight into our users from research and an understanding  from web analytics about what’s hot on the website. But what we didn’t have, until now, was clarity on what kind of message drove the most traffic or time spent on site. Knowing this should allow us to optimise future messaging.


Back in February, I ran a series of events at work on ‘writing social media@ONS‘, the first of which Mark Frankel, Head of Social News at the BBC, kindly came to present as did Catherine Toole, CEO of Associated Press’ Sticky Content. The objective was to bring statisticians and the off-line media team – who draft our social content – around to the idea of writing with a user segment in mind as well as the content’s likely performance. A significant piece of each session was the idea of using a healthy mix of copy formats (listed below) and the hypothesis that different formats would see different responses from our users (an idea that’s discussed a lot in digital circles):

  • Headline. Subject, number and link. No messing about. Example
  • Nugget. Did you know? (without asking it). Example
  • Draws you in. Makes users *want* to click as the content’s so profound, emotional or sticky. Example

In addition to copy formats, it’s assumed that images and interactives have a positive effect so the increase of infographics and data visualisations being produced here should help on that front.

How we did it

We use HootSuite to publish our social content and Webtrends on the web analytics side of things. The two can work together so each link we publish can be tagged with set parameters and values within each parameter.

  • Parameter 1 = taxonomy / subject and values within are i) econ, ii) pop, iii) lab, iv) bus, v) corp
  • Parameter 2 = format and values within are i) headline, ii) nugget iii) draws you in
  • Parameter 3 = content and values within i) image, ii) interactive iii) post (to be used when the other 2 don’t apply)

HootSuite also works with Google Analytics in the same way and like with ours, that data can be pulled from an API, which Rob in our Data Vis team put into this scatter graph on Tableau Public.

ONS's social media tableau

What it shows Very broadly…

  • Enticing copy sees better engagement on site
  • Images help a post’s performance
  • Interactives help even more so

By all means play around with this, and turn filters on or off by clicking on the values on the top left. Apologies it’s not too intuitive, but this is a quick and dirty prototype after all. Massive caveats This stuff is all well and good but there are some massive caveats I should share / would keen to hear your thoughts on:

  • This visualisation doesn’t represent engagement in our content off site. It’s a big ask to interrupt someone flicking through their social feed, to come to our site and spend time there. So shares and chatter around a subject could be high even if visits to or time spent on the website is nothing special.

  • This visualisation doesn’t take in to account bounce rate. On from the point above, it’s a big ask to expect a user to consume content on our website when it doesn’t optimise for mobile which the lion share of visitors to our site from social are on.
  • Because of point 2, we’re very careful to make sure the reason someone clicks on one of our links is answered or very clear when they land on the site. Sometimes the destination is deep within the site where various pages on the same subject are organised into sections with ‘#sectiontitle’ at the end of the URL. Like this. This Webtrends / HootSuite mash up doesn’t like them, so what you see in the tableau is a fraction of what could be there.

Wrap up

This has been a useful exercise to back up the copy formats introduced back in February at the writing for social media @ONS sessions. I think we should continue for a month or so to get a little more data and try to address some of the caveats above. After then though, we should draw a line under it. As we’re not selling products, we don’t need to labour over every word for optimum response, splitting hairs to get a few more visitors from social when in the grand scheme of things search engine optimisation (SEO) will have much more impact on pushing numbers up. All said and done though, it’s important we continue to have a mix of content with or without images told through different copy formats, so as not to dilute messaging when it really needs pushing.


MeasureFest take 2

Yesterday, a few of us from ONS’ User Insight and Engagement team went along to MeasureFest. (I talked highly of the last one, and as a result of that session in October we’re in the final throws of building a real time scatter graph in tablaeu which merges message tags appended to URLs we publish on social media with web analytics data – so we can see patterns in what works and what doesn’t).

Anyway, whilst I’m not sure this one was *quite* as great as the last (for me anyway), I thought I’d share the notes I made as a lot of good things were brought up.

Of all the references to web analytics, Google Analytics was the only provider mentioned, highlighting its grip on the market. GA’s speed suggestion report was cited by Neil Barnes as really useful to understand page load times. He said websites should set a KPI of getting page load time down to under 3 seconds (a cut off, research suggests, where a large proportion of users abandon if it takes any longer). In another presentation by Jono Alderson, there was also talk of using demographic data in GA to add value to user personas that other research methods form. Dara Fitzgerald’s talk covered 5 considerations when migrating to Universal Google Analytics from standard, if doing it now (it’s in beta and won’t roll out fully for a couple of years.)

Merging on and offline databases into one view was mentioned by Andrew Hood and perhaps is something more fitting for a commercial organisation or government department that deals in transactions. ONS isn’t either of those, but to overlay data we can scrape from social and that from our call centre would make for some very robust insight in to how we can better service our users.

Craig Sullivan (the Malcolm Tucker of the conversion rate optimisation world) gave a great presentation on 20 common ‘mistakes’ made in AB testing. Here are the slides, note the language is colourful at times and I used ‘mistakes’ here rather than Craig’s original title!

External data sources was talked about as well. I really like the idea of pulling in sources such as the weather (as Simo Ahava talked about), and seeing what effect that has on behaviour.


Again, not for ONS and it was also suggested (by Andrew Hood) not to get too hung up on feeds like this for risk of talking yourself into hypotheses. But for where there is a clear connection between an external data source and an organisation’s proposition, I can see the benefit.

Using social data as an insight tool was the final chunk of the day. Rebecca Carson talked through Brandwatch’s take on the evolution of social media analytics.

Evolution of social media

Rebecca talked through how thinking has moved from volume and sentiment, to understanding.

  • Why sentiment may fluctuate
  • More about users and how brands fit into their lives
  • Audience segment attitudes
  • How segments compare, contrast and can be better served by brands

(This is all good, though *a lot* needs to be done by the industry to inform users that this is one way their data are used – a year old but NetBase’s study on privacy is worth a read to remind us all how big brother this stuff could get).

Rebecca also talked about the challenges of understanding:

  • What users are ‘saying’ vs ‘doing’ online
  • How users act during a brand’s listening exercise vs be asked direct
  • How research data can be modeled against conversion data

…which are, of course, age old user research issues. In fact, it’s interesting to note that this number crunching-focused event didn’t have a session or two on the more established methods of user insight such as eye tracking and interviews. For me, that balance would make MeasureFest an even better event.

(My blog republished from ONS Digital Publishing’s blog.)

As Les Dennis would say, ‘Our survey said’

To get our heads around overhauling our website, we’ve had a first pass at auditing the content, looking at page titles and meta data initially as well as visitor stats. That data has helped us create an up-to-date site map but also colour code each item dependant on the traffic to create a ‘heat map site map’.

Dorothy House site map / heat map

It shows that:

  • Users don’t delve too far into the site, either from Google or by navigating when they arrive. A common problem for an old site that hasn’t kept pace with technology.
  • ‘Home’, ‘recruitment’ and ‘contact’ are our busiest pages. Maybe because they rank first on Google and also for ‘contact’, because maybe users would rather speak to the phone.

So we know more about what’s hot and what’s not, we’re finding out about the why, so we’ve just launched a survey to get the ball rolling. After then we’ll do more user testing to make better the way the site is organised and designed.

Whilst responses roll in, we’re looking deeper into on and off page search engine optimisation and also collecting examples of websites we think are good, that we could learn from. Those examples will be stuck to that space on the right in the pic. Any ideas?

(My blog republished from Dorothy House Digital)

The magnificent #MeasureFest

Stephen Pavlovich talking conversion rate optimsation on mobile

MeasureFest is a new conference, put on by seasoned organiser @kelvinnewman. We’ve not worked together but I’d seen numerous references to MeasureFest pop up on Twitter so there was obvious interest in it (good sign) it was free (another good sign) and the skeleton agenda promised to cover some great stuff:

  • Analytics vs Insight
  • Attribution
  • Conversation rate optimisation
  • Customer value optimisation
  • As well as the not so much online, but still very much part of the customer experience that needs to be measured, call tracking

All this an even better sign MeasureFest was going to be good.

A handful of the takeaways for me were:

  • Outcomes at the heart of measurement and the business as a whole. This notion of outcomes (measuring users who have used our data) over output (number of visitors to a page) has been around a while but it’s always good to remember that that’s what matters over anything else
  • Incremental vs big. Or in full, incremental changes for incremental improvements vs big changes for big improvements. This is obvious really but makes you take stock of what you’re doing to assess how much impact bits of work will have and whether they’ll meet the ultimate objective if there are big expectations
  • UTM tagging and a clean structure for it are paramount for accurate goal measurement
  • Attribution only focuses on acquisition, doesn’t cover retention or the lifetime value of a user
  • Unique telephone numbers are still the way to track offline activity. Whilst we have one phone number for ONS, statisticians put their direct phone number on their publication. Calls to that should be tracked in the same way calls to our 0845 is.

Look out for the next MeasureFest in June.

(My blog republished from ONS Digital Publishing blog.)