Page Revamp for Better User Experience

Page Revamp for Better User Experience cover

User experience has always been an important factor for your SEO campaign. Spending enough time on improving your site’s user experience can go a long way, and will help your long-term SEO goals. But improving it is not as easy as it sounds.

Giving up on improving a website and being content with its below-average user experience happens more often than you would think. This is problematic because the change in standards and preferences is common amongst users, and if a website does not adapt accordingly, it will lose visitors and potential customers.

Fortunately, the best way to adapt and improve your website’s user experience is to revamp your pages. It’s immensely beneficial to most businesses and it allows your website to keep up with the user’s constantly changing preferences and standards. But what does it entail?

Revamping Pages for Better User ExperienceRevamping Pages for Better User Experience

In today’s online landscape, establishing a web presence and managing it is simpler than ever, but creating a website that can keep visitors coming back is not something that’s easily done. Consequently, producing great user experience is one of the things marketers and SEO specialists struggle with.

Revamping your pages is an integral part of your conversion rate optimization, and it helps you make your website and SEO as effective as it can be.

So, some important factors to consider when revamping your pages for better user experience are:


Your website’s keywords are basically about how you want users to find and identify your website, and how they try to find out more about a certain product or service your business is offering. Unfortunately, not many businesses have the capability to properly target keywords.

Some companies still try outdated, SEO black-hat tactics such as keyword stuffing in hopes of ranking highly. However, as Google puts out updates and algorithms, tactics like keyword stuffing has become an ineffective way to rank higher in the SERPs (search engine results pages). So, if you have a history of using keyword stuffing as a tactic before the algorithms were released, it is highly possible to still have remaining pages that contain this outdated, harmful tactic.

A revamp of your pages can help you take a closer look at their targeted keywords, and ensure that you are properly targeting the right keyword to rank as authentic and as organic as possible. You can also avoid any possible penalties if you have any pages that contain keyword stuffing. And it can also assist you in figuring out more appropriate keywords for your SEO campaign.

To help you get started with a proper keyword research for your site, you can use SEMrush:

  • Log in and click Keyword Analytics drop-down button

Semrush keyword analytics

  • Click Overview

Semrush keyword analytics overview

  • Enter your keyword in the search bar located above

Semrush keyword analytics overview search bar

  • Take a look at its volume, phrase match, related keywords and their corresponding volume on the results page to check if there are any keywords that are better than the one you searched for.

Semrush keyword analytics overview results page

Website Design

Creating great user experience entails as much knowledge as possible about your target audience, and designing a website that appeals to them. A recent study conducted by Adobe concluded that 59% of consumers prefer to engage with content that is beautifully designed.

One of the reasons why 59% of consumers prefer a beautifully designed content is because of people’s short attention span. So, most of the visitors you’ll receive will only stay if they find your content helpful or if it piques their interest. You will also be needing a design that is not only catered to your target audience but will also help the audience have an easier time scanning or reading your content. Some things you can incorporate to improve your site’s design are:

  • Headers
  • Simpler Navigation
  • Bullet Points
  • Font that is easy to read

Also, if you are branching out to a mobile website – which you should – make sure that you have a responsive web design to help your mobile SEO campaign.


Once a visitor arrives on your homepage, about 52% of them would expect information about your company. Show them what your business is all about and what value you can bring to their business. Make your visitors think that whatever you are offering them is worth their time.

Aside from information about your business, 64% of visitors will also look for your contact information on your homepage. It’s fine to have a separate “Contact Us” page, however, it would be better for you to have a snippet of it on your homepage.

Just take a look at our business website,

seo hacker business website

All of our information is included above-the-fold so that our visitors will have an easier time knowing what we do and how to contact us.

It would be best for you to revamp your homepage in terms of information that your visitors need, and if that’s not possible, revamp your navigation to give your visitors an easier time to arrive at the content that they are looking for.


Revamping your pages also allows you to update existing content and make it more valuable and relevant for your target audience. You’ll be needing a proper content strategy whenever you update your existing content to make sure that everything is working accordingly.

Check your content and refresh any page that is outdated, ensure it is properly optimized, and you may opt to re-purpose some of them for a better keyword.

As mentioned above, people would prefer if the content is presented in a simple yet aesthetically appealing manner. So, remember that aside from the content itself, delivery is also an important aspect of a user’s experience on your website.

Key Takeaway

Revamping your pages for better user experience is an important aspect of your SEO campaign. Be mindful of the metrics you should be looking at when you revamp your pages. This will help you know what strategies work and what does not.

If your website is stagnant and does not adapt to the changes in preferences in trends, it will slowly lose rankings and online visibility – which can ruin any possibility of converting visitors.

Do you have any questions regarding the revamping of your pages for better user experience? Comment it down below and let’s talk.

Page Revamp for Better User Experience was originally posted by Video And Blog Marketing


What We Learned in September 2017: The Digital Marketing Month in a Minute

While the main tech headlines have revolved around new hardware, with all the major players showing off their new products to the press and public, there has been plenty of updates all around the digital marketing and tech landscapes. Google is making changes to appease both publisher and the EU Commission and Instagram is becoming an advertising juggernaut…

Industry news

Google responds to EU competition commission woes

Google has responded to the EU Commission’s June ruling that it has breached EU antitrust rules by giving an illegal advantage to its comparison shopping engine (CSE). Google’s new plan involves allowing competing CSEs to bid in an auction to have their listings appear alongside Google Shopping’s in the “one-box” at the top of the search results. Google Shopping will also have to bid, and will be run as a separate entity in Europe in an apparent attempt at a level playing field. We are sceptical that the playing field will truly be all that level – it’s not clear what operating costs the new Google Shopping unit will have to bear – and this also feels like a step up in complexity for individual retailers who will now need to manage competitive listings on multiple platforms. We anticipate that it will be hard for retailers to get visibility into why an individual listing on a given CSE does or does not appear in the Google one-box.

Read the full story (Search Engine Land)

First Click Free is dead

Google has recently announced that it will abandon its ‘First Click Free’ policy, whereby users are allowed to read three free articles a day on otherwise paywalled publications, or the website in question wouldn’t appear prominently in search results. Publishers have hailed the change as a victory, but many (including Distilled CEO Will Critchlow) question whether the replacement is any better.

Read the full story (The Guardian)

Apple drops Bing in favour of Google for Siri web search

Apple has announced a switch to Google as its provider of web searches from within Siri (on iOS) and Spotlight (on mac). Google already powers web searches within Safari and pays Apple a lot (an estimated $3 billion in 2017) in so-called “Traffic Acquisition Costs” (TAC) for the privilege. It’s not yet clear whether more money will be changing hands as part of the updated deal, particularly as the integration will reportedly be API-only and show only organic results, with no ads. Is it possible Apple is even paying Google (or at least reducing the TAC Google pays for the Safari integration)?

Read the full story (Tech Crunch)

Google updates AdWords to tackle Apple ITP problem

Apple has recently signalled its intention to roll out ITP (Intelligent Tracking Prevention) as part of a Safari update, which is aimed at limiting the cross-browsing data that third-party trackers can capture. This poses a headache for Google AdWords. In short, Google’s solution involves a new analytics cookie that will be used to capture conversion data in a way that confirms to ITP.

Read the full story (Search Engine Land)

The super-aggregators

Facebook’s handling of news aggregation has become a highly sensitive and politicised topic over the last 12 months. It has been criticised for censoring too much, not censoring enough and having no real solution to the fake news problem. Ben Thompson of Stratechery argues that Facebook (and others) have become super aggregators, and breaks down potential ways to effectively regulate these massively-powerful companies.

Read the full story (Stratechery)

Apple and Amazon lead new hardware launches

It’s the season for the tech giants to launch their new wares, and Amazon feels like the brand with the most innovative and interesting products being released. The e-commerce company has been in the hardware business for some years now, but its new round of Echo hardware (including the Echo 2, Echo Spot, Echo Plus and more) shows it is still experimenting to see what resonates and drives widescale adoption. And the Apple event didn’t exactly slip under the radar either…

Read the full story (QZ)

Instagram hits huge 2-million monthly advertisers mark

In September, two million businesses bought ads on Instagram. This figure is double the amount bought in March and four times the amount bought in September 2016. Facebook regularly gets five million monthly ad buyers. The purchase and subsequent integration of Instagram by Facebook has, of course, accelerated the growth of advertising with a large overlap of advertisers promoting their products and services on both social platforms.

Read the full story (Marketing Land)

Ahrefs crawlers now executing JavaScript

Ahrefs has announced it will now crawl links found in JavaScript. According to the company, it “will only execute JavaScript if a page has more than 15 referring domains pointing at it”. This currently works out to be about 30 million of the ~6 billion pages it crawls every day and results in the discovery of an additional 250 million links in JS.

Read the full story (Ahrefs)

Google searchers using location modifiers less and less

Google has personalised search based on location for a number of years, and has consistently gotten better at it. However, there has been a question mark around the overall understanding and adoption of implicit aspects by searchers. Recent data would suggest that people are finally getting used to the idea. For example searches including the phrase “near me” is declining, while comparable searches without the location modifier have grown by 150% this year alone.

Read the full story (Think with Google)

Distilled news

SearchLove London 2017 is now just 11 days away. For any last-minute decision making, we’ve compiled the 8 biggest reasons to join us at this year’s conference. You can pick up your tickets here. Senior Designer Leonie Wharton has compiled the most interesting creative work we’ve produced this year, while Principal Consultant Ben Estes had updated his very popular technical audit checklist.

Over on the Moz blog, Robin Lord has written an epic post on how he built a chat bot from scratch (and how you can too), and Zee Hoffman Jones has been laying down the checklist for competitive analyses (protip: Zee will cover this in even more detail at SearchLove London).

What We Learned in September 2017: The Digital Marketing Month in a Minute was originally posted by Video And Blog Marketing

How Faceted Navigation Affects SEO

How Faceted Navigation Affects SEO cover

Faceted navigation is problematic for almost all e-commerce websites. The number of pages that e-commerce websites have on different versions of a single product poses a threat to the crawler’s efficiency – which could negatively affect your SEO. But what does faceted navigation really mean?

Faceted NavigationFaceted Navigation

Faceted navigation is typically found in the sidebars of e-commerce websites, and it contains both facets and filters. It allows the users to look for their desired product through a combination of attributes which will filter the products until the users find what they need.

However, facets and filters are different from each other. Their differences are:

  • Facets – These are indexed categories that help specify the listings and it acts as an extension of the main categories. Facets add a unique value for every selection a user makes, and as it’s indexed, facets should send relevancy signals to the search engine by ensuring that the page contains all the important attributes.
  • Filters – These are used to sort and refine items within the listings. This is necessary for the users, but not for the search engines. Filters should not be indexed because they do not change a page’s content, they merely sort it out in a different order – which would lead to multiple URLs having duplicate content.

Facet categories and filters


Potential Problems

Every possible combination of facets normally has its own unique URL which can cause quite a few problems for your SEO:

  • Duplicate Content
  • Wasting of Crawl Budget
  • Link Equity Dilution

As your site grows, so does your duplicate page. The inbound links can come to different duplicate pages which will dilute link equity and will limit the ranking capability of these pages.

This also heightens the possibility of keyword cannibalization. Multiple pages try to rank for the same keywords which would result in less stable and lower rankings – that could have otherwise been avoided if only an individual page targeted the keyword.

Solutions for Faceted Navigation

When choosing a solution for faceted navigation, you should take into consideration what you want in the index, increase the number of pages that get indexed, and lessening the number of pages you don’t want indexed. Some solutions for faceted navigation are:

  1. AJAX

This can help you with your faceted navigation because when you apply AJAX, a new URL is not created when a user clicks on a facet or a filter. Since there will be no unique URLs for every possible combination of facets, the issue of duplicate content, keyword cannibalization, and crawl budget wasting is potentially eliminated.

However, AJAX can only be effective before the start of an e-commerce website. It cannot be used as a means of repairing old, existing e-commerce sites. You will also need to invest in development time and proper execution for it to help on your faceted navigation problems.

  1. Noindex Tag

A noindex tag is used for the purposes of letting the bots know to exclude a certain page from the index – which would lead to it not being shown in the Google search results. It could definitely reduce the amount of duplicate content that shows up in the index and search results. Here’s how you can block search indexing using the noindex tag.

However, it does not help with the wasting of crawl budget because bots would still be entering the page. It also does not help with the distribution of link equity because the specific page that has a noindex tag would still be receiving link equity.

  1. Canonicalization

The use of canonical tags can help you let Google know that you have a preferred page that they should index and rank, and all other similar versions of that page’s content are only duplicates of the preferred page and should not be indexed and ranked.

You can use canonical tags for the problem of duplicate content and link equity will be consolidated to your preferred page. However, bots will still be able to crawl the pages – which will be a waste of your crawl budget.

  1. Robots.txt

Disallowing sections of the site can lead to great results. It’s easy, fast, and reliable. The best way to do this is to set a custom parameter to appoint all the varying combinations of facets and filters that you want to block (e.g. “noidx-2”). You can then incorporate it to the end of each URL you want to block or you can use a meta robot noindex tag in the headers to set the pages that you want to be blocked. You can then enter the disallowed custom parameter into your robots.txt file. It looks like this:

robots txt custom parameter

It is important to note that when you change the URL strings, you need to give 3 or 4 weeks for the search engine bots to notice these changes before your block these URLs with the robots.txt file.

However, this also has some problems. Link equity will be confined and the disallowed URL will not be able to share its link equity. Also, the URL can still be indexed if there were any “follow” links pointing to it.

  1. Google Search Console

This is a good method to temporarily fix your problems while you create a much better faceted navigation. You can use Google Search Console to tell Google how to crawl your website, and it should only be your last choice. Here’s how you can do it:

  • Log in to Google Search Console and Click the Crawl drop-down button

Google search console crawl

  • Click the URL Parameters button

Google search console URL parameters

  • Indicate the effect each of your parameters has on the page and how Google should treat these pages

It is important for you to remember that this will only work for Googlebot, therefore, it won’t work for Bing and Yahoo.

Faceted Navigation Best Practices

Faceted Navigation Best Practices

Here are some of the best practices you can do to help you create proper faceted navigation:

  • Use AJAX
  • Remove or prevent links to category or filter pages when they have no content (no product or out-of-stock)
  • Allow indexation for certain facet combinations that have high-volume search traffic
  • Set the site hierarchy through breadcrumbs in categories and subcategories, then mark them up with microdata
  • Set a prescribed URL order for facet combinations to avoid duplicate content issues
  • Consolidate indexing properties from the component pages to the whole series through pagination markup with rel=”next” and rel=”prev”

Key Takeaway

Any one of the solutions mentioned above is by no means the best solution out there. Every business is different and will have different circumstances, there isn’t any best approach that an SEO specialist could use to fix any website. You will be the one to decide the best solution for your faceted navigation problems.

An optimized faceted navigation can help your e-commerce site target a wider set of keywords, but it also opens you to higher risk. Make sure that every stage of development is properly enacted to have a website that prioritizes user experience, but is also able to meet the standards of search engines.

If you have any questions or clarifications, just comment it down below and let’s talk.

How Faceted Navigation Affects SEO was originally posted by Video And Blog Marketing

Negative SEO Protective Measures

Negative SEO Protective Measures coverNegative SEO has been around for quite some time and it still prevails up to this day and age. It can damage your website and make you lose valuable rankings. But are there protective measures for it? Can you protect your website from these harmful attacks? Yes, you can.

The release of the Penguin update only increased the effects of negative SEO. Fortunately, Google released some solutions that can control it, but the overall defense against negative SEO is still up to, us, the SEO specialists. But what does negative SEO really mean?

Negative SEO Negative SEO

Negative SEO could be considered as the “foul play” of the SEO industry. It is the practice of using black hat SEO techniques to de-rank competitors. It could be a powerful tactic if done correctly. However, it’s still disliked by the majority of the SEO community, and its practitioners are not doing anything helpful towards the SEO industry.

Negative SEO can happen in a variety of ways. It could be through numerous spammy links, forceful crawling, and, in rare cases, plagiarism. However, if you’re diligent enough, you can prevent these attacks from causing irreversible harm to your website.

In case you are a victim, a potential target for attacks, or you simply want to protect your website, here are ways in which you can detect and prevent negative SEO from harming your website.

Link Audits

Routinely auditing links is a good practice that most SEO specialists do. Fortunately, link audits can also help you detect if you are the target of a negative SEO attack. Constant monitoring of your link profile’s growth can help you discover suspicious link activity before it causes further harm to your website.

A standard graph of your link profile growth should look something like this:

cognitive seo standard link velocity graph

However, if you notice a sudden increase or decrease and you have not worked on any link building campaigns, it should be a clear sign of a potential attack.

cognitive seo suspicious link velocity graph

This attack happened to WP Bacon, a WordPress podcast site, back in 2014. It was the target of link farm spam that gave it more than a thousand links with the anchor text “porn movie”. This caused a major drop of more than 50 spots in the rankings for the majority of its main keywords.

Luckily, they immediately disavowed the attacking domain and was able to recover their rankings and traffic.

WPBacon traffic restored screenshot

You can use a link auditing software like CognitiveSEO or manually audit your links, just as long as you are able to properly monitor your link profile’s growth. If you are already a victim of link farm spamming, immediately notify Google and disavow all the suspicious links.

Content Scraping

Content scraping is the process of taking content from a much more authoritative website and publishing it as your own. This happens because not everyone is good at content creation which leads them to plagiarize content.

It is worth noting that content scraping is done by small, underperforming websites, and not by authoritative, high-performing websites. Consequently, the usual victim of scraping are the high-performing websites.

If you are the victim of content scraping, you can expect consequences if the copied content gets indexed before yours. Your page might lose value and consequently, lead to a decrease in rankings.

You can use tools such as Copyscape to find copies of your content on the web. If you find a domain that has plagiarized your content, you can either ask the webmaster of the site to remove it or report them by using Google’s Copyright Removal form. You can use the form by filling out all the necessary information that Google needs and submitting it. This is what it looks like:

Google copyright removal form

Keyword CTR

Bartosz Goralewicz experienced something unnatural with a client’s site back in 2014. The site was receiving more than a thousand visits that landed on a certain page that would instantly bounce. Since user experience is an important signal, the high bounce rate looked like very bad UX.

He found out that somebody had actually programmed a bot to target specific keywords that his client ranked for, it lands on the page, and then bounces right after, which leads to the creation of a false SERP bounce rate.

You can detect this attack by regularly monitoring the CTR of your keywords. To monitor, you can use Google Search Console. You can do this by:

  • Going to Google Search Console and logging in
  • Clicking the Search Traffic drop-down button

Google search console search traffic

  • Clicking Search Analytics to check your keywords’ CTR

Google search console search analytics

  • Bonus: Here’s a more in-depth look at Google Search Console’s Search Traffic Section


If you find any sudden spikes for no reason at all, immediately contact Google and disavow the links.

Site Speed

Another ranking factor that negative SEO can potentially attack is your site speed. Practitioners of negative SEO can forcefully crawl your website and cause a heavy server load that will lead to the slowing down or even crashing of your website.

If you notice that your site has slowed down, you can use a crawling software like Screaming Frog to discover anything suspicious. If you do find something that proves you’re the victim of an attack, immediately contact your webmaster or hosting company to find out where the heavy server load is coming from and block them using robots.txt or .htaccess.

Security Upgrades

Aside from negative SEO, cyber attacks are still prevalent today. Making sure that all your software is updated, they possess all the available security patches and ensuring that your CMS software has top-notch encryption are just some of the things you need to improve your security.

Another important upgrade is to switch to HTTPS. It does not only give you better site security, it also serves as a good ranking signal that might help improve your rankings.

An instance where an upgrade in security can help you protect your site is when someone is trying to hack your website. If your website is hacked and the person involved has malicious intent and they tamper with your robots.txt file, it could lead to a complete de-indexing of your site and a massive drop in rankings. It is highly recommended that you use a rank tracking software to check your site’s visibility. If there are sudden drops, here’s how you can use Google Search Console to fix it:

  • Go to Google Search Console
  • Click the Crawl drop-down button

Google search console crawlGoogle search console crawl

  • Click the robots.txt Tester button to check if your robots.txt files are properly setup

Google search console robots txt tester

  • Bonus: Here’s a more in-depth look at Google Search Console’s Crawl Section

Google My Business Listing

Numerous negative reviews might be a sign that someone is trying to ruin your brand image through fake reviews. However, reviews of your brand are not considered as negative SEO, but it can still damage your brand image and reputation.

Be aware of your Google My Business Listing and your online reputation by using a social media monitoring software. And if you notice anything suspicious about the reviews, here’s how you can flag it:

  • Look for your business on Google Maps
  • Scroll down to the Review Summary on the left panel

Google maps review summary

  • Click the # Reviews button

Google maps review summary button

  • Click the three dots menu to open the Flag function

Google maps flag menu button

  • Click the Flag as Inappropriate button

Google maps flag as inappropriate

Key Takeaway

Being a victim of negative SEO, your best course of action is to monitor the totality of your website, isolate the problem before it causes more damage, then report it to Google.

Negative SEO might not be common, but do not be complacent. It is better for you to be protected than to suffer from irreversible damage. Simply take note of the things mentioned above, and be competent enough to monitor suspicious changes in your site and be responsible enough to increase your site’s security.

Do you know any other protective measures against negative SEO? Tell me in the comments below and let’s help each other out.

Negative SEO Protective Measures was originally posted by Video And Blog Marketing

First Click Free is Dead, but is its Replacement Really any Better for Publishers?

The publishing industry has been claiming victory recently in a long-running disagreement with Google over how subscription content (i.e. content that sits behind a paywall or registration wall) should appear in their search results:

There’s a lot of confusion around the new policy which Google has announced, and a lack of clarity in how the media, publishers, and Google itself is reporting and discussing the topic.

Google’s own announcement is typically obtuse in its framing (“Enabling more high-quality content for users”) but has plenty enough information for those of us who spend much of our professional lives interpreting the search engines’ moves to figure out what’s going on.

The short version is that what’s being reported as “ending” the first click free policy is really about extending it. There are some parts of the extension that publishers have asked for, but the key concession Google is demanding – that publishers label the paywalled content in a machine-readable way – will lead to further weakening of the publishers’ positions.

To run full the full analysis, I’m going to start with some background – but if you know all about the history, go ahead and jump ahead to my new analysis and conclusions.

The background – what was First Click Free (FCF)

In the early days of Google, they indexed only content that was publicly-available on the open web to all users and crawlers. They did this by visiting all pages on the web with their own crawler – named Googlebot. At various points, they encountered behaviour that they came to label cloaking: when websites showed different content to Googlebot than to everyone else. This was typically done to gain a ranking advantage – for example to stuff a load of text onto a page containing words and phrases that didn’t appear in the article users were shown with the objective of appearing in searches for those words and phrases.

Google disliked this practice both because it messed up their index, and – the official line – because it resulted in a poor user experience if someone clicked on one of these articles and then discovered content that was not relevant to their search. As a result, they declared cloaking to be against their guidelines.

In parallel, publishers were working to figure out their business models on the web – and while many went down the route of supporting their editorial business with advertising, many wished to charge a subscription fee and allow only paying customers to access their content.

The conundrum this presented was in acquisition of those customers – how would people find the paywalled content? If Googlebot was blocked at the paywall (like all other logged-out users) – which was the only legitimate publisher behaviour that wasn’t cloaking – then none of those pages would rank for anything significant, as Googlebot would find no real content on the page.

Google’s solution was a program they called First Click Free (FCF) which they rolled out first to news search and then to web search in 2008. This policy allowed publishers to cloak legitimately – to show Googlebot the full content of pages that would be behind the paywall for regular users by identifying the Google crawler and specifically treating it differently. It allowed this behaviour on the condition that the publishers allow any user who clicked on a Google search result to access the specific article they had clicked through to read whether they had a subscription or not. After this “first click” which had to be free, the publisher was welcome to enforce the paywall if the user chose to continue to request subsequent pages on the site.

Problems with First Click Free and the backlash

The biggest problem with FCF was that it  created obvious holes in publishers’ paywalls and led to the open secret that you could access any article you wanted on many major newspaper sites simply by googling the headline and clicking through. While complying with Google’s rules, there was little the publishers could do about this (they were allowed to implement a cap – but required to allow at least 3 articles per day which is beyond the average consumption of most paywalled sites by most users – and effectively constituted no cap).

Many publishers began to tighten their paywalls or registration walls – often showing interstitials, adverts, or enforcing a monthly quota of “first click” articles a given user was allowed – technically leaving them cloaking in breach of Google’s guidelines, and frequently providing a poor user experience.

Publishers also began to resent more generally that Google was effectively determining their business models. While I have always been concerned about exactly what will continue to pay for journalism, I always had little sympathy for the argument that Google was forcing publishers to do anything. Google was offering a way of cloaking legitimately if publishers were prepared to enable FCF. Publishers were always welcome to reject that offer, not enable FCF, and also keep Googlebot out of their paywalled content (this was the route that The Times took).

Earlier this year, the Wall Street Journal pulled out of FCF, and reportedly saw a drop in traffic, but an increase in subscriptions.

The new deal is really an expansion of FCF

The coverage has been almost-exclusively describing what’s happening as Google ending the FCF program whereas it really sounds more like an expansion. Whereas before Google offered only one legitimate way of compensating for what would otherwise be cloaking, they are now offering two options:

  1. Meteringwhich includes the option previously called FCF – requires publishers to offer Google’s users some number of free clicks per month at their own discretion – but now also allowing publishers to limit how many times a single user gets free content after clicking through from Google

  2. Lead-in – which shows users some section or snippet of the full article before requiring registration or payment (this is how implements its paywall at the moment – so under the new rules they would now legitimately be able to allow Googlebot access to the full normally-paywalled content subject to my important notes below)

Google is imposing a critical new condition

However, both of these options come with a new limitation: in order to take part in the expanded scheme they now call Flexible Sampling, publishers must mark up content that will be hidden from non-subscribers using machine-readable structured markup called JSON-LD. Structured markup is a machine-readable way of providing more information and context about the content on a page – and in this case it enables Google to know exactly which bits of content Googlebot is getting to see only because it’s Googlebot (and the publisher is engaging in Flexible Sampling) and what will actually be visible to users when they click through.

And here’s the rub.

This new requirement is listed clearly in Google’s announcement but is getting little attention in the mainstream coverage – probably because it’s both a bit technical, and because it probably isn’t obvious what difference it makes to publishers beyond a bit of development work(*).

To me, though, this requirement screams that Google wants to do the same things they’ve done with other forms of structured markup – namely:

  1. Present them differently in the search results

  2. Aggregate and filter them

(*) Incidentally, the technical requirement that the JSON-LD markup declare the CSS selector for the paywalled content is one that we at Distilled predict is going to present maintenance nightmares for many publishers – it essentially means that any time a publisher makes a visual change to the user interface on any of their article pages, they’re going to need to check that they haven’t broken their compliance with the new Flexible Sampling program. These are often considerations of different teams, and it is very likely that many publishers will accidentally break this regularly in ways that are not obvious to them or their users. It remains to be seen how Google will treat such violations.

1. I’m convinced Google will label paywalls in the search results

My thinking here is that:

  1. Hard paywalls are already labelled in Google News

  2. Many other forms of structured markup are used to change the display in the search results (probably the most obvious to most users is the ratings stars that appear on many product searches – which come from structured markup on the retailers’ sites)

  3. Especially in the case of a hard paywall with only a snippet accessible to most users, it’s a pretty terrible user experience to land on a snippet of content and a signup box (much like you see here if you’re not a subscriber to The Times) in response to most simple searches. Occasionally a user might be interested in taking out a new subscription – but rarely to read the single article they’re searching for right now

Point 3 is the most critical (1 & 2 simply show that Google can do this). Given how many sites on the web have a paywall, and how even the most engaged user will have a subscription to a handful at most, Google knows that unlabelled hard paywalls (even with snippets) are a terrible user experience the majority of the time.

I fully expect therefore to see results that look something like this:

This will:

  • Allow them to offer a scheme (“flexible sampling”) that is consistent with what publishers have been demanding

  • Let publishers claim a “win” against big, bad Google

  • Enable the cloaking that lets Googlebot through even hard paywalls (all but the most stringent paywalls have at least a small snippet for non-logged-in users to entice subscriptions)

  • Avoid having to remove major media sites from the search results or demote them to lower rankings

  • And yet, by labelling them clearly, get to the point that pretty much only users who already have a subscription to a specific site ever click on the paywalled results (the number of subscriptions you already have is small enough that you are always going to remember whether you have access to any specific site or not)

My prediction is that the end result of this looks more like what happened when the WSJ pulled out of FCF – reportedly good for the WSJ, but likely very bad for less-differentiated publishers – which is something they could already do. In other words, publishers have gained very little in this deal, while Google is getting them to expend a load of energy and development resource carefully marking up all their paywalled content for Google to flag it clearly in the search results. (Note: Google VP for News, Richard Gingras, has already been hinting at some of the ways this could happen in the Google News onebox).

2. What does aggregation look like?

Once Google can identify paywall content at scale across the web (see the structured markup information above) they open up a number of interesting options:

  1. Filtering subscription content out of a specific search and seeing only freely-available content

  2. Filtering to see only subscription content – perhaps from user-selected publications (subscriptions you own)

    • Possible end-game: connecting programmatically to subscription APIs in order to show you free content and content you have already got a subscription for, automatically
  3. Offering a bundle (Chris Dixon on why bundles make economic sense for both buyers and sellers). What if you could pay some amount that was more than a single subscription, but less than two, that got you access to 5 or 6 major media sites. It’s very likely that everyone (except publishers outside the bundle!) would be better off. Very few players have the power to make such a bundle happen. It’s possible that Google is one of those players.

Under scenario #3, Google would know who had access to the bundle and could change the display in the search results to emphasise the “high quality, paid” content that a particular searcher had access to – in addition to the free content and other subscription sites outside the bundle. Are we going to see a Spotify for Publishers? We should all pay close attention to the “subscription support” tools that Google announced alongside the changes to FCF. Although these are starting with easy payment mechanisms, the paths to aggregation are clear.


Ben Thompson has been writing a lot recently about aggregators (that link is outside his paywall – a subscription I wholeheartedly recommend – I look forward to seeing his approach to the new flexible sampling options on his own site, as well as his opinions). Google is the prototypical super aggregator – making huge amounts of money by aggregating others’ work with effectively zero transaction costs on both the acquisition of their raw materials and their self-service sale of advertising. Are they about to aggregate paid subscription content as well?


Publishers are calling this a win. My view is that the new Google scheme offers:

  1. Something that looks very like what was in place before (“metering””)

  2. Something that looks very like what pulling out of FCF looked like (“lead-in”)

And demands in return a huge amount of structured data which will cement Google’s position, allow them to maintain an excellent user experience without sending more traffic to publishers, and start them down a path to even more aggregation.

If paywalls are to be labelled in the search results, publishers will definitely see a drop in traffic compared to what they received under FCF. The long-term possibility of a “Spotify for Publishers” bundle will likely be little solace in the interim.

Are you a publisher?

If you’re wondering what you have to do as a result of these changes, or what you should do as the landscape shifts, don’t hesitate to get in touch and we will be happy to discuss your specific situation.

First Click Free is Dead, but is its Replacement Really any Better for Publishers? was originally posted by Video And Blog Marketing

SEO Hacker’s Content Strategy for SEO

Content Strategy for SEO cover

In today’s online landscape, an effective SEO content strategy is what most SEO professionals yearn for. But where do we find it? Is there a blueprint for an effective SEO content strategy that would improve your SEO indefinitely? Yes, there is.

Here are the main factors we monitfor for our SEO content strategy :

  1. Easier Topics
  2. Content Layering and Internal Links
  3.  Linkability
  4. Content Updates

Having an effective SEO content strategy can help you improve your site’s overall ranking. Let’s get started.

Easier TopicsEasier Topics

The first step in any SEO content strategy is topic ideation. The process begins with keyword research – analyze and inspect keywords or topics that are relevant to your brand. You can use any keyword research tool that you’re comfortable with, however, there are three factors you must take note of. They are:

The best thing that could happen is that all your desired keywords have high search volume with a low difficulty. Those keywords, however, are far and few in between. Often times, keywords that have low difficulty would be the better choice to target.

There are numerous tools you can use such as SEMrush or CognitiveSEO’s Keyword Tool. Here in SEO Hacker we use CognitiveSEO’s Keyword Explore. It’s simple and you just need to put in your desired keyword, then it automatically shows you the data. It measures your keyword’s difficulty from zero (0) being the easiest and a hundred (100) being the most difficult.

Cognitive SEO Keyword Difficulty screenshot

I would also highly advise you to do your homework and check keyword difficulty manually. How do we do it?

Simply put in your desired keyword into the search bar, and analyze the results. If the first page results contain popular business names or highly-authoritative websites, it usually means it’s more difficult. Conversely, if you see low-authority websites, it usually means lower difficulty and much better chances to create outstanding content.

Manual Google Keyword Research screenshot

Take note that if you’re a new business, and one with not yet much authority to work with, it may be better to go for niche keywords. Being able to stand out in a less competitive environment will give you more opportunities to rank higher and gain more links.

Content Layering Internal LinksContent Layering and Internal Links

Content layering basically means that you layer your content in regards to their topics. You “layer” your middle-of-the-funnel content (content that gives specific solutions to a particular query) above your bottom-of-the-funnel content (your converting/landing pages) through internal linking. We do this because it is extremely beneficial to our website.

Some benefits include:

  • Authority Sharing: Having the opportunity to receive inbound links for your middle-funnel content because it has more “linkable” assets than the bottom-funnel. Eventually, this will pass down valuable link equity to your bottom-funnel content. This is beneficial for you because no one will usually link to your bottom-funnel content due to it being promotional in nature.
  • Site Authority and Popularity: If your middle-funnel content is ranking highly in the search results, you’ll notice that your site’s authority and popularity are steadily improving. Then, your bottom-funnel content will also improve through increased conversions and more sales.
  • Link Procurement for Bottom-Funnel Content: Promoting your middle-funnel content can help you procure links for your bottom-funnel content through the strategic placement of its links inside the promoted content.

Lastly, content layering and an organized link structure will help the users and search engines navigate through your site easier.


“Linkability” is your content’s potential to earn links. And we all know that links still remain as one of the most important ranking factors. So, you should create content with “linkability” in mind.

Remember that people usually link to websites due to the relevance of the content. You should have an idea of the people or webmasters you want to reach out to, and build your content in such a way that your potential of getting an inbound link from them increases.

Some factors that we consider for our outreach targets are:

  • Keyword Search Volume: If the search volume for your keyword is relatively high, then it’s highly likely that more people will link to your content.

    • We normally use CognitiveSEO’s keyword tool to find out our target keyword’s monthly volume results. We just put in our target keyword in the tool’s search bar, and wait for the tool to load the data for our keyword.
      Cognitive seo keyword tool
    • Afterward, we’ll be taken to the results page , and the keyword difficulty, average content performance, and monthly volume will be shown on the top of the page.
    • Bonus: Here’s a more in-depth discussion of CognitiveSEO’s keyword tool.

Cognitive seo monthly volume

  • Unique Domain Links to the Top Results: If the top results for our target keyword has numerous unique domains linking to them, our chances of strategically acquiring links from these domains increases due to our content’s relevance. We usually use Ahrefs or Buzzsumo for checking the backlinks of the top results.

Buzzsumo backlink checker screenshot

Aside from studying the prospective outreachtargets, we will have to examine our link opportunities. A simple approach would be to create something better than what is currently ranking or more commonly know as the skyscraper technique. We create better content than what’s ranking, then reach out to the people that linked to the original content.

Creating better content does not necessarily entail spending numerous hours trying to come up with the best version, sometimes, you only need to change some aspects in order to produce a much better version.

Some changes we make are:

  • More searchable title
  • Refine and organize readability and structure
  • Improve page speed
  • Incorporating trusted sources and data
  • Explaining the topic in more depth
  • Including recent or original data/research
  • Presenting it in a different format (e.g. video, infographs, etc.)

Updating ContentUpdating Content

Aside from creating new content, our comprehensive SEO content strategy also includes updating existing content or pages – which is easier than starting over to create something new.

For example, you can add a video to long-form guides which can add value and linkability to the page.

Another thing we do is to repurpose our underperforming pages. We inspect the search results for our targeted keyword, check if there is a prevalent format among the top results, then apply that format to our underperforming pages.

It’s important for SEO specialists to invest time, money, and energy into creating new, engaging content, but it’s important to improve and update existing pages as well.

Key Takeaway

Just to help you remember the SEO content strategy above, here’s a short recap:

  • Easier Topics:

    • Judge topics by their search volume, traffic value, and keyword difficulty.
    • Another option is to manually find easier topics through search results that show low authority sites.
  • Content Layering and Internal Links:

    • Through proper content layering and organized link structures, your bottom-funnel content can benefit from shared link equity, improved domain and site authority, and more link acquisitions.
  • Linkability:

    • Determine potential outreach prospects through keyword search volume, unique domains that link to top results, and the types of pages.
    • Use the skyscraper technique to have more link opportunities.
  • Updating Content:

    • Upgrade your existing content
    • Repurpose existing pages into something more “link worthy”.

Effective SEO content strategies are important for your SEO goals. So, read, learn, and apply the practices explained above to help your content creation and your SEO campaign.

SEO Hacker’s Content Strategy for SEO was originally posted by Video And Blog Marketing

8 Reasons You Should Join us at SearchLove London 2017

We’re closing in on SearchLove London – it’s on 16th and 17th October – in just a few short weeks’ time. We’ve been running a conference in our home city since 2009, and I’m as passionate as I’ve ever been about making our events stand out.

You can still get a ticket for under £900 – the classic all-access pass costs £899 +VAT – and get access to the whole conference as well as after-show entertainment on both nights.


Here’s why we think our show is special:

1. Quality

The combined outstanding and excellent ratings from a recent conference.

I obviously don’t generally get to see all the feedback other conferences get, but I’d bet ours is right upthere. At one of our recent events, our eight best speakers were all rated outstanding or excellent by over 9 out of 10 people in the audience. Even our twelfth-best speaker was rated outstanding or excellent by 4 out of 5 people in the audience. I’ve never seen another conference where the bottom quartile speaker ratings are still getting into the ~65% outstanding or excellent range.

Speaker quality and consistency is our top priority, and the most common complaint about conferences generally. With our conference being a single track show, we know everyone will see every speaker, so they all need to bring their A game, and they know it.

2. The speakers

We’ve invited some of the best speakers from Boston and San Diego to London 2017.

Speaking of the speakers(!) I’m so grateful to all the people who put such an incredible amount of work into preparing their talks – if you’ve never done it, you have no idea how much work and pressure it can be.

This year, we have:

  • Exceptional speakers: we often invite back speakers who do an exceptional job at our other conferences. Running events on both sides of the Atlantic might bump up our travel costs, but it lets us see great speakers with our own eyes before inviting them to our big stage:

    • To come in the top 4 at our San Diego conference this year, a speaker needed to get over 97% of the audience rating them outstanding or excellent. At this London event, we’re bringing 3 of those top 4 speakers back to wow you. (*)

    • In Boston earlier this year, our top 3 scored 94+% outstanding or excellent and we’re bringing all of them to London.

  • Returning favourites: 3 of the top 5 all-time best SearchLove speakers (looking at average scores from speakers who’ve appeared multiple times)

  • Brand new speakers: 11 of our 17 speakers have never appeared at SearchLove London before (and Paddy last spoke here in 2011 / Justin in 2012!). We’re confident they’re going to will blow you away (see below for more on our prep process)

(*) the other top-4 speaker was Greg Gifford (DistilledU members can see the videos here). It looks like we need to invite him over to London soon!

3. A great venue

The Brewery adds to the SearchLove london experience.

As a speaker, I’ve rarely come across a stage as good as the one at The Brewery. It’s a huge widescreen, with extra massive screens partway back so everyone can see my slides, the stage is huge, my face is projected far too big alongside the slides giving great trolling opportunities when I pull stupid faces, and the audio / visual setup is top-notch. I trust the A/V team to make me look and sound good, and I get to concentrate on my story.

As a delegate, you get a seat with a desk, power, notebook and pen. You get wifi that works, and you get top-notch food and great coffee. Join us for structured lunchtime work at our Topic Tables staffed by the Distilled team, or just hang out and catch up with friends new and old.

4. A taste of London

Enjoy your time in one of the most vibrant cities in the world.

We know that many of our delegates travel to attend, and so we’ve picked our venues for the conference and entertainment to help you make the most of your trip to one of the greatest cities on the planet.

The Brewery is in The City of London – the historic Square Mile – so you’ll get a taste of the traditional. The entertainment is conveniently nearby, and you’re within easy walking distance of the buzzing Old Street technology hub (with its great hipster coffee) as well as Clerkenwell with its spectacular restaurants and fancy bars. Even in the time I’ve lived and worked in London, I’ve seen a dramatic improvement in the food and drink scene – all great excuses to make the trip to an incredible city.

If you want to spend a bit of time visiting London either side of the conference, you can be anywhere in the centre of London within 20-30 minutes by public transport, whether you want to see the tourist sights or do some shopping. If you want to extend your trip to the rest of the UK, you’re close to the Kings Cross and Euston stations that connect you to almost everywhere north of London (and even to St Pancras for Paris and the rest of Europe).


5. Access to experts, and the chance to meet friends old and new

We work hard to make networking with fellow attendees as enjoyable as possible.

We know that much of the value in attending a conference comes from meeting speakers and other delegates so we set up plenty of opportunities to do that:

  • VIP ticket-holders join the speakers for an exclusive pre-show dinner.

  • We have chosen to have a single-track event, with every speaker getting a full-length 40-minute session – this means that every other delegate has seen the same speakers you have, and so you’ll have plenty to chat about, and all our speakers will be very familiar to you and super-approachable

  • Plenty of opportunities to mingle and meet people – including structured and unstructured lunchtime sessions, regular breaks, a fantastic party on the first night and industry meet-up on the second (which even non-delegates can attend so invite your other London friends)

6. Past delegates would urge you to come

You might have noticed that I’m a bit obsessed with feedback. As part of the conference feedback, we ask our delegates to tell us how likely they are to recommend our conference (out of 10). From this, we calculate a Net Promoter Score (NPS). NPS ranges from -100 to +100, with anything over 50 being excellent. Last year’s London conference rated a 55 with almost half the delegates surveyed (44%) giving it the top possible score of 10.

7. Coming from overseas? It’s cheaper than ever

Without getting too political about it all, our currency has been fluctuating a bit over the last year, and so right now, our tickets come in at only:

  • EUR 1,021

  • USD 1,212

There has to be some silver lining, right? If you’re coming from the US or Europe, the exchange rate has never been more in your favour. Your money goes further!

8. We’re working hard to address all the criticisms we’ve seen of marketing conferences

It turns out good coffee is high on people’s conference priorities.

The other day, I put out a call on Twitter to ask for everyone’s common complaints about marketing conferences because I want to make sure that we are doing our very best to avoid them – whether they’re big complaints or small details:

  • We take our code of conduct very seriously and work hard to make our events welcoming and inclusive for all – and I’ve heard good private feedback about our efforts:

    • We remind speakers about it during our prep calls

    • It’s emphasised during our MC’s intro

    • All our staff know what to do in the event of witnessing or receiving a report of a violation

  • We’ve got sessions on hardcore link building and deeply technical topics – we’ve got plenty on content and social, but we haven’t forgotten our roots

  • And a load of details:

    • The food is great – delegate comments:

      • “the general organisation and food etc. were top notch”

      • “really good food”

      • “great food”

      • “great venue and food”

    • The wifi works

      • “good wifi”

    • We have great coffee

      • “coffee was awesome”

    • Complaint: lanyards can be hard to read or flip over. Our lanyards have names printed on both sides – hopefully big enough to read easily

But of course, by far the most common issue people have is with speaker and talk quality. I talked a fair bit above about our speakers but we are by no means assuming that we’ve done all we need to do – we continue to run a speaker selection and preparation process that involves:

  1. Detailed research, including watching previous footage, reviewing past decks etc

  2. Discussion of topic ideas that the speaker has new and interesting ideas about

  3. Content calls with me or a senior Distilled team member to set expectations, discuss the outline, and share information about the conference and audience

  4. Where appropriate / for any speaker that wishes: review and feedback on actual talk outlines and draft decks

We also encourage first-time speakers to review footage of past top-rated sessions and speakers.

I asked a few of our speakers for their thoughts on our speaker prep process. They said:

Emily Grossman:

“The SearchLove team really sets speakers up for success. It all starts with initial planning brainstorms where we talk about the best topic-fit for SearchLove. Will, Lynsey, and the whole team are very open about what works and doesn’t work for their audience. As a speaker, this helps shape how I’ll approach a certain subject and allows me to really tailor both my topic and my deck to the SL crowd.”

Greg Gifford:

Sam Noble:

What are you waiting for?

There’s still time to pick up your ticket, but time is running out. Click the link below and pick up your ticket today. Reply in the comments if there are any last-minute questions you’re burning to ask.

Join us for SearchLove London 2017

8 Reasons You Should Join us at SearchLove London 2017 was originally posted by Video And Blog Marketing

Communicating Value Effectively: Respecting the customer’s right to draw their own conclusions

No one likes a braggart.

When someone states that they’re the best at something, my immediate reaction is to question such a bold claim — and to get a little irritated. It’s cute when a kid does it. It’s not so cute when an adult does the bragging.

Customers feel the same way when they visit your webpage. They are bombarded daily by marketing ads that love to use that word “best.”

“We’re the best” … “We have the best” … “We make the best …”

“The goal of marketing is not to make a claim; the goal of marketing is to foster a conclusion.”

— Flint McGlaughlin, CEO and Managing Director, MECLABS Institute

But if you want to stand out in the marketplace, instead of making a claim that you are the best, show your prospective customers that you are the best — with specific, quantifiable facts. Then, let them draw their own conclusions whether you are, indeed, the best at what you do, or not.

When you allow the customer the freedom to do their own thinking — to infer from a solid list of quantifiable, credible reasons — you are valuing the customer, and, in return, the customer values you and your product or services.

In this Quick Win Clinic, Flint McGlaughlin looks at a claim made by book creator website Bookemon that states it is the “Best Book Creator,” and evaluates how well it presents the facts about said claim.

Communicating Value Effectively: Respecting the customer’s right to draw their own conclusions was originally posted by Video And Blog Marketing

Clarifying Your Marketing Objective: The danger of asking “how?” too soon

In an earlier Quick Win Clinic, Flint McGlaughlin, CEO and Managing Director of MECLABS, talked about the importance of determining a clear objective for your webpage.

But where does the marketer go from there?

The next step is to determine the most effective way to accomplish your objective.

“What is my objective and what is the most effective way to accomplish my objective? … We have to give people a reason to invest their mental energy in going forward.”

—Flint McGlaughlin, CEO and Managing Director, MECLABS

It is at this point that we marketers should avoid making the mistake of rephrasing the question and asking, “How can I accomplish my objective?”

The “how” question is insufficient because it doesn’t force you to (1) generate options and (2) select from those options the one that promises the best way to accomplish your objective.

In this Quick Win Clinic, McGlaughlin optimizes The Recruiting Code, a webpage submitted by Bryan, whose objective is to sell a book. Watch the video to see if the objective was accomplished in the most effective way.

Clarifying Your Marketing Objective: The danger of asking “how?” too soon was originally posted by Video And Blog Marketing

Marketing is Not About Making Claims; it’s About Fostering Conclusions

Imagine for a moment you are in the 10-items-or-less line at the grocery store. There is a man in front of you getting rung up. He’s wearing sunglasses and a suit. You note amusingly to yourself that he must be especially sensitive to fluorescent light. He’s talking loudly on the phone while the clerk patiently scans his only items: 11 huge containers of protein.

“I’m a closer Frank — it’s what I do,” he gabs into his late-model iPhone Plus. “I’m the best in this city. Believe me. You’ve never seen a closer as good as me, Frank. Frank? You there Frank? Yeah, did you hear what I said Frank? I’m a closer!”

Once the clerk is done ringing him up, he pays, mouths “thank you” and plops a glossy, white business card on the counter. Looking from the clerk to you he points to the card, shoots both of you a thumbs up, gathers his protein into his cart, and walks out the door continuing his deafening conversation with Frank.

His card features a typical real estate logo and a glamor shot of his bust without sunglasses. Though, you do make another half-amusing note-to-self that he is wearing the same tie.

Why Marketers are Just Like Frank’s Photophobic Associate

I took a while painting that picture for you because — every day — marketers do the same thing as Frank’s photophobic associate. We make wild claims about ourselves and expect people to be impressed. When, really, all we’re doing is helping them conclude that we’re not the kind of company they would want to do business with.

The worst part is that a business usually exists in the marketplace because they DO have real value to offer customers. But most of us don’t know how to communicate that to our customers effectively.

When we can get it right, however, and rather than make claims, foster conclusions in the mind of the customer, the results can be powerful.

Take this MECLABS certified experiment recently run with a single-product nutrition company.

Experiment: Background

Test Protocol: TP1798

Experiment ID: Protected

Location: MECLABS Research Library

Background: A single-product company that sells high-quality, all-natural powdered health drinks

Goal:  To increase order conversion

Primary Research Question: Which of the following pages will produce the highest conversion rate?

Approach: A/B multi-factorial split test

Experiment: Control















Now, take a moment to look at the Control in this test. Before you read any further, it might help you understand what I’m talking about better if you try to identify any photophobic-guy-like claims in the page copy.

Now, they aren’t as dramatic as our opening character, but they are there.

  • Boost Your Energy and Metabolism
  • Improve Digestion and Gastrointestinal Function
  • Detoxify and Alkalize Your Body at a Cellular Level
  • Save Time and Money
  • Limited Offer! Act Now!

There’s more, but let’s just focus on these for a second. It seems at face value to be good copywriting. The words are well-chosen, interesting, and they have a kind of energy to them. But at their heart, they are just bragging.

As a result, the conclusions in the mind of the customer who might be reading this page must be couched in a kind of suspension of disbelief if they are to continue. Maybe the people who buy already know the company is trustworthy so they go on to fill out the form and purchase.

But what about the people unfamiliar with the company? To them, this is just another fad super-food that claims it’s the best. There’s no evidence, no logical argument, no facts to back up what they are saying.

But now, consider the Treatment in this experiment as a contrast.

Experiment: Treatment
















In the Treatment, we change a little bit of the copy, but we achieve an entirely different result in the mind of the customer. The copy has changed to focus not on claims, but rather facts, which, in turn, foster the overall conclusion that this is an excellent product and worth paying for.

  • Made from 75 whole food sourced ingredients in their natural form
  • Contains probiotics and enzymes for optimal nutrient absorption and digestion
  • Carefully formulated by doctors and nutritionists to deliver essential nutrients
  • 10+ years of research to develop an easy to mix powder with naturally sweet taste

What’s the result?

Experiment: Results

The result is a 34% increase in conversion. And for an ecommerce product like this one, that translates to pure revenue.

Foster Conclusions, Don’t Make Claims, Make More Money

In the end, people are still people. We are mostly reasonable. We hear arguments and we can change our minds. But when we hear someone making braggadocios claims, rather than trying to rationally win us over, we are naturally repulsed. Your customers are the same way. And when we foster the right conclusions in their mind about us using facts, data, and tangible evidence, we will inevitably feel better about our marketing, and make more money in the process.

You Might Also Like:

The Prospect’s Reality Gap

The Web as a Living Laboratory

Brand: The aggregate experience of the value proposition

The Boston Globe: Discovering and optimizing a value proposition for content

Marketing is Not About Making Claims; it’s About Fostering Conclusions was originally posted by Video And Blog Marketing