Month: March 2019

Distilled seeking less experienced speakers for SearchLove Boston 2019

Over the past 8 or 9 months, you might have heard us talking about the SearchLove community speaker sessions. These are the opportunity to speak for 20-minute in front of 200+ people that we ran at our London and San Diego conferences, delivered by a group of relatively new speakers. Each speaker received support and coaching by the Distilled team, with the aim of giving them a platform to showcase their knowledge, talent and opinions.


Now to the best bit. We’ve been so happy with our previous community speakers that we’re rolling it out to our East Coast conference, SearchLove Boston 2019.  If this sounds like something you’d like to be a part of then we’re on the lookout for speakers who:

  • Are Boston locals (no further than 2-2.5 hours drive from the venue). We want to support the community where our conference runs and help our speakers raise their local profile.
  • Are looking for their first big stage speaking opportunity
  • Are available to join us at SearchLove Boston on June 10 & 11, 2019.

Before you apply we recommend having a read of the details below. Once you’ve done that, grab your camera and send us your pitch to join us in Boston. You’ve got just over 2 weeks to apply (deadline: April 16, 2019)


What’s in it for us and our audience?

Our events team spends a huge amount of time reviewing speaker applications, visiting conferences on both sides of the Atlantic and talking to the best speakers on the circuit at the moment, so we can gather them all under one roof at our SearchLove conferences. We’ve had many speakers that have blown us away and we love it when they ask us if they can speak again, but over the last 9 months we set ourselves the goal to find even more emerging talent.

So far we’ve had 7 community speakers standing shoulder to shoulder with some of the best digital marketers in the world, and they’ve all held their own (some were rated in the top 3 speakers) . What makes this so exciting for us, is that these are marketers who work on the front line of digital marketing at their companies every single day. They take a leap of faith and stand on stage telling us about their successes and failures, so we can all learn from them.

We want to make SearchLove diverse. We all work in an industry that is white male dominated, and we hold ourselves responsible for developing a more diverse pool of speakers who can appear on stages across the world in the future. SearchLove Boston 2018 was our first conference with over 50% women speakers, and that’s a number we want to sustain and grow.

We are aware (as with previous rounds of community speaker applications), that a lot of applicants will be white men, but if that doesn’t describe you then we’d especially encourage you to grab your phone and send us an application.

apply Now

What’s in it for you?

Here’s the full package you’ll receive if you are successful (along with your 20 minutes on stage!):

  • Introductory call with the Distilled team
  • Multiple video calls to run through your presentation with the Distilled team and get feedback
  • Deck review and content call to bounce around your session ideas
  • 1 to 1 ongoing support from a Distilled team member
  • Final in-person review with Will Critchlow and the Distilled team in Boston before the conference
  • VIP ticket to attend SearchLove Boston including attending the VIP dinner with all the other speakers the night before the conference
  • A nice bunch of Distilled and SearchLove swag

We are extremely excited to be rolling this scheme out to our second US event after the success of San Diego & London (where one of the community speakers broke into the top 3 rated speakers!). But as this is an opportunity for local folks, we’ll only be accepting pitches from applicants living within the Boston area (2-2.5 hours drive from the venue)  for this particular conference.

A note on the video requirement

As part of the application process, we require all speakers to send us a short video. Ultimately, we are looking for speakers who get us excited, and sometimes words on paper don’t tell the whole story. We also wanted to level the playing field as newer speakers might not have access to professionally shot and edited videos. By doing this we have found that the application process is more inclusive and gives everyone their best chance to shine and stand out.

We do not expect the video to be professionally lit and shot. The most important thing is that we want to see your energy and enthusiasm about your topic and the chance to be a part of SearchLove Boston. Need an example? Have a look at Will’s short video below to see the sort of thing we’re looking for.

Seriously, a selfie video, shot on your mobile phone or webcam is perfectly acceptable, but have a think about how you might stand out from the crowd.

Final words

After reading this, we hope that you’re super excited to take a leap of faith and fill in the application form. If you’re still on the fence, we’d encourage you to just give it a go.

Don’t fall prey to imposter syndrome. You’re working in the industry, doing these things day in and day out. We want to hear about your experiences and share your journey. Don’t worry about how much speaking experience you have – we promise we’ll do everything we can make your presentation excellent. The best advice we can give you is to apply.

Let our previous community speakers convince you

I thought the feedback was the best thing about the whole programme. I was, perhaps, a little sceptical in advance but completely a fan.

[It] was comfortably uncomfortable! Having your work pulled apart is never easy, but it was done in a really encouraging way and brought the best out of all of our presentations.

Andi Jarvis, SearchLove 2018 community speaker

How to apply

You’ll need to tell us:

  • Why you’d like to speak at SearchLove Boston
  • Where you are based
  • What your speaking experience looks like so far
  • What topic you’d like to talk about – the more specific and actionable a topic you can describe, the better
  • Remember, the closing date for applications is April 16, 2019

And you’ll need to send us a short video as described above.

apply now

Distilled seeking less experienced speakers for SearchLove Boston 2019 was originally posted by Video And Blog Marketing

Old Search Console Shutting Down: New Features You Need To Know

Last updated on

Old Search Console Shutting Down New Features You Need To Know

It’s an end of an era for the old Google Search Console, as it will finally be shut down on March 28, 2019, to make way for the complete integration of the New Search Console. This has been in the works for the past few months, as users have been gradually introduced to the newest Search Console features, to allow for users to become familiar with the new interface.

After being in the beta stage during the early part of 2018, the New Google Search Console has introduced new features such as the URL Inspection Tool and Domain Properties. With a refined interface and a host of additional features, let us take a look at each new feature in the latest version of Google Search Console.

New Sitemap Features

One of the latest updates to the Search Console, the new sitemap features allows users to be able to delete sitemaps, present RSS and Atom feeds, and review more sitemap error details. Before this update, the only way that users can delete sitemaps is through the older version of the Search Console, which made the process less efficient, especially with the need to adjust to the new interface.

Sitemap Support GSC

This feature comes in handy, as it allows you to stop data from being recorded on the Search Console, especially if you are looking to streamline the number of websites you want to track. You can also view sitemaps in a new tab, which makes viewing multiple sitemaps possible. The new sitemap report also provides more detail than before, allowing you to monitor errors and issues more efficiently. This amount of detail allows you to see specific elements that need to be fixed, making the process of optimizing sitemaps much better.

URL Inspection Tool

A feature introduced during the past few months, the URL Inspection Tool allows you to view your website’s HTML, screenshot, HTTP response, Javascript console messages, and Page Resources. You can inspect any URL within the website, allowing you to view individual pages with detail. This new feature also means that Fetch As Google is no longer included in the New Search Console, as a screenshot can now be viewed in the Inspection tool itself. Another new feature included in the URL Inspection Tool is Live URL Testing, which allows you to be able to monitor and assess webpages in real-time, allowing you to be able to see any issues that can be remedied as soon as possible.

Fetch As Google Going Away

Domain Properties

The other recently introduced feature is Domain Properties, which organizes your domain listings for each website, making accessing subdomains and non-HTTPs much easier. While this may be a minor update compared to some of the other features, this allows users to access data more efficiently, preventing the need to scroll through long lists of domains just to find the right one.

Google-selected Canonical URLs

Another new feature integrated into the URL Inspection tool is the Search automatically selects the canonical URL in your website. This is important to know, as Google has a tendency to ignore rel canonical at times. This allows you to see which URL is considered by Google as canonical, with the option of changing it upon your request if Google has selected an incorrect URL.

Retiring the “info” Command

Along with the introduction of this new feature, Google has also retired the info command feature, as it is a feature rarely utilized by the users. The same data can be viewed on the Search Console, meaning retaining the feature would be redundant. With this in mind, this new feature aims to be the better solution for website owners and SEOs to track their URLs.

Key Takeaway

The New Google Search Console aims to provide users with more comprehensive information than before, with features that allow you to view more data, with an interface that is organized and user-friendly. Like all other Google services, the New Search Console should receive a slew of updates during the rest of the year, providing new features that would make one of Google’s most effective tools even better.

If you have questions about Google Search Console or SEO in general, leave a comment below and let’s talk.


Old Search Console Shutting Down: New Features You Need To Know was originally posted by Video And Blog Marketing

How to Check Your Site Speed: 5 Things You Need to Know About the Google User Experience Report

You’ve done your keyword research, your site architecture is clear and easy to navigate, and you’re giving users really obvious signals about how and why they should convert. But for some reason, conversion rates are the lowest they’ve ever been, and your rankings in Google are getting worse and worse.

You have two things in the back of your mind. First, recently a customer told your support team that the site was very slow to load. Second, Google has said that it is using site speed as part of how rankings are calculated.

It’s a common issue, and one of the biggest problems about site speed is it is so hard to prove it’s making the difference. We often have little-to-no power to impact site speed (apart from sacrificing those juicy tracking snippets and all that content we fought so hard to add in the first place). Even worse – some fundamental speed improvements can be a huge undertaking, regardless of the size of your dev team, so you need a really strong case to get changes made.

Sure, Google has the site speed impact calculator which gives an estimate of how much revenue you could be losing for loading more slowly, and if that gives you enough to make your case – great! Crack on. Chances are, though, that isn’t enough. A person could raise all kinds of objections, for instance;

  1. That’s not real-world data

    1. That tool is trying to access the site from one place in the world, our users live elsewhere so it will load faster for them
    2. We have no idea how the tool is trying to load our site, our users are using browsers to access our content, they will see different behaviour
  2. That tool doesn’t know our industry
  3. The site seems pretty fast to me
  4. The ranking/conversion/money problems started over the last few months – there’s no evidence that site speed got worse over that time.

Tools like are fantastic but are usually constrained to accessing your site from a handful of locations

Pretty much any site speed checker will run into some combination of the above objections. Say we use (which wouldn’t be a bad choice), when we give it a url, an automated system accesses our site tests how long it takes to load, and reports to us on that. As I say, not a bad choice but it’s very hard to to test accessing our site from everywhere our users are, using the browsers they are using, getting historic data that was recording even when everything was hunky-dory and site speed was far from our minds, and getting comparable data for our competitors.

Or is it?

Enter the Chrome User Experience (CRUX) report

In October 2017 Google released the Chrome User Experience report. The clue is in the name – this is anonymised domain-by-domain, country-by-country site speed data they have been recording through real-life Google Chrome users since October 2017. The data only includes records from Chrome users which have opted into syncing browser history, and have usage statistic reporting enabled, however many will have this on by default (see Google post). So this resource offers you real-world data on how fast your site is.

That brings us to the first thing you should know about the CRUX report.

1. What site speed data does the Chrome User Experience report contain?

In the simplest terms, the CRUX report gives recordings of how long it took your webpages to load. But loading isn’t on-off, even if you’re not familiar with web development, you will have noticed that when you ask for a web page, it thinks a bit, some of the content appears, maybe the page shuffles around a bit and eventually everything falls into place.

Example of a graph showing performance for a site across different metrics. Read on to understand the data and why it’s presented this way.

There are loads of reasons that different parts of that process could be slower, which means that getting recordings for different page load milestones can help us work out what needs work.

Google’s Chrome User Experience report gives readings for a few important stages of webpage load. They have given definitions here but I’ve also written some out below;

  • First Input Delay

    • This is more experimental, it’s the length of time between a user clicking a button and the site registering the click
    • If this is slow the user might think the screen is frozen
  • First Paint

    • The first time anything is loaded on the page, if this is slow the user will be left looking at a black screen
  • First Contentful Paint

    • Similar to first paint, this is the first time any user-visible content is loaded onto the screen (i.e. text or images).
    • As with First Paint, if this is slow the user will be waiting, looking at a blank screen
  • DOM Content Loaded

    • This is when all the html has been loaded. According to Google, it doesn’t include CSS and all images but by-and-large once you reach this point, the page should be usable, it’s quite an important milestone.
    • If this is slow the user will probably be waiting for content to appear on the page, piece by piece.
  • Onload

    • This is the last milestone and potentially a bit misleading. A page hits Onload when all the initial content has finished loading, which could lead you to believe users will be waiting for Onload. However, many web pages can be quite operational, as the Emperor would say, before Onload. Users might not even notice that the page hasn’t reached Onload.
    • To what extent Onload is a factor in Google ranking calculations is another question but in terms of User Experience I would prioritise the milestones before this.

All of that data is broken down by;

  • Domain (called ‘origin’)
  • Country
  • Device – desktop, tablet, mobile (called ‘client’)
  • Connection speed

So for example, you could see data for just visitors to your site, from Korea, on desktop, with a slow connection speed.

2. How can I access the Chrome User Experience report?

There are two main ways you can access Google’s Chrome user site speed data. The way I strongly recommend is getting it out using BigQuery, either by yourself or with the help of a responsible adult.


If you don’t know what BigQuery is, it’s a way of storing and accessing huge sets of data. You will need to use SQL to get the data out but that doesn’t mean you need to be able to write SQL. This tutorial from Paul Calvano is phenomenal and comes with a bunch of copy-paste code you can use to get some results. When you’re using BigQuery, you’ll ask for certain data, for instance, “give me how fast my domain and these two competitors reach First Contentful Paint”. Then you should be able to save that straight to Google Sheets or a csv file to play around with (also well demonstrated by Paul).


The other, easier option, which I actually recommend against is the CRUX Data Studio dashboard. On the surface, this is a fantastic way to get site speed data over time. Unfortunately, there are a couple key gotchas for this dashboard which we need to watch out for. As you can see in the screenshot below, the dashboard will give you a readout of how often your site was Fast, Average, or Slow to reach each loading point. That is actually a pretty effective way to display the data over time for a quick benchmark of performance. One thing to watch out for with Fast, Average, and Slow is that the description of the thresholds for each isn’t quite right.

If you compare the percentages of Fast, Average, and Slow in that report with the data direct from BigQuery they don’t line up. It’s an understandable documentation slip but please don’t use those numbers without checking them. I’ve chatted with the team and submitted a bug report on the Github for this tool . I’ve also listed the true definitions below, in case you want to use Google’s report despite the compromises, or use the Fast, Average, Slow categorisations in the reports you create (as I say, it’s a good way to present the data). The link to generate one of these reports is

Another issue is that it uses the “all” dataset – meaning data from every country in the world. That means data from US users is going to be influenced by data from Australian users. It’s an understandable choice given the fact that this report is free, easily generated, and probably took a bunch of time to put together, but it’s taking us further away from that real-world data we were looking for. We can be certain that internet speeds in different countries will vary quite a lot (for instance South Korea is well known for having very fast internet speeds) but also that expectations of performance could vary by country as well. You don’t care if your site speed looks better than your competitor because you’re combining countries in a convenient way, you care if your site is fast enough to make you money. By accessing the report through BigQuery we can select data from just the country we’re interested in and get a more accurate view.

The final big problem with the Data Studio dashboard is it lumps desktop results in with mobile and tablet. That means that even looking at one site over time, it could look like your site speed has taken a major hit one month just because you happened to have more users on a slower connection that month. It doesn’t matter whether desktop users tend to load your pages faster than mobile, or vice versa – if your site speed dashboard can make it look like your site speed is drastically better or worse because you’ve started a facebook advertising campaign that’s not a useful dashboard.

The problems get even worse if you’re trying to compare two domains using this dashboard – one might naturally have more mobile traffic than the other, for example. It’s not a direct comparison and could actually be quite misleading. I’ve included a solution to this in the section below, but it will only work if you’re accessing the data with BigQuery.

Wondering why the Data Studio dashboard reports % of Fast, Average, and Slow, rather than just how long it takes your site to reach a certain load point? Read the next section!

3. Why doesn’t the CRUX report give me one number for load times?

This is important – your website does not have one amount of time that it takes to load a page. I’m not talking about the difference between First Paint or Dom Content Loaded, those numbers will of course be different. I’m talking about the differences within each metric every single time someone accesses a page.

It could take 3 seconds for someone in Tallahassee to reach Dom Content Loaded, 2 seconds for someone in London. Then another person in London loads the page on a different connection type, Dom Content Loaded could take 1.5 seconds. Then another person in London loads the page when the server is under more stress, it takes 4 seconds. The amount of time it takes to load a page looks less like this;


Median result from

And more like this;

Distribution of load times for different page load milestones

That chart is showing a distribution of load times. Looking at that graph, you could think 95% of the time, the site is reaching DOM Content Loaded in under 8 seconds. On the other hand you could look at the peak and say it most commonly loads in around 1.7 seconds, but you could, for example see a strange peak at around 5 seconds and realise – something is intermittently going wrong that means sometimes the site takes much longer to load.

So you see saying “our site loads in X seconds, it used to load in Y seconds” could be useful when you’re trying to deliver a clear number to someone who doesn’t have time to understand the finer points, but it’s important for you to understand that performance isn’t constant and your site is being judged by what it tends to do, not what it does under sterile testing conditions.

4. What limitations are there in the Chrome User Experience report?

This data is fantastic (in case you hadn’t picked up before, I’m all for it) but there are certain limitations you need to bear in mind.

No raw numbers

The Chrome User Experience report will give us data on any domain contained in the data set. You don’t have to prove you own the site to look it up. That is fantastic data, but it’s also quite understandable that they can’t get away with giving actual numbers. If they did, it would take approximately 2 seconds for an SEO to sum all the numbers together and start getting monthly traffic estimates for all of their competitors.

As a result, all of the data comes as a percentage of total throughout the month, expressed in decimals. A good sense check when you’re working with this data is that all of your categories should add up to 1 (or 100%) unless you’re deliberately ignoring some of the data and know the caveats.

Domain-level data only

The data available from BigQuery is domain-level only, we can’t break it down page-by-page which does mean we can’t find the individual pages which load particularly slowly. Once you have confirmed you might have a problem, you could use a tool like Sitebulb to test page load times en-masse to get an idea of which pages on your site are the worst culprits.

No data at all when there isn’t much data

There will be some sites which don’t appear in some of the territory data sets, or at all. That’s because Google hasn’t added their data to the dataset, potentially because they don’t get enough traffic.

Losing data for the worst load times

This data set is unlikely to be effective at telling you about very very long load times. If you send a tool like to a page on your site, it’ll sit and wait until that page has totally finished loading, then it’ll tell you what happened.

When a user accesses a page on your site there are all kinds of reasons they might not let it load fully. They might see the button they want to click early on and click on it before too much happened, if it’s taking a very long time they might give up altogether.

This means that the CRUX data is a bit unbalanced – the further we look along the “load time” axis, the less likely it is it’ll include representative data. Fortunately, it’s quite unlikely your site will be returning mostly fast load times and then a bunch of very slow load times. If performance is bad the whole distribution will likely shift towards the bad end of the scale.

The team at Google have confirmed that if a user doesn’t meet a milestone at all (for instance Onload) the recording for that milestone will be thrown out but they won’t throw out the readings for every milestone in that load. So, for example, if the user clicks away before Onload, Onload won’t be recorded at all, but if they have reached Dom Content Loaded, that will be recorded.

Combining stats for different devices

As I mentioned above – one problem with the CRUX report is all of the reported data is as a percentage of all requests.

So for instance, it might report that 10% of requests reached First Paint in 0.1 seconds. The problem with that is that response times are likely different for desktop and mobile – different connection speeds, processor power, probably even different content on the page. But desktop and mobile are lumped together for each domain and in each month, which means that a difference in the proportion of mobile users between domains or between months can mean that site speed could even look better, when it’s actually worse, or vice versa.

This is a problem when we’re accessing the data through BigQuery, as much as it is if we use the auto-generated Data Studio report, but there’s a solution if we’re working with the BigQuery data. This can be a bit of a noodle-boiler so let’s look at a table.

Device Response time (seconds) % of total
Phone 0.1 10
Desktop 0.1 20
Phone 0.2 50
Desktop 0.2 20

In the data above, 10% of total responses were for mobile, and returned a response in 0.1 seconds. 20% of responses were on desktop and returned a response in 0.1 seconds.

If we summed that all together, we would say 30% of the time, our site gave a response in 0.1 seconds. But that’s thrown off by the fact that we’re combining desktop and mobile which will perform differently. Say we decide we are only going to look at desktop responses. If we just remove the mobile data (below), we see that, on desktop, we’re equally likely to give a response at 0.1 and at 0.2 seconds. So actually, for desktop users we have a 50/50 chance. Quite different to the 30% we got when combining the two.

Device Response time (seconds) % of total
Desktop 0.1 20
Desktop 0.2 20

Fortunately, this sense-check also provides our solution, we need to calculate each of these percentages, as a proportion of the overall volume for that device. While it’s fiddly and a bit mind-bending, it’s quite achievable. Here are the steps;

  1. Get all the data for the domain, for the month, including all devices.
  2. Sum together the total % of responses for each device, if doing this in Excel or Google Sheets, a pivot table will do this for you just fine.
  3. For each row of your original data, divide the % of total, by the total amount for that device, e.g. below

Percent by device

Device % of total
Desktop 40
Phone 60

Original data with adjusted volume

Device Response time (seconds) % of total Device % (from table above) Adjusted % of total
Phone 0.1 10 60 10% / 60% = 16.7%
Desktop 0.1 20 40 20% / 40% = 50%
Phone 0.2 50 60 50% / 60% = 83.3%
Desktop 0.2 20 40 20% / 40% = 50%

5. How should I present Chrome User Experience site speed data?

Because none of the milestones in the Chrome User Experience report have one number as an answer, it can be a challenge to visualise more than a small cross section of the data. Here are some visualisation types that I’ve found useful.

% of responses within “Fast”, “Average”, and “Slow” thresholds

As I mention above, the CRUX team have hit on a good way of displaying performance for these milestones over time. The automatic Data Studio dashboard shows the proportion of each metric over time, that gives you a way to see if a slowdown is a result of being Average or Slow more often, for example. Trying to visualise more than one of the milestones on one graph becomes a bit messy so I’ve found myself splitting out Fast, and Average so I can chart multiple milestones on one graph.

In the graph above, it looks like there isn’t a line for First Paint but that’s because the data is almost identical for that and First Contentful Paint

I’ve also used the Fast, Average, and Slow buckets to compare a few different sites during the same time period, to get a competitive overview.

Comparing competitors “Fast” responses by metric

An alternative which Paul Calvano demonstrates so well is histograms. This helps you see how distributions break down. The Fast, Average, and Slow bandings can hide some sins in that movement within those bands will still impact user experience. Histograms can also give you an idea of where you might be falling down in comparison to others, or your past performance and could help you identify things like inconsistent site performance. It can be difficult to understand a graph with more than a couple time periods or domains on it at the same time, though.

I’m sure there are many other (perhaps better) ways to display this data so feel free to have a play around. The main thing to bear in mind is that there are so many facets to this data it’s necessary to simplify it in some way, otherwise we just won’t be able to make sense of it on a graph.

What do you think?

Hopefully, this post gives you some ideas about how you could use the Chrome User Experience report to identify whether you should improve your site speed. Do you have any thoughts? Anything you think I’ve missed? Let me know in the comments!

If this has inspired you to dig into your site speed page-by-page, my colleague Meagan Sievers has written a post explaining how to use the Google Page Speed API and Google Sheets to bulk test pages. Happy testing.

Bonus – what are the actual thresholds in the CRUX Data Studio report?

As mentioned above, the thresholds in the CRUX Data Studio report aren’t 100% correct, I have submitted a GitHub issue but here are the updated thresholds.

How to Check Your Site Speed: 5 Things You Need to Know About the Google User Experience Report was originally posted by Video And Blog Marketing

How to Avoid the Wrong Pages from Ranking Using These Strategies

Last updated on

How To Avoid The Wrong Pages From Ranking Using These Strategies

Imagine this scenario, you have published a well-crafted landing page that provides quality content and authoritativeness that Google is looking for and containing the keyword that you want to rank it for. After a while, you check the SERPs to see if your website is ranking for it, and then you see it: You have a webpage that is ranking and showing up in SERPs, but unfortunately, the wrong page is showing up in the results. This is one pressing issue that SEOs tend to face from time to time, where pages rank for the wrong keywords, which leads to loss of traffic for the intended pages, and users going to pages with content not related to what they searched for in the first place.

The goal of using the correct keyword is for you to establish your search traffic around that keyword, allowing you to rank on Google, and establish your website as an authoritative source. This is why having the wrong page ranking in SERPs would just hinder you from reaching that goal.

This scenario can negatively affect your rankings and traffic if not optimized right, making any good SEO strategy less effective without the proper fixes. It is also worth taking note that not all scenarios are similar, as there might be other factors that might be in play that requires a varied solution. With that in mind, here’s how to avoid your pages ranking for the wrong keywords using some helpful tips.

Analyze the problem

Before formulating a solution to this problem, there are a few more questions to ask to ensure that your process will be accomplished accordingly. This allows you to analyze what type of problem you have, allowing you to act with the appropriate action. Here are the questions that you need to answer:

Does the keyword have related content on your page?

If yes:

This is the first question that you should ask, as this tends to be the most common issue that is encountered by a lot of SEOs. This tends to be the issue that is less challenging to fix, as you have content that is ready to be optimized. The best approach to this issue is making sure that the related content is properly optimized. This means observing proper keyword placement and even enhancing the content by adding more information that helps improve its quality, allowing it to generate more traffic.

Optimizing the landing page is also important, as you would need to lessen the related keywords (you also have the option of replacing all of them completely) in order for it to not rank on the wrong SERPs. Adding keywords is another option, but this might affect your keyword density, which can be another problem altogether. Taking a look at your links is also important, as you would need to adjust certain links, especially from pages that are connected to the non-related keyword.

If no:

If the keyword does not contain related content, then it is best to formulate content around it. Starting out by crafting a new landing page is a good approach, as it provides the users with another opportunity to search for your website while providing quality content that users would need. After publishing a landing page that helps establish it as a part of your page, creating blog posts around it allows you to establish it as an integral part of your website, while also allowing you to help your traffic grow even further.

Are your webpages properly optimized for searchability?

Another issue that can affect this problem is there might be problems that might cause your webpages to not be indexed and crawled properly, leading to inconsistent results that can lead to bad rankings. It is best to solve this issue by using crawling tools like Screaming Frog, which allows you to see which pages are being crawled and indexed by Google.

Screaming Frog

This approach comes in handy when doing some on-site SEO, as you would be able to remedy this problem more efficiently, and even prevent the issue from happening in the first place. However, in cases that it has already happened, it is best to make use of the tool and allow the page to be crawled and indexed.

Did you check the links?

Links play a role in generating traffic that helps your rankings grow. If your webpages are not ranking to the right keywords, links might also be a factor at play as well. This can be due to poor link building strategies and internal linking, which can greatly affect traffic and link juice. It is best to re-evaluate your link building strategy and utilize quality links from reputable sources, as that is the first step in establishing content quality and authority.

Proper internal linking is also crucial, having proper related content being linked to your webpage would help indicate which keyword you want to rank for. Link building mistakes can be the major factor that causes the wrong keywords to rank, which is why it’s important to analyze links in your content in order to provide the best solution. Redirection is another solution, but this is best reserved only for the worst-case scenarios, as redirection can affect loading speed and traffic if not done right.

Key Takeaway

Having your webpages ranking for the wrong keywords is a big issue that can negatively affect your traffic, but with the right strategy, you can help re-establish your search rankings, while also being able to craft new content and utilize new keywords to expand your search presence.

If you have questions about content optimization and SEO in general, leave a comment below and let’s talk.

How to Avoid the Wrong Pages from Ranking Using These Strategies was originally posted by Video And Blog Marketing

Local SEO for Multi-Location Businesses: How to Guide


SEO is important for your business if you want to make your site discovered on the web. It’s also a great way to market yourself and gain site traffic. But when your business has multiple locations (like franchises), the SEO strategy works a little differently.

To get visibility and business from locals, you’ll want to optimize what’s called your local SEO. Local SEO is different from regular SEO because it takes into consideration your local area.

For instance, when a customer searches for something that has local intent or identifies a specific location (say a city or county), search engines know they’re looking for a local business within a certain area or radius.

In fact, approximately 70% of customers visit a store based on information they find online, which means you need to make your locations search friendly so customers can visit them.

Image Source 

In this article, I’ll go over some tips to help you optimize your local SEO so all your locations get the visibility and attention they deserve from people searching locally.

Create a Separate Landing Page for Each Location

While it’s acceptable to have one domain for all locations, you’ll still need to take a few steps to optimize local SEO for each location so it can be found in local SERPS.

Often,  business owners create just one master landing page for all their locations to share. While that may seem efficient (at least time wise), it doesn’t do much in the way of optimizing your chances of popping up during local searches.

The compromise to that is to have one website, but allocate a different page for each business location. Each of your pages should have information specific to that location. These include your business name, phone number, and address.

And then you can take it a step further to add other customized content specific to that location, like testimonials; names, faces, and titles of staff; and any updates pertaining to that location.

Also, it’s helpful to have a Google map embedded on the location’s page to show exactly where it sits in relevance to area streets and landmarks.

Image Source

On the back-end of each page, make sure you create title tags and meta descriptions that are location specific. For instance, you might put “San Francisco Hair Salon” in some form on both so the chances of you showing up in local queries is higher.

Lastly, you want to make sure each of these pages can be discovered by Google.

Why Google?

Because Google handles approximately 60% of search queries, making it the most heavily used search engine out there.

So how can you make sure it knows your site exists? By submitting your sitemap. Simply put, a sitemap is a HTML or XML file that displays all the URLs your website has, the date you last updated each page, its metadata, etc.

The XML sitemaps let search engines go through or crawl your website and jot notes on each of your pages so that it can then feed the information it’s gathered into the right search results. It’s basically a map of your website.

HTML sitemaps, on the other hand, help users quickly find the content they’re looking for when they visit your site. It’s pretty much an outline version of your site.

Use Google My Business (GMB)

Google My Business is a Google tool that helps you get on Google’s apps and search results. Again, because this is the search engine that fields the most number of queries, we’re focusing on it first.

But it’s also a good idea to look into how you display on other popular search engines like Bing and Yahoo and optimize your searchability on those as well.

On Google platforms like Google search and Maps, your business profile will appear anytime your business shows up.  Google My Business lets you create an individual business listing  for each location so each location page can show up uniquely during searches.

Most importantly, make sure you follow the steps to verify each of your locations on Google My Business. To do this, you’ll have to select the location you want to verify and click “Verify now.”

This will confirm the accuracy of your business information to Google and let you start showing up in queries.

Next, add the hours for each location.

Finally, make sure each of your locations displays the same information across the web. The name, address and phone number for each location should be the same in all listings where the location shows up.

For instance, Gap shouldn’t show up as Gap in one online directory but Gap Inc. in another. Also, make sure your hours of operations are listed accurately and the same every place your location appears on the web.

Variations are a big deal to Google.

If Google detects inconsistencies from different sources on the web, this will affect your success in the SEO department.

Image Source

Finding out whether or not your locations display the same information across the board can be challenging. If you’re not sure about your level of consistency across the web, you can use tools like Yext and Moz Local.

These tools scour the internet for mentions of your business and make uniform all the information pertaining to it based on the details you provide.  They’ll change Gap to Gap Inc. if the information you give them says Gap Inc. is correct. Or they’ll correct your hours of operations based on what you say.

The only drawback is you have to remain subscribed to these services for a lifetime if you want to continue seeing results. The moment you unsubscribe, they’ll revert all your changes so information goes back to how it was originally displayed.

Manage Online Reviews for Each Location

Now that you’ve got everything set up on Google, it’s time to tackle another key step in the process: managing the reviews for each location (this should also be done on Yelp).

Why is this important?

Because when your location shows up in Google’s results, your reviews will display immediately beneath your business name.

But more importantly, potential customers make decisions based on these reviews. For example, the average customer will read approximately seven reviews before they form an opinion about any business or company.

That’s why encouraging and showing appreciation for positive reviews and using strategic tactics to resolve negative ones will work in your favor. If people see you’re active on reviews, they’ll be more motivated to leave you one, which can help your rating soar higher.

Bad reviews, on the other hand, can be tricky. If you receive one, which is pretty much inevitable, here are a few pointers to keep in mind:

Don’t Lash Out at the Customer or Fight Back

Reviews aren’t the place to get ugly. No matter what the customer says, acknowledge their feelings and apologize for the experience. Even better, try to offer to make it up to them.

Getting rude right back will only make you seem just as much at fault as them–except you have your business reputation to think of too.

Image Source

Be Tactful and Positive

Fight negativity with positivity. In other words, don’t be negative right back to the client. Tell them you appreciated their visit, that you’re sorry for anything less than an exceptional experience, that you would like a chance to make it up to them, or that you’d like to collect more feedback offline.

These are all positive statements to a negative situation, which can potentially help turn things around in your favor.

Mind Your Language and Tone

People can pick up on verbal connotations even in writing. If you’re being sarcastic or insincere, they’ll be able to tell. Make sure you don’t use inappropriate or condescending language. Most of all, make sure your message sounds sincere and accepting, not defensive or rude.

Focus on Localized Link Building

It’s easy for you to boast about your own site, but it means more when others do the boasting for you. This is possible when other sites link to your site.

Backlinks are great because they boost your credibility on search engines and can help your local SEO. Ideally you want to build links from community specific websites or resources into your individual location pages as this improves local signals and relevance.

Image Source

Fortunately, there are several ways you can encourage people to backlink to you:

Sponsor an Event

When you pour money or resources into a local event, like a college or school, you’re doing a good thing for them and you. These institutions usually need extra funds. And when you sponsor them, they’ll usually acknowledge you on their site through an honorable mention, often linking your website.

Host Events

Hosting local events is another way to encourage backlinks and boost your local SEO.

Say for example you hold an event at your office for senior citizens, you can ask the right places to help promote that event by listing the name and location of it on their sites.

For instance, you could approach and notify senior living communities, hospices, churches, etc. All of these places have an interest in promoting your event to their community. And in the process, mention of your site or business map on their site will result favorably toward your local SEO efforts.

Look Into Other Neighborhood Sponsorships

Marathons and charities are everywhere. Engage with local events like these that are close to your business. You could sponsor these events or be a contributor in some way.

Not only will you be giving to a worthy cause, but you’ll also likely gain a mention on their website, which can positively impact your local rankings.

Track the Performance for Each Location

Once you follow all of these steps to optimize local SEO for your locations, your work isn’t over. Next, it’s time to step back and see if all those efforts are paying off the way they should.

Google My Business’s Insights feature can be a tremendous tool for this. Essentially this is a dashboard that gives you information on your individual pages and it’s particularly useful when you’re monitoring the site performance of several locations at once.

Image Source

For instance, Insights will offer you a peek into things like the number of views your site had and information about searches that led target visitors to your site (like whether viewers found you on Google Search or Google Maps).

It also displays what actions users took when they visited your site (did they call you directly from your landing page, request directions to your location, and visit your site multiple times?)

Image Source

Insights also shows you one month’s worth of queries so you can see which phrases and keywords people used to see your location pop up in their search results.

Plus, Insights lets you track the site performance of all your locations so you can compare and see which pages are performing well and which ones aren’t.

Using these comparisons, you can figure out what one page has that another doesn’t to help drive improvements.


SEO is great to get your business site to show up in any kind of search results. But Local SEO is what you want to tackle if you’re working toward gaining visibility and exposure for multiple business locations. Because local is what people will search for when they’re trying to physically pay a visit to a business.

Image Source

The more you use different tools, like creating different pages for locations, using Google My Business to optimize your location web presence, keeping on top of reviews and leveraging backlinks to boost your local SEO, the better the results you’ll see in terms of traffic.

With some diligence, patience and persistence, you can have each of your locations optimized for local SEO in no time at all.

Local SEO for Multi-Location Businesses: How to Guide was originally posted by Video And Blog Marketing

Email Marketing: How to engage with your current prospects and drive more traffic to your site

Before you engage in CRO ideas that drive sales opportunities deeper into the funnel, many travel brands need to get customers from an email to their site. From our research into building a model of the customer’s mind, we’ve discovered a seemingly simple yet often overlooked email marketing tactic. (I’ve used three experiments from the travel industry as an example, but this discovery is broadly applicable to most industries).

Move your main call-to-action to the top section of your email, and make sure it’s clear and prominent

Remember, the goal isn’t to get the vacation package booking within your email — that’s the website’s job. Your goal is to get the click. As marketers, we often want to throw the kitchen sink at our prospects to spin the odds in our favor. I mean, they are bound to click on something at that point … right?

Wrong! Make sure your objective in your email is focused, and construct your calls-to-action accordingly. Too many options can overwhelm or confuse your prospects when they don’t know what to do. When your prospect doesn’t know what to do, they tend to abandon the email altogether.

340% increase in CTR by adding prominent CTA

Take an experiment we helped a river cruise company run. Its email had many “clickable” areas driving to the landing page, but it lacked a clear call-to-action (CTA) to invite the prospective traveler to take action. Without a prominently emphasized CTA, it is difficult for a reader to quickly identify a primary objective of this email communication.

By adding a yellow “See Offer Details” button near the top of the email that didn’t otherwise exist previously, we saw a 340% increase in clickthrough rate (CTR).

17% increase in CTR by moving CTA button to the top

In a similar test, a vacation brand had an email that actually did have a clear call to action, but it was hidden under a paragraph of copy mid-way through the email. By moving the button to the top and pulling out the most important value claims for the vacation, this version resulted in a 17% increase in clickthrough rate.

43% increase in CTR from reducing number of CTAs

Finally, here’s another A/B test example that we ran with another vacation provider where we saw a 43% lift in clickthrough rate. We reduced the number of CTAs and focused it on the key actions we wanted the prospect to take. We also made them much more visible to a prospect who is skimming dozens of emails. We continued to refine this tactic in two additional follow-up tests to fully optimize the email, which continued to compound the lift in clickthrough rate.

Bottom Line: Make the CTA a no-brainer

In email, we must recognize that our prospects skim dozens of emails in their inbox for flight deals, cruises, hotels and vacation packages. If you’re skilled enough to get the “open” from a compelling subject line, make the next micro-yes a no-brainer for your prospect.

There are many complex things you can do with email marketing from a technological and personalization perspective. But before you dive into those, the lowest-hanging fruit may be to simply test the clarity of the email “ask.”

Related Resources

Email Messaging on-demand certification course (from MECLABS Institute, MarketingExperiments’ parent research organization) — Take this course to capture more subscribers, craft effective email copy and convert email clicks to sales

Optimizing Email Capture: 9-point checklist to grow your email marketing list by minimizing the perceived cost of opting in

Marketing Chart: How vacation booking methods are being considered and used

Email Marketing: How to engage with your current prospects and drive more traffic to your site was originally posted by Video And Blog Marketing

Link Building Mistakes and How to Fix Them

Last updated on

Link Building Mistakes and How to Fix Them

Link Building is a fundamental SEO practice that has been around for a long time. Despite a lot of people saying that it is no longer as effective as before, we believe that it is still a strong strategy that is able to help generate traffic and expand online presence.

There are many ways to create an effective link building strategy, with each website requiring you to change your tactics to accommodate specific needs. However, mistakes still tend to happen, and this can negatively impact your website, causing a loss in traffic, and at times leading to some websites getting penalized in the most severe cases. It is best to avoid or fix these issues as soon as possible, and we’re here to help you do just that.

Link Buying or Selling

Link buying used to be a notorious practice during the 2000s, with some websites outright promoting this service. The objective of their service is to improve your Google rankings by charging for links. This was during a time in which Google PageRank was a metric that everyone wanted to focus on, which resulted in an overflow of paid links across the internet.

Eventually, Google started to lock-down on this practice, leading to these services being penalized. Link buying has also become one of the reasons why PageRank was shut down, as websites focus on the metric rather than crafting quality content that can generate organic traffic. While this practice may no longer be as blatant as before, there is still a considerable number of websites and people still trying to sell links to websites.

Link Buying is a black-hat strategy that should be avoided at all costs, as it can undo all of the traffic that you have taken a long time to grow. The best fix for this practice is to ignore any pitches that imply the buying or selling of links.

Poor or Generic Email Pitches

Having a good email pitch is a fundamental link building skill, as it is the deciding factor on whether or not webmasters would be interested enough to click and read your pitch. From experience, I’ve seen countless emails that contained email pitches that felt generic and templated, with barely anything to get you interested to respond to their inquiry. Email pitches are the first step in building links and having a well-written email that can leave the reader invested into collaborating with you helps open up more opportunities to build meaningful connections.

Personalizing your email outreach is also another great way of promoting your pitch, while also establishing good rapport from the get-go. Other types of great pitches are the ones that address a need for a service that you are making an inquiry to or providing a service that they would need. Answering a question usually offers the best ways to create a good email pitch that would definitely help you establish that much-needed connection.

Link Spamming

Another notoriously common practice, link spamming usually happens on blog comments, guest posts, and online forums, where users would type in their comments, containing links to poor-quality websites, or send in guest posts containing spammy links. It is a common sight to see these posts and comments on a lot of websites and email pitches, and there are times that it can feel like these spammy content contains recurring patterns that you would be able to recognize quickly, making easier to avoid and fix.

Link Spam


Spammy anchor text is another issue, as this is where users would commonly place their spammy or harmful link in a post or comment, under the guise of promoting content. The best solution to link spamming is making sure that these posts do not get published in the first place, as you can see these bad links by the time you review the post. Tools like Disqus also help protect your website from link spam, as you can review, edit and remove links that might be harmful to your site, making sure that you’re not a victim of black-hat SEO.

Not establishing relationships with webmasters

Link building is not only a strategy that helps you promote and connect content to other websites, but it is also another way of establishing a network and form beneficial partnerships with other websites and services. This means that link building is more than just making sure you link your content to other websites. Another great practice is supporting each other’s websites by subscribing to their content and responding to their social media posts.

Establishing a relationship Forming partnerships with other webmasters is essential, as you would be able to pitch more content to them, while also getting promoted by them as well. This healthy relationship allows you to establish a network where you can continue building links, allowing your website to generate more traffic and gain more link juice. Good rapport brings in more benefits for both parties, fostering connections that can open up even more digital marketing opportunities.

Private Blog Networks

PBNs have been centers of controversy in the SEO industry, as people dispute whether they’re black-hat or grey-hat practices. Either way, building links through PBNs are best avoided at all costs, as they can cause your website to get penalized by Google, causing way more problems than solutions.


PBNs are for people looking for an easy way out, but this also increases the risk of getting punished and losing all of your traffic quickly as well. Ranking well and building links is a process that requires you to invest your time and effort into growing your traffic organically, which means that any process that tells you otherwise means they’re not playing by the rules. Like other black-hat SEO strategies, a PBN is another example of making shortcuts in gaining traffic, at the risk of Google giving you severe penalties.

Key Takeaway

Like every other SEO strategy, link building requires a lot of time and effort to grow and generate successful results. Mistakes will only hinder you from this goal, which is why knowing about them allows you to avoid your website and links from being compromised. Link building is a fundamental SEO strategy, and these entries would only help you know the best ways to optimize your strategy for more success.

If you have questions and inquiries about link building or SEO in general, leave a comment below and let’s talk.

Link Building Mistakes and How to Fix Them was originally posted by Video And Blog Marketing

Why You Should Use JSON-LD to Optimize Your Local SEO

Last updated on

Why You Should Use JSON-LD to Optimize Your Local SEO

Schema has become an integral part of optimizing your website for search, as it helps search engines have a better understanding of your website, allowing it to become more accessible for the users. When it comes to informative aspects of your website, adding schema markup provides more context for search engines for information such as location, contacts, and other crucial information that you want to provide to the user.

This makes schema crucial for business websites, especially with the expanding online market becoming increasingly competitive. With the wide variety of markup types available, you would be able to find the right one for your business and expand your search presence overall.

There are different types of schema available for your website, and when it comes to which one can help you rank better in Google SERPs, it has to be JSON-LD. With Local SEO making huge strides in growth, here’s why using JSON-LD is best for your local business, along with a few tips and steps on how to implement it.



JSON-LD (which stands for Javascript Object Notation for Linked Data) is a form of schema that utilizes Javascript for its implementation. Compared to other schemas, JSON-LD is a simpler system that is easier to implement on a website. JSON-LD aims to list and annotate important information on your website, allowing search engines to understand your website much better. One of the benefits of implementing JSON-LD is being able to optimize your featured snippets, which is very beneficial for local SEO, especially with users having a wide selection of choices.

Applying JSON-LD to your webpage


One of the best things about JSON-LD is that adding it to your website’s code is a simple process that mostly requires you to copy and paste the code into your document, which means that users with little coding experience would be able to implement this without much trouble. The most important part of implementation is finding the right item types to identify, like information about your business such as address, the name of the owner, contact details, and much more.

Implementing the code into your documents starts by adding the code <script type=’application/ld+json’>, which signals that the schema would be implemented into the page itself.

Here’s an example of the schema being added into the website’s code:


As you can see, JSON-LD has codes for each type of information on your webpage, allowing search engines to contextualize and provide this information in search and featured snippets. For this case, pieces of information, such as business hours, location, contact details, images, logos, and type of website are listed accordingly, allowing Google to understand much better.

The simplicity of this method is that all you have to do is to tag each part of the page with the appropriate tags, and you’re good to go. Website development and optimization have become simpler, with the latest WordPress being more accessible to users with little to no coding experience. With JSON-LD, it is another crucial process that has been streamlined with the user in mind. As standard procedure, it is best to have your code validated first before being fully implemented, and have it submitted on the Google Structured Data Testing Tool.

Why JSON-LD is best for your business website


JSON-LD is a versatile type of schema that allows search engines to understand your website’s information, and here are a few more reasons why it’s best for your business’ website.

It’s preferred by Google

Along with its relative accessibility and ease of use, JSON-LD is a schema system preferred by Google. This makes it a plus when it comes to aiming to boost your search rankings, along with appearing on SERPs providing updated and correct information to the user. This piece of information was stated by Google’s John Mueller, where he answered a question on which type of schema does Google prefer on their Webmaster’s Hangout session.

Your featured snippets look better

One of the benefits of using schema having better search snippets, allowing you to provide more information to the user about your business. With the abundance of choice online, being able to provide quality information can sometimes be the difference in getting someone to make a purchase. Along with featured snippets, your SERPs also look better and have additional details like this one below:

Schema SERP

This can give you an edge over the local SEO competition, as you have a more detailed snippet, providing more information compared to other brands Featured snippets have become a fundamental element in SEO, and optimizing your schema really helps.

Key Takeaway

Schema has become an integral part of website optimization, with search engines making use of it to further understand what users are looking for. With JSON-LD, you would be able to allow Google to do just that, while also optimizing your local SEO altogether.

If you have questions and inquiries about Local SEO and SEO in general, leave a comment below and let’s talk.

Why You Should Use JSON-LD to Optimize Your Local SEO was originally posted by Video And Blog Marketing

Google Launches March 2019 Broad Core Algorithm Update

Last updated on

Google Launches March 2019 Broad Core Algorithm Update

Over the past few weeks, numerous SEOs have speculated that Google might have launched unannounced algorithm updates, especially during the early part of March. While ranking fluctuations were present, it wasn’t until Google’s latest announcement that confirms that there indeed was a major algorithm update.

As announced on Google Search Liaison’s Twitter account, a board core algorithm update was launched on March 12, 2019, which was around the same time a major update was launched last year. Here are our thoughts on the latest update, and how it can affect your traffic and your strategies during the next few months.

At a glance

While Google has confirmed that an update has been launched, the details that they have provided are vague as to what changes have been made, and what type of websites will be affected. Due to last year’s Google Medic Update, many SEOs are now keeping track of any massive ranking fluctuations on various industry websites, as a majority of websites experienced massive traffic gains and drops during that time period.

Google Core Algorithm Announcement

The current algorithm update is one of Google’s routine updates that aim to bring in specific changes, which can affect the rankings and traffic of websites. Additionally, Google also stated that some of the changes aim to benefit pages that are “previously under-rewarded”, which indicates a number of websites should experience a significant boost in their rankings and traffic.

Significant effects

While this update may not be as big as the Google Medic Update, we were still able to experience significant boosts to some of the keywords that we are tracking. While the update has only been announced on March 12, we have noticed significant fluctuations during the last week of February until the first week of March, which allegedly means that the update may have already been up and running during that timeframe. Upon checking that time period, one of the most significant results was seeing a keyword go from ranking in the hundreds to reaching the second and first page of SERPs.

March Core Algorithm Update SERP Ranking

A good number of our keywords have received boosts in their rankings, some even reaching the first page of Google seemingly in a matter of days. These websites are from a variety of industries, which can also mean that the algorithm update does not target specific industries, instead, rewarding websites that have optimized their content to measure up to Google’s E-A-T standards.

Fixes and optimization

While the update helped numerous websites get boosts in rankings, Google stated that no fixes would be done to pages that do not perform as well as other websites. This might confuse some SEOs looking to optimize their rankings, as identifying issues would be challenging as there is no specific element that they can directly point out and immediately remedy.

Despite the fact that there is no specific fix pointed out in relation to the update, it is still a viable strategy to continue improving your content in order to drive more traffic that can boost your rankings. It is also important to monitor all of the websites you handle constantly, especially if you have websites from a variety of industries, as their ranking fluctuations would vary greatly from each other.

It is also worth looking into your websites that are not ranking well despite the presence of quality content, as they might get a much-needed ranking boost thanks to the latest update. While more details about the update might emerge in the upcoming weeks, it is best to stick with your strategy if you experienced positive changes in your rankings, while finding new ways to improve your rankings as well.

Key Takeaway

Algorithm updates bring a host of new changes, with some of these changes not specified by Google. This is why traffic and rankings during the time of these updates can greatly affect your site, or even have no effect at all. Google always finds new ways to update their algorithm, and if last year was any indicator, expect a slew of updates during the rest of the year.

If you have questions or inquiries about algorithm updates or SEO in general, leave a comment below and let’s talk.

Google Launches March 2019 Broad Core Algorithm Update was originally posted by Video And Blog Marketing

How to prevent your website from getting attacked by spammy and malicious links and recover

Last updated on

How to prevent your website from getting attacked by spammy and malicious links and recover

One of the worst things that could happen to any website that you have given a lot of time and effort to optimize is seeing it get attacked by any form of negative SEO. Despite the many advancements in SEO and cybersecurity over the past few years, attacks can still happen in a frequent basis, which leads to numerous issues such as traffic loss, penalties from search engines, and even parts of your website no longer functioning.

As an SEO company, it is a crucial task to make sure this doesn’t happen to our websites or be able to solve the issue immediately. Negative SEO comes in many forms, and all of it can inflict all kinds of problems you wouldn’t want for any of your websites. One of the most notable examples of negative SEO is spammy and malicious links, which can appear seemingly out of nowhere to ruin your traffic and content quickly if not acted upon. With that in mind, it is best to find the best strategies that would help you remove these links and recover your traffic before more damage is done.

Start by conducting a link audit

In our previous article, we have discussed disavowing links, which is an action that must be taken with serious caution, as it can greatly affect your website’s traffic and content. For this case, disavowing links is one of the most effective actions to take when it comes to spammy and malicious links. Before going to Google’s Disavow Links tool, the first step is to conduct a link audit and track all of the links that are present during a certain time period.

Export Links

Ahrefs allows you to track and export your links into a file, which makes it easier to evaluate and mark each link, especially when the number of links reaches the hundreds or thousands, which is a scenario that can happen.


This process can take a lot of time depending on the number of links. However, it is best to conduct this process with enough time, as disavowing links is an action that would be very difficult to undo, which means accidentally disavowing a good link is a recipe for disaster. It is best to take this step by step and have your team double check before applying the action.

It is also worth noting that disavowing links must only be done in the following conditions:

You were attacked by negative SEO – The case for this article, this can mean harmful links and other black hat techniques that involve links.

Manual Link Penalty from Google Search Console – Receiving this means that Google has found some links that might need another look to see if it is something that needs to be disavowed.


The next step is to apply the action by disavowing all of the harmful links. This will signal Google to block those links in your website, protecting your website from being harmed further. The process can take a few weeks, which makes it a lengthy process that must be monitored constantly.

Disavow Links Tool

You need to create a .txt file that contains the links that need to be disavowed and sign in to your Google Search Console account. The next step is to upload the file and select Disavow Links.

From then on, Google will be conducting a lengthy process gradually. It is worth noting that if you have any additional links to disavow, all you need to do is to reupload an updated file with the additional links.


The next step is to conduct fixes to your website and recover any lost links, content, and traffic. Like disavowing links, this can take time and is dependent on the damage that has been caused by these malicious attacks. With this, it is best to stick with your proven SEO strategies to allow you to be able to recover your traffic.

It is best to start through content optimization to allow you to recover your search rankings. Stick to creating trustworthy and authoritative content as that would help re-establish your presence in Google SERPs, and also become places that contain good links that would also help you gain more link juice.

Since you have lost links and disavowed unwanted ones, a solid link building strategy is also a must. Linking to authoritative sites, along with guest content from reputable sources would surely help give you link juice and more organic traffic gradually.

While this may be standard optimization processes, going back to basics is the best step to take, as attacks can be so severe, that you may have to almost go back to your starting point. Damage done can vary from site to site, but the standard procedure of regaining your traffic should be similar, and you will be able to regain lost traffic.


Preventive measures must be taken once you have re-established your website, as you are still at risk of being attacked again. One of the first things you can do is to monitor website comments, as some of the most common places where bad links can appear. Suspicious reviews from Google My Business can also be areas of concern and flagging them ASAP would be the best step to take.

Regular site crawling also makes sure that you would be able to monitor issues within your website as well, which is why tools like Screaming Frog is a must in your arsenal. Google Search Console manual actions must also be acted upon immediately, as this can be another starting point for attacks. The best remedy for any issue is immediate action, and regular monitoring and checks would allow you to be able to sniff out anything that would harm your website.

Key Takeaway

Being attacked by spammy or malicious links is one of the worst things that could happen to your website. With the right strategy, along with conducting careful and calculated processes, you would be able to remedy your website from malicious and spammy links.

If you have questions and inquiries about negative SEO and SEO in general, leave a comment below and let’s talk.

How to prevent your website from getting attacked by spammy and malicious links and recover was originally posted by Video And Blog Marketing