Month: May 2019

Favicon Display Update: Everything You Need to Know and How to Properly Set It Up

Last updated on

Favicon Display under magnifying glass
It’s not a secret that most of the users that search on Google are using their mobile devices which is why Google has been constantly improving their mobile search experience. When compared to how it looked before, mobile search experience has experienced notable changes – some for the better and some, arguably, for the worst. Does this most recent design update make the user’s mobile search experience better or worse? Let’s find out.

Google has been constantly testing out new designs for mobile search over the years and they recently started displaying favicons for all the websites in the mobile SERPs. The primary reason why Google rolled out this update is to bring a website’s branding in front of the user and help them better understand where the information comes from and if the pages that they visit have what they’re looking for. But this update also involves search ads display and poses potential problems for organic search results. Before we dive in deeper, let’s talk about favicons.

What is a Favicon?

A favicon is commonly associated with a website and is often regarded as its “logo” or “icon”. Historically, these small icons were and are still displayed in a browser’s tab to represent a website’s branding and help users identify sites that prove useful to them. Here’s what it looks like in a browser’s tab:

favicon example in a browser tab

If your website doesn’t have a favicon, browsers and search engines will use a default or generic symbol in place of your favicon. If you want to improve your branding in the world wide web, then having a favicon is a must-have.

How to Create a Favicon and Add It to Your Site

Creating a favicon is one of the simplest things you can do – especially if you already have a logo for your website. There are a variety of favicon generators available that lets you create your favicon in a matter of seconds. However, it becomes a challenge when you don’t have your own logo to turn into a favicon.

After you’ve created your own favicon, adding it to your site is also an easy process. To simplify it in 2 steps:

  1. Upload the favicon.ico file to your desired folder
  2. Insert this code <link rel=“shortcut icon” href=“(link to the directory)/favicon.ico” /> in the <header> tag

For a more comprehensive guide to applying it to WordPress sites, here’s a guide

Google’s Mobile Search Favicon Design Update

Google recently updated their mobile search design and they started showing a website’s favicon in the search results.

SEO Hacker favicon search results

Here’s a side by side comparison on the old look and the new one:

google favicon display update

Image from

It looks great. It helps users identify websites they know faster and it helps us webmasters and SEOs spread awareness of our brand/website. Of course, this update has implications not just for us but also for normal marketers that don’t dabble in SEO, the businesses themselves, and many others that put their focus in their website’s online presence.

An important aspect to remember is that Google has guidelines for the favicons they display in the mobile search results and even if everything in the guidelines are met, having your favicon displayed is not assured. Here are Google’s favicon guidelines:

  • Both the favicon file and the home page must be crawlable by Google (that is, they cannot be blocked to Google).
  • Your favicon should be a visual representation of your website’s brand, in order to help users quickly identify your site when they scan through search results.
  • Your favicon should be a multiple of 48px square, for example: 48x48px, 96x96px, 144x144px and so on. SVG files, of course, do not have a specific size. Any valid favicon format is supported. Google will rescale your image to 16x16px for use in search results, so make sure that it looks good at that resolution. Note: do not provide a 16x16px favicon.
  • The favicon URL should be stable (don’t change the URL frequently).
  • Google will not show any favicon that it deems inappropriate, including pornography or hate symbols (for example, swastikas). If this type of imagery is discovered within a favicon, Google will replace it with a default icon.

Issues with the Favicon Display Update

This recent update isn’t without flaws. Since Google displays favicons for organic mobile search results, they also updated the ads in mobile search which a lot of SEOs are having problems with. Here’s what it looks like:

google ads favicon display

The problem with how they changed the display for ads is they used a dark pattern design which looks like a regular brand icon to the regular, unsuspecting user. This gives normal users a hard time distinguishing between an organic search result and a sponsored ad. If a webmaster wanted to confuse users, they can just come up with an image that looks like the favicon Google uses to indicate that a search result is a sponsored ad.

Another potential problem is if a webmaster copies an authoritative website’s favicon and uses it for their own site? For the unsuspecting user, they can be a victim of this unethical move and enter the site thinking that it’s the authoritative website they know.

Those are only some of the problems I’ve read and thought about. But since SEOs around the world are conducting experiments with this recent update, Google can know what specific problems they have to mitigate or fix.

Key Takeaway

We’re all open to the idea of updating design schemes for the search results. However, Google isn’t a perfect company and search engine, and it’s up to us SEOs to find faults and issues with the updates they roll out. What do you think about this update? Let me know in the comments below and let’s talk.

Favicon Display Update: Everything You Need to Know and How to Properly Set It Up was originally posted by Video And Blog Marketing

SEO Trends 2019: The Latest & Most Current Tactics that Actually Work


SEO is more than a marketing fad. It is a necessity for online exposure in 2019. It is through search engine optimization that you stay competitive within your industry. There is a constant struggle for SEO supremacy. But the means by which you reach the peak of the search engine mountain varies year by year.

If you managed to claim first page exposure in the SERPS, you can’t just rest on your laurels and expect to get by. One change to Google’s search algorithm could cast you down into obscurity overnight.

And make no mistake about it, Google’s algorithm changes often, and in extreme ways (think last year’s mobile-first index or previously RankBrain). If you’re not up to date on how it continues to evolve, you risk being left in the dust.

SEO is not something you just do one time and then stop. It is a living, breathing, evolving creature that must be constantly maintained. SEO in 2019 looks far different than SEO looked in 2009.

So, what are the current trends in SEO that are achieving results in 2019? Read on to find out!

User Intent

One of the keys to any SEO strategy in 2019 is to have a strong understanding of user intent. By that I mean you have to know what your audience is looking for.

For starters, you need to understand simple things like what platform your core audience is looking to use. Are they more desktop oriented? Do they like mobile platforms like cell phones and tablets? Would they prefer a regular website, accessed through a web browser, or are they looking for standalone applications?

What about the kind of content they want to see? Do they want text? Audio? Video? The way in which you primarily communicate with your core audience should match up to their wants and needs.

Likewise, understanding what your audience expects to see when they search is important. Specifically, you have to understand what questions they are most likely to ask platforms like Google.

You need to provide the answers to their questions in a way that is simple, effective, and relatable to the experience that they prefer.

Audience identity and intent is vital to your success, regardless of SEO. Even perfectly optimized content is useless when it is designed with the wrong audience in mind. You might get a lot of eyes on your site, but none of them will convert to profitable ventures unless you customize the user experience to them.

On that front, Google wants your site to enhance the user journey. After all, when Google ranks your site highly on its platform it is essentially giving you its seal of approval and signing off on your content. Google is naming you an authority in your field, and they don’t do that lightly.

This should be at the forefront of your mind, both when it comes to designing your site as well as creating quality content. You have to keep the intent of the end user in mind as you write.

Another factor to keep in mind is the end user’s place in the sales funnel. It is your job to meet them where they are, not where you think they should be.

To that end, SEO is no longer all about matching keyword phrases. It has evolved and become more about answering questions and covering that topic completely.

That’s not to say that keywords are dead. Far from it!

Keyword research has evolved. That means 2019 marketers have to pay careful attention to search engine results pages to see how similar websites are ranking for the same keywords that you’re targeting.

Optimize Beyond Google

Since the dawn of SEO, everything has been all about the all mighty Google. That has changed in recent years. Google search continues to dominate the market, but it is not the only game in town.

No, we’re not talking about other search engines. Google has that on lockdown. While it’s nice to optimize for Bing and Yahoo, they have such a small percentage of the market, that they are not really a major factor.

Other platforms like Amazon and Apple’s App Store are becoming more important as users continue to rely more and more on e-commerce for their shopping needs.

If you’re an e-commerce business, you need to be focusing on Amazon. It gets the lion’s share of e-commerce traffic and, as we said in the last section, it is important to meet people where they are searching.

To that end, Amazon SEO and App searches are becoming far more prevalent. And optimizing content for those platforms is a whole other animal.

Amazon has its own proprietary algorithm called A9, and it takes a lot of various information into account, including a more heavy focus on conversions.

You also have to optimize your content based on the devices that your audience is using. That means optimizing for mobile devices, home assistants or audio devices, which are becoming increasingly popular.

How popular? Smart speaker use grew by 200% in the third quarter of 2018. This trend is expected to continue into 2022, where it is estimated that 55% of households will include some kind of smart speaker device. That is a 42% increase from today.

Voice search is rising fast, with customers now expecting answers the moment they ask them. That is part of the major shift in customer intent from search results to getting answers. As such, search queries are becoming more conversational in tone and have to be optimized that way.

Apply Structured Data

One of the key elements of modern SEO is to include structured data in the code of your website. That is because the search engine industry has begun to rely more on artificial intelligence. As the need for AI increases, so too does the need for structured data.

Structured data is code that speaks directly to search engines, showcasing who you are, what you do, and why you deserve to be ranked high. It is the key to moving forward with artificial intelligence.

Structured data makes it easier for artificial intelligence to take in all of your content and determine how your site works.

The faster the processing, the better it is.

Structured data is the key to determining the contextual relationships between the behaviors of your customers and the topics that they are searching for.

Structured data also gives search engines specific information that allows it to understand content that is more based on topical information and customer support.

Content is (Still) King

Content has always been the king when it comes to SEO. After all, optimized content is the name of the game. This is one thing that has not changed whatsoever.

In fact, optimized content has actually gotten even more important!

In a 2018 update to Google’s search algorithm, we saw a more intense focus given to the quality of a site’s content. Because of that, you want to provide depth in your quality, as opposed to just hamfistedly stuffing keywords into paragraphs.

Google placed a lot of attention on this specific issue in 2018, so it stands to reason that 2019 will be no different.

Ideally, you want to create the kind of high-quality content that will inspire links. One of the main ranking factors that go into determining SEO placement is the presence of backlinks. These are links that connect to your site placed on high value and relevant websites which Google has ranked highly. Consider it a letter of recommendation to Google. It is a respected site telling Google that it vouches for your information and your authority on the subject.

The best way to inspire such confidence on the part of other sites is to make sure that your content is stellar.

Your content needs to address and solve a relevant problem as opposed to being a random buffet of mashed together keywords. The content needs to motivate, move, and connect with your specific audience. Remember who you are writing for.

Focus on the Technical Aspects of SEO

Technical SEO has become much more important as websites continue to evolve. Websites are becoming far more advanced, and the technical functions of those sites are in turn becoming much more important.

For starters, Google rewards sites for speed. The faster your response time, the better impression you will make on the search engines. That means you shouldn’t sacrifice page speed for the sake of your site’s advanced features. One of the best ways to ensure that your SEO score won’t suffer is to make the site simpler and faster to impress Google.

Machine Learning

A huge feather in the cap of your SEO score would be a focus on machine learning. That’s because machine learning is becoming more common as artificial intelligence systems are rising in popularity.

Machine learning is already a powerful tool, and it hasn’t yet achieved the true limits of its power. As such, machine learning is set to be a major player in 2019 and beyond.

Furthermore, employing machine learning platforms throughout your digital marketing plan can play a big part in answering the question of search intent. Machine learning can be used to develop SEO content by helping you to better understand your core demographic.

Through machine learning, businesses can actually see where their approach succeeds and fails. It can determine where there are holes in your approach much more efficiently than human analysis.

Machine learning systems can accurately study your audience as a whole and determine not only what they are looking for but how you can best provide that to them.

Aim for Search Engine Results Page Features

Search engine results pages are the arena in which SEO campaigns succeed or fail. As such, you should study the layout of search engine results pages and try to ensure that your content is featured above others. One of the best ways to do that is to rank as a featured snippet.

Featured snippets are select Google results that display above the organic search results. It is a snippet of your content that is tossed up as the answer to the question asked by a user.

Features snippets should be highly sought after as they are great for attracting new traffic. They are found in 12.29% of all search queries and siphon off a percentage of the clicks from the number one position.

To illustrate that point, search results with no featured snippet in place see the number one spot getting 26% of all clicks that come in for that search. When there is a snippet, the number one slotted result only gets 19.6% of the clicks.

Answer boxes and recipes are also big traffic items. Typically answer boxes will appear in a light gray box above the organic results, in an attempt to directly answer long-tail questions. Recipes are found for food-related searches and also appear prominently on a search engine results page.

Sometimes these boxes are known as “zero position” as they appear even before the first organic result.

They should be respected and coveted because of the exposure that comes with them. SEO is all about receiving exposure, and exposure is driven by visibility within the search engine results page.

In Conclusion

Familiarize yourself with these growing trends to ensure that your SEO efforts are not unseated in 2019 and the new years ahead. By continuing to monitor the industry and note changes to search algorithms, you will protect your SEO campaigns and maintain your well-earned spot on top.

SEO Trends 2019: The Latest & Most Current Tactics that Actually Work was originally posted by Video And Blog Marketing

PPC & SEO Synergy: How to Find Efficiencies Between the Two Channels

This post aims to cover a series of synergies between SEO and PPC that could help your business/clients run the two channels in a more efficient manner and optimise the overall spending.

This post is the first of a series of 3 articles: the last one will include a downloadable checklist.

Part one will cover some basic concepts that are important to reiterate to ensure we are all on the same page (whatever background you have, this post should be simple enough) and 3 synergy ideas that you could try yourself.

PPC and SEO have historically been seen as separate siloes, no matter how much we are trying to think of the opposite. We often see large companies struggle to promote synergies between departments which ultimately impacts knowledge sharing in the industry. However, one thing should be really clear:

Online customers do not care what frictions may exist between different teams in your company, all they want is a seamless, simple user experience.

But why would we do so, Sam? I am already too busy in my normal job, I have no time to spend testing and experimenting on synergies which may as well be not there or not worth my time. Yes, it does take time and patience to dig into this topic but I can assure you that it will provide value to your digital activity – that is the key point I am trying to tackle with this post.

Hopefully, by the end of it, you will have some thoughts for what may work for you and pass them on to your team/business. Some of the activities we will discuss will provide clear monetary advantages that you could benefit from today if done correctly.

Before diving into the first 3 test ideas, let’s go over some basic concepts that will help understand the rest of this post.

What is Google Ads’ quality score?

Google defines it as “an estimate of the quality of your ads, keywords and landing pages. Higher quality ads can lead to lower prices and better ad positions”.

Its score goes from 1 to 10 (1 being the minimum and 10 the maximum) and it is made of 3 main elements: expected click-through rate, ad relevance and landing page experience.

The higher it is, the more relevant your ads and landing pages are to the user. It is also used by Google to evaluate your CPC (cost per click) and multiplied by your maximum bid to establish your ad rank in the ad auction process. As a consequence, a higher QS means a higher ROI (I think some of you might already know where this is going!).

You can easily check your QS in the “Keyword Analysis” field of your Google Ads account and its components’ scores can be seen within 4 status columns: Qual. Score, Ad relevance, Expected click-through rate (CTR) and Landing page experience. For this post, I will focus mainly on the latter.

Example from our Distilled’s Google Ads account

What is Google Ads’ landing page experience?

Google says it is their “measure of how well a website gives people what they’re looking for when they click your ad”. The experience you offer directly impacts your Ad Rank, which indirectly links to your CPC and position in the ad auction.

Pretty simple, right? Wrong. And here is why: Google confirms there are 5 elements you can work on to improve the landing page experience. Without going too much into details (this post is not just about landing page experience after all), let’s go through point by point and highlight a few simple considerations:

1) Offer relevant, useful and original content

Point 1 is nothing new – Google has been advocating that the content is ‘king’ for years. This has been proven on a regular basis with Google’s latest major algorithm updates. What represents the main challenge from a PPC point of view, is the fact that we often use transactional category as landing pages for our ads, which often display zero/thin content or not very relevant. Something to think about!

Enjoying part one? Subscribe and we’ll send parts two and three directly to your inbox.

2) Promote transparency and foster trustworthiness on your site

Point 2 clearly relates to what has been at the centre of attention since August ‘18 with the ‘Medic’ Algorithm update, where Google has been focusing heavily on EAT (expertise, authoritativeness, trustworthiness) of a site – read more on this from Google directly.

3) Make mobile and computer navigation easy

Point 3 related to mobile-friendliness (in case you live under a stone, 2018 was the year of mobile first indexing), information architecture and user-friendliness: needless to say these have become important ‘indirect’ ranking factors nowadays. Check Tom Capper’s presentation on the topic here if you are interested in the subject..

4) Decrease your landing page loading time 5. Make your site fast

Point 4 & 5 clearly relate to the concept of speed for your site. Without going into a page speed rabbit hole, Google has been assessing speed as a direct element for landing page experience, and why wouldn’t they? It makes a lot of sense: how awful would it be for a user if, after clicking on an ad, the landing page takes a very long time to load? Chances are the user will bounce back to the SERP (pogo stick evaluation by Google – read more on it here [link to Will chat with John Muller]) which tells Google that is a ‘bad’ ad.

Without further ado, let’s start going through the checklist we created at Distilled to show you how you can make your PPC & SEO channels work harder together.

This post covers the first 3 synergy ideas of our checklist – the other synergies will be outlined in the following two posts.

Keyword research with SEO and PPC in mind: why synergy works better

The backbone of any PPC and SEO strategy is, without a doubt, keyword research, therefore why not try and find efficiencies from each other methodology?

From an SEO point of view, we tend to start our research by looking at a core group of keywords our client/site should be visible for and then expand on it, building a what some of us call keyword universe (yes I said it, a bit of a buzzword these days). It is a pretty simple concept: by creating buckets of keywords we identify opportunities where we envision our site to rank in the near future, after an adequate amount of work on our end. Sounds familiar?

PPC Keyword Research

From a PPC standpoint, we have the great benefit of testing, in real time, what works and what does not! Google Ads provides a platform with tons of data we can use to experiment on which keywords to include in our ad groups that help us decide how to structure the whole campaign and work towards efficiently using our budget. The approach tends to be slightly different: starting with broad keywords and moving to the specific, the opposite of the SEO one.

By using broader keyword matches, we can test which keywords are bringing clicks & conversion and optimise our account accordingly, with the final goal to have as many precise match keywords as possible (highly targeted and cheaper).

Do you see where this is going? By sharing both lists between the departments/people in charge, there are clear benefits in analysing what has worked for one and what has worked for the other.

Quick recap: why is this worth it?

  • SEO/PPC might have done quite a bit of the keyword research on their own (we all know how dull this task can easily become) so there is no need to duplicate the task/spend too much time to do it again – save your company time and money by sharing the findings from both methodologies.
  • As the two approaches are different (SEO: narrow to broad vs PPC: broad to narrow), chances are that combining the data will provide keywords that a siloed approach would have not brought to the table.
  • SEO can use PPC to easily test certain keywords that have worked well and potentially implement them on metadata (more on this on following posts), page copy and so on.

PPC as a content gap tool

Remember what we said about PPC keyword research methodology? Broad to specific, as opposed to specific to broad in SEO. While running ads with broad matches you can learn a lot about your users and your site. In this particular instance, I am interested in highlighting content gap opportunities which PPC can bring to light.

As most of you might know, last year Google introduced a new exact match type (read this nice breakdown on Search Engine Land to know more) which changed things quite significantly. However, for our test we just need to focus on the least restrictive match types, Broad match & Broad match modifier. Such match types allow your ads to appear whenever a user’s search query includes any words in your key phrase, in any order, with the exception of the + sign (the modifier) which locks individual words – for a search query to trigger an ad it must include that word.

Let’s explain this with an example: imagine Distilled was running PPC ads for SearchLove London, our digital marketing conference based in London. For our keyword of choice, “seo conferences”, we could run the following match types:

Match Type


Example keyword

Example search

Broad match


seo conference

best marketing events 2019

Broad match modifier


+seo conference

seo marketing events 2019

How PPC broader matches can help SEO

Simply put, broader match types have the capacity of triggering results which are going to match your selected keywords with the broadest possible searches – misspellings, synonyms, singular/plural forms, related searches, and other relevant variations. This exercise is extremely useful at the beginning of your PPC activity, when you are trying to figure out what keywords are driving clicks (and conversions) in order to then refine your campaigns with more specific match types – why not use this ‘exploratory’ phase for any potential content gaps?

By examining broad searches that are bringing clicks (even if few) to your account, you can immediately check them against your site’s content and offerings – when you see a ‘gap’, you can decide if it is worth exploring by creating a page/blog post/content accordingly. You will be surprised by the number of quick wins you might come across!

Let’s go back to our Distilled example: are users searching for ‘marketing conferences in 2019’? That gives me an idea to produce an article on this subject, talking about the best marketing conferences that are taking place in 2019, including SearchLove. My article will be topical, I know it will be relevant because I tested it with PPC and I can do a bit of self-promotion for Distilled’s event – isn’t this a win-win?

How to get started:

In case you struggle to get started, follow this process:

  1. Run, for a limited period of time, ads containing broader keyword match types: Broad match and Broad match modifier.
  2. Wait a week or so (the lower your budget, the more you will need to wait to gather data that is significant) and start reviewing the list of keyword such match types are triggering.
  3. Download an SQR (search query report): identify your top opportunities and cross-reference them against your site. Ask questions, such as:
    1. Do I have a page that covers this topic/keyword?
    2. If not, is it worth creating one?
  4. Further test idea: once you have a list of search terms from running the broad match for a period of time, you could pull them out into exact match keyword form, run for another week or so to get impression share data and solid estimates of traffic potential; bear in mind that impression share needs to be above 10% to get an accurate number.
  5. Get to work: create an article/page to “fill that gap” and optimise it. You may want to use it for PPC or simply for SEO purposes.

Quick recap: why is this worth it?

  • By testing broader keyword matches in your PPC campaigns, you could run into interesting gaps that your site’s content is not covering.
  • Create content accordingly and capitalise on these gaps.

Google shopping ads – errors that your SEO friends can help you fix

If you are running Shopping ads, I am sure you have come across several errors that will seriously test your patience. The good news is that some of these issues that might seem hard to understand for someone not familiar with SEO, can be easily flagged and fixed.

Among the long list of issues you may encounter in your Merchant Centre, I will focus on two in particular:

1) “Product pages can’t be accessed and / or Procut pages cannot be accessed from a mobile device”

Simply put, the landing page you picked cannot be accessed by Google, hence your ads will be disapproved. Why would this happen? Here is the list of reasons why you might see this:

  • Page not found (404): The landing page is not live and not found on the server.
  • Too many redirects: Your landing page is the victim of a series of redirects, 2 or more to be precise.
  • Couldn’t connect/HTTP 5xx response: The server cannot process the request for some reason.
  • Hostname not resolvable: When Google is not able to resolve the hostname.

How to prevent this from happening:

Before uploading all your Shopping ads, my advice to you is the following:

  1. Ask your SEO buddy (or do it yourself very easily – Screaming Frog is my go-to tool for any crawl type of job) to do a quick status check on all the URLs that you intend to run ads for in order to spot any anomalies (404s, 5xxs, redirect chains). If you spot these issues in advance you can chat to your devs and get them fixed quickly before running the ads: win-win situation.
  2. If you are not 100% sure that Google can crawl an individual page, then use the URL Inspection Tool in Google Search Console, which is something SEOs are very familiar with and use on a regular basis at URL level: this function will check how Google crawls or renders a URL on your site.

Side note: you have fixed a lot of issues and now your pages are eligible to appear in the Shopping ads, great news! How long do I need to wait until they actually show? If you don’t want to wait a few days, then Google themselves suggest to increase your crawl rate by changing the settings in Google Search Console – your SEO friends can help you with this too!

2) Images cannot be crawled because of robots.txt restriction

The image you selected is being blocked via robots.txt. In case you do not know what that is, ask your SEO friends or read Google’s explanation to understand all the details.

How to prevent this from happening:

If you want to avoid this from happening, my advice to you is the following:

  1. Run (or ask your SEO team to do so) a Screaming Frog crawl for the list of URLs you are planning to use and view which URLs have been blocked by robots.txt in two locations: ‘Response Codes’ tab and ‘Blocked by Robots.txt’ filter (read more here).
  2. If you want to inspect single URLs, you should use the URL Inspection tool on Google Search Console to see whether your pages have been blocked or not. Again, SEOs will be able to help you super easily here!

Quick recap: why is this worth it?

  • Instead of running into a lot of annoying, yet minor issues, when setting up your Google Shopping ads, get your SEO folks to help you prevent them from happening.
  • Start doing checks and using tools you normally would not use to help you with your daily work.

Part 1 of our SEO & PPC synergy series terminates here. Stay tuned to read the following 2 articles on the subject, the last of which will include a downloadable checklist.

If you enjoyed part one, subscribe to our email list and we’ll send parts 2 and 3 directly to your inbox.

PPC & SEO Synergy: How to Find Efficiencies Between the Two Channels was originally posted by Video And Blog Marketing

How to Create Structured Data Markup for Rich Snippets

Last updated on

Structured data has become to be one of the biggest things for search. Since was launched in 2011 by the search giants Google, Bing, Yahoo, and Yandex as an initiative to create a common set of schemas for the web, there have been tons of improvements in the database.

Through time, the number of websites using structured data is small even though the advantages of it are clear. Google and other search engines have made ways to better track data and advocate the use of it.

Specifically, Google made better use of structured data through Rich Snippets to optimize the appearance of search results.

What are Rich Snippets?

The structured data markup that is placed in a webpage allows Google and other search engines better understand the content of a page. This results to “Rich Snippets” or “Rich Results”. These are search results that have extra information based on the structured data of a webpage. This adds extra interaction for the user.

The list of schemas in consists of 614 types, 906 properties, and 114 enumeration values – and that is a lot. With hundreds of schemas in the vocabulary, Google has picked out a few types that are eligible for rich snippets.

As of this writing, Google supports the following types of rich snippets:

  • Article
  • Breadcrumb
  • Book
  • Carousel
  • Corporate contact
  • Course
  • Critic Review
  • Dataset
  • Employer Aggregate Rating
  • Event
  • Fact Check
  • FAQ Page
  • How-to
  • Job Posting
  • Livestream
  • Local Business
  • Logo
  • Media
  • Occupation
  • Product
  • Q&A Page
  • Recipe
  • Review Snippet
  • Sitelinks Searchbox
  • Social Profile
  • Software App
  • Speakable
  • Subscription and paywalled content
  • Top Place list
  • Video

Why Is it Important to SEO?

Rich snippets make normal search results a lot more attractive and interactive for users. It gives users a “taste” of how your actual page content looks like which makes them more likely to click.

Structured data is NOT a direct ranking factor. Google had plans of making it one, but in 2017, Google said they don’t want to depend on structured data to better understand the web.

However, since it helps understand the content of your website, it helps bots identify the relevance of your page and rank it for the right terms. Having structured data for rich results could also slightly increase your Click Through Rate.

How to Create Structured Data Markup

Even though it’s not a direct ranking factor, adding structured data to your pages and optimizing it for rich snippets is undoubtedly helpful. It can be quite intimidating for non-web developers, but Google has made a lot of resources for webmasters to make it easier to understand.

Structured data can be used in different encodings, RDFa, Microdata, JSON-LD, etc. Take note that Google’s preferred type is JSON-LD.

Use Google’s Structured Data Reference

In the Google Developers website, there is a reference for all structured data types that are eligible for rich results.

If you have a page that fits any of these structured data types, click on it to see a guide about it. It will show you a short introduction about the structured data type, an example of the markup, a few content guidelines, and the structured data type definitions.

After reading what’s it all about, click on the “See Markup” button under examples and it will redirect you either to the Rich Results Test Tool or Structured Data Tester Tool. It will show you a sample of a valid code. This specific code is for FAQ pages.

Edit the Code to Match your Content

After generating the code from Google’s reference, all you need to do is copy and paste it to a notepad or document to make it easier to make changes.

There’s not a lot of changes you have to do and the code is pretty straightforward. You could easily identify the information you have to include. For some cases like FAQs and How-Tos, the sample code might be too short. All you have to do is copy the exact section of codes where you need to add additional information.

Here’s a snippet of the FAQs markup I did:

 “@context”: “”,
 “@type”: “FAQPage”,
 “mainEntity”: [{
   “@type”: “Question”,
   “name”: “What are the operating hours?”,
   “acceptedAnswer”: {
     “@type”: “Answer”,
     “text”: “WorkPlays operates from 9:00am to 7:00pm during weekdays. We are closed on the weekends and during all public holidays.”
 }, {
   “@type”: “Question”,
   “name”: “Is there a limit to the number of employees I can bring?”,
   “acceptedAnswer”: {
     “@type”: “Answer”,
     “text”: “Yes, if you are renting the space as a group, you are only allowed up to 5 members in one room. This is to ensure that you can all work comfortable while making the most out of the available resources.”


Add the Markup to your Website

This is the hardest and trickiest part. It’s better if you give the code to your resident web developer and let them handle it. A mistake could screw up your website.

However, if you have to do it by yourself, you need to have an FTP (File Transfer Protocol) Access of your website, copy the code, and paste it in the header of the page. The HTML element should be visible on the page.

Test Codes for Validation and Errors

Once you’ve completed editing your structured data markup, it’s time to test it using Google’s tools to see if it’s valid or to check for errors. Google has two tools to check if the structured data markup that you made is valid. One is the Structured Data Testing Tool and the other is the Rich Results Test. I prefer using both.

Both tools allow you to check for errors but its easier to add quick edits or corrections to your code in the Structured Data Testing Tool while you could have a preview of how your page would look like in the search results in itself in the Rich Results Test.

The Structured Data Testing Tool also detects other types of schemas compared to the Rich Results Test that only tests structured data that is eligible for rich results.

How to Use the Structured Data Testing Tool

First, go to the testing tool here. It will give you an option to either automatically pull up the HTML code of a page or you manually copy and paste a code.

If you already applied the code, just copy and paste the URL and click run test. If not, try it out first in the Code Snippet Option. I usually use the Code Snippet first so I don’t have to re-apply it again in case there are errors.

After you click on Run Test, in the left panel you will see your code where you could make quick edits while on the right panel you will see the structured data type your code is valid for as well as Errors and Warnings.

If there are errors in your code like missing commas or other elements, it will show you immediately the line where the error is on the left panel.

How to Use the Rich Snippet Test

Go to the Rich Snippet Test tool here. Just like the Structured Data Testing Tool, you also have the option to fetch the code of an existing page that has structured data or you could manually place the code yourself.

If you’re going to manually input the just the structured data code, I recommend adding an HTML code for Page Title and for another rich snippet valid structured data, add the URL to your images so you could get a preview of how your page would exactly look like in the search results.

If you want to see how your code would look like as a rich snippet in the search results, just click “Preview Search Result”. Currently, the Rich Snippet Test is on its Beta phase and other structured data markups might not be supported yet. The preview that is shown is also on mobile version.

Resubmit your URL for Re-indexing

After you have applied the code in your page, make sure to let Google know that you made changes. Using the URL inspection tool in Google Search Console, inspect the URL of the page you made changes to and click on ‘Request Indexing’.

It didn’t take long for the changes to reflect. Within 24 hours, I already saw changes in the search results and the FAQs pages I made is now a Rich Result.

Key Takeaway

Writing and applying structured data markup for rich snippets takes a lot of time and a lot of detail but there are no real cons to this. Whether Google makes this as a ranking factor in the future or not, making your website eligible for rich snippets will still benefit you in every way it can.

How to Create Structured Data Markup for Rich Snippets was originally posted by Video And Blog Marketing

Ahrefs Launches Internal Backlink Audit Feature

Last updated on

Cover Photo - Ahrefs Launches Internal Backlink Audit Feature

Having the right tool is vital for your SEO strategies and efforts to dominate your niche. As much as you try to gain traffic and climb up the ranks in search results without the aid of a tool, this would just be a waste of time. Ahrefs has proven its name in the industry, evidenced by its visibility in most Webmasters’ vocabulary. The crowd-favorite is continuously evolving and you can see it through its recent update on the Internal Backlink Audit.

Internal linking is one of the tasks that most webmasters neglect to check, which can very well hurt your position in the SERPs. Google has mentioned time and again that links are one of the most important ingredients you should take advantage of in order for you to rank. This is why the right linking strategy includes the Internal Backlink Audit.

How to Audit Backlinks Using Ahrefs

The thing with having an efficient tool such as Ahrefs is that you are at your most convenient while using them. With the knowledge that Google crawls websites through links – both internal and external – this should be a reason enough for you to organize your internal linking. If you don’t know this by now, providing a road map for your content by internally linking to it provides a hierarchy to it.

Ultimately, you lead the users into your most important content, all while providing them with the best set of information. You will also see broken or bad links that return a 404 error. By using this tool, you can identify internal links that are not helping the parent pages gain velocity in ranking and traffic.

An overview of the external backlink profile has always been a part of the Ahrefs skeleton. Now, you will see that Internal Backlinks can also be checked using this tool. Under the Site Explorer tab in the Ahrefs toolbar, start by inserting your url and then checking the Internal Backlinks option found at the sidebar.

As you can see, your anchor link and all your internal backlinks are grouped accordingly. Once you see that you fail to include a link that is related to your anchor, then start internally linking through it.

Seo Hacker Ahrefs

Another important feature of Ahrefs Internal Backlinking Option is its ability to let you categorize your link data according to the link type. This will further enhance your efforts to do a linking audit since you can decide where to focus. Additionally, you would not find difficulty in doing on-site optimization regarding links since you can check if you are properly redirecting to a page or if you are using the dofollow/nofollow tags correctly.

Link Type

Aside from URL Ranking, Referring Domains, External Links, Total Search Traffic, and the Keywords that page is ranking for, you can also see how your content is grouped according to the message you want the user to receive. The new Ahrefs feature lets you do just that by using the text snippets surrounding your content, you will see how it points back to your anchor link. This will help you understand how you should group your content accordingly.


Importance of Internal Backlink Audit for SEO

Internal Linking should follow a structure. Using the Ahrefs tool will help you do this in order for your site to bring smoother user experience and a solid SEO profile. Organizing your web pages in accordance with the keywords or its purpose for the site is one of the best things you can do to dominate the SEO industry. Let’s face it, you can come up with thousands of strategies but if you fail to do the simplest act of Internal Link Audit, all your efforts will be for nothing.

Additionally, you will not reap the benefits that Backlink Audit brings to your SEO. These advantages include but are not limited to:

  1. Utilizing anchor links to better aid user intent.
  2. User navigation becomes smoother because users would not run the risk of encountering a dead link or page.
  3. Content is displayed in a series, giving the signal that your site has a well-optimized structure.
  4. Highlights new links that point to relevant content.
  5. Link juice is passed between your web pages.

Internal linking will pass on your site’s true value. In addition to running a site crawl to collect the links that are not beneficial to pages anymore, it is important that you see them relating to one another. By doing this, content is better delivered and utilized according to its purpose.

Using Internal Links for Your Content Marketing Strategy

Knowing all this, how are you going to use the Ahrefs tool to your advantage? Linking deep into your site can help will be the best way to build your link architecture, so why not make a content marketing strategy out of it? You may have heard about topic clusters and this is where Internal Backlink Audit comes into play.

Whether you plan on re-purposing content or making new ones, you can use this Ahref tool to do so. As you may well know, using tons of links in your internal pages will hurt your SEO score. To avoid this, just segment those content on your blog page or preferred content navigation page.

This common model found below will give you a clear idea of what to do by using internal linking as a content strategy. It may seem obvious to do this but many tend to neglect this part of their sites.


internal link structure


Think about this: If your content is woven together to form a large chunk of information from your site, users will have something to look forward to. More traffic and a larger opportunity to rank in the SERPs for you.

Once you see that internal linking makes sense, you will have more cause to debunk the common misconception in SEO that internal links carry little weight as opposed to external backlinks. Using Internal Linking will help you improve your link flow to individual pages which means that you will help them rank better. And since “Content is King”, target this part of your site and see the improved results.

Key Takeaway

Every website has internal links but not all webmasters know how to use it. Internal links are silent players when it comes to boosting organic traffic, but once you start being mindful enough to audit them, the results will be beneficial. Keep in mind that the link audit is an important part of your regular on-site optimization efforts and step up your game by using it to build a great content structure. What are your best techniques in link auditing? Comment down below!

Ahrefs Launches Internal Backlink Audit Feature was originally posted by Video And Blog Marketing

Ecommerce Website Builders: Best Solutions in 2019 (Compared)


If you’re a retail company, you need some kind of ecommerce platform to sustain you through the modern era.

Online shopping represents unparalleled convenience and accessibility. That means if you’re not currently selling your products online, you’re missing out on a lot of potential business.

But to get started, you’ll need a brandable domain name, along with a strong ecommerce store that is easy to use and effective. Hiring a web developer could be thousands of dollars that don’t exist in your budget. Thankfully, there are programs which are designed to help business owners create professional looking ecommerce websites for a fraction of the cost of hiring a developer.

But how can you make sure that you’re purchasing the service that’s best for you? By comparing and contrasting some of the best ecommerce website builders on the market today!

We’ve highlighted six of the top online store builders in this article to provide you with a resource that will help you make an informed and beneficial decision for the future of your business.


Shopify is one of the most well-known ecommerce website builders in existence. It is also one of the longest-running, serving customers since 2006.

Shopify is a massive company, touting merchants in more than 150 countries worldwide. On top of that, Shopify is incredibly effective. To date, it boasts an impressive $82 billion in total sales.

Shopify Pros

One of the best features of Shopify is that it allows you to sell your products across multiple channels. That means it includes social media support for platforms including Facebook and Instagram.

The importance of this feature can’t be understated, as users enjoy shopping on social media, and Facebook and Instagram advertising are two of the most effective online retail ad tools in existence.

Shopify has add-ons that can also be used in conjunction with eBay and Amazon, giving you even more platforms on which to sell your wares.

Speaking of your wares, Shopify has a built-in inventory system which can easily manage your entire store, keeping track of every last widget you offer.

There are over 800,000 stores on Shopify’s platform, and all of them can be custom designed. Users can choose to work off of a premade template or create a store completely from scratch.

Shopify also includes the Shopify Network, which is a peer-to-peer resource featuring more than 680 experts who will answer your questions and solve your issues.

Shopify Cons

Unfortunately, Shopify isn’t a perfect system. For starters, users are forced to switch between the system’s editor and dashboard manually while creating their storefront. This makes for an inefficient experience.

Shopify also enforces a transaction fee. While this might not seem out of the ordinary, it actually is. Shopify is one of the only platforms to do this. Normally transaction fees are taken through payment platforms like Stripe which work in conjunction with your online store. Shopify, however, takes it upon itself to dip its hand into your profits. When you’re also selling via a platform like Amazon, you’re losing out on even more profit because Amazon will also take its cut.

Shopify Costs/Fees

The lowest tier of Shopify, called Shopify Lite is only $9 per month. This service lets you sell your products on social networks, on an existing website, or in person.

Shopify’s Basic Plan comes in at $29 per month and it gives you everything from the previous tier, plus your own online store.

Shopify’s regular plan is  $79 per month. Using this tier you can create gift cards and receive professional sales reporting. You also get a shipping discount of 72%, which is not too shabby.

The higher tiers of Shopify are the Advanced plan, which is a whopping $299 per-month and Shopify Plus which is only offered through negotiated pricing.


BigCommerce is one of the top ecommerce platforms in the world. It gives you not only the ability to create your own store, but it is specifically designed to help you make sales.

BigCommerce Pros

As we just mentioned, BigCommerce is designed with sales in mind. It does not just give you a store and throw you into the deep end. That’s why a lot of big-name clients tend to sell on this platform, including Toyota, Ben & Jerry’s, and Kodak.

It also includes over 40 different payment gateways, including many of the tried-and-true methods users have come to expect like PayPal, Square, Apple Pay, Amazon Pay, and Stripe.

It is perhaps the most scalable ecommerce platform in existence, with built-in data tools making this platform ideal for businesses that are growing quickly.

BigCommerce also has many more built-in features than the majority of its competitors. On top of that, it is easy to implement search engine optimization into your store. The system features tools that are designed to improve your optimization efforts.

Much like Shopify, you can also sell across various social channels like Instagram, Facebook, and Pinterest, as well as popular platforms like Amazon and eBay.

(Image Source)

BigCommerce also includes a feature called abandoned cart recovery, which is a tool that lets you send automated emails out when someone bounces from your store with items left in their cart. According to studies, retailers can win back up to 15% of customers using this service.

BigCommerce Cons

The main criticism that BigCommerce gets is that it’s not great for anyone who is not tech savvy. It’s a powerful tool, that much is for sure. But it’s constant use of technical jargon combined with complex terminology will sadly fly over the heads of beginners.

Another glaring check in the con category is that BigCommerce does not feature a mobile app. This makes it far more difficult to operate your business and manage sales while on the go. In today’s market, everything needs to be mobile and BigCommerce is sadly behind the 8 Ball on that.

BigCommerce Costs/Fees

BigCommerce features seven free themes that you can choose from to create your store, but you will also run the risk of looking similar to every other store out there that doesn’t want to pay to create its online presence. On top of that, the platform features over 100 paid themes ranging from $145 to $235.

The BigCommerce Standard plan is $29.95 per month. It comes with unlimited products, unlimited staff accounts, multiplatform support, coupons and gift cards, real-time shipping quotes, the ability to get product ratings and reviews, and 24/7 support.

The Plus plan costs $71.95 per month, and it includes everything in the standard plan, plus abandoned cart recovery, customer segmentation, and store credit cards.

The Pro plan is $224.95   and it includes everything mentioned above, plus Google customer reviews, product search filtering, and Customer SSL.

There is also an Enterprise plan which has custom pricing that is designed for each business.

All BigCommerce plans come with a 15-day free trial.


Wix is a widely known service, primarily for its website builder. Its claim to fame is that it helps beginners design custom websites without having to know how to code. But Wix also has an eCommerce platform that is gaining in popularity.

Wix Pros

The same drag and drop interface that makes building a website a snap on Wix’s platform can also be used to create eCommerce sites. That means you don’t need any knowledge of coding or hosting to create a memorable store.

The Wix editor shows you how the edits that you’re making will impact the overall storefront design. This ensures that you’re never operating blindly and making changes that you’ll later regret.

Wix is Great for beginners thanks to its simplistic interface and mobile responsive templates. It gives you the freedom to focus on developing your brand with enhanced customization.

You can also incorporate product videos into your store, which are always popular with shoppers.

Wix also features an abandoned cart recovery tool. On top of that, it is a multilingual platform, giving you the ability to create different sites for different countries.

Wix is also one of the most affordable platforms on this list, which we will go into shortly.

Wix Cons

Wix’s most glaring shortcoming is that is cannot integrate with social platforms.

It’s also almost too freeing. By that, I mean that it grants you unprecedented levels of creative freedom, which can, unfortunately, impact usability when mistakes are made. This is especially dangerous for beginners.

Wix Costs/Fees

Wix’s Business Basic plan is only $20 per month. With this level, you can upload five hours of product video, see no Wix ads in your shop, and connect outside domains to your store. It also comes with unlimited bandwidth and 20 GB of storage.

The Business Unlimited plan is just a bit more, at $25 per month. It upgrades the Basic package by giving you 10 hours of video, no ads, the ability to connect your domain, unlimited bandwidth, and 35 GB of storage.

The Business VIP tier is $35 per month. With it, you get unlimited hours of video, 50 GB of storage, and priority response to customer service issues.

All Wix plans come with a 14-day money back guarantee.


Volusion has been around since 1999, making it the oldest service on this list. It is known for including some effective tools, but the design features of the service are notoriously tricky.

Volusion includes 30,000 stores worldwide and has serviced 185 million orders in its lifetime.

Volusion Pros

One of Volusion’s best features is a good analytic tool, which helps you get a better handle on how your store is doing.

Volusion also allows you to take a variety of payment methods and includes a mobile app for conducting business while on the go.

Volusion recently launched its all-new V2 operating system, which has improved the overall user experience. But is it enough to earn your loyalty?

Volusion Cons

One of the most glaring issues for Volusion is that it excludes digital products from being sold on the platform. That means downloadable items like e-books cannot be featured or sold.

There is also no function for blogging, which severely limits SEO reach.

While the pricing is fair, there is an additional charge for SSL certificates, which is something that is included with most other platforms at no cost.

While V2 has improved overall usability, the site building process is still clunky and forces you to constantly switch between the front and back end. On top of that, the front end is difficult to navigate.

Volusion Costs/Fees

Volusion’s Personal plan costs $29 per month and allows you to sell an unlimited number of products with unlimited storage and online support. You can make up to $50,000 in online sales per year, and take payments from Stripe, PayPal, and Volusion Payments. It also includes a tax calculator, inventory management, coupons, product variants, and manual order creation.

The Pro plan is $79 per month and features online and phone support. You can make up to $100,000 in online sales per-year, and it allows ratings and reviews, phone orders, bulk order processing, and advanced discounts.

Volusion’s Biz Tier is $299 per month and grants you priority level support and up to $500,000 per year in online sales.

The Prime level comes with VIP support and customer online sales per year. The pricing for this level is also customized and comes with Executive Slack support and a monthly site audit.

SSL Certificates, which are free on most platforms, cost between $89 and $99.

All tiers come with a 14-Day Free Trial


Weebly is another drag-and-drop website builder and has a good reputation. It is known for being one of the easiest platforms to use for building an online store.

Weebly Pros

Weebly is known for being extremely effective for small businesses. Its offers are relatively basic, which can be good for someone who isn’t interested in all of the extra noise that comes with advanced site design.

Using this platform, you can build your site and set up your online store within hours.

It has some really good templates available and allows you to create dedicated product pages for every item in your store.

It also features guides for ecommerce SEO best practices and has a free option.

Weebly Cons

One of the biggest issues people seem to have with Weebly is that all of its lower tiered pricing redirects customers when it’s time to check out. It takes them to instead of remaining with the domain associated with your brand. This can make some customers nervous and force them to abandon their cart.

There is also a transaction fee of 2.9%, plus 30 cents through its payment providers, plus an additional 3% charged to all starter and pro plans.

While the templates are good, there is not a lot of them. That means your site runs the risk of looking like other Weebly designed stores.

There is limited drag and drop customization, which curbs your design ability. You’re also unable to restore earlier versions of your store without contacting the support team.

There is no artificial design intelligence, which designs a site for you based on preferences. This is a popular service used by a lot of other platforms.

Weebly Costs/Fees

Weebly’s free account includes 500 MB of storage, a subdomain, shows ads, and features SSL security.

Weebly’s Connect plan is only $5 per month. It shows ads, has only 500 MB of storage but allows you to connect the site your domain (except for checkout).

The Pro plan is $12 per month and has no ads, unlimited storage, SSL security, advanced site stats, password protection, video backgrounds, HD video and audio, up to 200 members, a 3% Weebly transaction fee, and a list of up to 25 products. The checkout system still goes through Weebly on this tier.

The Business plan is $25 per month. It features everything listed above, plus unlimited members, membership registration, no transaction fees, unlimited products, check out featured on your domain, digital goods, product reviews, inventory management, shipping discount, a tax calculator, and coupons.


Squarespace is another very well known platform. It has made a name for itself by allowing beginners the ability to create professional looking websites with little to no coding experience because of its ease of use. It also uses a drag and drop interface and has cemented itself as one of the more popular platforms with a strong advertising campaign.

Squarespace Pros

Squarespace gives you unprecedented levels of customization control without having to do any coding.

Its templates are impressive and flexible when it comes to design. Squarespace’s shopping cart blends in well with the website, creating a more natural look.

Ecommerce functions are built into all templates and users have full control over product variants. You can also sell digital and service products on top of physical items.

Squarespace integrates fully into social media platforms.

Squarespace Cons

While Squarespace has a great number of features, it is also one of the more expensive services on this list. Its interface is also not very friendly to beginners and may confuse some. It includes a transaction fee with some of its tiers.

Squarespace Costs/Fees

Squarespace Business costs $18 per month with a 3% transaction fee. It includes advanced metrics, unlimited contributors, promotional pop-ups, integrated ecommerce, unlimited products, a mobile information bar, premium blocks, and an announcement bar.

Squarespace Basic is $26 per month with no transaction fee. It features all of the business plan features, plus a mobile-optimized website and checkout. It also has unlimited contributors, commerce metrics, an inventory system, added tax, coupons, label printing, integrated accounting, check out on your domain, customer accounts, and features products on Instagram.

Squarespace Advanced is $40 month with no transaction fee. It includes all of the Basic plan features, plus the ability to take subscriptions, abandoned cart recovery, advanced shipping, flexible discounts, and the creation of gift cards.


Not everyone is going to find what they’re looking for in an ecommerce web builder. Unfortunately, there is no perfect system and different businesses have different needs. That’s why it is sometimes better to go with a custom web design & development service like the one offered by HigherVisibility.

By going custom you aren’t tied to a set platform and everything that you need can be built according to your specifications.

Although we typically build websites in WordPress, we have the ability to work with any platform. For more information on HigherVisibility’s custom ecommerce website solution, click here.

In Conclusion

The ability to create a stunning and user-friendly ecommerce website is essential for success in today’s online retail world. By committing to one of these platforms or by using a customized platform like the one offered by Higher Visibility, you can guarantee a chance to close the gap with your competitors and create a winning ecommerce business. Why wait, start selling today!

Ecommerce Website Builders: Best Solutions in 2019 (Compared) was originally posted by Video And Blog Marketing

How to Find Orphaned Pages and What to Do with Them

Last updated on

Maintaining a website and doing SEO means putting out content regularly. Whether it’s an e-commerce website that has thousands of products or a services website that publishes blog posts regularly, a website will inevitably expand the number of pages inside as time goes by.

Whether a website has 100 pages or 10,000 pages, internal linking is a crucial on-page SEO strategy. Linking one page to another helps visitors to navigate through your website. Aside from that, it helps search engine bots crawl your website. The more a page is internally linked, the easier it is for a bot reach the page and crawl it more frequently.

That means important pages for your websites such as Landing Pages, Product Categories, Services, Blog Posts, etc. should be frequently linked to each other.

However, there are some cases where some pages are left out in the ecosystem. These are called Orphaned Pages.

What are Orphaned Pages?

Orphaned pages are pages of a website that is not internally linked or has zero links from other pages of your website. This makes it difficult for search engine bots to crawl and index these pages.

Orphaned pages may occur for different reasons. It could be old blog posts, old products that are not being sold anymore, old services pages that are not being offered anymore. While there are some pages that are purposely left out such as testing pages and tags pages, it is critical that you check if there are orphaned pages that are still relevant for the users.

Does it Affect My SEO?

The answer is both yes and no. The effect of orphaned pages in a website’s rankings depends on how you look at it. If a page that is orphaned was created to be shown to users and has content that is important to users, it hurts your SEO because crawlers can’t see this page thus it won’t appear in the search results. Users won’t be able to see them either.

However, if a page that is orphaned was created for other purposes not related to users such as testing functionalities or testing a new website design, then you can leave these pages as it is.

How to Find Orphaned Pages using Screaming Frog

To find orphaned pages using Screaming Frog, you have to first make sure that your Google Analytics and Google Search Console accounts are connected.

To do that, under Configuration, scroll down to API access and connect Google Analytics and Google Search Console.

Once you got them connected, make sure that under the General tab of the API window, you select Crawl New URLs Discovered in Google Analytics.

After connecting your GA and GSC accounts, under Configuration, go to Spider, and check Crawl Linked XML Sitemaps. Then check the option Crawl These Sitemaps: and input the URL of your website’s sitemap.

After setting everything up, you could now start crawling your website. Once it’s finished crawling, under Crawl Analysis, click on configure and check the box beside Sitemaps. It will start analyzing the crawl log of your website and will allow you to see the orphaned pages.

After the analysis, in the Overview under Sitemaps, you can now see all orphaned pages that were crawled by Screaming Frog.

How to Find Orphaned Pages using SEMRush

You could also find orphaned pages by setting up Site Audit in SEMRush. If you don’t have a website set up, create a new project first and let SEMRush crawl your website.

Once the set up of the project is complete, go to the Site Audit of your website then go to Issues. Under the Notices tab, scroll down to check if orphaned pages report is enabled.

If it hasn’t been enabled yet, connect your Google Analytics account in the Site Audit Settings. The process is similar to Screaming Frog. It will prompt you to log in with your Google Account, select the Profile, Property, and View of your selected Website and click Save.

Once you complete the setup, SEMRush will automatically collect data from Google Analytics. Unlike Screaming Frog, you don’t have to connect Google Search Console to get orphaned pages data in SEMRush.

After a few minutes, refresh your browser and check the Issues tab again. Click the dropdown menu Select an Issue and you will find Orphaned Pages (Google Analytics) under Notices.

Optimize or Scrap?

Once you collected all orphaned pages, it is now up to you what to do with these. You could place them inside a Google Sheet.

  • If a page is still relevant, label them as ‘optimize’ and find possible pages to link to this page.
  • If a page was relevant but now irrelevant such as old products or old services, you could delete them and leave them as 404. No need to redirect these as they don’t carry any link value at all.
  • If a page is purposely left out, you could leave them as it is.

Here’s a sample template that you could use:

Key Takeaway

While orphaned pages can be harmless to your website’s overall rankings and SEO value, it could be a critical issue when important pages are left out. Include monitoring of orphaned pages in your regular website maintenance audit. Make sure that your website has a healthy site structure and good flow of link juice by internally linking pages to each other.

How to Find Orphaned Pages and What to Do with Them was originally posted by Video And Blog Marketing

Google Image Classification and Landmarks

Image Classification in the past

Back in 2008, I was writing about how a search engine might learn from photo databases like Flickr, and how people label images there in a post I wrote called, Community Tagging and Ranking in Images of Landmarks

In another post that covers the Flickr image classification Landmark work, Faces and Landmarks: Two Steps Towards Smarter Image Searches, I mentioned part of what the Yahoo study uncovered:

Using automatically generated location data, and software that can cluster together similar images to learn about images again goes beyond just looking at the words associated with pictures to learn what they are about.

That is using metadata from images in an image collection, which is very different from what Google is doing in this post about identifying landmarks in the post, How Google May Interpret Queries Based on Locations and Entities (Tested), where it might identify landmarks based upon a knowledge of their actual location.

More Recent Image Classification of Landmarks

I mention those earlier posts because I wanted to share what I had written about landmarks, before pointing to more recent studies from Google about how they might recognize landmarks, a year apart from each other, with one being a followup to the other.

The first of these papers, Google-Landmarks: A New Dataset and Challenge for Landmark Recognition, starts out by telling us about a problem that needs solving:

Image classification technology has shown remarkable improvement over the past few years, exemplified in part by the Imagenet classification challenge, where error rates continue to drop substantially every year. In order to continue advancing the state of the art in computer vision, many researchers are now putting more focus on fine-grained and instance-level recognition problems – instead of recognizing general entities such as buildings, mountains and (of course) cats, many are designing machine learning algorithms capable of identifying the Eiffel Tower, Mount Fuji or Persian cats. However, a significant obstacle for research in this area has been the lack of large annotated datasets.

A year later, Google worked to improve the dataset that was being used for image classification when identifying landmarks, and updated the dataset that they had created the year before, as they tell us in,Announcing Google-Landmarks-v2: An Improved Dataset for Landmark Recognition & Retrieval Part of the effort behind that work came from getting a lot of help as described in the blog post announcing it:

A particular problem in preparing Google-Landmarks-v2 was the generation of instance labels for the landmarks represented since it is virtually impossible for annotators to recognize all of the hundreds of thousands of landmarks that could potentially be present in a given photo. Our solution to this problem was to crowdsource the landmark labeling through the efforts of a world-spanning community of hobby photographers, each familiar with the landmarks in their region.

Google Patent for Image Classification when Identifying Landmarks in Image Collections

image classification

Google was recently granted a patent that focuses on identifying popular landmarks in large digital image collections. Considering Google operates Google photos, that makes a lot of sense. The landmark identification efforts at Flickr sound a little similar to this effort on Google’s part. The patent does target a specific problem which it tells us is:

However, there is no known system that can automatically extract information such as the most popular tourist destinations from these large collections. As numerous new photographs are added to these digital image collections, it may not be feasible for users to manually label the photographs in a complete and consistent manner that will increase the usefulness of those digital image collections. What is needed, therefore, are systems and methods that can automatically identify and label popular landmarks in large digital image collections.

Some of it does sound similar to the Flickr efforts where it talks about working to populate and update “a database of images of landmarks including geo-clustering geo-tagged images according to geographic proximity to generate one or more geo-clusters, and visual-clustering the one or more geo-clusters according to image similarity to generate one or more visual clusters.”

How might this play into image classification and search involving landmarks?

The patent describes how it could fit into searches, with the following steps:

  • Enhancing user queries to retrieve images of landmarks, including the stages of receiving a user query
  • Identifying one or more trigger words in the user query
  • Selecting one or more corresponding tags from a landmark database corresponding to the one or more trigger words
  • Supplementing the user query with the one or more corresponding tags, generating a supplemented user query

Trigger words appearing in queries is interesting.

The patent also tells us that it could also involve a method of automatically tagging a new digital image, which would also cover:

  • Comparing the new digital image to images in a landmark image database, wherein the landmark image database comprises visual clusters of images of one or more landmarks
  • tagging the new digital image with at least one tag based on at least one of said visual clusters

The patent is:

Automatic discovery of popular landmarks
Inventors: Fernando A. Brucher, Ulrich Buddemeier, Hartwig Adam and Hartmut Neven
Assignee: Google LLC
US Patent: 10,289,643
Granted: May 14, 2019
Filed: October 3, 2016


In one embodiment the present invention is a method for populating and updating a database of images of landmarks including geo-clustering geo-tagged images according to geographic proximity to generate one or more geo-clusters, and visual-clustering the one or more geo-clusters according to image similarity to generate one or more visual clusters. In another embodiment, the present invention is a system for identifying landmarks from digital images, including the following components: a database of geo-tagged images; a landmark database; a geo-clustering module; and a visual clustering module. In other embodiments, the present invention may be a method of enhancing user queries to retrieve images of landmarks or a method of automatically tagging a new digital image with text labels.

Even Smarter Image Classification of Landmarks

This system appears to be capable of finding very popular landmarks in photo collections across the web and storing those in a landmark database, where it might geocluster those. It’s interesting thinking about this effort. If Google Might use those landmark images in Image Search Results, it may not stop image classification at that point

I recently wrote about Google Image Search Labels Becoming More Semantic? where we were told in an updated Google Patent that images were being labeled based upon an ontology related to the topics of those images. A Google image search for a landmark like The Washington Monument shows a number of image classification labels at the top of the results that can be clicked on if you want to narrow down the results to specific aspects of those monuments.

So, image classification may include specific monuments, and then even more narrow classifications, like having the following labels applied to the Washington Monument:

Reflecting Pool
Lincoln Memorial
Washington DC
Observation deck
National Mall

So, Google may have smarter image classification when it comes to landmarks, but it is labeling them so that they are more meaningful, too.

Copyright © 2019 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

Google Image Classification and Landmarks was originally posted by Video And Blog Marketing

10 Years of SearchLove and Rand Fishkin: Free Video Edition

2019 marks 10 years of Distilled hosting conferences. For those that are more recent to our conferences, you’ll know them as SearchLove. For those that have been our friends, attendees, speakers & sponsors for longer,  you might remember the days of LinkLove and ProSEO. One or two might even remember an event in the crypt of a church with only one or two speakers and a couple of dozen delegates from even longer ago. A good number of you might already have your tickets booked to join us in Boston in June.

In that time we have:

  • Had over 200 speakers
  • Hosted thousands of attendees from every continent around the world (except Antarctica)
  • Run our conference in London, San Diego, Boston (and in the past New York and New Orleans)

Making the Best Conference Possible

For each of those conferences, we’re always aiming to select the best speakers we can find, and when they take to the stage the competition is fierce to achieve the best possible speaker score.

At each SearchLove event, we ask our audience to rate each session Outstanding, Excellent, Good, Average or Poor. We then pay the most attention to the percentage of delegates rating a speaker Outstanding or Excellent.

Almost certainly near the top of those rankings, time and time again we see Rand Fishkin, the former co-founder and CEO of Moz, and now CEO at SparkToro.

  1. He has never scored below 80%. Ever. Meaning at least 4 out of every 5 delegates rates his talk outstanding or excellent
  2. Now is a better time than ever to see him speak. He’s coming off a 96.9% in San Diego

Rand Fishkin speaker scores over the last 6 years

Not only is Rand a fantastic speaker, but he has also always been a huge supporter of our conferences, even when he’s not on the stage you’ll see him hanging on the edge of his seat watching our other speakers. Deciding which conference to go to? Here’s Rand’s recommendation:

A look back at the last 5 years of Rand

To celebrate a decade of SearchLove, and the run-up to our Boston conference next month, we’ve decided to give you videos of all of Rand’s sessions from the last 5 years of SearchLove for free. Take a look for yourself at how Rand’s presentations have stood the test of time, and then go and book your place to see him in person at SearchLove Boston.

Sign up to access over 6 hours of Rand Fishkin videos for free.

Where is SearchLove going next?

Our aim is to keep delivering the best conferences we can. In 2019 we have introduced community speaker slots which have helped us identify even more talented speakers such as Andi Jarvis, Luke Carthy, Laura Hogan, Raffa Asquer, Vince Nero, James Corr and Nancy-Lee McLaughlin. And as always we’ll keep inviting back your favorite speakers like Rand.

We want it to keep being the best place to come to learn about all areas of search and digital marketing, and continue to empower people to return to their office, think differently about their work, and feel inspired to go and make a difference in their organisations.

Tickets are already on sale for this year’s Boston and London conference and we are heading back to San Diego in Spring 2020.  We hope to see you there.

10 Years of SearchLove and Rand Fishkin: Free Video Edition was originally posted by Video And Blog Marketing

SEO Audit 2019: A Comprehensive Guide

Last updated on

Cover Photo - SEO Audit 2019 A Comprehensive Guide

Website optimization is made up of different facets that need to be optimized individually in order for your site to slowly reach the top spot in the first page of the search results. Onsite optimization – consisting of different factors that are inside your website need to be checked and optimized, offsite optimization – consists of different factors that deal with links that connect other websites to yours, and technical optimization – everything technical (codes and other factors that need IT expertise). All of these are important for your rankings and should not be disregarded. But the challenge is to find the pain points out of all these facets and fix them.

A website is a delicate object that needs constant maintenance and care from webmasters and SEOs. Our job is to create the most optimized site that contains useful, authoritative, and high-quality content that is able to assist users in their search queries and help them find what they’re looking for. So, how do we do that? We audit the site to find the broken facets and fix them accordingly. Here’s how:

Check your Website Traffic

Traffic is a consequential effect of your SEO efforts. If you were able to improve your search visibility for a high volume keyword, you can almost be sure that your site’s traffic count will also increase. However, a drop in an otherwise steady number of traffic does not always mean that your search visibility took a drop as well. Take a look at this example:

Google Analytics Screenshot

We do a weekly checkup of our traffic count and once we saw the sudden drop, we knew something was wrong. The problem was, we didn’t do anything. I just published a new post and it suddenly became that way. I won’t go into how we investigated and fixed the cause of the drop, but this just goes to show how important it is to do a regular check of your traffic in Google Analytics. If we didn’t do the regular checks, then our traffic count might have just stayed that way until it becomes a crisis.

Make it a habit to regularly check your traffic count for the sole reason of you being on top of everything. I recommend doing this twice a week, if you can check it 4 times a week, then that would be best. This is an important foundation of your site audit and checking your traffic can never be disregarded. After checking your traffic, the next step is to:

Check your Google Search Console Coverage Report

Google Search Console is probably the most crucial tool for SEOs and learning how to use it at its full extent is a must. As an SEO professional, it is important that the pages that you want Google to index are being shown in the search results and those that you don’t want to show should not appear in the SERPs.

GSC Coverage Report 1

Google Search Console’s Coverage Report is the best way to know how Google sees your pages and monitor the pages on your website that is being indexed. You would also be able to see crawling errors so you could fix them immediately.

Check your Submitted Sitemaps

Since your sitemap contains all the URLs you want Google to crawl, you should make sure that all your sitemaps are submitted and crawled by Google. You could submit multiple sitemaps in Search Console but this decision should be based on the size of your website.

Sitemaps GSC

If you have less than a thousand pages, it would be better if you only have one sitemap. If your website is a travel booking website and has 50,000 pages or more, you could divide your sitemap to multiple sitemaps to better organize your URLs.

Take note that having more than one sitemap does not mean crawling priorities will change. It is only a great way of structuring your website and telling crawl bots what are the parts of your website that are important.

Submitted and Indexed

Under this report, you should see all the URLs that are in your sitemap. If you see URLs here that should not be indexed such as testing pages or images, you should place a noindex tag on them to tell crawl bots that this should not be shown in the search results.

Indexed, Not Submitted in Sitemap

Most of the time, this report shows pages that you don’t want users to see purposely in the search results. Even though you submitted a sitemap, Google could still crawl links that are not in your sitemap.

As much as possible, the number of pages indexed in this report should be kept at a minimum. If there are pages indexed that you don’t want users to purposely see, place a noindex tag on them. If you see important pages here, it would be much better if you could add them to your sitemap.

Check for Crawl Anomalies

Crawl Anomaly

In Google Search Console under the Coverage Report, you could also see pages that were crawled by Google but were excluded from the search results. This could be because of the noindex tag, robots.txt file, or other errors that might cause crawl anomalies and make a page non-indexable by Google.

You should check this report as you might have important pages that are under these. For pages that are under ‘Blocked by Robots.txt’ or ‘Excluded by Noindex tag’, fixes should be as easy as removing them from the robots.txt file or removing the noindex meta tag.

For important pages under ‘Crawl Anomaly’, you should check it using the URL Inspection tool of Search Console to see more details why Google is having problems crawling and indexing it.

If you don’t see any important pages here, there is no need for any further actions and let search console keep unimportant pages here.

Check the SERPs

This is one of the most important things a lot of SEOs usually forget. A lot of people are busy doing link building and strategizing with their on-page SEO that they don’t check what their pages look in the search results.

Site Command

To do this, go to Google search and use the advanced search command “site:” to show all the results Google has for your website. This is a great way of knowing how users see you in the search results.

Check for page titles that are too long or too short, have misspelled words or wrong grammar. Though meta descriptions are not used as a ranking factor by Google anymore, it is still a strong Click Through Rate factor and should still be optimized so make sure that meta descriptions of your pages are enticing for users.

Robots.txt Validation

The robots.txt file, also called the robots exclusion protocol or standard, is what the search engine look for to determine which pages on your site to crawl. Robots.txt is vital to your SEO Audit since a slight misconfiguration or problem can cause a world of problems for you. What more if you totally neglect to check it for the audit?

Putting your robots.txt to the test by making sure that the search engine can properly access is one of the best SEO practices in the industry. If your robots.txt file is missing, chances are all of your publicly available pages would be crawled and added to their index.

To start, just add /robots.txt next to your URL.


If you haven’t created a robots.txt file already, you can use a plain text editor like Notepad for Windows or TextEdit for Mac. It is important that you use these basic programs because using Office tools could insert additional code into the text.

However, you can keep in mind that you can use any program that is in the .txt file type. Robots.txt can help you block URLs that are irrelevant for crawling. Blocking pages such as author pages, administrator credentials, or plugins among others, will help the search engine prioritize the more important pages on your site.

The search engine bot will see your robots.txt file first as it enters your site for crawling.

use programs like Microsoft Word, the program could insert additional code into the text. Of course, you shouldn’t miss out on the robots crawling directives such as user-agent, disallow, allow, and the sitemap.

Onsite Diagnosis and Fix

After diagnosing your site through the different facets of the search engine (Google), it’s time for you to check your website as an entity. The tool we’ve always used to check on our site’s onsite status is Screaming Frog. We’ve always used it as the websites we handle grow larger as the months pass by. You set the parameters and it’s even capable of crawling/compiling outbound links to let you know if you have broken links. Here’s what the overview looks like:


Screaming Frog compiles all the different onsite factors and lists down errors that you might want/need to fix. Onsite factors that Screaming Frog shows are:

  • Protocol – If your pages are HTTP or HTTPS
  • Response codes – From Pages blocked by Robots.txt to server errors, these are all compiled and displayed by Screaming Frog
  • URL – If your URLs contain underscores, uppercases, duplicates, etc.
  • Page Titles – Everything that you need to know about your pages’ title tags
  • Meta Description – If your pages are missing their meta descriptions, duplicate meta descs, the length of your meta descriptions, etc.
  • Header Tags – Although Screaming Frog only compiles and displays H1s and H2s, these are already the most valuable aspect of your page structure.
  • Images – Missing alt text, image size, etc.
  • Canonicals
  • Pagination
  • And many more facets that are important for your SEO efforts.

Knowing the current state of your website’s pages is important since we are not perfect beings and we make the mistake of overlooking or forgetting to optimize one or two aspects of a page. So, use crawling tools like Screaming Frog to check the state of your pages.

Pagination Audit

Performing a pagination audit can affect your SEO efforts in your site because it deals heavily with how organized your pages are. Meaning, that task of pagination audit is done with the end goal of organizing sequential pages and making sure that these are all contextually connected. Not only is this helpful for site visitors, but it also projects a message to search engines that your pages have continuity.

Pagination is implemented for instances when you need to break content into multiple pages. This is especially useful for product descriptions used in eCommerce websites or a blog post series. Tying your content together will signal the search engine to think that your site is optimized enough to allow them to assign indexing properties to these set of pages.

How do you go about a Pagination? You have to simply place the attributes: rel=”prev” and rel=”next” in the head of each page in the series. Perform an audit by using an SEO Spider tool. While doing this, make sure that the attributes serve its purpose and that is to establish a relationship between the interconnected URLs that directs the user to the most relevant content that they need.

Pagination Audit should not be amiss since this maximizes the content in your site, allowing users to have a great experience in digesting these chunks of information. It is also very useful in increasing efforts for the navigation throughout the page.

XML Sitemap Check

XML sitemaps are especially useful because it lists your site’s most important pages, allowing the search engine to crawl them all and increase understanding on your website’s structure. Webmasters use the XML Sitemap to highlight the pages on their sites that are available for crawling. This XML file lists URLs together with additional meta-data about each of these links.

A proper SEO audit guide should always include the XML Sitemap Check because doing so will guarantee that User Experience always lands on a positive note. For you to make sure that the search engine finds your XML sitemap, you need to add it to your Google Search Console account. Click the ‘Sitemaps’ section and see if your XML sitemap is already listed there. If not, immediately add it on your console.

To check your sitemap for errors, use Screamingfrog to configure it. Open the tool and select List mode. Insert the URL of your sitemap.xml to the tool by uploading it, then selecting the option for “Download sitemap”. Screamingfrog will then confirm the URLs that are found within the sitemap file. Start crawling and once done, export the data to CSV or sort it by Status Code. This will highlight errors or other potential problems that you should head on out and fix immediately.  

XML Sitemap Index


Google places importance on delivering useful, relevant, and informative results to their users, so location is an important factor for the results that they display. If you search for “pest control Philippines”, Google will give you pest control companies in the Philippines – not pest control companies in Australia or any other part of the world.

ccTLD plays a role in stating which specific search market/location your site wants to rank in. Some examples of ccTLD would be websites ending in .ph, .au, etc. instead of the more neutral .com. If your website is, then you can expect that you’ll rank for and you’ll have a hard time ranking for international search engines like If you have TLDs that are neutral (.com, .org, or .net), then Google will determine the country where you can be displayed based on the content you publish in your site and the locations of your inbound links.

If you already have a target country in mind but you have a neutral TLD like .com, you can set it your website’s target country manually in Google Search Console. Here’s how

Go to the old version of Google Search Console → Click on Search Traffic → Then click on International Targeting → Manually set your target country

This is what it should look like:

International Targeting

Note that if you have a ccTLD, you won’t be able to set your target country and this option is only available for websites that have a neutral TLD.

Structured Data Audit

We all know that Google is constantly improving their algorithms to better understand content, semantics, and the purpose of websites. This is where structured data shines since it indicates or marks up specific entities in your pages to help Google and other search engines better understand it.

The most common format of structured data used by webmasters around the world comes from After indicating the necessary entities in your structured data, you can use Google’s Structured Data Testing Tool to check if your structured data has any errors. Here’s what it looks like:

Structured Data Testing Tool

After you’ve applied your structured data, you can use Screaming Frog to crawl your website and check if there any errors in the application or you can check which pages don’t have any existing structured data.

Structured Data

After finding the errors, apply the necessary fixes.

Internal Linking Audit

Internal links are one of the most important ingredients of SEO. The pages should appear connected, otherwise, it would just be a waste of a domain. Internal Linking Audit highlights user experience as its primary goal since the connections between pages may cause your site performance to falter. As your website is an interconnection of pages, you have to determine the most valuable content that you want the user to visit. This is why linking to these types of pages are one of the most important optimization tactics you can do.

Internal Links can spell the difference between leads that you can encourage to convert to a customer. Internal Linking Audit should make you well-versed in determining which URLs are relevant to your site. Check it by making sure that you are linking to the correct page version, most of which are in an HTTP/HTTPS format. Link on absolute URLs and do not forget to link to canonical page versions.

Check Follow/Nofollow Links

Dofollow and nofollow links have been the subject of debate in the SEO industry for a long time now. There are those who say that it is detrimental to SEO efforts and another side who promotes its importance for search engines. We stand with the latter.

A nofollow link will look like this:

<a href=”” rel=”nofollow”>Anchor Text</a>

This will instruct search engines to avoid this specific link. The attributes found above helps define the relationship that a page or content has with the link it is tagged with. Nofollow links are mostly used in blogs or forum comment because this deems spammers powerless. This was created to make sure that inserting links is not abused by those who buy links or sell them for their gain. As a webmaster, it is your job to check your pages for these links. Inspect the code and see if the links are tagged with its corresponding follow or nofollow attribute.

Check for Orphaned Pages

Orphaned pages are a big issue. These are pages that have no internal links toward them. Since bots crawl websites through links, orphaned pages would most likely be not found by crawl bots.

SEMRush Orphaned Pages

To check for orphaned pages, you could use SEMRush Audit. This should appear in the Issues report of the site audit but you will only have access to this report if your SEMRush account is connected to your Google Analytics account.

If orphaned pages are found in your website, you should check if these pages have importance. If yes, augmenting it and linking to it internally should be next step. If not, redirecting it or deleting the page should be enough.

SSL Audit

Security for users is one of Google’s priority. In 2014, HTTPS was announced as a ranking factor by Google

Secure Sockets Layer or better known as SSL, certifies that the user’s browser connects to a web server is encrypted and private. Getting an SSL certificate is easy. You could buy it online, activate it, then install it.

If you already have an SSL, make sure that you frequently check the status of it. You could use SSL shopper or other SSL checkers online.

Site Speed Test and Improvement

Site speed is a major ranking factor. Not only does slow site speed affect rankings, but it also affects user experience too, which is another ranking factor.

There are a lot of site speed analysis tools online. Google’s PageSpeed Insights is a great free tool to analyze your website. You should take note that website page speed scores are different for mobile and desktop.

Even if your website loads fast for desktop users, it might not be the same case with mobile users. Improvements for either is different as well. Make sure that your website is optimized for both.

AMP Setup and Errors

AMP is a great and easy way to make your website mobile friendly. Creating AMP pages are as easy as installing the AMP Project’s plugin if you are using a WordPress website. It automatically creates AMP versions of your selected pages.

If your website has AMP pages, Search Console will provide you a report for AMP pages they crawled in your website. You could check it under Enhancements.


To manually check how your pages look in AMP version, just add /amp at the end of any URL that you have. You could do this even if you are using a desktop.

Key Takeaway

In order to stay afloat in the world of SEO, you have to be mindful enough to perform all of the optimization practices found here. If you fail to do so, you are not going to harvest the growth that you seek. Might as well just give up on being a webmaster and be something else entirely.

There is a myriad of search algorithm updates, erratic market trends, increase in competition, among other things, all the more reason for you to be always on the move. With the help of the different tools that you can easily access with just a google search away, all of these can be done in a snap. If you are committed to these practices, SEO ranking would just be a light feather on your workload.

Is your website optimized well? What are the other SEO elements that you review on the daily?

SEO Audit 2019: A Comprehensive Guide was originally posted by Video And Blog Marketing