Month: February 2020

SEO Spiders: What are Search Engine Crawl Spiders & How Do They Work?

SEO

There are spiders on your website.

Don’t freak out! I’m not talking about real eight-legged spiders. I’m referring to Search Engine Optimization spiders. They’re the bots that make SEO happen. Every major search engine uses spiders to catalog the perceivable internet.

It is through the work of these spiders, sometimes referred to as crawl spiders or crawlers, that your website is ranked on popular search engines like Google, Bing, Yahoo, and others.

Of course, Google is the big dog of the search engine world, so when optimizing a web site, it’s best to keep Google’s spiders in mind most of all.

But what are search engine crawl spiders?

The crux of it is simple: In order to rank highly on search engine results pages, you have to write, design, and code your website to appeal to them.

That means you have to know what they are, what they’re looking for, and how they work.

Armed with that information, you’ll be able to optimize your site better, knowing what the most significant search engines in the world are seeking.

Let’s get into it.

What Are Search Engine Spiders?

Before you can understand how a web crawler works and how you can appeal to it, you first have to know what they are.

Search engine spiders are the foot soldiers of the search engine world. A search engine like Google has certain things that it wants to see from a highly ranked site. The crawler moves across the web and carries out the will of the search engine.

A crawler is simply a piece of software guided by a particular purpose. For spiders, that purpose is the cataloging of website information.

(Image Source)

Google’s spiders crawl across websites, gathering and storing data. They have to determine not only what the page is but the quality of its content and the subject matter contained within.

They do this for every site on the web. To put that in perspective, there are 1.94 billion websites active as of 2019, and that number rises every day. Every new site that pops up has to be crawled, analyzed, and cataloged by spider bots.

The search engine crawlers then deliver gathered data to the search engine for indexing. That information is stored until it is needed. When a Google search query is initiated, the results and rankings are generated from that index.

How Does a Crawler Work?

A crawler is a complicated piece of software. You have to be if you’re going to be cataloging the entire web. But how does this bot work?

First, the crawler visits a web page looking for new data to include in the search engine index. That is its ultimate goal and the reason for its existence. But a lot of work goes into this search engine bot’s task.

Step 1: Spiders Check Out Your Robots.txt File

When Google’s spiders arrive at a new website, they immediately download the site’s robots.txt file. The robots.txt file gives the spiders rules about what pages can and should be crawled on the site. It also lets them look through sitemaps to determine the overall layout of the pages and how they should be cataloged.

(Image Source)

Robots.txt is a valuable piece of the SEO puzzle, yet it’s something that a lot of website builders don’t give you direct control over. There are individual pages on your site that you might want to keep from Google’s spiders.

Can you block your website from getting crawled?

You absolutely can, using robots.txt.

But why would you want to do this?

Let’s say you have two very similar pages with a lot of duplicate content. Google hates duplicate content, and it’s something that can negatively impact your ranking. That’s why it’s good to be able to edit your robots.txt file to blind Google to specific pages that might have an unfortunate effect on your SEO score.

Google is super particular about things like duplicate content because its business model is dedicated to providing accurate and quality search results. That’s why their search algorithm is so advanced. If they’re providing the best information possible, customers will continue to flock to their platform to find what they’re looking for.

(Image Source)

By delivering quality search results, Google attracts consumers to their platform, where they can show them ads (which are responsible for 70.9% of Google’s revenue).

So, if you think that the spiders are too critical of things like duplicate content, remember that quality is the chief concern for Google:

  • Quality suggestions lead to more users.
  • More users lead to increased ad sales.
  • Increased ad sales lead to profitability.

Step 2: Spiders Check Your Links

One major factor that spiders hone in on is linking. Spiders can not only recognize hyperlinks, but they can follow them as well. They use your site’s internal links to move around and continue cataloging. Internal linking is essential for a lot of reasons, but they also create an easy path for search bots to follow.

Spiders will also take careful note of what outbound links, along with what third party sites are linking to yours. When we say that link building is one of the most critical elements of an SEO plan, we’re telling the truth. You have to create an internal web of links between your pages and blog posts. You also have to make sure you’re linking to outside sources.

But beyond all of that, you have to make sure that external sites that are in high favor with Google and relevant to your site are linking to you.

(Image Source)

As we mentioned in the last section, Google needs to know that it is giving high-quality and legitimate suggestions to searchers in order to maintain its dominance and, by extension, profitability.

When a site links to you, think of it as a letter of recommendation. If you’re applying for a job as a nurse, you will come prepared with letters of recommendation from previous hospital administrators and medical professionals with whom you’ve worked.

If you show up with a short letter from your mailman and your dog groomer, they may have beautiful things to say about you, but their word is not going to carry a lot of weight in the medical field.

 

SEO is a job interview with Google.

You’re interviewing for the top spots in your industry every second that you’re online. Google’s spiders are the HR representatives conducting the interview and checking your sources before reporting back to their higher-ups and deciding your eligibility.

Step 3: Spiders Check Your Copy

A common misconception about search engine spiders is that they just come onto the page and count all of your keywords.

While keywords play a part in your rank, spiders do a lot more than that.

SEO is all about tweaks to your copy. Those tweaks are made in an attempt to impress Google’s spiders and give them what they’re looking for.

(Image Source)

But what are search engine spiders looking for when they review your website copy?

They’re trying to determine three key factors.

  1. The relevance of your content. If you’re a dental website, are you focusing on dental information? Are you getting off-topic on random tangents or dedicating areas of your site to other unrelated themes? If so, Google’s bots will become confused as to how they should rank you.
  2. The overall quality of your content. Google spiders are sticklers for quality writing. They want to make sure that your text is in keeping with Google’s high standards. Remember, Google’s recommendation carries weight, so it’s not just about how many keywords you can stuff into a paragraph. The spiders want to see quality over quantity.
  3. The authority of your content. If you’re a dental website, Google needs to make sure that you’re an authority in your industry. If you want to be the number one search term for a specific keyword or phrase, then you have to prove to Google’s spiders that you are the authority on that particular topic.

If you include structured data, also known as schema markup, into the code of your site, you’ll earn extra points with Google’s spiders. This coding language gives the spiders more information about your website and helps them list you more accurately.

It’s also never a good idea to try and trick Google’s spiders. They’re not as dumb as a lot of SEO marketers seem to think. Spiders can quickly identify black-hat SEO tactics.

Black-hat SEO encompasses immoral tactics used to try and trick Google into giving a site a higher ranking without creating quality content and links.

An example of a black hat SEO tactic would be keyword stuffing, where you’re piling keywords nonsensically into a page. Another tactic that black hat SEO firms use is creating backlinks through dummy pages that contain a link back to your site.

And a decade ago, these tactics worked. But since then Google has gone through many updates, and its spider bots are now capable of identifying black hat tactics and punishing the perpetrators.

Spiders index black-hat SEO information, and penalties can be issued if your content is proven to be problematic.

These penalties can be something small yet effective, like downranking the site, or, something as severe as a total delisting, in which your site vanishes from Google altogether.

Step 4: Spiders Look At Your Images

Spiders will take an accounting of your site’s images as they crawl the web. However, this is an area where Google’s bots need some extra help. Spiders can’t just look at a picture and determine what it is. It understands that there’s an image there, but it is not advanced enough to get the actual context.

That’s why it’s so important to have alt tags and titles associated with every image. If you’re a cleaning company, you likely have pictures showing off the results of your various office cleaning techniques. Unless you specify that the image is of an office cleaning technique in the alt tag or title, the spiders aren’t going to know.

Step 5: Spiders Do It All Again

A Google spider’s job is never done. Once it is finished cataloging a site, it moves on and will eventually recrawl your site to update Google on your content and optimization efforts.

These bots are continually crawling to find new pages and new content. You can indirectly determine the frequency in which your pages are recrawled. If you’re regularly updating your site, you’re giving Google a reason to catalog you again. That’s why consistent updates (and blog posts) should be a part of every SEO plan.

How Do You Optimize Your Site for SEO Spiders?

To review, there are several steps that you can take to make sure that your site is ready for Google’s spiders to crawl.

Step 1: Have a clear site hierarchy

Site structure is crucial to ranking well in the search engines. Making sure pages are easily accessible within a few clicks allows crawlers to access the information they need as quickly as possible.

Step 2: Do Keyword Research

Understand what kind of search terms your audience is using and find ways to work them into your content.

Step 3: Create Quality Content

Write clear content that demonstrates your authority on a subject. Remember not to keyword stuff your text. Stay on topic and prove both your relevance and expertise.

Step 4: Build Links

Create a series of internal links for Google’s bots to use when making their way through your site. Build backlinks from outside sources that are relevant to your industry to improve your authority.

Step 5: Optimize Meta Descriptions and Title Tags

Before a web crawler makes its way onto your page’s content, it will first read through your page title and metadata. Make sure that these are optimized with keywords. The need for quality content extends to here as well.

Step 6: Add Alt Tags For All Images

Remember, the spiders can’t see your images. You have to describe them to Google through optimized copy. Use up the allowed characters and paint a clear picture of what your pictures are showcasing.

Step 7: Ensure NAP Consistency

If you’re a local business, you have to make sure that your Name, Address, and Phone Number not only appear on your site and throughout various third-party platforms, but that they are consistent everywhere. That means that no matter where you’re listing a NAP citation, the information should be identical.

(Image Source)

That also applies to spelling and abbreviations. If you’re on Main Street, but you want to abbreviate to Main St., make sure you’re doing that everywhere. A crawler will notice inconsistencies, and it will hurt your brand legitimacy and SEO score.

Step 8: Update Your Site Regularly

A constant stream of new content will ensure that Google always has a reason to crawl your site again and update your score. Blog posts are a perfect way to keep a steady stream of fresh content on your website for search engine bots to crawl over.

In Conclusion

A strong understanding of SEO spiders and search engine crawling can have a positive impact on your SEO efforts. You need to know what they are, how they work, and how you can optimize your site to fit what they’re looking for.

Ignoring SEO spider crawlers can be the fastest way to ensure that your site wallows in obscurity. Every query is an opportunity. Appeal to the crawlers, and you’ll be able to use your digital marketing plan to rise up the search engine ranks, achieving the top spot in your industry and staying there for years to come.

SEO Spiders: What are Search Engine Crawl Spiders & How Do They Work? was originally posted by Video And Blog Marketing

Optimize SEO: Almost a Decade into the Panda Update

Last updated on

Google-Algorithm-Tips-Optimizing-for-Panda-Update-Nine-Years-Later-Cover-Photo

Google has had its fair share of makeovers over the years that both seasoned and rookie SEO players have experienced. As the Panda Update celebrates its 9th birthday, it is an insightful effort to see how far it has come from its conception over a decade ago.

February 24, 2011, marks the date that Panda stirred the SEO world. Black and white-hat strategists alike restructured their perspective of how they present their site through their content. I remember the days when updates such as the aforementioned Panda or Penguin update kept us on our toes on what our next move will be.

This is what makes SEO exciting, it helps keep your mind sharp as you think of strategies and innovations that can help you build your visibility on the search engines. It is arguably harder to do SEO today with all the updates sprouting about but in the end, SEO is still a zero-sum game so you will be rewarded if you play right. Having said that, let’s take a look back at how the Panda update changed SEO as we see it today.

What is the Panda update?

Panda was launched for the purpose of hunting down content farms as they grew in number. Google decided that their quality should not be compromised because of this unethical method of publishing content. Their team branded the update as a way to find higher-quality sites.

The algorithmic improvement honored sites with original content and highly relevant branding. Furthermore, it tackled trust and authority. Looking at it now, this must-have served as a sneak peek for the E-A-T metric for sites.

Before it was known around the globe as the “Panda” update, it was first called “Farmer”. This is a subtle homage to content farms that have been dramatically hurt by the update. To rehash how the update shook most of the webmasters’ world back then, it is known as the major algorithm update that affected 12% of the English search results.

This is by far and large, a heavy percentage of the listings. Businesses came to a close as they were flagged down for using unethical techniques in increasing their site’s visibility. There were many casualties but this only proved that there is a right way to do SEO.

How did it affect our team?

We are a decade strong just like the Panda update and I have witnessed how it improved the search engine by a mile. The team I lead has always practiced the best for our clients with ethical and powerful strategies that help them secure a position on the first page.

The Panda update is a blessing to us because this has helped us rise above the ranks as a team that respectably practices SEO. Personally, my blog did not suffer from this update, but we felt its effect briefly as there is a slight dip in traffic after February 24.

Fortunately, it picked right back up again. Not bad for a site that’s not even a year old by this date.

panda update

Here is a comparison of our traffic from February to March:

comparison

Traffic was steady as February only lagged behind because it is limited in the number of days. This is no surprise since I have been very strict in the articles that I write ever since. I make sure that the blog’s content is relevant to the readers and will give them the value that can help them with their goals.

The Panda update proved to be an advantage for us because it went on to help us achieve an approximately 100% increase in traffic as its version 4.0 rolled out. Practicing ethical methods to win at SEO will help you stay afloat even if you are vulnerable to getting hit by an update. It pays to be a good guy and the Panda update is a testament to that.

Optimizing for the Panda Update in 2020

There’s no such thing as stagnant SEO. It is a common misconception that optimization is a passive chore. To help you stay on the winner’s side, you must adapt and work hard together with the updates that come from the system. More than that, optimize content for user experience and it will help you as much as you are helping your audience with the content you are generating. Here are some tips to help you stay on top of things in relation to the Panda update:

Design and develop a great website

You do not need extravagant web design, you just need to have a clean one. Your content would be put to waste if you do not have the right structure for it. Begin by having a clear vision of how your site will look as you insert your content. Although you can say that this is a vanity metric, having a clean and well-maintained site helps in improving the user experience of your site visitors

Publish original content

That’s all there is to it, really. Everything goes back to content. Creating content just for the heck of traffic will bite you in the butt when you’re least expecting it. Although we have algorithm updates to guide us on the right path, always keep in mind that you should create content for people. Provide useful, evergreen, unique, and quality content and be of value to your audience. Beware of risky content strategies that veer away from helping your site build up authority against competitors.

Focus on specific topics

It is easy to fall prey to writer’s block if you focus on a distinct topic but you should keep in mind that this will help you build authority for your site. One thing that the Panda update tackled is content focus. I recommend that you plan around the content themes that you know will give value to your readers.

Key Takeaway

It’s nice to look back on this and see how far SEO has come since its early days. Today, I am leading a team of over 50 people which is also a milestone since SEO Hacker was just a one-man team. Now, we have come a long way from having a few thousand readers to an average of 60,000 users. This is a fruit of the team’s labor in satisfying user intent and optimizing according to Google standard, we will be prepared for more updates in the years to come with powerful strategies that deliver results.

Optimize SEO: Almost a Decade into the Panda Update was originally posted by Video And Blog Marketing

Liquid Web Review: The Best Fully Managed Web Hosting Provider

Last updated on

We’ve all been there. Having an incompetent or below average hosting provider that did not perform up to par which led to our website’s malfunctioning or an intense slow down in page speed – both detrimental to SEO and your business website’s growth. Every minute that a website is down due to server problems, you lose traffic, you lose rankings, and you lose profit.

When looking for a hosting provider, there is a vast choice of companies that offer and promise the same thing – 100% uptime and 24/7 support. Until all sorts of problems come up and it takes days before you even get an initial response.

For this post, I’ll be reviewing the hands-down best hosting provider that I’ve ever had – Liquid Web. I’ve tried multiple hosting services in my decade of experience and thus far I had the best experience with Liquid Web. I’ve been a long time user of Liquid Web. In fact, I’ve used it for both my websites and my clients’ websites. Liquid Web offers fully managed web hosting and they are the best at it.

Before going into this review, I highly recommend that you give Liquid Web a visit and see what they offer. They have solutions that will fit your needs whether you are a freelancer, agency, or business owner.

Liquid Web: Fully Managed Hosting Provider

Liquid Web provides fully managed web hosting services and has been in business since 1997. They have more than 25,000 servers hosting more than 30,000 customers worldwide. They own three private core data centers; US-Central in Michigan, US-West in Arizona, and EU-Central in Amsterdam. Each data center has on-site security and highly trained technicians.

What is Fully Managed Hosting?

Fully managed hosting means the hosting company sets up everything for the customer; from the operating system, network infrastructure, hardware management, security scanning, plus 24/7 customer support and other additional services.

Fully Managed Hosting vs Unmanaged Hosting

Unmanaged hosting is like a DIY option. You just pay for the server and everything else is done manually by you. So if you don’t have any experience with IT, you’ll have to hire someone else to do it for you.

However, for fully managed hosting, you don’t have to go through all of that. To simply put it, you pay for the services and the company takes care of everything for you so you can spend your precious time doing something else for your business.

What Makes Liquid Web Special?

When it comes to hosting services, I always look for companies that I could trust. As an SEO agency with multiple clients and hundreds of tasks each every day, we have to trust that our hosting provider will take care of everything for us to save us time. And that’s what Liquid Web has been doing for us for a long time already.

High Performance and 100% Uptime

Liquid Web promises high-performance managed web hosting and 100% uptime. Currently, we are using Liquid Web’s dedicated server plan and we are hosting 50+ high-traffic websites on it. Everything is running smoothly and I could easily say we are getting what we paid for.

24/7/365 Customer Support

One of the things that I absolutely love about Liquid Web is their 24/7/365 live chat support. They brand their customer support team as the “Most Helpful Humans in Hosting” and I couldn’t agree any less. 

Liquid Web has a team of more than 250 highly-trained, knowledgeable, and friendly people who are ready to assist should you encounter any problems. They are Windows, Cisco, and Red Hat Linux certified so you are guaranteed to talk with someone who knows their stuff.

They have 3 options for support; phone call, live chat, and their ticketing system through email. For live chat, I usually get a response in less than 5 minutes at any time of the day. And for tickets, I get a response within 30 minutes after I send a request. This is REALLY good and an industry best.

For scale, most hosting companies outsource their support. I have experienced talking to people that have inadequate knowledge and would give generic, scripted answers.

Free Migration

I initially hosted just SEO-Hacker.com on Liquid Web to test the waters. And when we decided to transfer all of our sites here, their team assisted us all the way to migrate our sites from our old host. The best part of it is it’s free.

Website Security

Prior to Liquid Web, our websites regularly get cyber attacked and rather than focusing on developing and improving the websites, our team would spend hours cleaning malware. Liquid Web guarantees protection from any cyber threats and regularly scans for vulnerabilities. This is crucial because I have seen websites get penalized due to hacked content and that is an SEO’s nightmare.

Managed WordPress and WooCommerce

If you are building your website on WordPress or setting up an e-commerce site on WooCommerce, Liquid Web offers complete solutions for that.

Liquid Web’s Managed WordPress plan offers no traffic limits so you don’t have to worry about people slowing down your website, an automatic SSL, and daily backups. They also provide full server access that allows you full control of your website.

For WooCommerce solutions, Liquid Web gives you access to thousands of themes and flexibility to make your site look great plus its mobile optimized. Liquid Web also makes it easier to customize product categories and pages so you don’t waste that precious crawl budget. And since WooCommerce is built on WordPress, everything is SEO-friendly.

The best thing here, in my opinion, is speed. Liquid Web’s platform is built on PHP7, SSL, and Nginx that optimizes WordPress speed in its full potential. Website speed is one of the most important ranking factors for SEO and it keeps users from exiting the website even before the website loads.

Liquid Web’s Products

Unlike other hosting companies, Liquid Web does not offer shared hosting plans and only offers advanced plans. They also run their servers on SSDs (Solid State Drives). They are more expensive HDDs (Hard Disk Drives) but SSDs run faster and delivers better performance. Liquid Web offers:

  • Dedicated Servers
  • Virtual Private Servers
  • Cloud Dedicated Servers
  • Managed WordPress
  • Managed WooCommerce

Each product has pricing options depending on your needs and you save more with annual plans than monthly plans. To get more information about the products they offer and find what suits your needs, take a look at Liquid Web’s Hosting Plans.

Final Thoughts 

When it comes to hosting services, I would never settle for less and I would always want the best. As an SEO, I highly depend on my web hosting company to make sure that our clients’ websites are up at all times. Liquid Web has been the most reliable hosting company that I have tried. If you’re hosting company is giving you headaches with server problems and unresponsive customer support, I highly recommend that you check out Liquid Web and let the folks there do the work for you.

Liquid Web Review: The Best Fully Managed Web Hosting Provider was originally posted by Video And Blog Marketing

Get to Yes: Three conversion lessons learned from FBI hostage negotiation

(This article was originally published in the MarketingExperiments email newsletter.)

“When it comes to data, simplify. Become an essentialist. Get beneath the data to the essence. Look for patterns. Patterns transform info into wisdom.”

— Flint McGlaughlin, Managing Director, MECLABS Institute

Not too long ago, the aunt of a prominent Haitian political figure was kidnapped and held for a $150,000 ransom demand. Hostage negotiator Chris Voss managed to convince the kidnappers to release the hostage for just $1471 dollars and a CD player. It’s a pretty incredible (true) story.

So how did he do it?

In this video, Flint McGlaughlin explains Voss(s) secret, pointing out three lessons that business owners, marketers — and anyone else who needs to negotiate a deal — can learn to increase conversion.

Remember, the goal of a webpage is to win a yes. A sale is simply a big yes. It is the sum total of a series of smaller yes(s) the customer makes while interacting with your business.

Just as Voss knew that one small “no” in the negotiation process could put the safety of the hostage in danger, we should be aware that a single “no” anywhere in the customer journey process can jeapardize the sale.

Watch the video to get the macro YES.

If you would like your own webpage diagnosed on one of our upcoming YouTube Live sessions, you can send your website info through this form, and we’ll try to fit it in.

Related Resources

The Hypothesis and the Modern-Day Marketer 

The Power of Perceived Value: Discover how a well-marketed banana & roll of tape produced a windfall

Website Development: How a small natural foods CPG company increased revenue 18% with a site redesign

MECLABS Quick Win Consult : Get personalized, detailed conversion marketing advice at an affordable price

Get to Yes: Three conversion lessons learned from FBI hostage negotiation was originally posted by Video And Blog Marketing

What is SEO? The Beginner’s Guide to Search Engine Optimization

Guide

What is SEO?

SEO (Search Engine Optimization) is the marketing process to increase visibility within the organic search results, for the specific purpose of obtaining more traffic for a website.

How SEO Works

Every major search engine such as Google, Bing, and Yahoo provides a listing of websites that are deemed to best answer a search user’s query. Those listings are called search results. Within those search results you will find paid listings, but the bulk of the results are made up of what are called organic listings. The organic results are the free listings that the search engines have found, through their analysis, to be the most relevant to the user’s query.

Being listed at the top in the organic listings is at the root of why SEO is so important. Organic results currently receive a significantly greater share of traffic in comparison to paid ads. So, if your business isn’t being found in the organic listings, you are missing out on a large percentage of the consumer market.

Getting to the top of the results is where search engine optimization comes in. There are numerous strategies and techniques that go into creating a relevant and optimized website to truly compete for the keywords that can generate revenue for your business.

See Also:  Why is SEO Important?

How to Get Started with SEO

Basic SEO isn’t hard to get started with, it just takes time and plenty of patience. Web pages don’t typically rank overnight for competitive keywords. If you are willing to create a solid foundation of optimization for your website, and put in the time to build upon it, you will achieve success and grow your organic traffic. Businesses that cut corners may achieve short-term results, but they almost always fail in the long term and it is very difficult for them to recover.

So how do you go about optimizing your website? You must first understand how the search engines work as well as the various techniques that make up SEO. To make it easier to navigate, we have linked each section below so that you can jump to the section that interests you.

How Does Google Rank Websites?

The search engines use automated robots, also known as web crawlers or spiders, to scan and catalog web pages, PDFs, image files, etc. for possible inclusion in their massive indexes. From there, each web page and file is evaluated by programs called algorithms that determine if a file offers enough unique value to be included into the index.

If a file is deemed valuable enough to be added into the index, the file or web page will only be displayed in search results that the algorithms have determined are relevant and meets the intent of the user’s query. This is the true secret to the success of the search engines. The better they answer a user’s queries, the more likely it is that the user will use them in the future. It is generally accepted, as evidenced by the sheer number of their users, that Google delivers more relevant results because of their highly sophisticated algorithms that they are constantly improving. In an effort to maintain their competitive advantage, it is believed that Google makes 500-600 updates to their algorithm each year.

User intent is playing a greater role in how search engines rank web pages. For example, if a user is searching for “SEO companies,” is the user looking for articles on how to start an SEO company, or are they looking for a listing of SEO companies that provide the service? In this case, the latter is more likely. Even though there is a small chance that the user may be looking to start an SEO business, Google (with their expansive data on user habits) understands that the vast majority will be expecting to see a listing of companies. All of this is built into Google’s algorithm.

Search Engine Ranking Factors

As stated previously, these algorithms are highly sophisticated, so there are numerous factors that these programs are considering to determine relevance. In addition, Google and the other search engines work very hard to prevent businesses from “gaming” the system and manipulating the results.

So how many ranking factors are there? Well, back on May 10, 2006, Google’s Matt Cutts declared that there are over 200 search ranking signals that Google considers.

Then, in 2010, Matt Cutts stated that those 200+ search ranking factors each had up to 50 variations. That would bring the actual number to over 1,000 unique signals. Even though there are so many signals, they don’t each carry an equal amount of weight.

For websites wanting to rank for nationally competitive keywords, the signals that matter most are related to on-page, off-page, and penalty-related factors. On-Page optimization is related to content, and for a website to have a chance to rank for competitive search phrases, the content needs to be very relevant and answer the search user’s query. Off-page factors are related to the link popularity of the website and how authoritative external sources find your content. As for the penalty-related factors, if you are caught violating Google’s Webmaster Guidelines, you are not going to be competitive at all.

For websites that are looking to rank in their local market, the factors are very similar except for the addition of Google My Business, local listings, and reviews. Google My Business and citations help verify the actual location of the business and its service area. Reviews assist in determining the popularity of that local business.

On-Page SEO

On-page SEO is the optimization of the HTML code and content of the website as opposed to off-page SEO, which pertains to links and other external signals pointing to the website.

The overall goal behind on-page SEO is to make sure that the search engines can find your content, and that the content on the web pages is optimized to rank for the thematically relevant keywords that you would like to target. You have to go about things the right way, however. Each page should have its own unique theme. You should not be forcing optimization onto a web page. If the keywords don’t match the theme of your product or service, you may just need to create a different page just for that particular set of keywords.

If you optimize the title, description, and headings while creating content appropriately, the end user will have a significantly better experience on your website as they will have found the content that they were expecting to find. If you misrepresent the content in the title tag and description, it will lead to higher bounce rates on your website as visitors will leave quickly when they aren’t finding what they expect.

That is the high level overview of on-page SEO. There are several key topics regarding on-page optimization that we are going to cover including HTML coding, keyword research, and content optimization. These are foundational items that need to be addressed to have a chance at higher rankings in the search engines.

Website Audit

If your website is having difficulty ranking, you may want to start with an SEO audit. You need a 360-degree analysis to identify your website’s shortcomings so that you can correct them and put your site back on a more solid foundation. An audit can help identify glitches or deficiencies, both on and off page, that may be holding your website back from ranking for your targeted keywords.

There are SEO crawling tools that can help you identify basic technical issues fairly easily, making that part of the auditing process much more efficient. An audit typically would not stop at a technical analysis. Content and the link profile of a website also play a significant role in the ranking ability of a website. If you don’t have quality content or authoritative links, it will be difficult to rank for highly competitive search phrases.

See Also:  Technical SEO Audit Process + Checklist

HTTPS Security

Google, in an effort to create a safer and more secure web experience for their search users, has been pressing webmasters to secure their websites. Safe experiences are important for Google’s search engine. If search users have confidence in the safety of the results being displayed, they will be more likely to continue using Google’s search engine in the future.

While Google might have a good reason to encourage webmasters to secure their sites, webmasters have just as much incentive. When people are purchasing online, potential customers are going to feel much more confident completing a transaction with a site that is secure versus one that isn’t.

To strongly encourage webmasters to secure their websites, Google has integrated SSL as a search ranking factor. Websites that do have SSL certificates will be favored slightly above those that do not. It isn’t a big factor, but enough that if all other factors are equal, the site that is secured will win out.

Mobile-Friendly Design

Over 60% of all search queries are conducted on a mobile device now. Google is working to create the best possible search experience; therefore, they need to give preference to websites that are improving their mobile usability.

Providing a better mobile experience can be done in several ways. The most common method is that webmasters are creating responsive designs that will adjust based on the size of the browser that you are viewing from. Other sites have a mobile-specific version of their web page that is displayed when the server detects that the user is utilizing a mobile browser.

The speed at which mobile pages load is also important. Google backs a project called Accelerated Mobile Pages (AMP) and, in 2016, they integrated AMP listings into their mobile search results.

Taking into consideration that the majority of searches are now mobile based, Google is creating a mobile-first index. This will become Google’s primary index of web pages, and it has been rolling out slowly since the end of 2017. Should you be concerned if your site isn’t mobile friendly? Maybe not in the short term, but in the coming months, yes. Websites that provide a good mobile experience will have a greater ability to rank higher in the search results. This isn’t just Google, however. Bing is moving in this direction as well.

See Also:  How Google’s Mobile-First Index Works

Website Page Speed

Page speed is the length of time it takes for a web page to load. With Google’s mobile-first index, page speed is going to play a greater role as a ranking factor.

Once again, it is about the user’s experience. Visitors to a website just don’t want to wait for pages to load. The longer it takes for your web page to load, the greater the chance that the visitor will leave (bounce) and visit your competitor’s site. Google did a study and found that the difference between a 1s web page load time and a 5s time increased the odds that the visitor will bounce by 90%.

You need to pay attention to the rate at which visitors bounce off your website, as this is also a ranking factor. Not only that, those are lost revenue opportunities. Google cares about these metrics because they need to keep their search users happy with the results being displayed.

There are several things that you can do to improve page speed. Some of these items include the following:

  • Enabling file compression
  • Minifying CSS, Javascript, and HTML
  • Reducing page redirects
  • Removing render-blocking JavaScript
  • Leveraging browser caching
  • Improving server response time
  • Using CDNs (Content Distribution Networks)
  • Optimizing image file sizes

Website Crawlability

Search engine robots visit your website and make copies of each page for consideration into their index. If they deem the content valuable, they will include it in their index so that their search users can find it. That is why it is important to make sure that the search engines can easily find the pages within your website. If they can’t find your content, it can’t be included in their index for others to find.

There are technical problems that can prevent search robots from crawling your website, but most websites don’t experience these issues. One issue is that some websites use JavaScript and Flash, which can sometimes hide links from the search engines. When a robot is crawling a website, they crawl by visiting the links found within a web page. If they don’t see the link, they won’t crawl it, meaning they won’t know to consider the page for indexing. Another common crawling issue is the accidental inclusion of a “noindex” tag on a web page or the blocking of entire folders of a website from being crawled through the robots.txt file.

One way to help ensure your content is seen is by creating a sitemap.xml file. This file lists out the various URLs for a website, making it easier for the search engine robots to find the content of your website. Sitemap files are especially helpful for websites with thousands of pages, as links to some content may be buried deep within your website and harder for the search engines to find right away. As a general rule, you should update this file whenever new pages are added.

Avoiding Page & Content Duplication

Duplicate content can harm the ranking ability of some of the most important pages of your website. When Google is looking at the pages of your website and finds nearly identical content between two or more pages, they may have difficulty prioritizing which one is more important. As a result, it is very possible they will all suffer in the rankings.

Here are some unintentional, yet common causes of content duplication:

  • WWW and Non-WWW versions resolving: If the website resolves for both versions instead of one redirecting to the other, Google may index both versions. Google has done a much better job about handling this, but it does remain a possibility. You can solve this issue by ensuring that you have proper 301 redirects in place.
  • Pagination: This is very common ecommerce SEO issue, but does happen with others. Let’s say that you have a website that sells shoes with a category called “blue shoes.” You may sell hundreds of different “blue shoes” and you don’t want to show any more than 20 different styles at a time. In this instance, you would probably have multiple pages that you can click-through. As you click, you notice you are going to a web page with a URL of “blue-shoes/page2/”. That activity is called pagination. If not handled properly, Google may index every single page, each consisting of nearly identical content, when in reality you want the main category to be the ranking page within the search results. You can solve this problem by using rel=canonical tags.
  • URL Parameters: Staying with our “blue shoes” example, you may want to see what is available in a size 12. You click to sort to find what is available in a size 12, you stay on the page, but you notice the URL now says “blue-shoes?size=12”. The “size=12” is a URL parameter. Sometimes these URL parameters are indexed. This is another very common occurrence for ecommerce websites. You can solve this by using rel=canonical tags or by handling URL parameters through Google Search Console.

Structured Data

Structured data is organized information that can help the search engines understand your website better. Through the collaboration of Google, Bing, Yandex, and Yahoo; Schema.org was created to standardize the HTML markup of this information.

While there is no conclusive proof that structured data can improve your search rankings, it can help enhance your listings, which can improve your click-through rate. Click-through rate is something that is widely accepted as a ranking factor.

The following are some enhanced listing features as a result of structured data:

  • Star ratings and review data
  • Knowledge graphs
  • Breadcrumbs within your search result
  • Inclusion in a rich snippet box for a result

Implementing structured data is also helping you prepare for how search is evolving. Search engines are trying to provide instant answers to questions, which will play an even greater role with voice search.

See Also:  Understanding Google Rich Snippets

Role of Domain Names

There has long been a discussion regarding the value of having a keyword in the domain name. There was a time when there was a very strong correlation with ranking ability, but as the algorithms have evolved, that signal has diminished in importance.

That does not mean there isn’t value there, however. It simply means that you shouldn’t place a big focus on the domain name containing the keyword when there are so many signals that are more important.

SEO-Friendly URL Structures

URLs are one of the first things that a search user will notice. The structure of the URL not only has the ability to impact the click-through rates of your search listings, but can also impact your website’s search ranking ability.

  • Keywords in the URL: It is a best practice to include keywords in the URL as long as it accurately represents the content of the page. This will give additional confidence to the search user that the page contains the content that they are looking for while also reinforcing the important keywords to the search engines. Going with our blue shoes example, a URL would look like this: “/blue-suede-shoes/”

See Also:  How Valuable is it to Have a Keyword in Your Domain URL?

  • Don’t Keyword Stuff URLs: There is something to be said for having a little too much of something. Sometimes when you have too much, it can work against you, and the same goes for keywords in a URL. When you over-optimize, search users will see a spammy-looking URL, and that could hurt your click-through rates, but the search engines could also think that you are trying to game the system. A keyword-stuffed URL would look like this: “/shoes/blue-shoes/blue-suede/best-blue-suede-shoes/”
  • Pay Attention to URL Length: Keep your URL length at a minimum. Google will truncate URLs that are too long, and that can impact your click-through rate.
  • Avoid Dynamic URL Strings: Whenever possible, try to have a static URL with keywords versus a dynamic URL that contains symbols and numbers. They aren’t descriptive and will not help your listing’s click-through rate. A dynamic URL will look something like this: “/cat/?p=3487”

Title Tags

A title tag is the HTML code that forms the clickable headline that can be found in the search engine result pages (SERPs). The title tag is extremely important to the click-through rate of your search listing. It is the first thing that a search user will read about your page and you only have a brief few seconds to capture their attention and convince them that they will find what they are seeking if they click on your listing.

The following are some quick tips for writing title tags:

  • Title tags should contain keywords relevant to the contents of the web page.
  • You should place important keywords near the beginning of the title.
  • Try to keep titles between 50-60 characters in length. Google actually is looking for about 600 pixels, which 60 characters will fit within roughly 90% of the time. Beyond the 600 pixels, Google will display a truncated title, and your title might not have the impact that you intended.
  • Whenever possible, use adjectives to enhance the title. Example: The Complete Guide to SEO.
  • Don’t stuff your title with keywords. It will look spammy, and users will be less likely to click on your listing.
  • Don’t have web pages with duplicate titles. Each page needs to accurately describe its own unique content.
  • If you have room, try to incorporate your brand name. This can be a great way to generate some brand recognition. If you already have a strong brand, this may even improve the click-through rate of your search listing.

Meta Description Tags

A meta description tag is the HTML code that describes the content of a web page to a search user. The description can be found in the search engine result pages (SERPs), underneath the title and URL of a search listing. The description, while not a ranking signal itself, plays an important role with SEO. If you provide a relevant, well-written, enticing description; the click-through rate of the search listing will most likely be higher. This can lead to a greater share of traffic and a potential improvement in search rankings, as click-through rate is a search ranking factor.

The following are some quick tips for writing description tags:

  • Use keywords, but don’t be spammy with them. Keywords within a search query will be bolded in the description.
  • The rule of thumb used to be to have the length of a description to be no longer than 155 characters. Google is now displaying, in many instances, up to 320 characters. This is great from a search listing real estate perspective, and you should try to utilize as many of those 320 characters as possible. When doing so, make sure you get a clear message across in the first 155 characters just in case it gets truncated for a query.
  • Accurately describe the content of the web page. If people are misled, your bounce rate will be higher and will potentially harm the position of your listing.

Meta Keywords Tags

The meta keywords tag is HTML code that was designed to allow you to provide guidance to the search engines as to the specific keywords the web page is relevant for. In 2009, both Google and Yahoo announced that they hadn’t used the tag for quite some time, and in 2014, Bing acknowledged the same. As a best practice, most SEOs don’t use the tag at all anymore.

Header Tags

A header tag is HTML code that notes the level of importance of each heading on a web page. You can use 6 heading elements: H1, H2, H3, H4, H5, H6; with H1 being the most significant and H6 the least. Typically the title of an article or the name of a category, product, or service is given the H1 Tag, as they are generally the most prominent and important heading.

Headings used to play an important role in the optimization of a web page, and the search engines would take them into consideration. While they might not carry the same weight they once did, it is still widely believed they offer some ranking value and it is best to include them.

As an SEO best practice, it is best that you utilize only one H1 tag, as this will signify the primary theme of the web page. All other headings should use H2-H6 depending on their level of importance. From an optimization standpoint, it is safe to use multiple H2-H6 tags.

Competitor Research

Conducting a competitor analysis is a strategic function of properly optimizing a website. This can help you identify areas that your business’s website might not be covering. The analysis will help drive the keyword research and content development so that improvements can be made.

The following are some quick tips for conducting competitor research for SEO:

  • Architectural Flow: Look at the flow of the content on your competitor’s websites. Analyze how they have it structured architecturally to get deeper into the site. There may be some important takeaways from a website that ranks well, and that may make you decide to plan out your website differently.
  • Keyword Gap Analysis: Use a tool like SEMRush or Ahrefs to view the keywords that your competitors are ranking for. Compare that list to what you are ranking for and determine if there are any gaps that you need to cover.
  • Backlink Analysis: Similar to a keyword gap analysis, you can also do that for backlinks. Backlinks are extremely important to the algorithms of the search engines, so you might find some easy wins by seeing where your competition has links. There are great tools such as Ahrefs and Majestic that can help you find these backlinks.

Keyword Research

Keyword research is essential to any SEO campaign, and this is one area where you don’t want to cut corners. You need to understand how people search for your specific products, services, or information. Without knowing this, it will be difficult to structure your content to give your website a chance to rank for valuable search queries.

One of the biggest mistakes that a webmaster can make is focusing solely on ranking for one or two keywords. Often, there are some highly competitive, high-volume keywords that you feel you must rank for in order to be successful. Those keywords tend to be one to two words in length, and there is a pretty good chance your competitors are working hard on those very same keywords. Those make for good longer-term goals, and you should still optimize for those, but the real value is in long-tail keywords.

Long-tail keywords make up the vast majority of all search volume. These keywords are more specific, longer phrases that someone will type in, such as “blue 4-inch high heel shoes.” This is much more descriptive than “blue shoes,” even though that word might have a higher search volume. With these keywords being so much more descriptive, these types of customers also tend to be more in-market to transact now.

See Also: 10 Free Keyword Research Tools

SEO Copywriting & Content Optimization

You have probably heard that “content is king” when it comes to SEO. The thing about content is that it drives a lot of factors related to optimization. You need quality content to have a chance at ranking for the keywords that you are targeting, but for you to have success, you also need to have content on your website that webmasters find interesting enough to want to link back to. Not every page can do both. Pages that are sales focused, such as product and service pages, aren’t usually the kind of content that webmasters want to link to. Link generation from content will most likely come from your blog or other informational resources on your website.

The following are some quick tips for content optimization:

  • Understand your audience and focus writing your content for them. Don’t force something that won’t resonate. You will risk losing sales and increasing your visitor bounce rate.
  • Utilize the keyword research that you have performed. You should naturally work important keyword phrases (that you identified through your research) into your content.
  • Be certain the keywords that you are optimizing for are relevant to the overall theme of the page.
  • Don’t overdo it with keyword frequency. Important keywords should be in headings with a few mentions throughout the copy. Search engines use latent semantic indexing to process synonyms and the relationship between words, so just focus on writing naturally. If you use the same keywords over and over, it won’t sound natural, and you risk over-optimizing with the search engines.
  • Be thorough when writing. Content marketing statistics show that long-form content is nine times more effective than short form.
  • Write headlines that will capture the visitor’s attention.
  • Try to incorporate images, charts, and videos to make the content more engaging.
  • Tell a story and, whenever possible, include a call to action.

See Also: SEO Writing: Creating SEO Friendly Content in 6 Easy Steps

Breadcrumbs

Breadcrumbs can play an important role in helping Google better determine the structure of your website. They often show how content is categorized within a website and also provides built in internal linking to category pages within your website. When placing breadcrumbs, make sure to use structured data as Google may also choose to include these breadcrumb links within your SERP listing.

See Also: Bread Crumbs & SEO: What are They and How They’re Important for SEO

Internal Linking

To understand the importance of internal link building, you need to understand the concept of “link juice.” Link juice is the authority or equity passed from one web page or website to another. So with each link, internal or external, there is a certain amount of value that is passed.

When you write content for your website, you need to take into consideration whether there are any other relevant web pages on your site that would be appropriate to link to from that content. This promotes the flow of “link juice,” but it also provides users the benefit of links to other relevant information on your website.

Off-Page SEO

Off-Page SEO (also known as Off-Site SEO) refers to optimization strategies that are enacted outside of your website that may influence the rankings within the search engine results pages.

A large portion of Google’s algorithm is focused on how the outside world (the web) views the trustworthiness of your website. You may have the best-looking website with the most robust content, but if you don’t put in the work to get noticed, odds are good that your website won’t rank very well.

Since Google’s algorithm is non-human, it relies heavily on outside signals such as links from other websites, article/press mentions, social media mentions, and reviews to help determine the value of the information on your website.

Importance of Link Building

While it would be difficult to rank without quality content, you won’t have any chance on competitive keyword phrases without links. It is still universally accepted that the number and quality of inbound links to a website and/or web page is the number one influencer of search rankings.

With link building as the top search signal, you must have a solid strategy to attract and acquire links for your website. That being said, you should not even start this process without having a solid foundation built with content and design. Acquiring links is much easier when you have attractive, valuable resources that other websites will want to link to.

Link Quality

It is important to know that not all links are created equal. Google’s algorithm looks closely at the trustworthiness of the linking website. For example, a link for the New York Times, which has millions of links and tens of thousands of reputable and unique websites linking to it, will carry much more weight than a link from your friend’s free WordPress site he built a couple of weeks ago.

There are tools that estimate the authority of a website. Two popular tools are Moz’s Open Site Explorer, which calculates a Domain Authority score, and Ahref’s Domain Rank. These tools are great for determining what links are worth acquiring. The higher the score/rank, the more worthwhile it would be to spend time to acquire that link.

The number of links isn’t the only thing that the search engines look at from a quality perspective. They will also take into consideration how relevant the linking website/content is to your own site. If you are selling shoes, for example, a link from a fashion blog will carry more weight than a link from Joe’s Pizza Shack. It may seem crazy to even try to get a link from a pizza place for a shoe website, but in the early days, search engines focused more on the quantity of links. As webmasters caught on, some would try to acquire every link they could find to influence search results. To ensure the quality of results, the search engines had to focus on quality to account for this potential spam.

The Quantity of Links for a Website/Web Page

Even though the greater focus is on the quality and relevance of links, quantity still has a place. You want to continue to grow the number of authoritative links pointing to your website and its important web pages so that you have the ability to challenge your competition on highly competitive keywords.

Another aspect to the quantity discussion is that it is better to have 100 total links from 100 different websites as opposed to having 100 links coming from 10 websites. There is a strong correlation that the more links your website has from unique domains, the better chance you have for ranking improvement.

Of course, once you obtain top rankings, you can’t take a break. If you aren’t growing your link profile, you can trust that your competitors are, and they will eventually surpass you. For this reason, SEO has become a necessary ongoing marketing function for most organizations.

Link Anchor Text

Anchor text is the actual highlighted hyperlink text on a web page that directs a user to a different web page upon clicking. How websites link to you does make a difference in the search engine algorithms, and anchor text is part of that equation. The search engines look to anchor text as a signal for a web page’s relevance for specific search phrases.

Example: Getting a link with the anchor text of “Blue Shoes” on a fashion blog, would indicate to the search engines that your web page must be relevant to “blue shoes.”

In the past, the more links you had pointing to your web pages that contain specific keywords, the more likely you would rank for that keyword. There is still some truth to that, but if you have a high percentage of your links containing keywords, the search engines will suspect you are trying to manipulate the results.

A natural link profile generally contains a varied mix of anchor text, usually dominated by linking text that includes a website’s brand name.

Link Placement

Another factor that the search engines consider is the location of where the link is placed. This plays a role with how much weight or “link juice” will be passed to your own web page from the linking website. Footer and sidebar links are not given as much weight as they once were because of the abuse related to websites selling links in the past.

Links within the body or content of the web page generally designate greater importance to the topic being discussed and have more weight given to them. It is also widely accepted that the higher the link is within the content, the greater the weight and authority that is passed.

Do-Follow vs. No-Follow Links

A do-follow link tells the search engines that the website is essentially vouching for the web page that they are linking to and that search engines should follow that link and pass the appropriate link juice accordingly. A no-follow link has HTML code containing  rel=”nofollow” to tell the search engines not to follow the link to the destination and pass any authority or credit.

User-generated content such as comments, wikis, social bookmarks, forum posts, and even some articles have been abused by SEOs to obtain easy links for websites. No-follow links have become a very necessary part of search algorithms because of this user-generated content. Websites started placing the rel=”nofollow” tags to signify that they weren’t vouching for the links. This has led to a significant reduction in spam that was being generated on those websites, as the value of obtaining a link was no longer there.

Do-follow links are significantly more valuable because of the link juice and authority that is passed, but no-follow links are very natural to have in a link profile and can still offer value from the referral traffic that they may provide.

Earned Links vs. Buying

Purchasing links for the purpose of building the authority of your website is against Google’s guidelines and would put your website at risk of receiving a penalty. All purchased links for advertising purposes should be given a no-follow designation to keep your website out of harm’s way.

Earned links are what the search engines actually want to give credit to. For a link to be “earned,” there has to be a clear and compelling reason for one website to want to link to another. Being cited in articles or as a resource on a list are all good examples, assuming that there is a clear and relevant connection between your content and the linking website. Going back to the shoe website example, a link from a fashion blog, within the article content and to your web page, that details the top shoes trends is one that has a clear connection. If your shoe trends page had a link from a pizza shop within their footer, that is something that would not give any appearance of being natural. There would be no clear reason that they would link to that page, as the page is not relevant to their business, audience, or content.

Most reputable SEO companies offer link building services that create linkable assets or content to conduct outreach on your website’s behalf in order to obtain these earned links. You can also do this yourself by building well-thought-out resources or content and contacting websites that you believe would find your resources link-worthy.

Social Media and SEO

Even though many SEOs thought that social media was a ranking factor in Google’s algorithm, in 2014 Google’s Matt Cutts declared that it is not. In 2016, Gary Illyes, a Google Webmaster Trends Analyst, also confirmed that social media is not a ranking factor.

While social media might not be a direct contributor as a ranking factor, the following are a few ways social media can help improve SEO:

  • Link Acquisition: If you produce great content and people are sharing it through social channels, webmasters and bloggers might see that content and decide to link back to it on their actual websites. The more your content is shared, the greater your chances of naturally generating links will be.
  • Branded Searches: As your content, products, or services are discussed more frequently via social media, people will begin to seek you out on the search engines by typing in your brand name. If you offer a product or service, those search queries may contain your brand name plus the name of a product or service, like “[Brand Name] blue shoes,” for example. As Google or another search engine sees that you are relevant for a type of product or service based on actual customer queries, you may find yourself in a better position for the unbranded version of the query.
  • Positive Business Signals: Many businesses have a social media presence. Having an active profile can show that you are actually making an effort to engage your customers. The search engines look at numerous pieces of data to determine the legitimacy of a business or website, and socials signals could potentially be seen as a validator.

SEO Success Metrics

Every business is different, and the same can be said for their specific goals. SEO success can be measured in a variety of different ways. A plumber may want more phone calls, a retailer may want to sell more product, a magazine publisher may want to simply increase page views, and another business may just want to rank higher than their competitor.

The following are some common metrics for gauging the success of an SEO optimization campaign:

  • Leads/Sales: Most businesses are looking to receive a return on their investment regardless of the type of marketing medium. SEO marketing is no different in that regard. Ultimately, the best gauge of success is if the efforts lead to an increase in sales and leads. If sales are the goal, the focus of the campaign should revolve around search phrases that would attract in-market consumers versus keywords that are more inquisitive in nature. Keywords that are question-based may deliver traffic, but they typically have very low conversion rates. That should be part of a larger brand strategy after your bases are covered with your money keywords.
  • Organic Search Traffic: If your organic traffic is increasing, that is a good sign that your SEO campaign is working. It shows that you are moving up in the search rankings for your chosen keywords and that users are clicking on your search listings. If you know that you are ranking well for keywords that have decent search volume, but you aren’t receiving much in terms of traffic, you may want to work on improving your Title Tag to generate a higher click-through rate. Within Google Search Console, you have the ability to view keywords and their corresponding click-through rate with your search listing.
  • Keyword Rankings: Many businesses look at keywords as an indicator of how a search campaign is performing. This can also be a misleading indicator if it isn’t viewed in the right context. If you are solely focused on the performance of a few “high volume” keywords, you could be feeling discouraged even though, through the eyes of a more seasoned SEO expert, you may be making great progress. Highly competitive keywords will take time, and sometimes a significant amount of time to achieve first page rankings. What you should be looking at is whether positive progress is being made on those terms, to the point you begin to see a path of obtaining your desired results over time.
  • Keyword Diversity: It is easy to look at the high volume, competitive keywords and feel that you need to rank for those to be successful. Those are great longer-term goals, but the fact remains: 70% of all search traffic is going to long-tail keywords. Tools like SEMRush and Ahrefs monitor millions of keywords and can show you how many keywords your site or web page is ranking for. If you see this number continue to increase, that is another good indicator that your campaign is going in the right direction.

See Also:  Best Tools to Monitor Keyword Search Rankings

Creating a Winning SEO Strategy

Most businesses today know the basic concepts around SEO. You can do a lot of individual SEO functions well, but success comes with how you put the pieces together to form a cohesive strategy.  That strategy will create the foundation for organic traffic growth that will propel a business for years to come.

You need to be prepared to spend a significant amount of time mapping out the strategy because it is much better and more effective to do it right the first time. The following are some of the key components found in nearly every successful SEO strategy:

  • Identify Your Goals: For a successful campaign, you first need to have a clear idea of what you are trying to accomplish. That will drive the strategy moving forward, as everything that you do will be to achieve that goal. For example, if your goal is to generate leads or sales, the structure of your website and the content written should be focused on sending customers into your sales funnel.
  • Focus on Topics: Topics should dictate the design and flow of the website and its content. Knowing your products and services, make a list of high-level topics that will form the content pillars of your website. Typically, these topics are going to be the shorter-tail, higher-volume, and more competitive keywords.
  • Do the Research: After you settle on the topics that properly describe your products and services, you need to perform research to find the longer-tail keywords that customers would search for to find information about those topics. This type of keyword often answers a wide range of questions.
  • Create the Content: When creating the pillar content, be as thorough as possible. The trends show that search engines have been placing a higher value on long-form content. From there, you create the supportive content based off long-tail research performed on the topics. The pillar content should link to the supportive content, and the supportive content should link back to the pillar. This will help formulate a semantic relationship that search engines love.
  • Promote: It isn’t enough to create great content. Most businesses don’t have a large audience that is following their blog closely. So if only a few are reading the content, you won’t be getting many links into it without some form of outreach. Link building is essential to a successful SEO campaign. If you have a piece of content that is a valuable resource, you should contact like-minded websites you feel would find that resource to be valuable as well. Give them the reason why you feel their readers or customers will see value in it, and they may just decide to provide a link back. Another way to promote is through social media. The more people that share your content, the more likely someone will post a link on their website to that content. If you don’t have a large following, you may want to spend a little on social media advertising to boost your social presence.
  • Make the Time: The hardest thing to do is make time. SEO is not a “set it and forget it” marketing activity. You need to dedicate time each week to achieving your goals. Once you achieve them, make new goals. If you don’t put in the effort, you will find your competition passing you by. If you don’t have the time, you should consider hiring a company that offers SEO services.

See Also:  Ecommerce SEO Best Practices Guide

See Also:  Beginner’s Guide to Amazon SEO

See Also:  Local SEO for Multi-Location Businesses

When Should I Expect to See Results?

A standard rule of thumb is that you should start to see results within 3 to 6 months of starting an SEO campaign. That does not, however, mean that you will be at the top for all of your keywords.

The following are several factors that contribute to how quickly you will see success with your SEO campaign:

  • Your Initial Starting Point: Every website can achieve results, but you need to set reasonable expectations. Every website is unique. Each site will have different content, architecture, and link profiles. A website that was created a few months ago will react very differently compared to one that has years of history behind it.
  • Keyword Competition: Not all keywords are created equal. Typically, higher-volume keywords also have the highest level of competition. If you aren’t currently ranking on the first three pages for those competitive head terms, you should expect that it will take a fair amount of time (sometimes longer than a year) to really make headway. Longer-tail keywords tend to have lower levels of competition, but provide upwards of 70% of all organic search traffic. Depending on how established your website and brand is, most websites will reap the biggest benefits on long-tail keywords early on.
  • SEO Strategy: Forming a solid SEO strategy is key to how your website will perform in the search results. The strategy sets the direction for all campaign activities. If executed properly, you should begin to see SEO success building throughout the campaign.
  • Campaign Investment: The old saying “you get back what you put into it” definitely applies to SEO. That is true for both time and money. If you are not dedicating time to search engine optimization activities, you aren’t going to see the results. The same can be said if you are looking for an SEO company to manage the campaign. If you aren’t willing to invest in SEO and are looking for a cheap option, the provider won’t be able to dedicate much time and effort to deliver the results you are looking for. With that being said, some investment is better than no investment. You will have to just adjust your expectations accordingly.

Search Engine Algorithm Updates

Algorithm updates are changes to the evaluation of ranking signals and the way they are applied to the search index to deliver relevant results for search queries. Google makes roughly 500 algorithm updates per year. Of those 500 updates, typically a few would be deemed significant with the vast majority considered to be minor adjustments. Until recently, the major updates were provided names, either by Google or the SEO community. Now, major updates are usually rolled out within the main core algorithm, making it more difficult to determine the true nature of the changes that have been made.

The following are some of the more well-known algorithm updates over the years:

  • Panda: Presenting high-quality content to search users has long been a focus for Google. The Google Panda update was created to reduce the number of low-quality, thin-content pages displayed in their results while rewarding higher quality pages that offer unique, compelling content.
  • Penguin: The amount of links pointing to a website plays a very significant role in how well a site will perform in SERPs. With it being such an important ranking factor, link amount was an easy target to be abused by unethical SEOs. To combat link webspam, Google Penguin was released. The purpose of the update is to filter out the spammy, manipulative links while rewarding links that are natural and relevant.
  • Hummingbird: The Google Hummingbird update was a major overhaul of the core algorithm and how Google responded to user queries. This update was focused on providing relevant results based on identifying the user’s intent using Google’s Knowledge Graph and semantic search.
  • Pigeon: The Google Pigeon update was focused around improving localized search capabilities. Google strengthened the ranking factors related to proximity and distance of a business to improve results for search queries that have localized intent. For example, if a search user types in “plumber,” the intent is to see plumbers in their particular area, not a list of companies around the country.
  • RankBrain: Google’s algorithm uses artificial intelligence to interpret search queries and deliver results that best match the user’s intent, even though the results may not display content that matches the exact query that was searched. Google continues to use this machine learning to evolve its results as the algorithm learns more about user intent related to queries.
  • E-A-T/Medic Update: Google rolled out a core algorithm update in August of 2018 that was early on coined the “Medic Update”, but is also commonly referred to at the E-A-T update as the primary component focuses around expertise, authoritativeness, and trustworthiness. Health and YMYL (Your Money Your Life) related content that wasn’t clearly written by a reputable author (Doctor, Financial Advisor, etc.), and/or sites that had very little or a negative online reputation seemed to impacted the most.

Search Engine Penalties

There are two varieties of search engine penalties that a website can receive: manual or algorithmic.

Manual penalties are a result of a Google employee manually reviewing a website and finding optimization tactics that fall outside of Google’s Webmaster Guidelines. With a manual penalty, the business will receive a manual penalty notice within Google Search Console.

Algorithmic penalties are a result of Google’s algorithm finding tactics that they believe are not consistent with compliance of Google’s Webmaster Guidelines. There is no notice that is typically provided with these types of penalties. The most common way a webmaster would determine if they had an issue is by matching updates of traffic/ranking losses with known Google algorithm updates. As Google has integrated key updates (such as Penguin) within their core algorithm, it has become much more difficult to pinpoint the exact nature of an update. In those instances, there is a heavy reliance on chatter within the SEO community to see if there are common traits shared by sites that have been impacted.

The two most common penalties revolve around two of the more important aspects of Google’s algorithm: content and links.

With Google Panda, sites with very thin or duplicate content see significant drops in rankings, traffic, and indexed pages.

Websites that have a spammy link profile can receive an “unnatural links” penalty within webmaster tools, or be caught within Google’s Penguin algorithm. Sites hit by these have an extremely difficult time ranking well for keywords until they begin the penalty recovery process and the offending links have been removed or disavowed.

White Hat vs. Black Hat

There are two schools of thought when it comes to SEO. One has to do with long-term growth with long-term rewards (White Hat), and the other is to grow as fast as possible while reaping the rewards until you are finally caught breaking the rules (Black Hat).

Ethical SEOs employ what are considered White Hat SEO tactics. These SEOs are focused on creating positive experiences with quality content, while building natural links to the website. These tactics stand the test of time and usually experience more positive gains than negative when a Google Algorithm change occurs. These tactics build a solid foundation for sustainable growth in organic traffic.

Black Hat SEOs don’t mind violating Google’s guidelines. They are typically focused on improving search rankings quickly to profit off of the organic traffic increase while it lasts. They will use tactics such as purchasing links, creating doorway pages, cloaking, and using spun content. Google eventually detects these tactics and will penalize these websites accordingly, and once they do, Black Hat SEOs usually move on to their next short-term conquest. This is not a tactic that real businesses would want to use, as it can be very detrimental to the future of your online presence and reputation.

What is the Future of Search Engine Optimization?

The future of SEO will revolve around voice search. This is truly the next frontier that the search engines are working through, and SEOs will need to be ready.

When it comes to nationally competitive keywords, obtaining the feature snippet or top ranking for keywords will be very important. You will need to work on making sure that you have content that will answer the questions of search users.

For Local SEO, maintaining consistency in name, address, and phone (NAP) data across the phone directories will be important, but you will need to work on obtaining reviews to standout from the crowd.

See Also:  How Popular is Voice Search?

SEO Consultants

If your company has made the determination that you will need to hire a digital marketing agency, either for a lack of available time or expertise internally, the evaluation process is very important.

The following are some key areas that you will want to evaluate when choosing an SEO firm:

  • Consultation: You don’t want an agency that runs their business like it’s on autopilot. If they aren’t actively trying to learn about your business or goals, you should eliminate them from consideration. Once you get to the presentation stage, you shouldn’t accept them just emailing over the proposal either. You should be engaged and listen to what is being proposed and any SEO advice that is being offered.
  • Plan of Action: When selecting any vendor, you need to believe in what they are going to do for your company. As part of the proposal process, the prospective SEO provider should provide you with a detailed plan of action. This plan should encompass all areas of SEO including on-page, off-page, and how success will be measured. A timeline of SEO activities is also something that most reputable firms will provide.
  • Quality over Quantity: Some SEO vendors will promise a very large number of links and other deliverables each month. That may be realistic if you are paying several thousands of dollars monthly, but if it sounds too good to be true, it probably is. Quality work takes time and usually doesn’t come in large quantities.
  • Case Studies: Some people say that you should get references, but odds are pretty good that the provider has cherry-picked a list of clients that will have nothing but glowing remarks. Case studies offer much greater value as you can read about real clients, their problems, and how the prospective company produced results for them.
  • Expertise: When it comes to SEO, it is usually better to choose a company that offers it as their primary service. Search optimization is continuously evolving and it is better to be with a company that is in the trenches every day, aware of the latest SEO trends and research. You will want to partner with a company that is plugged in. Also, ask about their recognitions and involvement within the SEO community.
  • Success Metrics: Work with a company that talks about metrics other than just rankings. If they are talking about generating sales and leads, they probably aren’t the right fit. They can get you ranking for keywords, but keywords that lead to conversions are where it matters most.
  • Cost: Real search engine optimization takes time and effort. If you are being presented with $199 per month options, you are probably looking in the wrong place for vendors. You don’t have to mortgage everything to purchase SEO services, but you need to have enough of a budget to have actual work performed.

SEO Careers

Search engine optimization is an important skill in the fast-growing digital marketing industry which, according to Forrester Research, will grow to $120 Billion in marketing spend by the year 2021.  

If you are considering a career as an SEO specialist, the following are some traits that would serve you well in this field:

  • Enjoy Changing Environments: Even though the core principles of SEO haven’t changed much over the years, algorithms do get updated. Aside from algorithms, each client presents their own unique challenges.
  • Love to Research: Half of the fun of being an SEO professional is the research. Test-and-learn environments can present exciting new discoveries.
  • Great Organizational Skills: To be successful, you need to be able to formulate strategies and execute them.
  • The Passion to Win: If you like the feeling that you get when you win, just imagine ranking number one for a highly competitive keyword!

See Also:  SEO Certifications: Are They Beneficial & Do They Make you an Expert?

See Also:  SEO Courses: The Best Free and Paid Training Options

See Also:  Digital Marketing Career Guide: How to Get Started

SEO Tools

Like almost any other profession, you need to be equipped with the right tools in order to succeed. The following is a list of popular search engine optimization tools that are used by industry professionals.

  • Moz SEO Products: Moz is one of the best-known SEO product providers. Moz Pro has tools for keyword research, link building, website auditing, and more. They also have a product called Moz Local that helps with local listing management.
  • Ahrefs: Ahrefs offers a wide range of tools such as a keyword explorer, a content explorer, rank tracking, and a website auditor. Their most popular feature is their backlink analysis tool, which assigns ratings to how powerful each domain and specific URL is based on the quality of links pointing into them. They are always coming out with new features.
  • SEMrush: SEMrush offers organic and paid traffic analytics. You can type in any domain or web page and see what keywords are ranking. They also offer a variety of other features such as a site auditor and a content topic research tool. This is another service known for frequent product updates and new features that has become an invaluable resource for many SEOs.
  • Majestic: Majestic offers a tool that allows you to analyze the backlink profile of any website or web page. They have crawled and indexed millions of URLs, categorizing each page based on topic. They have also assigned Trust Flow and Citation Flow ratings to each URL to represent, based on their analysis, how trustworthy and influential a web page or domain may be.
  • Screaming Frog: Screaming Frog offers a popular desktop application tool that will crawl websites, detecting potential technical and on-page issues. The software is available for PC, Mac, and Linux.
  • DeepCrawl: DeepCrawl is a website spidering tool that will analyze the architecture, content, and backlinks of a website. You can set the tool to monitor your website on a regular basis so that you are aware of issues as they arise.
  • LinkResearchTools: Well-known for their backink auditing tool, LinkResearchTools has helped many SEOs identify high-risk links within a backlink profile. They also offer tools to identify link building outreach targets.
  • BuzzSumo: This platform gives you the ability to analyze content for any topic or competitor. You can identify content that has been performing at a high level by inbound links and social mentions. BuzzSumo also allows you to find key influencers that you can reach out to for promoting your own content.
  • Google Analytics: By far the most popular web visitor analytics tool, it allows you to analyze data from multiple channels in one location to provide a deeper understanding about user experience.
  • Google Search Console: This is an essential tool that helps a webmaster understand what Google is seeing in regard to their website. You can see web indexing status, crawl errors, and even the keywords that you are ranking for. They will also send you messages if your website has received a manual penalty.
  • Bing Webmaster Tools: Bing’s equivalent to Google Search Console, you can run a variety of diagnostic and performance reports. You also can receive alerts and notifications regarding your website’s performance
  • Google PageSpeed Insights: PageSpeed Insights provides real-world data on the performance of a web page across desktop and mobile devices. A speed score is provided, along with optimization tips, to improve performance. You can lose website visitors if pages take too long to load, making this tool very useful.
  • Structured Data Testing Tool: This tool allows you to paste code or fetch a URL to verify proper coding of the structured data.
  • Mobile-Friendly Testing Tool: This tool provides an easy test to check if a website is mobile friendly. If there are page loading or other technical issues, it will provide additional details.

Helpful Resources

Knowledge is power with SEO, as it is with most industries. You need to stay aware of the latest news, research, and trends so that you are able to adapt accordingly. The following are some educational resources that provide a wealth of knowledge on the subject of search engine optimization:

  • HigherVisibility’s Insite Blog: The Insite blog is a great resource for business owners and marketers to learn more about various aspects of digital marketing. There is a heavy focus on search engine optimization, but paid search and social media are also covered well.
  • HigherVisibility’s Resource Center: The resource center provides a collection of digital marketing best practices, research studies, and guides.
  • The Moz Blog: The Moz Blog provides insights from some of the search industry’s greatest minds. The blog provides advice, research, and how-tos.
  • Search Engine Land: One of the industry’s most popular blogs, Search Engine Land provides breaking news, research, white papers, and webinars on almost anything related to digital marketing.
  • Search Engine Watch: Search Engine Watch is a top industry blog covering topics such as search engine optimization, PPC, social media, and development.
  • Search Engine Roundtable: Search Engine Roundtable was founded by industry veteran Barry Schwartz, covering breaking news and recapping events in the world of search.
  • Search Engine Journal: SEJ is a popular blog founded by Loren Baker that provides the latest news in the search marketing industry.
  • Google Webmaster Central Blog: This blog provides official news on Google’s search index straight from the Google Webmaster Central team.

SEO Conferences & Events

If you want to learn from the greatest minds in the industry, attending conferences and events is an excellent way to do it. The following are some of the most popular conferences that have a strong focus on search marketing:

  • MozCon: MozCon is a conference that provides tactical sessions focused on SEO and other topics related to digital marketing. There is always a quality lineup of top industry veterans leading the discussions.
  • SMX (Search Marketing Expo): SMX is produced by Third Door Media, the company behind Search Engine Land and Marketing Land. This is one of the leading events for search marketing professionals, offering a wealth of knowledge with various sessions and training workshops.
  • SearchLove Conference: Bringing together some of digital marketing’s leading minds, the SearchLove Conference helps to get marketers up to date on the latest trends in online marketing.
  • PubCon: PubCon is a leading internet marketing conference and tradeshow that provides cutting-edge, advanced educational content to marketers.

What is SEO? The Beginner’s Guide to Search Engine Optimization was originally posted by Video And Blog Marketing

Google Releases Update to Change of Address Tool

Last updated on

Google search console update Photo

As Google slowly migrates old tools they had to Google Search Console, they’re also coming up with new updates that will help webmasters and SEOs do their work, monitor changes, and accurately track their website’s performance. Now, with their recent update to Google Search Console’s Change of Address Tool, we’ll now be able to update and monitor our site moves. Here’s how:

Google Search Console Change of Address Tool Update

Through the change of address tool, you can notify that you’re moving your old domain to a new one. This leads to them prioritizing the crawling and indexing of the new domain as opposed to the old one. The change of address also tells Google to forward the authority and signals from the old domain to the new one. 

So, aside from redirecting the pages of your old domain to the new one, it’s necessary for you to use the change of address tool to fully and securely migrate your website. But Google rolled out an update that we’ll all like: 

Redirect Validation

Redirection Verification screenshot

Since redirects are a massive and integral part of site migration or moves, it makes sense that Google will come up with this new feature in Google Search Console’s change of address tool. With this new feature, you’ll be able to validate if the redirects you have for your top URLs are valid or are erroneous. 

As you can see from the screenshot above, it’ll show you the validity of your redirects and if there are pages from the old site that currently don’t have redirects. This could potentially be of massive importance in assuring that all of the pages you want to redirect are properly redirecting. 

Prompts

Prompts in change of address tool screenshot

Once you’ve started with your change of address request, Google Search Console will have prompts that will remind you that the site is moving to a new domain. You will see the prompts once you log in to the dashboard of the website you’re migrating from or the website you’re migrating to. 

Our Experience with the Change of Address Tool

The primary reason why we’re particularly excited about this new update is that migrating websites to new domains is a difficult undertaking just because it involves too much monitoring and quality assurance just to get the best results. 

Of course, we’re not exceptions to the failures of site migrations. One of which happened a year ago with one of our clients. To give you an idea on what happened with our site migration here’s a graph of their movement for one of their most important keywords:

site migration ranking graph screenshot

8 months. 8 full months of missed opportunities for traffic, conversions, and growth. While this was happening, we did a deep dive on why this was happening and we found out that the old website’s pages were still ranking for the same keywords even though we’ve redirected them properly and we’ve used the change of address tool to signal to Google that our website is moving. Fortunately, we still recovered. 

However, it makes me think that if we had these new features, we’d be able to track better and monitor better our migration since we also had pages that weren’t redirecting properly, etc. 

Key Takeaway

Although this might not seem like a large update – it isn’t – we can’t deny the fact that this update is a massive help for SEOs and webmasters alike. Google also advises us that site moves or migrations will take up AT LEAST 180 days or more. This is what they said regarding the 180 day period:

“After the 180 day period, Google does not recognize any relationship between the old and new sites, and treats the old site as an unrelated site, if still present and crawlable.”

Do you have any stories about a site migration gone wrong or are you just excited about this feature? Comment your stories and opinions down below!

Google Releases Update to Change of Address Tool was originally posted by Video And Blog Marketing

Conclusions Precede Decisions: Two case studies that teach the most effective way to amplify conversion

(This article was originally published in the MarketingExperiments email newsletter.)

Data is about working backward through a customer’s behavioral traces into their decisions. These decisions are based on conclusions customers make from observation sets that we, as marketers, present to them.

In this video, MECLABS Managing Director Flint McGlaughlin shares a deeper understanding of how people make conclusions that impact the choices they make, choices like making a purchase or completing a lead generation form.

To help you apply these principles to your own marketing, McGlaughlin shows case studies about landing page optimization for a mattress brand and pricing strategy for a software as a service. He explains how to improve the content and presentation of an observation set to get better conversion results.

If you would like your own webpage diagnosed on one of our upcoming YouTube Live sessions, you can send your website info through this form, and we’ll try to fit it in.

Related Resources

If you’d like hands-on help improving your calls-to-action, you might want to consider a Quick Win Intensive for your company.

In this session, McGlaughlin also discusses a variable cluster test. You can learn more about here:

McGlaughlin also discusses value proposition, which you can learn more about here:

Conclusions Precede Decisions: Two case studies that teach the most effective way to amplify conversion was originally posted by Video And Blog Marketing

How to Make a Traffic-Boosting Content Structure

Last updated on

How-to-make-a-traffic-boosting-site-structure-Cover-Photo

How often do SEOs repeat time and again in their minds that content is king but fail to make it convert to traffic? It all boils down to SEO site structure and how content plays a role in leading users through a significant experience on their website. I have been in the industry for a decade now and I have learned that your content is just as good as its value for promotion. If your content is all over the place with no foundational basis on how you can use it to boost traffic to your site then your efforts may well be wasted.

You must be asking, “How do I know that content is really the key ingredient for winning at digital marketing?” Well, you have your tracking tools to give you the numbers you need to justify success but in times when you need a more human approach to your analysis, you have to look at the way you have structured your content for users to see.

Let’s talk about how you can start building a strategy centered around your content’s overall structure on your SEO site.

Keyword Planning for SEO Site Structure

Keyword research is a perpetual effort. You should have a well-rounded approach to your approach to discovering keywords that give you ROI. As an SEO specialist, I never stop looking for keywords that work well for my clients. Especially since search engines can update in the blink of an eye, you too should adapt just as quickly. Even better, if you plan keywords for prevention of getting hit with an update, this would require sticking to topically relevant keywords to match evergreen content.

keyword planning for seo site structure

Special shout out to Mangools for their new dark mode visual, looks great guys! Anyway, back to the subject, a well-planned content structure would start with your initial research for seed keywords. From there, you have to plan for your vision of content. Begin with the very purpose of your online presence. What do you plan to give to users? How are you going to provide them with value? If you’re an SEO webmaster like I am, of course, you have to search for SEO keywords. Start from there and segment your keywords per product, service, or whatever section you would be creating landing pages for.

Consider SEO Siloing for Content

Consider seo siloing
SEO siloing is one of the strategies that you should consider if you want to fix your site’s content structure. There are two kinds of siloing: Physical Siloing and Virtual Siloing.

Physical Siloing is mainly about making a subfolder for each topic you have for your content. An example would be http://www.helloworld.com/digital-marketing/seo/seo-strategies. This ensures that you have your content all segmented through its URLs. It would become as specific as you go along but you can also risk messing up your site structure if you get confused about the subfolders you create. To avoid that, just make sure to highlight the most important sections in your navigation bar. This way, you can create silos more effectively.

For this article, we are going to focus more on Virtual Siloing. This is when you use your internal links on your website to maintain connections for your content, therefore increasing relevance and structure for it. Hyperlinks can do a lot more for your site than you let on and if you put in the hard work to organize them then it can significantly boost your traffic. This would be advisable if you do not want to create physical silos for your site.

Make an Interlinking Database

It would be easy to keep track of your virtual silos if you are just beginning to publish content on your site but what if you have more than 50 blog posts already? You should come up with an interlinking database for your site.

Here’s an example:

interlinking database

You can organize it by anchor text like the first selection on the list or you can try organizing it by links. If you plan on optimizing anchor texts, take note of them and see if you are doing a great job of making your content rank for those keywords. This can also help you evaluate what anchor texts bring in the most traffic to your site which you can use for your linkbuilding efforts.

It would be a great tracker for the interlinks you plan on organizing for your site. As you may well know, Google gives importance to your links. This is not just about backlinks but internal links as well. It would be advantageous for your site if you organize your content through silos. Search engines will detect your site’s topical relevance if you connect your pages to each other. Think of it this way, don’t expect other sites to link back to you if you do not regard your content worthy to be interlinked to. This creates urgency for you to organize your links.

Key Takeaway

You should never leave out content when organizing your SEO site structure. More than satisfying the search quality rater guidelines, you also give your audience more knowledge about what you do. With all the chaos that happens in the SEO world, it is important that we seek security in structure. Knowing this helps you focus on your goal to help you keep winning in your digital marketing efforts.

Comment down below and let’s talk about how you trained your content to work wonders for you.

How to Make a Traffic-Boosting Content Structure was originally posted by Video And Blog Marketing

Ecommerce Conversion Rates in Google Analytics: What is it, What is a Good Number, How Do I Improve?

Conversion Optimization

When you’re running a business website, Google Analytics can be one of the most useful tools in your arsenal.

GA can monitor and track your customers and products. It can help you improve traffic and conversions while getting you more customer data.

But above all else, Google Analytics enables you to improve on the overall customer experience.

Customer experience is a big deal when it comes to Google-based SEO. Google wants to make sure that the sites it recommends to users are easy to use and provide a positive online experience.

If you’re an ecommerce business, a better customer experience will help you increase your conversion rate. As anyone who has ever sold anything online can tell you, conversion rate optimization is the bread and butter of an ecommerce store.

By what is an ecommerce conversion rate? What kind of conversion rate should you be shooting for? And once you know all of that, how can you use Google Analytics to optimize your ecommerce’s conversion rates?

What is a Conversion Rate?

If your business relies on online sales, you’re going to hear a lot of talk about conversion rates. It is, by far, the most critical metric for ecommerce stores.

A conversion rate measures what percentage of the traffic that comes in through your site is converting. If you get 1,000 visitors in a day, how many of them took some action on your website?

It should come as no surprise that many online sales professionals see conversion rate as priority number one.

But what is a website conversion?

That’s entirely dependent on the nature of your business. Website conversion can mean many different things for different companies.

Unsurprisingly, the purchase of a product or service is seen as the most common website conversion for an ecommerce platform. However, depending on your ecommerce’s business model or marketing campaign, a conversion could also be signing up for an email list, downloading an email-gated buyer’s guide, or any number of other actions.

How Do You Check Your Conversion Rate in Google Analytics?

We’ve established why conversion rates are so important. In many ways, they are the measure of success for your business. Tracking your conversion rate over time is essential to gauge the success or failure of particular business ventures.

Conversion rates can easily be tracked through Google Analytics ecommerce tracking. If you want to see how your business is faring over a specific period, there are a few easy steps that you can take to see what’s been going on with your site’s visitors.

Step-by-step walkthrough to reviewing your ecommerce conversion rates:

  1. Go to the section of the Google Analytics dashboard marked “Conversions.”
  2. From there, select “Ecommerce” and then “Overview.”
  3. Check out the chart featured on this page. It will detail your site’s performance over time.
  4. The actual conversion rate will be listed underneath the chart.

(Image Source)

Ideally, you should be checking your conversion rate regularly. We recommend looking at Google Analytics every week to see how you’re doing compared to past performance. You should be doing a more in-depth analysis at least once per quarter.

Why is this important?

Let’s say that you’re an online retailer running some aggressive marketing campaigns in an attempt to get users onto your site. Your SEO is on point, you are running ads on Google and Facebook, and you’re drawing in massive amounts of web traffic.

Your number of visitors is through the roof, but you’re not converting.

This means that all that ad budget — all the salary spent on a content writer and designers — is wasted.

That’s why you want to know your conversion rate at all times. When something like this happens, you’re going to need to examine where your prospects are bouncing from and why (and how to fix it).

What’s Impacting your Ecommerce Conversion Rates?

As we mentioned in the last section, knowing your conversion rate ensures that you have the opportunity to make use of the traffic you’re getting. A super low conversion rate is indicative of a significant problem at play somewhere on your site.

If something is wrong on your site, you’re going to have to fix it if you want your business to stay afloat.

One of the reasons your conversion rate is so important is that it can help you determine if you have a clear conversion path for potential customers.

When it comes to the conversion path, there are three primary (website-based) reasons you’d be losing conversions:

  1. Poor site navigation
  2. Un-optimized checkout process
  3. Payment process and security issues

That prompts you to ask three questions:

  1. Do visitors easily understand where they have to go to make a purchase?

Your users should be able to immediately know where they have to go and what they have to do to purchase on your ecommerce website. If they get confused, you’re likely going to lose them.

It’s akin to a customer walking into a store, browsing the products, but not being able to find a point of sale to check out at.

  1. Is your checkout process optimized, with features that simplify the process?

Your shopping cart is your check out counter, and since there is no human clerk there to facilitate the checkout, you should have a simple cart that makes the sales process easy.

  1. Are there payment process or security issues?

Finally, your payment options should be standard and straightforward. Creativity is, in fact, your enemy here. Online shoppers are familiar with the traditional checkout process; veering too widely from it will cause a drop in conversions.

Review your competitors, or larger corporate ecommerce sites (like H&M’s checkout below) to make sure your process lines up with what your visitors will expect:

(Image Source)

Cybercrime, also, is a huge issue right now.

If your ecommerce visitors don’t feel safe entering their credit card information on your site, you’re going to lose them.

That’s why trusted third-party checkout systems like PayPal come in handy for a lesser-known online store. They might not trust you yet, but they trust a name like PayPal.

Remember, the all-important user experience that we touched on before matters a lot in the customer journey and could lead to issues with your conversion rate.

Consider your low conversion rate to be like a warning sign, symptomatic of a more significant issue with the site that has to be corrected quickly.

What is a Good Conversion Rate?

Now the question quickly becomes, what is a reasonable conversion rate?

The answer to that question is frustrating: It depends.

The standard for a reasonable ecommerce conversion rate varies widely. Different business models have different average conversion rates, so it’s essential to do some research on your own to determine what’s average for your branch of the ecommerce market.

It’s rare to have a truly massive conversion rate. Typically, conversion rates average on the very low side. Don’t expect to have a conversion rate even of 10%.

That’s why traffic is so relevant. The more people come to your site, the more opportunities for conversion you have. An average conversion rate of 2% will be a lot more lucrative on a site that sees 100,000 visitors in a week.

The average conversion rate for ecommerce in 2019 as a whole is estimated at 1.7%.

(Image Source)

As you can see in the image below, conversion rates (even within the ecommerce industry alone) can vary wildly:

(Image Source)

While something like food and drink has a super high conversion rate of up to 7.24% for websites scoring in the top 25%, the furniture industry has a median average conversion rate of 0.68%.

Let’s say you’re an ecommerce company selling toys and games. According to the research, that industry has a median conversion rate of 2.43%. If you have 750,000 visits per month, you’re looking at 18,225 conversions per month on average if you’re keeping pace with the industry.

If you up the number of visitors to 1 million, you’re getting 24,300 conversions without your conversion rate changing.

Equally, if you stick with the 750,000 visits, optimizing your conversion rate (redesigning your site, simplifying checkout, etc) to bring it to 3.2% increases your total conversions to 24,000.

So it’s up to you: what’s easier, driving 250,000 more visitors per month, or optimizing your ecommerce site’s conversion rate?

How has mobile shopping affected ecommerce conversion rates?

The way people shop on ecommerce platforms has changed throughout the years.

When you look at mobile vs. desktop conversions, you’re going to see why most companies are putting a lot more effort into their mobile markets. However, this also differs industry by industry.

Mobile makes up a massive 41.32% of retail ecommerce conversions. However, when you look at the travel industry, mobile is only 30.15% of all conversions.

How Can You Improve Conversion Rates Using Information from Google Analytics?

Google Analytics can be a helpful tool for improving your conversion rate and driving sales. By examining your conversion rate over time, you’re able to see problems occurring. It also helps you by showing you your bounce rate (the rate that people abandon your site). By looking at this data, you can better determine where the problem lies.

Before you can fix a problem, you first need to understand that there is one. That’s what Google Analytics helps you to do.

Perhaps the site or landing page is hard to understand, or the interface is not user-friendly.

Pricing could also be a factor. Visitors may balk at what you’re asking for and abandon your site in search of a more moderately priced competitor.

Using the information you’ve gathered from Google Analytics, you can figure out a game plan on how to optimize your user experience.

Ecommerce conversion optimization best practice #1: Get your copy right

Frequently, the issue is in the site’s text. You can fix those issues by creating a more compelling copy. Remember, product copy should be snappy and persuasive while also remaining relevant to push sales.

Headlines that are bland or unclear might drive customers away. When marketing a problem, it’s important to illustrate the value of your product or service right from the get-go. Check out the image below for an example of quick, snappy copy that will urge a customer to convert.

(Image Source)

Ecommerce conversion optimization best practice #2: Optimize your call to action

This is also a great time to look at your calls to action. If Google Analytics shows a large number of customers abandoning your site from its home page, then there might be an issue with your calls to action.

You can optimize these essential site elements by making your buttons larger. Play around with the wording of your calls to action in a way that gets people excited about acting. This is a perfect spot to speak about customer pain points and build trust.

Ecommerce conversion optimization best practice #3: Focus on images

Image quality is another area that might help with your conversion rate.

If Google Analytics shows people navigating away from image-heavy pages, perhaps your images aren’t drawing them in. They might be standing in the way of sales. It might be time to feature more pictures at a higher quality.

Ecommerce conversion optimization best practice #4: Feature great reviews

You should also consider using customer reviews to build trust.

Sometimes, a persuasive testimonial video or product example video can provide social proof to push prospective customers toward a conversion. Reviewing customers often become brand ambassadors and tout your services through social media and user-generated content. These go a long way toward driving sales.

Ecommerce conversion optimization best practice #5: A/B test everything

Consider changing some of your offers around to see what your audience is willing to pay. When you do this, monitor Google Analytics closely to watch for a change in conversion rate.

That’s a spot where an A/B test could come in handy. A/B testing is where you try out different elements on your site to see what combinations drive the most conversions.

Ecommerce conversion optimization best practice #6: Give great customer service

Making customer service more readily available is also something that helps a customer trust you enough to convert. If they have questions but can’t receive answers quickly, they will look elsewhere.

And you should have more than just a contact form. Consider a live chat service, as they have a history of success. In fact, 79% of businesses report that using live chat has had a positive impact on sales and customer loyalty.

For example, Jerome’s Furniture, an ecommerce furniture business, managed to use live chat to improve conversion rates by 10x,

Implement your live chat service and then monitor Google Analytics to see how it impacts sales, user behavior, and conversion optimization.

In Conclusion

There is no magical formula for conversion rate optimization. That’s why Google Analytics is such a beneficial tool. It gives you a chance to try out new things, lets you see when problems arise and shows you the results of your efforts to drive sales and make a splash with your ecommerce website.

Ecommerce Conversion Rates in Google Analytics: What is it, What is a Good Number, How Do I Improve? was originally posted by Video And Blog Marketing

How to Write an Incredible Title Tag

The humble title tag. Probably the single most important 50-60 characters of that piece of content you’ve written. 

Perhaps you’ve found this post because you’ve spent hours pouring your soul into a piece of writing and now you’ve realised people will only read it if you write a good 50-60 characters. Or maybe it’s just that your boss told you that he needs quick wins for your product pages and so you’re turning in desperation to the ol’ title tag. Writing a good title tag is part art, part science.  How do you do it?

We’ll start with some quick basics for beginners. If you’re looking for the split test results, fun processes & all the more advanced things, scroll down two sections. Nothing to see here.

Contents

What is a title tag?

The title tag of a page is the HTML tag which is used to summarise the content of your webpage. It’ll be used by search engines as the title in search:

Yes, I’m using my own post as an example…

In your browser tab:

And even as a fallback in social sharing posts:

It isn’t the same thing as the on-page title! An on-page title could be written as a variation of your title tag, or something completely different. If we take a look at the article I’m using as an example we can see that the brand isn’t on the on-page title.

  • Title tag: A Complete Guide to Log Analysis with Big Query | Distilled
  • On-page title: A Complete Guide to Log Analysis with Big Query

If you want a more severe example take a look at this Redbull article.

How long should a title tag be?

A title tag should typically be 50-60 characters. Technically Google’s maximum size is 600px. This usually works out at about 50-60 characters.

What do we want a title tag to do?

Welcome back, experienced people. What do we want our title tags to do?

  1. Summarise our page: Our title should summarise the general thrust of our page. Google is going to use it to understand what our page is about.
  2. Get people to click: It’s what users are going to see in the SERP. We need to convince people to pick us.

And if we just do one, you usually don’t get the best results. For example, using the title from the blog post above:

  • Totally factual: A Guide on Log Analysis.
  • All click: 6 Easy Steps to Log Analysis They Don’t Want You To Know.

We want to maximise how clicky our titles are without… you know… lying, mentioning that one trick dentists hate and crucially without compromising on summarising the page.

The title is primarily for people arriving on your site from Google. We’re not trying to pull people in who are idling. Those people are on Facebook, TikTok, Youtube, Instagram etc. (I know we did mention above that the title can sometimes be for social, but you can overwrite that if you’d like!)

The audience for your title is someone searching with an intent & that always comes first.

The process is quite different now depending on if you’re writing for a single article, or a template. 

How to write a title tag for a single article

Step 1 – Write the article

Write the article. It’s far easier to write a title when you know what you’ve written about. (This is assuming you know what you’re writing about, otherwise, sometimes headline writing can be a good way to generate ideas.)

Step 2 – Summarise the primary purpose/point of the article

Pull out the primary purpose/point of the article. No clickiness yet, just the factual summary.

Example

Step 3 – Find the factual, commonly searched keywords needed to describe the topic

Try to summarise what someone might search to find your article. Aim for the simplest most basic version of it. Search that term, take the top 5-10 articles which rank for it, plug them into a tool like Ahrefs, SEMRush, Searchmetrics, Brightedge etc. and download all the keywords those articles rank for.

If the top 5-10 articles look nothing like yours either:

  • You’re first to a topic (unlikely, but possible)
  • Or your phrase is wrong, try again.

Once you’re happy with the phrase, take that big list of keywords and look for any other commonly occurring phrases you’re missing and take note.

Example

We’re going to continue using my old article on log analysis as an example. Because it doesn’t have a great title…

First search phrase pick: “log analysis” 

If we look up this keyword these are the top articles (only 3 shown below). Clearly we can see here that none of these articles are about search log analysis, I probably need to change my keyword:

Second search phrase pick: “seo log analysis”

Yep, that search result looks far better. We’ve still got a short phrase, but now the articles are now on topic with my own:

Excellent. Now:

  • Let’s take all the URLs that rank in the top 5-10.
  • Download the keywords they rank  for. (Ahrefs, SEMRush, Sistrix etc.)

And then get the most common keywords from that list. This ngrams tool is a nice way to do it. We get:

word frequency
log 164
analysis 65
file 56
analyzer 41
server 40
logs 29
grep 13
analyze 13
access 12
excel 11

If we pull out the big generic words which would also apply to my article we get:

  • Log
  • Analysis
  • File

And possibly also:

  • Server

Step 4 – Writing lots of titles

Process

Now we’ve got all the factual words we’ll want in our title and brand.

What inspiration can we get for the clicky part? Lets quickly blast through a couple:

  • Writing an emotional headline:
    • Fear
    • Surprise
    • Anger
    • Disgust
    • Affirmation
  • Adding numbers:
    • Number of items in a list
    • Price
    • Date
  • Shameless clickbait inspiration:
    • Adding in mindblowing adverbs
    • The word “actually”
    • Being unreasonably specific

Then we try to write as many headlines as we can, but without trading away our relevance and factual keywords. 

When I started I worked with Hannah Smith on several projects. I remember her beating into us – “Write 20 titles. 20 is really hard.” Most of them will suck, but you’ll force yourself to be creative and somewhere there might be gold.

Example

Back to our previous example.

We’ve got our important factual words. We also know we want SEO as without that the intent of results shown wasn’t correct. Together those 4 words (without server) take up 18 characters. Which gives us roughly 32 characters left to play with. Let’s also look at our current title and see what we’re working with:

  • A Complete Guide to Log Analysis with BigQuery | Distilled
    • Making it clicky 
    • Factual description 
    • Brand 

We can see I’ve used “Complete Guide” to try and make it clicky and that I’ve also put the method of analysis “BigQuery” into the title. Both of these we could definitely play around with. Now we just try to write as many titles as we can.

  • “A Guide to SEO Log File Analysis | Distilled”
  • “What is a log file and why is it helpful for SEO? | Distilled”
  • “6 Stage SEO Log File Analysis – A Complete Guide | Distilled”
  • “How to do an SEO log file analysis | Distilled”
  • “SEO Log File Analysis – The most important technical analysis | Distilled”
  • “5 Ways to Analyse Log Files for SEO You Didn’t Know | Distilled”
  • “Logging in the SEO jungles of the internet | Distilled”
  • “Log analysis is the technical audit you should be doing | Distilled”
  • “Stop wasting your time crawling and look at the logs | Distilled”
  • “Log analysis for SEO in 2020 | Distilled”
  • “Server Log Analysis Guide – SEO For Large Websites | Distilled”

I started with the restrictions and gradually just ignored them in my attempt to get to 20 titles. I didn’t get there. Sorry Hannah.

Step 5 – Picking one

How do we decide which is best? 

Honestly, it’s savagely hard to pick the right title by yourself. Of all the title tag tests we’ve run at Distilled, only one in five is typically positive. When I first started in search, I thought titles were the easy win. About a year and a half of running endless title tag split tests and I’m no longer convinced.

If you can test it. The two easiest ways for a single article are:

  • Paying for it: If you’ve got the budget, you could run paid social media campaigns and see which title performs best.
  • Friends & Colleagues: Make a poll for your friends & colleagues and get them to vote.

How to write hundreds of title tags for a template

The above process works great if all you need to write is a single title.

But if you’ve got a template with hundreds of thousands of pages, then you can’t really do that. Well, you could, but it would be exhausting. Instead, we’re going to need a format for a title that we can apply to all our pages, to make our template shine. That previous process won’t cut it.

Step 1 – Summarise the primary purpose/point of the page

We’re going to start by trying to summarise the attributes of the page in as much detail as possible. This will give us an idea of what pieces of detail we can pull into our titles across our template.

Example

I’ve pulled two page templates from rightmove.co.uk (this isn’t every page template but we’re keeping it simple):

Step 2 – Figure out what searches should return our template

Our templated page matches a specific intent. We need to figure out how to represent that in a title tag. 

Two things make this hard:

  • We might have multiple templates with similar intents.
  • The pages in our template may be similar.

We need to try and make a title which:

  • Differentiates our template from other templates.
  • Differentiates pages in our template from each other.

If we’re really struggling perhaps these pages shouldn’t even exist. But that’s a conversation for another day.

Example

We have two templates:

  • For sale
  • To rent

In this case, it’s pretty simple. For sale & to rent are clearly the important keywords we need to keep each template different. We can see that by looking at the SERPs. Changing those keywords, changes the results from for sale to rent.

Within our template, we have lots of different locations.

  • Properties for sale in Manchester
  • Properties for sale in Ipswich

 In order to keep the pages in our template different, we’re going to need the location in the title.

Step 3 – Accept that it’s messy

But anytime you work with titles it’s going to get messy.

Take our previous example. Rightmove actually has pages for Manchester & Greater Manchester. One ranks for properties and the other for flats. Something is clearly going on there. Uh oh.

Should that change what we do?

When we’re working at scale, patterns are going to breakdown. There hopefully is an underlying pattern, but look long enough and you’ll find exceptions. All we can do is do our best. Make a reasonable guess at what is going on and spoiler for stage 6. Test.

Step 4 – Are there any common phrases we’re missing?

This is exactly the same as step 3 for articles

  • Take your phrase which summarises the page.
  • Search for it. Download all the keywords the top 5-10 results rank for.
  • Find the most common words.

Example

To keep it brief, we’re going to just stick with the properties for sale template for the rest of these steps! Running this example with the top phrases for “properties for sale in manchester” we get:

Keyword Frequency
manchester 211
sale 122
for 107
for sale 96
houses 59
house 45
buy 42
sale manchester 40
houses for 36
property 32

Words to note here are all fairly self-explanatory:

  • Property
  • Houses
  • Buy

Step 5 – What can we add to make it more attractive?

We know what we need to include to make the intent of our page clear.

  • Property/houses
  • For sale/To rent
  • Location

Now let’s use that as a base and write as many titles as possible.  

We want to:

  1. Make them as clicky as possible.

    1. Use extra attributes.
    2. Get creative.
  2. Avoid using words which might change search intent.

A general difference between this and individual articles: If you end up with an entirely factual template title that is far more acceptable here than with an individual article.

Generic ideas for things you can put in titles

  • Adding prices into the title.
  • Adding some sort of quantity into the title. 
  • Adding year into the title. 
  • Put in the obvious e.g. “online” in an online shop.
  • Popular synonyms.

Words to watch out for that can change an intent

  • Comparison style words – best, compare etc. 
  • Deal seeking words: cheapest, cheap, deal, affordable

Example

Let’s have a go at writing titles for our category pages

Our base is:

  • Properties for Sale in Manchester | Rightmove

Let’s make variants:

  • Properties & Houses for Sale in Manchester | Rightmove
  • Buy Properties & Houses for Sale in Manchester | Rightmove
  • Buy Houses & Properties for Sale in Manchester | Rightmove
  • 3,940 Houses & Properties for Sale in Manchester | Rightmove
  • 3,000+ Houses & Properties for Sale in Manchester | Rightmove
  • Properties for Sale – Houses for Sale in Manchester | Rightmove
  • 3,940 Houses & Properties for Sale Across Manchester | Rightmove
  • 3,940 Houses for Sale in Manchester – Get there first | Rightmove
  • 3,940 Properties for Sale in Manchester – Find your Happy | Rightmove

That’s a lot of variations. We even managed to fit in their tag line at the end.

Step 5 – Pick a title

Process

Just like with articles we’re going to end up with a list of titles and unsure which one will be best. Far more than with individual title tags, it’s really really important to split test.

  • Template level title tags are messy. We’ve already seen that in our example. You can make educated guesses from performing some large scale analysis, but there are going to be effects you miss. 
  • What works on one site won’t work on another & we’ve found only 1 in 5 title tags ends up being positive.
  • The stakes are often higher. We’re not changing one page, we’re changing a group of pages which is often a non-trivial amount of your search traffic.

If you can test at all I’d highly recommend it. We’ve got plenty of resources to help you get started. The two most useful should be:

If you can’t test, you can at least lean on our tests, I’ve got results from those in the next section.

Important context for our title tag split tests

We’re lucky enough at Distilled to have access to SEO split testing software we built. It lets us test different titles & accurately measure the impact on organic traffic. We’re about to talk about the different results we’ve learned, so it’s important to briefly talk about the assumptions implicit in these results.

You can only run SEO split tests on large groups of similar pages (e.g. all category pages, all listing pages etc.) and that means our results are from certain types of websites:

  • The websites are mostly large and authoritative. 
  • They tend to be in competitive SERPs.
  • The companies usually have SEO teams who have done the basics. There usually isn’t anything glaringly awful like product pages without titles that we can fix.
  • They are more typically tests applied to template pages like category, product & listing pages rather than blog pages. (Although that’s not everything, we run split tests on the Moz blog for example!)

I think you can learn a huge amount from these tests, but it’s still important to bear those assumptions in mind.

What are the chances you write a good title tag?

Writing titles is really hard. We mentioned this above, but let’s look at our numbers in slightly more detail. We’ve run many title tag tests across different industries. Our results break down as follows:

  • Successes: 22%
  • Null: 38%
  • Failures: 40%

Oof. 78% of the time title tag tests fall flat or actually harm the website. That makes testing super important. It’s not impossible you could work on a website where you never have a positive title tag test. Nothing you try will ever work. Without testing, you’d probably still roll out those titles. Just spotting the failures and not rolling them out will save you a huge amount of traffic.

With a single article, this isn’t so worrying, you’ve got a far larger creative space to play in and if it does go wrong, it’s a far smaller proportion of your traffic.

If you’re changing titles on big page templates, please make sure you test them!

How much impact do title tag changes have?

Broadly most title tag tests have an impact between 4-15% in either direction.

You can see a distribution of our title tag tests below.

7 learnings from title tag split tests

Most title tag changes are unique to a website, changing words and phrases which don’t generalise well from website to website. However, there are some more common patterns we’ve been able to test.

Putting in prices

50% of our title tag tests involving adding the price into the title have been positive. Not only do we get to put a number into the title, but it also provides more information.

Why was it null or negative the rest of the time? 

Our consultant Emily Potter thinks this is down to whether or not Google can find the price you put in the title on the rest of your page – i.e. are you being honest about price. We also think it may make a difference depending on how competitive you are on price.

Putting in year numbers

We haven’t had the chance to test this a huge number of times, but so far this change has been positive in the niches where we’ve done it. The shameless putting 2019, 2020 in the title has helped.

Shortening title tags hasn’t actually been that helpful

When you have lots of automatically generated titles, it’s common to end up with titles that are too long.

We’ve run a number of tests about shortening these titles and nearly all of them have been null (~80%). They’ve also never been positive. Our best current theory is that the templates which often end up with long title tags are typically attracting long tail traffic. When they are truncated, they’re still the only relevant result and so continue to rank, perhaps for long tail queries, keyword stuffing isn’t a problem.

Having said that I’d still say it’s worth trying to shorten your titles. If you manage to cut 4-5 characters from your title with no effect, you could use that space to add price or something else which may have an effect.

Emojis didn’t work

We’ve run several tests to put emojis into title tags and so far it hasn’t helped. Sorry folks 😦

I mean c’mon. Marketers can barely be trusted with FAQ schema, can you imagine what we’d do to Emojis.

Eye-grabbing on category/listing pages

We’ve tried some title tags for category/listing pages which were very different, actively calling out to the user in the SERPs.

  • Standard: Ford for Sale | CarShop
  • Example of our type of test: You there! Fords for Sale at the CarShop

These did not work. 

Localising language

We tested using localised versions of phrases. This wasn’t single letter changes (like s for z in UK vs US), but entire words e.g. pants instead of trousers.

This was notably positive (~20-25%).

Removing implied words from the title

We’ve seen mixed results from this. We ran a split test & found removing “online” from title tags had no effect on one particular client. Outside of our split-testing platform for a different client, we removed the word “online” from the title of an online store.

Our rankings for the terms including “online”, dropped and we quickly put it back in.

More detail on the split tests

If you want to hear more detail about some of these tests, or just love video and you’re signed up to DistilledU, you can see Emily Potter’s video on split testing from last year. If you’re not subscribed, you can see my slightly older talk here.

How long does it take to see the impact of a title tag change?

We usually see the impact of a title tag in 3-5 days.  We’ve had a couple which has taken longer, but this is the majority. The previous caveats are of course important here, we typically work on larger websites, which are heavily crawled.

Summary

I genuinely thought when I started I’d be able to get this post done in 1000 words. Even now, I can see all the little bits of context & other things that go into writing a good title, which I just couldn’t fit into this post. We didn’t even start talking about internal politics 🙂

But hopefully, this has got you on your way. Now let’s hear some stories.

What title tag tests have you found effective? What’s the worst title tag you’ve ever tried?

How to Write an Incredible Title Tag was originally posted by Video And Blog Marketing