Month: October 2017

Boost Your Conversion Rate Through Effective Social Media Marketing

Social Media is one of the most widely used networks in world, with over billions of users worldwide. With this large number, it is crucial to tap into the power of social media for your SEO campaign and expect to get a good conversion rate. Social media marketing is as complicated as it is large, and the trends and interests of the users change on a daily basis. Getting things right would mean that your reward will be immensely high, while doing the opposite would mean more difficulties for your team and your brand.

Social media is an important element for a successful SEO campaign, as engagement is necessary to be able to get a good amount of conversions. A solid strategy would make all the difference, and guarantee that you would be able to bring your conversion rates up. Here are some effective and powerful strategies you can use to get the results you need.

Ensure your Links Point to Relevant Landing Pages

Take your users to where they want to go, which means you should ensure that your links should arrive to the landing pages that are relevant. Taking people to where they want to go would help foster trust, and improve the overall user experience. Place all the necessary social media channels on your website as well to improve your visibility, and for your users to be able to share your content conveniently.

Constantly Assess your Data

Since social media trends change on an almost day-to-day basis, it is important to do regular assessments on the content that you are posting. You would be able to see what works and what does not, which can help you optimize your content quickly. Social Media Marketing teams should use data and analytics constantly, as these will help you know the results, and where are they coming from.

Create Informative Content

Quality content is one of the main keys for an effective SEO campaign, and people will always look for meaningful and helpful content that would be able to answer questions or tackle certain topics. Having concise, yet informative content would make people become engaged, and see you as a legitimate source that they can rely on. Give them the right answers to their questions and go straight to the point.

Use Interesting Headlines

One of the best ways to get the attention of your users is by using interesting and eye-catching headlines. This will be the first thing that social media users will see, and it is a way for them to assess if the article is worth the read. Think creatively when crafting out headlines, and you will be getting better conversion rates, which is good news for your social media marketing strategy.

Optimize Your Posts for Each Network

Each social media network has a different way of presenting their content to their users. For example, Twitter only allows 140 characters in each post, which means that you would have to rely on images to attract the user. Having a specialized post for each network would help you know the strengths of each of them, and assess how your posts would look like.

Schedule Your Posts and Interactions

Interaction is the bread and butter of social media, and you would like to interact with as many people as possible. This can be a challenge, especially if the users come from different parts of the world, as the different time zones come in as a major factor. Having a proper schedule would do wonders for your social media marketing campaign. There are social media tools that you can use to do so, which can also help you post on different accounts at the same time.

Reshare and Repurpose

If you have a post that has received a high number of interactions, that means you can still use it in the future. Sharing is not an effective strategy as it is, which means that you would have to reshape your content. This can be in the form of simple tweaks, like adding an image, or a new quote. This would prevent spamming, and will help give more visibility to your previous content.

Key Takeaway

The world of social media is vast, and being able to stand out helps bring about better results for your social media marketing campaign. With these effective strategies, you are guaranteed that you would have the upper hand when it comes to having quality content.

If you have questions regarding social media marketing, or SEO in general, leave a comment below and let’s talk.

Boost Your Conversion Rate Through Effective Social Media Marketing was originally posted by Video And Blog Marketing

The Right Tools to Track your Local SEO and SEM

Tracking Local SEO and SEM Using the Right Tools

Tracking your local SEO and SEM is highly important for your SEO and link building strategies. Being able to track these would help determine what strategy works and what does not, which can help you assess and look for other strategies that can be used. While most businesses and companies focus on increasing their internet traffic by using analytics trackers, it does not mean that you are getting the right results that you would want. Traffic can only get you so far, and what you would really need are tangible results that would benefit you.

This is why tracking matters, and using the right tools and steps would ensure that you have the right kind of set-up that you can access easily, which helps you see all the results you need.

Use Google Analytics

One of the most effective tools to track your Local SEO and SEM is by using Google Analytics. There are many tracking tools available in the market that you can purchase, but Google Analytics provides these functions for free. Once you have your account logged in and ready to go, it is now time to set it all up.

Local SEO Google Analytic

The first thing you have to do is to add your account and your website, and then install the tracking code on the website’s template. The best practice is to use a Google account of your own, and not from any developer agency. Instead of letting them create the account for your tracker, simply share it with their accounts.

Once that is done, the next step is to set up your time zone, currency, and the right website filters. This would help you view all the right data you need.

Setting up your Goals

Local SEO Goal

One of the things that you need, instead of online traffic, are conversions. These conversions are the tangible results that would benefit your business. The best way to get these conversions is by establishing the right goals. Here’s how you can establish goals using Google Analytics:

Local SEO New Goals

  • First, go to the Admin screen, and select Goals. Under Goals, you’ll be able to see + New Goal.
  • There are Goal templates that you can use if you are able to fit them with what you need. You can also create your own template, which is simple and very convenient.
  • The next step is to select a goal type and give your goal a name.
  • Keep it in mind that you can add actual monetary values to your goals, which would make it a very important metric for you to evaluate how well you are doing, and help assess your overall strategy.

Call and Form Tracking

One of the best ways to track conversions is by using call tracking tools, some of these tools can be integrated with Google Analytics, and converts phone numbers into tracking numbers. This form of dynamic number insertion will be helpful for your SEO campaign, as you would be getting conversions that lead to high traffic, while not even having to spend too much.

Another popular tracking method is form tracking, which is already being used by various companies and businesses. This can be done through WordPress plugins like Ninja Forms and Gravity forms. These plugins would generate thank you pages that allows you to track the click of the form submission button, and have proof that a form was accomplished.

These plugins are simple and easy to configure, and you can customize them to your liking. You can also use these thank you pages as ways to recommend other pages or services, which help keep users on your website. You can also use this approach when you are promoting events or during appointment booking and journal/newsletter signups.

Google Search Console

Local SEO Google Search Console dashboard

The Google Search Console is a simple but useful tool that would give you some basic SEO fundamentals and helps you look for problems and diagnostics that can help improve your rankings. You can connect your Search Tool account with your Analytics account and your Adwords account, which would help give you a more seamless view of your data

Google Adwords

Local SEO Google Adwords

Using Google Adwords is one of the main reasons why tracking your local SEO and SEM is very important. This helps you get important conversion data that you need to assess if your overall approach and strategy is working or not. Using Adwords would also help you know the real meaning of why should you be tracking those clicks. One of the best things that you can do with your Google Adwords account is to link it with your Analytics and Search Tools accounts, which gives you a one-stop location to view all of your crucial data.

Viewing your Data

Once that you have all your three tools linked together, you now have a centralized tracking center where you can see all of the relevant statistics and metrics. From this data, you would be able to compile all of it into a single, well-detailed SEO report that you can send to your team, and even to your clients.

Local SEO Analytic Data

To be able to view your data, go to the Analytics page, and select Acquisition. Next, select All Traffic, and then Channels. You can get an overview of the relevant metrics, which include Organic Search, Conversion Rate, Referrals, and much more. Another way of viewing your data is by selecting All Traffic, and then instead of clicking Channels, click Source/Medium. This would help you view all of the URLs that you are currently tracking.

If you want a more concise version of all of these, you can simply view it in the Analytics Dashboard, which you can customize and adjust however you want it to be.

Key Takeaway

Tracking your Local SEO and SEM may seem like a challenging task at first, but overall, it is essential to increase the effectiveness of your SEO campaign. By utilizing a free and powerful tool such as Google Analytics, you are guaranteed that you will track your links and domains in no time.

If you have any questions with regards to Local SEO, or other SEO-related matters, leave a comment below and let’s talk.

The Right Tools to Track your Local SEO and SEM was originally posted by Video And Blog Marketing

All the Slides from SearchLove London 2017

As it always tends to do, SearchLove London flew by in a flurry of top-class presentations, intense marketing chat with fellow digital folks, and lots of caffeine. The 2017 edition was our 2nd sellout crowd in a row and it was really special to look across a packed auditorium as each speaker got up on stage to share with the audience some really cool stuff.

And, above everything else, that’s why people join us year-after-year, because of the quality of those 17 intimidatingly-smart speakers. If you couldn’t make it along, or just need another look, you’ll find all the slides from the two days below…

Link building Case Studies, Myths and Fails by Paddy Moogan

Beyond the Reach of Keyword Targeting: The Evolution of Paid Media by Samantha Jane Noble

Reverse-Engineering Google’s Research on What Searchers are Looking for by Rob Bucci

Conquer Your Toughest Analystics Challenges and Level Up Your Marketing by Mike Arnesen

Go East, Innovators: Strategies From Asia the Rest of the World Needs to Adopt by Purna Virji

The New Era of Visual Marketing by Jes Scholz

Mobile-First Preparedness: What We’ve Learned From Crawling the Top 1 Million Websites by Jon Myers

Social Content Masterclass: Platform Specificity by David Levin

Content Distribution: How To Give Your Content More Life by Ross Simmonds

10 Steps to Make Power BI Help You Bust Silos in Search and With the C-suite by Wil Reynolds

A Competitive Analysis that Saved $337,000 by Zee Hoffman-Jones

The Campaign Flop: What to do When Your Content Fails by Kirsty Hulse

The Day After Tomorrow: When Ad Blockers Stop All Analytics Platforms by Samuel Scott

From Website to Web-App: Fantastic Optimisations and Where to Find Them by Emily Grossman

Digital Witness: Tales From the Charity Frontlines by Cheri Percy

The Why and How of Creating Video Content for Search by Justin Briggs

Seeing the Future: How to Tell the Impact of a Change Before You Make it by Will Critchlow

Be at the next SearchLove conference (in Sunny San Diego)

As one SearchLove event draws to a close, the planning for the next gets underway. We’ll be back at the beautiful Paradise Point resort in San Diego on 26-27 March 2018. Tickets are already on sale, with early bird discounts of $200 off every ticket. 

Want more great content in your inbox? Join the monthly newsletter.

All the Slides from SearchLove London 2017 was originally posted by Video And Blog Marketing

Facebook News Feed Experiments: Threat or Opportunity?

As Adam Mosseri, Head of News Feed at Facebook noted in a post on Monday “There have been a number of reports about a test we’re running in Sri Lanka, Bolivia, Slovakia, Serbia, Guatemala, and Cambodia.” The test he is referring to is that of moving all content posted by brand pages (not content shared by friends) from the main user News Feed into a separate tab named “Explore”.

What’s changed?

One of the first sources to write about the test was Filip Struhárik with the starkly titled “Biggest drop in Facebook organic reach we have ever seen”. The story has since been picked up by The Guardian (Facebook moving non-promoted posts out of News Feed in trial) and the BBC (Facebook explores, publishers panic). As you can tell from the titles, tensions are running high, which is understandable because the very people writing them could stand to be the hardest hit by another step of removal from their core audience. As you may also have gleaned, this trial is applying to organic content only, not promoted posts. It is a matter of time before someone comes up with an “-ageddon” nickname for the event, (Explorageddon sounds like a tourist board advert) but as many have pointed out the potential ramifications could be serious.

Purely from a publisher relations standpoint, this could perhaps have been handled better. As Mosseri mentions in his post “It’s also important to know this test in these six countries is different than the version of Explore that has rolled out to most people”. While it’s understandable that Facebook wouldn’t want to panic publishers by warning them of this planned test in advance, rolling out something so controversial in a limited geography and making it easily confused with something else far more widespread wasn’t a fantastic exercise in concern management. It’s akin to starting annual review day by firing the first few employees you meet and leaving everyone else to stew. It’s also understandable that any new release will come with its bugs, but Struhárik has reported page posts being removed from the main News Feed for users that don’t yet have the Explore section, meaning for those users all page posts are hidden in the Pages Feed section which I certainly hadn’t visited before today.

Change isn’t always a bad thing

It’s true that some changes that Facebook implement can make us better writers, marketers, and entertainers. The much-maligned algorithm update which reduced pages’ ability to reach their followers felt like it makes life harder, but it allowed good publishers to get far more for their money by engaging with their communities and learning from what they like, rather than just pumping out 50 posts a day to rack up those juicy clicks. Much like AMP, Instant Articles made us consider what we can pare back and peel away to give visitors only what actually matters with as little wait time as possible, and I’m actually quite interested in some of their plans on monetising chat bots discussed in this podcast, for instance sales messages being blocked if a user hasn’t actually engaged with your messages in the last 24 hours.

A step backwards

However, it’s not always the case that these changes improve the content quality. Facebook has also announced in the past that users don’t like reaching the end of their timeline, in response they allowed individual publishers to appear multiple times in a users’ News Feed, whether this improved satisfaction is debatable. In the same announcement, Facebook described users not wanting to see notifications of friends’ likes in their feed – after Facebook removed these notifications low-quality pages just pivoted to “tag a friend who” memes some of which exemplified the worst side of us on social media. More recently Facebook has gathered that users want to see more from friends and family, that is one of the reasons they have given for this latest test.

My concern is that moving this content to a separate (currently quite hidden) section and only allowing paid content into the News Feed won’t make publishers better, it’ll quarantine the terrible content but lump it in with the good. It stands to make Facebook success more like the deep-pockets-or-black-hat game that exists elsewhere and hampers the success of small but genuinely talented content producers. It’ll also mean that publishers have even more inaccurate figures about the value of a follow, making it harder still for community managers to argue the for investing in a community.

What’s more, I still don’t see it reducing the torrent of “Tag a mate who is s**t at golf” posts coming up in my feed because the real low-quality publishers already know how to get their content past Facebook’s net – get my friends to deliver it to me. There is even a host of “Tag a friend to make them open their phone and look at this cucumber for no reason” content – that’s content that is basing its success on mocking Facebook’s aim of showing you only what you want to see.

Want more advice like this in your inbox? Join the monthly newsletter.

Of course, Facebook has to make money but I am far happier with the current system which stands to make companies pay through the nose to distribute uninteresting and unoriginal content. While it’s far from perfect, the current method of checking content popularity leaves more of a gap for the intelligent, well-targeted, human content to run rings around generic uninspired posts, and even gives smaller publishers a better chance. It could be argued that users going to Explore will be primed to read and engage, but the number of times I open the “promotions” tab in Gmail speaks to the contrary, and that’s ignoring the fact that the Explore section currently won’t be limited to pages I subscribed to, but will include any content that Facebook deems appropriate.

The outcome

As Ziad Ramley, former social lead for Al Jazeera, suggests, this could all just be flash-in-the-pan. After the testing period, Facebook could well kill this experiment dead, or it might even roll out and have nothing like the negative impact we’re envisioning. Even though Facebook explicitly prioritises users over publishers, a stance that Techcrunch describes as the reason Facebook has survived so much change, there are certainly reasons why they might want to reverse this course of action. As Struhárikm observed to The Guardian: when we finally get a News Feed that’s just friends we may just find out just how boring our friends are. Maybe we’ll jump into the Explore section when we get sick of hearing about Clea’s “nightmare” mole operation, or maybe we’ll just stop logging in.

One thing’s for sure, moving publishers out of the News Feed, even if it is accompanied by a reduction in the quality of experience, is bound to be far more frictionless than attempts to move organic page posts back in. If Facebook makes this change and usage goes down I could imagine the smartest marketers playing News Feed exposure like the stock market, waiting for the drop in interest and investing heavily while Facebook tries to gain back its lost momentum.

Facebook News Feed Experiments: Threat or Opportunity? was originally posted by Video And Blog Marketing

SERPed Site Management and Site Explorer Review

SERPED Site Management and Site Explorer Review cover

Among SEO Hacker’s ever-growing toolbox, there are numerous tools in there that specialize in one specific function. Some tools specialize in keyword research, some in content assistance, others in tracking a site’s performance, and much more. However, we found out about SERPed and its capability of being an all-in-one SEO tools suite, and we just had to try it.

Tools like SERPed can greatly help your SEO campaign as they make things more organized, accessible, and efficient. Here’s what I think about it.

SERPed

Serped Login screen

Before logging into your account, the home page shows how much you and your company will be able to save by using SERPed. The tool advertises itself to be the do-it-all SEO tool that you would be able to rely on, which means that you would no longer have to download other SEO Tools. For $79 a month, it looks like a great deal

Serped DashboardUpon logging in, you would instantly go to your current project. From here, you can access the different sites and tools that SERPed has to offer. Let us navigate each part.

Sites and Tools

Serped Sites

The sites section allows you to navigate through the different domains that you are currently working on. You also have the option of adding a new site or even create a new project altogether.

Serped Tools

The tools section is where SERPed gets its money’s worth. You have all the things that you would need for your SEO and link building campaign, all in one convenient location. Here is a brief rundown of all the tools that SERPed has to offer:

  • Keyword Research – This tool allows you to be able to look for the right keywords for your SEO campaign, as you would be able to see how they rank up, and how competitive the keyword is.
  • Domain Research – This allows you to explore different domains on the internet, and gives you important statistics and data, like domain age, backlink data, and social metrics.
  • Domain Finding – This tool is the best way to look for the domains you would like to analyze and monitor. You can also look for available domains and see their prices, and expiring domains that would be available for purchase.
  • Site Manager – The site manager is where you would be accessing your projects, which would help you see how the domains are doing. You can also use the web analytics tool to see how they rank in search engines and use goal tracking to see if a domain is reaching certain milestones.
  • Rank Tracking – An effective SEO campaign needs results to back things up, and rank tracking allows you to see how well your keywords are doing locally and internationally. You can also see how well you are doing on sites like YouTube and Amazon as well.
  • Client Acquisition – If you are looking for prospective clients for your SEO and link building campaign, this tool would be able to help you contact them. You can explore and contact clients across different countries, and help them with their SEO campaign.
  • Other Tools – Along with all of the tools that have been mentioned, SERPed also has other reliable tools, that can help you, especially with content and blog management. The content curator and restorer allows you to manage and explore different articles in a blog, while WordPress Manager allows you to access all of your blogs in an instant. Lastly, there is also a grammar checker app, which comes in handy if you are writing and editing your content.

Next, let us focus on the Site Management and Domain Explorer Tools, which can be considered as one of the best parts of this do-it-all tool.

Site Management

SERPed Sites ManagementUpon entering the Site Management page, you can instantly see all of the projects that you currently have, along with some information like IP address, Domain and Page Authority, Moz Rank, Backlinks, and Social Metrics. You can also sort each of your projects by alphabet, or by their metrics and SEO rankings.

SERPed Projects PageIf you click one of the projects, you would be able to see even more data and other important pieces of information. These include keywords and rankings, and you can access different tabs.

SERPed Sites Statistics

On the Site Statistics tab, you get a very comprehensive list of data, which include keyword search volume, Moz and Majestic data, and keyword count and density.

Serped Backlinks Profile

The Backlinks Profile tab shows you charts which show you the SERPed Rank Breakdown, Domain Age Breakdown, and the Alexa Rank Breakdown. You can also view your anchor text cloud as well, and see the sites that are linking your domain.

SERPed Competitor

The Your Competitors tab shows you how you are stacking up with the similar companies in your field. This would help you track the progress of your competitors while helping you assess your own progress.

Serped SEO Review

The SEO Review tab is basically a detailed, yet concise report of your project. This would help assess the steps you need to take your SEO campaign further, as you are able to look for those specific details that would help you improve.

Site Explorer

One of the updated, more recently updated tools of SERPed is the Site Explorer, which has become more streamlined to make it more user-friendly.

Serped Site Explorer

Once you enter a domain, you get an overview of the domain, along with other relevant statistics. The sidebar on the left helps you access these data without the need of having to scroll down the whole page. Along with this, Spam Analysis has also been added, which helps eliminate unnecessary links and help evaluate potential links.

Serped Spam Analysis

As you can see, SERPed gives each domain a spam score, which shows how at risk your domain is from “spammy” links.

Serped Link Velocity

Another important part of the Site Explorer is the Link Velocity. This shows you how many links you have for each month which helps you with your link building strategy. The Site Explorer is your quick access tool which would help you get a quick look at how your SEO campaign is doing.

Verdict

With all of the tools available in a single place, SERPed is definitely that one place for all of your SEO and link building needs. While some tools only have a specific number of functions, and can only gather a limited amount of data, SERPed allows you to access many kinds of data, which helps you analyze and assess all the numbers that you need to know. If you are looking for an affordable and multi-faceted SEO tool, you can’t go wrong with SERPed.

Key Takeaway

Versatility is the key in today’s technology, which is why tools like SERPed is a welcomed addition to the list of effective SEO software. Its features allow you to not only access data, but also assist you with tasks such as site analysis, and content management. With all of the SEO tools out there, I can say that SERPed is simply one of the best.

What do you think about SERPed? Tell me your thoughts in the comments section below.

SERPed Site Management and Site Explorer Review was originally posted by Video And Blog Marketing

How a nonprofit leveraged a Value Proposition Workshop to see a 136% increase in conversions

Experiment: Background

Background: Willow Creek Association is a nonprofit organization committed to transforming leaders in their respective communities. Each year, they host a two-day event (Global Leadership Summit) to provide those leaders with training from speakers in successful companies.

Goal:  To increase order conversion

Primary Research Question: How can we create more value to increase conversions?

When The Global Leadership Summit team saw a significant decline in conversions from 2015 to 2016, they established a testing culture to understand why the change. One of the hypotheses behind the decline was removing the incentive, but testing proved it was not the incentive that affected the decline; it was the value proposition. Their next step was to analyze their current page for the gaps in perceived value for the prospect. The GLS team held a Value Proposition Workshop and applied their new learnings to their 2017 homepage for the summit — the results are worth sharing.

Experiment: Control

To begin, let’s focus on the value delivery of the control. At first glance, the GLS team noticed that the page held very little perceived value in the headlines and the copy. The GLS team concluded that the headline “About the GLS” did not give enough value. To a new prospect, who has never heard of The Global Leadership Summit, “GLS” might be a big jump. To assume that the prospect would understand this (or even need this information) is dangerous because it does not meet the prospect where they are in the thought sequence. As marketers, we need to ask ourselves: what is the sequential process in their minds as they enter this page? The prospect will probably ask questions more aligned to: How does this summit benefit me? What do I get out of it? Where is it located? Where do I have to travel? Who will be there? How can this improve my current career path? If marketers fail to ask the correct questions with the prospect in mind, we fail to find the correct answers.

As we journey down the page, we finally come across some useful information for the prospect. There is value in the “Location and Dates” section because it answers these crucial questions the prospect might have: Where is it located? Where do I have to travel? Can this product help me? Answering these questions are great. However, its location on the page is not. What is it doing in the middle of the page? If the page fails to answer these critical questions in the first 4 inches, in combination with prospect’s impatience, the conversion could be lost. The GLS team discovered this is a problem that needed to be addressed.

And finally, after analyzing the entire page, there is absolutely no mention of the speakers in attendance. The GLS team observed that they were neglecting the other crucial questions prospects might have when entering this page, as aforementioned.

Experiment: Treatment

Here is the new Global Leadership Summit page. GLS team extracted the real value of the summit and transferred it to a homepage, only after attending the Value Proposition Workshop. Let’s see how the GLS team addressed the value perception gap.

The GLS team added quantifiable claims in the headline … in the first 4 inches of the page. We can already see a stark difference in the headlines from 2016 and 2017. The larger headline reads “Two days of World Class Leadership Training,” and then goes back to read smaller text above the headline: “You have Influence. Join 400,000 of your peers to learn how to maximize it with …” The smaller text quantifies the number of people in attendance and popularity of the summit, while the larger text uses numbers to start showing instances of the Primary Value Proposition. This is an effective way to initially capture interest and build Credibility.

This headline does not only hold Credibility in the numbers, but there is also Specificity in the blue call-out box at the top of the page. The sub-headline under “The Global Leadership Summit” is specific on the location of the event, which erases the concern for travel arrangements (a potential pain point for prospects) thus, creating value. We will continue to see more of the same information elaborated further below, which creates congruence.

They also added specific information about the speakers. In the control, there was virtually no information about the speakers. In this version, we can see the speakers listed, and additionally, we see that the GLS team provided vital information that fostered conclusions. The GLS team leveraged speaker headshots, names AND positions at their respective companies; this increased the prospect’s perceived value, answering the question: “What do I get out of this?”

And finally, they added value throughout the page. At MarketingExperiments, we call this Congruence. At the top of the page, there was copy that read “convenient location near you.” Although the “Location near you” section seems far from the top, the GLS team still alluded this Primary Value Proposition in the main headline. Since this is the expanded section of the main Value Proposition, it creates congruence and reaffirms to the prospect that there is value.

Experiment: Results

So, what does the GLS team get from building credibility and being specific? Not just a forceful Value Proposition, but more than double the conversions.

Without value, you are doing nothing for the prospect

As blunt as that may seem, the truth is the truth. People do not spend time delving into webpages or emails without knowing they are receiving something at the other end. Friends, marketers, do not waste your time replicating other webpages with their nonsense information, designs and vernacular; instead, test and use the prospect’s thought sequence. Ask the right questions to get the right answers. These tools will give you the results that you want for your company.

For more about our value proposition training, click here. To watch The Global Leadership Summit webinar, click here.

How a nonprofit leveraged a Value Proposition Workshop to see a 136% increase in conversions was originally posted by Video And Blog Marketing

Proposing Better Ways to Think about Internal Linking

I’ve long thought that there was an opportunity to improve the way we think about internal links, and to make much more effective recommendations. I feel like, as an industry, we have done a decent job of making the case that internal links are important and that the information architecture of big sites, in particular, makes a massive difference to their performance in search (see: 30-minute IA audit and DistilledU IA module).

And yet we’ve struggled to dig deeper than finding particularly poorly-linked pages, and obviously-bad architectures, leading to recommendations that are hard to implement, with weak business cases.

I’m going to propose a methodology that:

  1. Incorporates external authority metrics into internal PageRank (what I’m calling “local PageRank”) to take pure internal PageRank which is the best data-driven approach we’ve seen for evaluating internal links and avoid its issues that focus attention on the wrong areas

  2. Allows us to specify and evaluate multiple different changes in order to compare alternative approaches, figure out the scale of impact of a proposed change, and make better data-aware recommendations

Current information architecture recommendations are generally poor

Over the years, I’ve seen (and, ahem, made) many recommendations for improvements to internal linking structures and information architecture. In my experience, of all the areas we work in, this is an area of consistently weak recommendations.

I have often seen:

  • Vague recommendations – (“improve your information architecture by linking more to your product pages”) that don’t specify changes carefully enough to be actionable

  • No assessment of alternatives or trade-offs – does anything get worse if we make this change? Which page types might lose? How have we compared approach A and approach B?

  • Lack of a model – very limited assessment of the business value of making proposed changes – if everything goes to plan, what kind of improvement might we see? How do we compare the costs of what we are proposing to the anticipated benefits?

This is compounded in the case of internal linking changes because they are often tricky to specify (and to make at scale), hard to roll back, and very difficult to test (by now you know about our penchant for testing SEO changes – but internal architecture changes are among the trickiest to test because the anticipated uplift comes on pages that are not necessarily those being changed).

In my presentation at SearchLove London this year, I described different courses of action for factors in different areas of this grid:

It’s tough to make recommendations about internal links because while we have a fair amount of data about how links generally affect rankings, we have less information specifically focusing on internal links, and so while we have a high degree of control over them (in theory it’s completely within our control whether page A on our site links to page B) we need better analysis:

The current state of the art is powerful for diagnosis

If you want to get quickly up to speed on the latest thinking in this area, I’d strongly recommend reading these three articles and following their authors:

  1. Calculate internal PageRank by Paul Shapiro

  2. Using PageRank for internal link optimisation by Jan-Willem Bobbink

  3. Easy visualizations of PageRank and page groups by Patrick Stox

A load of smart people have done a ton of thinking on the subject and there are a few key areas where the state of the art is powerful:

There is no doubt that the kind of visualisations generated by techniques like those in the articles above are good for communicating problems you have found, and for convincing stakeholders of the need for action. Many people are highly visual thinkers, and it’s very often easier to explain a complex problem with a diagram. I personally find static visualisations difficult to analyse, however, and for discovering and diagnosing issues, you need data outputs and / or interactive visualisations:

But the state of the art has gaps:

The most obvious limitation is one that Paul calls out in his own article on calculating internal PageRank when he says:

“we see that our top page is our contact page. That doesn’t look right!”

This is a symptom of a wider problem which is that any algorithm looking at authority flow within the site that fails to take into account authority flow into the site from external links will be prone to getting misleading results. Less-relevant pages seem erroneously powerful, and poorly-integrated pages that have tons of external links seem unimportant in the pure internal PR calculation.

In addition, I hinted at this above, but I find visualisations very tricky – on large sites, they get too complex too quickly and have an element of the Rorschach to them:

My general attitude is to agree with O’Reilly that “Everything looks like a graph but almost nothing should ever be drawn as one”:

All of the best visualisations I’ve seen are nonetheless full link-graph visualisations – you will very often see crawl-depth charts which are in my opinion even harder to read and obscure even more information than regular link graphs. It’s not only the sampling but the inherent bias of only showing links in the order discovered from a single starting page – typically the homepage – which is useful only if that’s the only page on your site with any external links. This Sitebulb article talks about some of the challenges of drawing good crawl maps:

But by far the biggest gap I see is the almost total lack of any way of comparing current link structures to proposed ones, or for comparing multiple proposed solutions to see a) if they fix the problem, and b) which is better. The common focus on visualisations doesn’t scale well to comparisons – both because it’s hard to make a visualisation of a proposed change and because even if you can, the graphs will just look totally different because the layout is really sensitive to even fairly small tweaks in the underlying structure.

Our intuition is really bad when it comes to iterative algorithms

All of this wouldn’t be so much of a problem if our intuition was good. If we could just hold the key assumptions in our heads and make sensible recommendations from our many years of experience evaluating different sites.

Unfortunately, the same complexity that made PageRank such a breakthrough for Google in the early days makes for spectacularly hard problems for humans to evaluate. Even more unfortunately, not only are we clearly bad at calculating these things exactly, we’re surprisingly bad even at figuring them out directionally. [Long-time readers will no doubt see many parallels to the work I’ve done evaluating how bad (spoiler: really bad) SEOs are at understanding ranking factors generally].

I think that most people in the SEO field have a high-level understanding of at least the random surfer model of PR (and its extensions like reasonable surfer). Unfortunately, most of us are less good at having a mental model for the underlying eigenvector / eigenvalue problem and the infinite iteration / convergence of surfer models is troublesome to our intuition, to say the least.

I explored this intuition problem recently with a really simplified example and an unscientific poll:

The results were unsurprising – over 1 in 5 people got even a simple question wrong (the right answer is that a lot of the benefit of the link to the new page flows on to other pages in the site and it retains significantly less than an Nth of the PR of the homepage):

I followed this up with a trickier example and got a complete lack of consensus:

The right answer is that it loses (a lot) less than the PR of the new page except in some weird edge cases (I think only if the site has a very strange external link profile) where it can gain a tiny bit of PR. There is essentially zero chance that it doesn’t change, and no way for it to lose the entire PR of the new page.

Most of the wrong answers here are based on non-iterative understanding of the algorithm. It’s really hard to wrap your head around it all intuitively (I built a simulation to check my own answers – using the approach below).

All of this means that, since we don’t truly understand what’s going on, we are likely making very bad recommendations and certainly backing them up and arguing our case badly.

Doing better part 1: local PageRank solves the problems of internal PR

In order to be able to compare different proposed approaches, we need a way of re-running a data-driven calculation for different link graphs. Internal PageRank is one such re-runnable algorithm, but it suffers from the issues I highlighted above from having no concept of which pages it’s especially important to integrate well into the architecture because they have loads of external links, and it can mistakenly categorise pages as much stronger than they should be simply because they have links from many weak pages on your site.

In theory, you get a clearer picture of the performance of every page on your site – taking into account both external and internal links – by looking at internet-wide PageRank-style metrics. Unfortunately, we don’t have access to anything Google-scale here and the established link data providers have only sparse data for most websites – with data about only a fraction of all pages.

Even if they had dense data for all pages on your site, it wouldn’t solve the re-runnability problem – we wouldn’t be able to see how the metrics changed with proposed internal architecture changes.

What I’ve called “local” PageRank is an approach designed to attack this problem. It runs an internal PR calculation with what’s called a personalization vector designed to capture external authority weighting. This is not the same as re-running the whole PR calculation on a subgraph – that’s an extremely difficult problem that Google spent considerable resources to solve in their caffeine update. Instead, it’s an approximation, but it’s one that solves the major issues we had with pure internal PR of unimportant pages showing up among the most powerful pages on the site.

Here’s how to calculate it:

The next stage requires data from an external provider – I used raw mozRank – you can choose whichever provider you prefer, but make sure you are working with a raw metric rather than a logarithmically-scaled one, and make sure you are using a PageRank-like metric rather than a raw link count or ML-based metric like Moz’s page authority:

You need to normalise the external authority metric – as it will be calibrated on the entire internet while we need it to be a probability vector over our crawl – in other words to sum to 1 across our site:

We then use the NetworkX PageRank library to calculate our local PageRank – here’s some outline code:

What’s happening here is that by setting the personalization parameter to be the normalised vector of external authorities, we are saying that every time the random surfer “jumps”, instead of returning to a page on our site with uniform random chance, they return with probabilities proportional to the external authorities of those pages. This is roughly like saying that any time someone leaves your site in the random surfer model, they return via the weighted PageRank of the external links to your site’s pages. It’s fine that your external authority data might be sparse – you can just set values to zero for any pages without external authority data – one feature of this algorithm is that it’ll “fill in” appropriate values for those pages that are missing from the big data providers’ datasets.

In order to make this work, we also need to set the alpha parameter lower than we normally would (this is the damping parameter – normally set to 0.85 in regular PageRank – one minus alpha is the jump probability at each iteration). For much of my analysis, I set it to 0.5 – roughly representing the % of site traffic from external links – approximating the idea of a reasonable surfer.

There are a few things that I need to incorporate into this model to make it more useful – if you end up building any of this before I do, please do let me know:

  • Handle nofollow correctly (see Matt Cutts’ old PageRank sculpting post)

  • Handle redirects and rel canonical sensibly

  • Include top mR pages (or even all pages with mR) – even if they’re not in the crawl that starts at the homepage

    • You could even use each of these as a seed and crawl from these pages

  • Use the weight parameter in NetworkX to weight links by type to get closer to reasonable surfer model

    • The extreme version of this would be to use actual click-data for your own site to calibrate the behaviour to approximate an actual surfer!

Doing better part 2: describing and evaluating proposed changes to internal linking

After my frustration at trying to find a way of accurately evaluating internal link structures, my other major concern has been the challenges of comparing a proposed change to the status quo, or of evaluating multiple different proposed changes. As I said above, I don’t believe that this is easy to do visually as most of the layout algorithms used in the visualisations are very sensitive to the graph structure and just look totally different under even fairly minor changes. You can obviously drill into an interactive visualisation of the proposed change to look for issues, but that’s also fraught with challenges.

So my second proposed change to the methodology is to find ways to compare the local PR distribution we’ve calculated above between different internal linking structures. There are two major components to being able to do this:

  1. Efficiently describing or specifying the proposed change or new link structure; and

  2. Effectively comparing the distributions of local PR – across what is likely tens or hundreds of thousands of pages

How to specify a change to internal linking

I have three proposed ways of specifying changes:

1. Manually adding or removing small numbers of links

Although it doesn’t scale well, if you are just looking at changes to a limited number of pages, one option is simply to manipulate the spreadsheet of crawl data before loading it into your script:

2. Programmatically adding or removing edges as you load the crawl data

Your script will have a function that loads  the data from the crawl file – and as it builds the graph structure (a DiGraph in NetworkX terms – which stands for Directed Graph). At this point, if you want to simulate adding a sitewide link to a particular page, for example, you can do that – for example if this line sat inside the loop loading edges, it would add a link from every page to our London SearchLove page:

site.add_edges_from([(edge['Source'],
'https://www.distilled.net/events/searchlove-london/')])

You don’t need to worry about adding duplicates (i.e. checking whether a page already links to the target) because a DiGraph has no concept of multiple edges in the same direction between the same nodes, so if it’s already there, adding it will do no harm.

Removing edges programmatically is a little trickier – because if you want to remove a link from global navigation, for example, you need logic that knows which pages have non-navigation links to the target, as you don’t want to remove those as well (you generally don’t want to remove all links to the target page). But in principle, you can make arbitrary changes to the link graph in this way.

3. Crawl a staging site to capture more complex changes

As the changes get more complex, it can be tough to describe them in sufficient detail. For certain kinds of changes, it feels to me as though the best way to load the changed structure is to crawl a staging site with the new architecture. Of course, in general, this means having the whole thing implemented and ready to go, the effort of doing which negates a large part of the benefit of evaluating the change in advance. We have a secret weapon here which is that the “meta-CMS” nature of our ODN platform allows us to make certain changes incredibly quickly across site sections and create preview environments where we can see changes even for companies that aren’t customers of the platform yet.

For example, it looks like this to add a breadcrumb across a site section on one of our customers’ sites:

There are a few extra tweaks to the process if you’re going to crawl a staging or preview environment to capture internal link changes – because we need to make sure that the set of pages is identical in both crawls so we can’t just start at each homepage and crawl X levels deep. By definition we have changed the linking structure and therefore will discover a different set of pages. Instead, we need to:

  • Crawl both live and preview to X levels deep

  • Combine into a superset of all pages discovered on either crawl (noting that these pages exist on both sites – we haven’t created any new pages in preview)

  • Make lists of pages missing in each crawl and crawl those from lists

Once you have both crawls, and both include the same set of pages, you can re-run the algorithm described above to get the local PageRanks under each scenario and begin comparing them.

How to compare different internal link graphs

Sometimes you will have a specific problem you are looking to address (e.g. only y% of our product pages are indexed) – in which case you will likely want to check whether your change has improved the flow of authority to those target pages, compare their performance under proposed change A and proposed change B etc. Note that it is hard to evaluate losers with this approach – because the normalisation means that the local PR will always sum to 1 across your whole site so there always are losers if there are winners – in contrast to the real world where it is theoretically possible to have a structure that strictly dominates another.

In general, if you are simply evaluating how to make the internal link architecture “better”, you are less likely to jump to evaluating specific pages. In this case, you probably want to do some evaluation of different kinds of page on your site – identified either by:

  1. Labelling them by URL – e.g. everything in /blog or with ?productId in the URL

  2. Labelling them as you crawl

    1. Either from crawl structure – e.g. all pages 3 levels deep from the homepage, all pages linked from the blog etc)

    2. Or based on the crawled HTML (all pages with more than x links on them, with a particular breadcrumb or piece of meta information labelling them)

  3. Using modularity to label them automatically by algorithmically grouping pages in similar “places” in the link structure

I’d like to be able to also come up with some overall “health” score for an internal linking structure – and have been playing around with scoring it based on some kind of equality metric under the thesis that if you’ve chosen your indexable page set well, you want to distribute external authority as well throughout that set as possible. This thesis seems most likely to hold true for large long-tail-oriented sites that get links to pages which aren’t generally the ones looking to rank (e.g. e-commerce sites). It also builds on some of Tom Capper’s thinking (videoslides, blog post) about links being increasingly important for getting into Google’s consideration set for high-volume keywords which is then reordered by usage metrics and ML proxies for quality.

I have more work to do here, but I hope to develop an effective metric – it’d be great if it could build on established equality metrics like the Gini Coefficient. If you’ve done any thinking about this, or have any bright ideas, I’d love to hear your thoughts in the comments, or on Twitter.

Proposing Better Ways to Think about Internal Linking was originally posted by Video And Blog Marketing

How to Get More SEO Value from your Nofollow Links

How to Get More Value from your Nofollow Links cover

The presence of nofollow links is important for your website, as they help generate more traffic to your pages. This makes them a very crucial element in your link building campaign and helps your SEO. It’s ideal to get the most value out of these links, which is why they are essential to any website.

With this in mind, here are some things that you need to know about nofollow links, along with some great strategies that you can utilize effectively improve your overall campaign.

Nofollow Links

Nofollow Links

A nofollow link is a type of link in a web page that a search engine does not acknowledge. This means that the authority and rank of the web page will not be affected by those links. Nofollow links help prevent spam and low-quality web pages from spreading and has also helped user-generated content gain more traffic as well. You can use a plugin in your browser that detects nofollow links in a page and highlights in dotted box. Here’s what it looks like:

nofollow extension screenshotThen, you can check the underlying code to see if it’s really a nofollow link.

rel nofollow screenshot

A good number of websites today apply nofollow to most of their external links. These links still hold value, as they allow potential visitors to enter a certain web page, which in turn increases traffic for that specific page. Some instances where nofollow links are usually used are:

  • Social Media Profiles: Any link posted in these accounts are nofollow links, and these include your Twitter, Facebook, Instagram, and LinkedIn accounts.
  • Article Sources: Some articles cite their sources from other websites to provide proof and credibility to their content. It’s important to remember that most of these citations are nofollow links. However, this is effective in increasing the traffic that goes into the linked page or cited source.
  • Forums/Communities: Websites like Quora or Reddit contain numerous content that users openly submit a post. Some of these content contain links to the user’s website or any other website that they might have cited. Forums like these usually nofollow these links.
  • Comments: Some websites allow users to give their insights with regards to the content, which provides solid user input to the site. Any links that are in the comments section are nofollow links, which makes sure that the website does not follow any unwanted links that may affect overall page quality.

Effective Strategies to Boost the Value of Nofollow Links

Nofollow links help with connecting users to different kinds of content on the internet. Here are some strategies or best practices you can use to help give it even more value.

Utilize the Power of Social Media

Most people nowadays access the internet using their mobile devices, such as phones and tablets. This has made content much more accessible and convenient, which makes nofollow links very important. For example, if you have some published content that you would like other users to see, like a review or a blog, you can post the link on a social media website, like Twitter – even if the link is nofollow.

Nofollow Twitter screenshot

You can also utilize the hashtags of Instagram, which would help people associate your links with your content. This works handily on current news and even the latest events. Social media has the power to spread and promote your content to a wider audience, so use it well.

Republish your Content on other Platforms

There are many different platforms where you can publish and distribute content, which makes it essential for you to diversify and expand your existing platforms. Simply put, make use of different platforms in your niche to republish content.

For example, if you published an article in a news site, you can republish it on a different site as well, provided that no one claims exclusity towards that certain article. While doing so, you can link it all back to the original post, which helps promote your content and through the effective use of your nofollow links, increase traffic to your page. This is similar to promoting your content across different social media accounts – which will not hinder your SEO.

Make use of Quora Contributions

Quora is a great source for a wide variety of inquiries that range from various topics like entertainment, education, sports, and much more. A good number of these inquiries have seen some helpful and positive responses from users, some of which have external links that help the users. These external links are examples of effective nofollow links, which can help users access different kinds of content around the internet.

Nofollow Quora screenshot

Low-Popularity Blogs

While they may not generate as much traffic as their high-popularity counterparts, comments on low-popularity blogs are great sources of good nofollow links. The reason is that your links are seen as more authentic and reputable, which would add even more value. This can also lead to different users citing your content and looking at you as an important source for their topics.

Key Takeaway

Nofollow links are essential when it comes to link building and SEO, which is why using these strategies would definitely bring about more positive results to your website and content.

Nofollow links would not transfer any link juice, but it still does an adequate job on increasing the traffic a page receives. Also, clicking the link is only the first part of the user’s journey in your page/website, it’s your job to make them stay in your page and make them a loyal follower. Giving your nofollow links more value, ensure that you would be getting a good amount of traffic to your webpages, and improve your SEO goals.

If you have any questions or insights about nofollow links and SEO in general, leave a comment below and let’s talk.

How to Get More SEO Value from your Nofollow Links was originally posted by Video And Blog Marketing

Does Tomorrow Deliver Topical Search Results at Google?

The Oldest Pepper Tree in California

At one point in time, search engines such as Google learned about topics on the Web from sources such as Yahoo! and the Open Directory Project, which provided categories of sites, within directories that people could skim through to find something that they might be interested in.

Those listings of categories included hierarchical topics and subtopics; but they were managed by human beings and both directories have closed down.

In addition to learning about categories and topics from such places, search engines used to use such sources to do focused crawls of the web, to make sure that they were indexing as wide a range of topics as they could.

It’s possible that we are seeing those sites replaced by sources such as Wikipedia and Wikidata and Google’s Knowledge Graph and the Microsoft Concept Graph.

Last year, I wrote a post called, Google Patents Context Vectors to Improve Search. It focused upon a Google patent titled User-context-based search engine.

In that patent we learned that Google was using information from knowledge bases (sources such as Yahoo Finance, IMDB, Wikipedia, and other data-rich and well organized places) to learn about words that may have more than one meaning.

An example from that patent was that the word “horse” has different meanings in different contexts.

To an equestrian, a horse is an animal. To a carpenter, a horse is a work tool when they do carpentry. To a gymnast, a horse is a piece of equipment that they perform manuevers upon during competitions with other gymnasts.

A context vector takes these different meanings from knowledge bases, and the number of times they are mentioned in those places to catalogue how often they are used in which context.

I thought knowing about context vectors was useful for doing keyword research, but I was excited to see another patent from Google appear where the word “context” played a featured role in the patent. When you search for something such as a “horse”, the search results you recieve are going to be mixed with horses of different types, depending upon the meaning. As this new patent tells us about such search results:

The ranked list of search results may include search results associated with a topic that the user does not find useful and/or did not intend to be included within the ranked list of search results.

If I was searching for a horse of the animal type, I might include another word in my query that identified the context of my search better. The inventors of this new patent seem to have a similar idea. The patent mentions

In yet another possible implementation, a system may include one or more server devices to receive a search query and context information associated with a document identified by the client; obtain search results based on the search query, the search results identifying documents relevant to the search query; analyze the context information to identify content; and generate a group of first scores for a hierarchy of topics, each first score, of the group of first scores, corresponding to a respective measure of relevance of each topic, of the hierarchy of topics, to the content.

From the pictures that accompany the patent it looks like this context information is in the form of Headings that appear above each search result that identify Context information that those results fit within. Here’s a drawing from the patent showing off topical search results (showing rock/music and geology/rocks):

Search Results in Context
Different types of ‘rock’ on a search for ‘rock’ at Google

This patent does remind me of the context vector patent, and the two processes in these two patents look like they could work together. This patent is:

Context-based filtering of search results
Inventors: Sarveshwar Duddu, Kuntal Loya, Minh Tue Vo Thanh and Thorsten Brants
Assignee: Google Inc.
US Patent: 9,779,139
Granted: October 3, 2017
Filed: March 15, 2016

Abstract

A server is configured to receive, from a client, a query and context information associated with a document; obtain search results, based on the query, that identify documents relevant to the query; analyze the context information to identify content; generate first scores for a hierarchy of topics, that correspond to measures of relevance of the topics to the content; select a topic that is most relevant to the context information when the topic is associated with a greatest first score; generate second scores for the search results that correspond to measures of relevance, of the search results, to the topic; select one or more of the search results as being most relevant to the topic when the search results are associated with one or more greatest second scores; generate a search result document that includes the selected search results; and send, to a client, the search result document.

It will be exciting to see topical search results start appearing at Google.


Copyright © 2017 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

Does Tomorrow Deliver Topical Search Results at Google? was originally posted by Video And Blog Marketing

The SEO Apprentice’s Toolbox: Gearing Up for Analysis

Being new to SEO is tricky. As a niche market within a niche market there many tools and resources unfamiliar to most new professionals. And with so much to learn it is nearly impossible to start real client work without first dedicating six months exclusively to industry training. Well…that’s how it may seem at first.

While it may be intimidating, investigating real-world problems is the best way to learn SEO. It exposes you to industry terminology, introduces you to valuable resources and gets you asking the right questions.

As a fairly new Analyst at Distilled, I know from experience how difficult it can be to get started. So here’s a list of common SEO analyses and supporting tools that may help you get off on the right foot.

Reviewing on-page elements

Page elements are essential building blocks of any web page. And pages with missing or incorrect elements risk not being eligible for search traffic. So checking these is necessary for identifying optimization opportunities and tracking changes. You can always go to the HTML source code and manually identify these problems yourself, but if you’re interested in saving a bit of time and hassle, Ayima’s Google Chrome extension Page Insights is a great resource.

This neat little tool identifies on-page problems by analyzing 24 common on-page issues for the current URL and comparing them against a set of rules and parameters. It then provides a list of all issues found, grouped into four priority levels: Errors, Warnings, Notices and Page Info. Descending from most to least severe, the first 3 categories (Errors, Warnings & Notices) identify all issues that could impact organic traffic for the page in question. The last category (Page Info) provides exact information about certain elements of the page.

For every page you visit Page Insights will give a warning next to its icon, indicating how many vulnerabilities were found on the page.

Clicking on the icon gives you a drop-down listing the vulnerabilities and page information found.

What makes this tool so useful is that it also provides details about each issue, like how it can cause harm to the page and correction opportunities. In this example, we can see that this web page is missing an H1 tag, but in this case, could be corrected by adding anH1 tag around the page’s current heading (which is not coded as an H1).

In a practical setting, Page Insights is great for quickly identify common on-page issues that should be fixed to ensure best SEO practice.

Additional tools for reviewing on-page elements:

Supplemental readings:

Analyzing page performance

Measuring the load functionality and speed of a page is an important and common practice since both metrics are correlated with user experience and are highly valued by search engines. There are a handful of tools that are applicable to this task but because of its large quantity of included metrics, I recommend using WebPagetest.org.

Emulating various browsers, this site allows users to measure the performance of a web page from different locations. After sending a real-time page request, WebPagetest provides a sample of three tests containing request details, such as the complete load time, the load time breakdown of all page content, and a final image of the rendered page. There are various configuration settings and report types within this tool, but for most analyses, I have found that running a simple test and focusing on the metrics presented in the Performance Results supply ample information.

There are several metrics presented in this report, but data provided in Load Time and First Byte work great for most checks. Factoring in Google’s suggestion to have desktop load time no greater than 2 seconds and a time to first byte of 200ms or less, we can gauge whether or not a page’s speed is properly optimized.

Prioritizing page speed performance areas

Knowing if a page needs to improve its performance speed is important, but without knowing what areas need improving you can’t begin to make proper corrections. Using WebPagetest in tandem with Google’s PageSpeed Insights is a great solution for filling in this gap.

Free for use, this tool measures a page’s desktop and mobile performance to evaluate whether it has applied common performance best practices. Scored on a scale of 0-100 a page’s performance can fall into one of three categories: Good, Needs Work or Poor. However, the key feature of this tool, which makes it so useful for page speed performance analysis, is its optimization list.

Located below the review score, this list highlights details related to possible optimization areas and good optimization practices currently in place on the page. By clicking the “Show how to fix” drop down for each suggestion you will see information related to the type of optimization found, why to implement changes and specific elements to correct.

In the image above, for example, compressing two images to reduce the amount bytes that need to be loaded can improve this web page’s speed. By making this change the page could expect a reduction in image byte size by 28%.

Using WebPagetest and PageSpeed Insights together can give you a comprehensive view of a page’s speed performance and assist in identifying and executing on good optimization strategies.

Additional tools for analyzing page performance:

Supplemental readings:

Investigating rendering issues

How Googlebot (or Bingbot or MSNbot) crawls and renders a page can be completely different from what is intended, and typically occurs as a result of the crawler being blocked by a robots.txt file. If Google sees an incomplete or blank page it assumes the user is having the same experience and could affect how that page performs in the SERPs. In these instances, the Webmaster tool Fetch as Google is ideal for identifying how Google renders a page.

Located in Google Search Console, Fetch as Google allows you to test if Googlebot can access pages of a site, identify how it renders the page and determines if any resources are blocked from the crawler.

When you look up a specific URL (or domain) Fetch as Google gives you two tabs of information: fetching, which displays the HTTP response of the specified URL; and rendering, which runs all resources on the page, provides a visual comparison of what Googlebot sees against what (Google estimates) the user sees and lists all resources Googlebot was not able to acquire.

For an analysis application, the rendering tab is where you need to look. Begin by checking the rendering images to ensure both Google and the user are seeing the same thing. Next, look at the list to see what resources were unreachable by Googlebot and why. If the visual elements are not displaying a complete page and/or important page elements are being blocked from Googlebot, there is an indication that the page is experiencing some rendering issues and may perform poorly in the search engine.

Additional tools for investigating rendering issues:

Supplemental readings:

Checking backlink trends

Quality backlinks are extremely important for making a strong web page, as they indicate to search engines a page’s reliability and trustworthiness. Changes to a backlink profile could easily affect how it is ranked in the SERPs, so checking this is important for any webpage/website analysis. A testament to its importance, there are several tools dedicated to backlinks analytics. However, I have a preference for the site Ahrefs due to its comprehensive yet simple layout, which makes it great for on-the-spot research.

An SEO tool well known for its backlink reporting capabilities, Ahrefs measures several backlink performance factors and displays them in a series of dashboards and graphs. While there is plenty to review, for most analysis purposes I find the “Backlinks” metric and “New & lost backlinks” graph to be the best places to focus.

Located under the Site Explorer tab, “Backlinks” identifies the total number of backlinks pointing to a target website or URL. It also shows the quantitative changes in these links over the past 7 days with the difference represented by either a red (negative growth) or green (positive growth) subscript. In a practical setting, this information is ideal for providing quick insight into current backlink trend changes.

Under the same tab, the “New & lost backlinks” graph provides details about the total number of backlinks gained and lost by the target URL over a period of time.

The combination of these particular features works very well for common backlink analytics, such as tracking backlinks profile changes and identifying specific periods of link growth or decline.

Additional tools for checking backlink trends:

Supplemental readings:

Creating your toolbox

This is only a sample of tools you can use for your SEO analyses and there are plenty more, with their own unique strengths and capabilities, available to you. So make sure to do your research and play around to find what works.

And if you are to take away only one thing from this post, just remember that as you work to build your own personal toolbox what you choose to include should best work for your needs and the needs of your clients.

The SEO Apprentice’s Toolbox: Gearing Up for Analysis was originally posted by Video And Blog Marketing