Month: November 2019

A Guide to Creating an SEO Strategy for Website Revamps and Migrations

Last updated on

A-Guide-to-Creating-an-SEO-Strategy-for-Website-Revamps-and-Migrations

Most SEO Horror stories come from one thing – Site revamps and migrations. A client’s failure to inform their SEO provider regarding their site revamp or migration often leads to disastrous SEO results. But there are instances wherein an SEO provider doesn’t fully grasp the tasks needed to be done when migration or revamp is in order. That’s why I made this guide to remind fellow webmasters and SEOs on the important tasks that need to be done when their sites are revamped or migrated.

After almost a decade in the SEO industry, I’ve experienced my fair share of site migrations and revamps gone wrong. Through those mistakes, I’ve learned the importance of having a complete process of site migrations and the opportunities that site revamps offer. Let’s start.

What is Site Migration and Revamp?

These are terms that describe an event where a website engages in changes that considerably affect its performance in the search engines. These changes may include:

  • Change in Domain name
  • Change in Website Architecture
  • Change in Website design – UI/UX
  • Change in Website Systems and Platforms

Most of these contribute to the overall difficulty that the migration and/or revamp poses when we try to avoid or recover from the repercussions.

Common Problems with Site Migration and Revamp (And Solutions)

Since problems and mistakes are more common than perfect migrations, here are some of the problems I’ve consistently experience while our client’s websites are migrating or are being revamped:

Redirects 

The most common forms of redirection are 301s and 302s. There’s a definite difference between the two, but it’s also common for developers and webmasters to mistake one for the other. 301 redirect is the best redirection method for site migrations since it’s a signal that the old page is permanently redirected to the new one. Meanwhile, 302 redirects are only temporary redirects – signaling that the old page may still be used in the near future. 

After 301 redirecting the pages, most webmasters forget to use a vital tool regarding site migrations – Google’s change of address tool. Here’s how to use it:

  • Simply go to your search console property 

Search Console Screenshot

  • Go to Settings 

Search Console Settings Screenshot

  • Click on Change of Address and fill up all the necessary details.

Change of Address tool Screenshot

That’s how you properly redirect and migrate your old site to the new one.

Orphaned Pages

Another problem that I’ve experienced is a massive increase in orphaned pages since the internal linking structure of our client’s pages were not retained. Many blog posts became orphaned pages and slowly disappeared from search results since I never noticed them becoming orphaned. Of course, looking for orphaned pages is easily solved by using crawling tools like ScreamingFrog and Netpeak’s spider to do a quick audit of your website

The solution I came up with is to create an internal linking database that contained the internal links that point to specific pages on our client’s website, so the next time a site migration or revamp happens, I know which page our internal links point to.

Internal Linking Database Screenshot

It’s a lot of hard work, but it helps me keep track of our client’s internal linking structure which definitely leads to avoidance of having orphaned pages in our new domain or newly revamped site.

Noindex Meta Tag

The last, commonly done mistake I’ve experienced is not removing the noindex meta tag while the site is transitioning from the dev site to the live site. The problem here is it leads to most pages of a website to be removed from Google’s index thereby drastically decreasing a website’s visibility. Which led to this:

Google Analytics No Index Screenshot

For almost a month, visitors to our client’s website flatlined and it took days of investigation from my team to find out that the site still had its noindex meta tag from the dev site. 

The countermeasure I came up with is to have a dedicated Quality assurance team that is in charge of checking the faults and errors inside a site that’s about to go live. 

SEO Strategy for Website Migration and Revamps

You can understand site migration and revamps as a way of starting out with a clean slate but with an advantage. You’re technically optimizing a new, up and running site, but it’s using the value, authority, and signals that the old site has to boost it’s way up in the search engine results pages. So, even before a website migration and revamp happens, you should already be able to create a strategy that will benefit and align with the new site. Some of the factors you need to consider are:

Website Structure

Your website’s architecture, internal linking structure, and even the page’s onsite factors are all due for an update. This is your chance to try out things you’ve been meaning to experiment with or you can retain your old strategy and hope that it works just as well for the new site. You can even try out new strategies such as Hubspot’s topic cluster content strategy.

Keywords

Keywords are the foundation of your SEO. So, make sure that the keywords you’re targeting are extremely beneficial to your search marketing goals. Site migration and revamps enables you to completely augment a page’s design and content to better fit the keyword you’re targeting. For example, you might be at the lower bracket of the first page for a specific keyword, maybe augmenting the content and structure/design of the page can help you reach the upper bracket or even the number 1 spot of the search results.

Onsite Optimization

One of the best experiences I’ve had was when a website had transitioned to a new domain. One of the onsite problems that plagued my team with their old website is the lack of onsite optimization, especially with their URL structure and slugs. So, when the client wanted to transition to a new domain, my team took that chance to optimize every little detail they could find inside the new domain’s pages – including the site’s URL structure and slugs. 

Key Takeaway

While it is true that migrating or revamping a website poses a lot of problems, it also serves to open new doors of opportunities to your SEO strategy. Always do your research and be ready when problems arise – that’s the best way to avoid mishaps during revamps and migrations. Having your new SEO strategy ready when the site goes live enables you to save time, money, and opens you to a lot more opportunities for growth and experimentation. Want to share your experience with a site migration or revamp? Comment it down below!

A Guide to Creating an SEO Strategy for Website Revamps and Migrations was originally posted by Video And Blog Marketing

SEO Tips: Upgrade your Content Strategy with NLP

Last updated on

Things-to-Consider-in-Creating-an-NLP-Strategy

Natural Language Processing is simply the area that focuses on the engagement between human language and computers. As simple as it may sound, there are many intricacies associated with it, especially if you are planning on implementing it for SEO. Before diving in headfirst on building your SEO content around NLP, it would be wise to craft a strategy that would fulfill its purpose.

With the introduction of the BERT update, importance on opt0imizing your content with NLP in mind bears significant weight on us SEOs. Although it is a given that you shouldn’t just churn out content just because you think it would perform well in the SERPs, many content marketers are guilty of this. The Google BERT update allows us to rethink our content strategies in pursuit of satisfying a natural language algorithm to cater to user queries better.

With that said, what are the things you need to consider in creating an NLP content strategy? What are the metrics that would help you attain success in creating these techniques? Read on below.

Starting a Sharp NLP Content Strategy

At the heart of it all, NLP is all about straightforward content, one that is easily understood by both people and systems. Starting a content strategy also calls for thinking of its technical counterpart. Relating to how the systems work and then integrating this to your content is one of the ways that can help you achieve success with your strategy.

Much like its programming counterpart, the way to kick off an NLP strategy is to start with a proper data set for the content that you need. This is when you need to conduct in-depth research about your topic. Say, for example, you are writing an article about automotive parts.

What you can do is break down your topic into questions. Let your whole body of content be the answer to your topic. If it would be easier for you, then write direct-to-the-point questions as the subheadings of your article all about the different information you can answer about automotive parts.

The example below shows how NLP can be favorable for your SEO. For this article, I have tried to create a direct to the point article that will address metal stamped parts and its application in the automotive industry. As you can see, we have secured the featured snippet for the keyword on how metal stamped parts are used in the automotive industry. This says a lot about NLP and how you can use it in your favor.

ROBERTS SNIPPET

The proper data set will ensure the relativity of the output of your strategy in regard to your input, which means that the whole essence of the content and how it can help answer user queries will quickly follow suit. Let separate elements of your research bind as one as you craft your content to fulfill search intent.

Although content NLP and NLP in a programming language are implemented differently, both work for the same intent and purposes; understanding user behavior and sentiments through the information that you are going to deliver.

Factors to Consider in Creating an NLP Strategy

According to wordlift.io, search engines are improving their systems to get a hold of a better understanding of user intent through its linguistic AI capabilities. To make sure that you are on the right track in optimizing for NLP, then you should think about the factors stated below and try to experiment with it in your content.

Injecting User Query to Content

Keywords are very powerful especially if you use them in the right context. The relevance of your content to your keyword is the recipe for SEO success. To do this, open your Google Search Console and see what queries your site is signaling to Google. Maximize the use of these keywords in your content generation and you’re all set.

queries

This is not to say that your topics should be limited to what is listed in your search console. Explore variations of these keywords and extract elements of information related to them. As I have said before, breaking down your content into parts would entice users to learn more about these particular topics. Make use of the tools at your disposal and keep NLP in mind when constructing these sentences.

Limit Stopwords

One element of NLP that you should consider is to limit the stopwords that you use with your keywords. Creating seed and semantic keywords would be best to satisfy this factor. What you should know about NLP is that you are not just optimizing content for the search engines but also for users. To do this, come up with content that is not full of stopwords so both of these agents understand what information you are delivering. In order to do that, limit the use of stopwords and maximize the use of your keyword variations. That way, you would be hitting two birds with one stone: answering user queries and direct content delivery.

Understanding Salience

Salience is one of the most important ingredients for an NLP strategy. Mastering content relevance would be your best bet in having a solid strategy. Strive to create content that would not stray far away from the topic. The relationship between each part of the text plays an important role in delivering a piece that is closely related to the essence of NLP. Avoid fillers. If you are aiming to say something, do not beat around the bush. Write as you would talk to another person.

Aim for a Concise Output

Developing an NLP strategy for SEO would also mean that you should aim for a concise output. Together with relevance, you would know that your content is well-fitting for an NLP strategy if you succeed in getting your point across. Consider this especially for voice search optimization which will require you to optimize your content to cater to quick queries. Be as concise as possible. Just like with content relevance, it would be a good idea to present clear and direct information.

Key Takeaway

If you aren’t already implementing NLP to your content strategy, you are surely missing out. Of course, you would be making your own strategy that is tailor-fit for your brand and audience. This brief guide is to help you get started and to encourage you to experiment with this type of strategy. It will go a long way, especially in increasing traffic and generating growth for your blog or site. Try it today and tell me how you started out by commenting down below.

SEO Tips: Upgrade your Content Strategy with NLP was originally posted by Video And Blog Marketing

User-Specific Knowledge Graphs to Support Queries and Predictions

A recently granted patent from Google is about supporting querying and predictions, and it does this by focusing on user-specific knowledge graphs.

Those User Specific Knowledge Graphs can be specific to particular users.

This means Google can use those graphs to provide results in response to one or more queries submitted by the user, and/or to surface data that might be relevant to the user.

I was reminded of another patent that I recently wrote about when I saw this patent, in the post Answering Questions Using Knowledge Graphs, where Google may perform a search on a question someone asks, and build a knowledge graph from the search results returned, to use to find the answer to their question.

So Google doesn’t just have one knowledge graph but may use many knowledge graphs.

New ones for questions that may be asked, or for different people asking those questions.

This User-Specific Knowledge Graph patent tells us that innovative aspects of the process behind it include:

  1. Receiving user-specific content
  2. The user-specific content can be associated with a user of one or more computer services
  3. That user-specific content is processed using one or more parsers to identify one or more entities and one or more relationships between those entities
  4. A parser being specific to a schema, and the one or more entities and the one or more relationships between entities being identified based on the schema
  5. This processes provides one or more user-specific knowledge graphs
  6. A user-specific knowledge graph being specific to the user, which includes nodes and edges between nodes to define relationships between entities based on the schema
  7. The process includes storing the one or more user-specific knowledge graphs

Optional Features involving providing one or more user-specific knowledge graphs may also include:

  • Determining that a node representing an entity of the one or more entities and an edge representing a relationship associated with the entity are absent from a user-specific knowledge graph
  • Adding the node and the edge to the user-specific knowledge graph
  • The edge connecting the node to another node of the user-specific knowledge graph

Actions further include:

  1. Receiving a query
  2. Receiving one or more user-specific results that are responsive to the query
  3. The one or more user-specific results are provided based on the one or more user-specific knowledge graphs
  4. Providing the one or more user-specific results for display to the user
  5. An edge is associated with a weight
  6. The weight indicating a relevance of a relationship represented by the edge
  7. A value of the weight increases based on reinforcement of the relationship in subsequent user-specific content
  8. A value of the weight decreases based on lack of reinforcement of the relationship in subsequent user-specific content
  9. A number of user-specific knowledge graphs are provided based on the user-specific content
  10. Each user-specific knowledge graph being specific to a respective schema
  11. The user-specific content is provided through use of the one or more computer-implemented services by the user

Advantages of Using the User-Specific Knowledge Graph System

The patent describes the advantages of implementing the process in this patent:

  1. Enables knowledge about individual users to be captured in a structured manner
  2. Enabling results to be provided in response to complex queries, e.g., series of queries, regarding a user
  3. The user-specific knowledge graph may provide a single canonical representation of the user based on user activity inferred from one or more computer-implemented services
  4. User activities could be overlapping, where reconciliation of the user-specific knowledge graph ensures a canonical entry is provided for each activity
  5. Joining these together could lead to a universal knowledge graph, e.g., non-user-specific knowledge graph, and user-specific knowledge graphs

(That Universal Knowledge Graph sounds interesting.)

Information from sources like the following may be used to create User-Specific Knowledge Graphs:

  • A user’s social network
  • Social actions or activities
  • Profession
  • A user’s preferences
  • A user’s current location

This is so that content that could be more relevant to the user is used in those knowledge graphs.

We are told also that “a user’s identity may be treated so that no personally identifiable information can be determined for the user,” and that “a user’s geographic location may be generalized so that a particular location of a user cannot be determined.”

The User-specific Knowledge Graph Patent

This patent can be found at:

Structured user graph to support querying and predictions
Inventors: Pranav Khaitan and Shobha Diwakar
Assignee: Google LLC
US Patent: 10,482,139
Granted: November 19, 2019
Filed: November 5, 2013

Abstract

Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for receiving user-specific content, the user-specific content being associated with a user of one or more computer-implemented services, processing the user-specific content using one or more parsers to identify one or more entities and one or more relationships between entities, a parser being specific to a schema, and the one or more entities and the one or more relationships between entities being identified based on the schema, providing one or more user-specific knowledge graphs, a user-specific knowledge graph being specific to the user and including nodes and edges between nodes to define relationships between entities based on the schema, and storing the one or more user-specific knowledge graphs.

What Content is in User-Specific Knowledge Graphs?

The types of services that user-specific knowledge graph information could be pulled from can include:

  • A search service
  • An electronic mail service
  • A chat service
  • A document sharing service
  • A calendar sharing service
  • A photo sharing service
  • A video sharing service
  • Blogging service
  • A micro-blogging service
  • A social networking service
  • A location (location-aware) service
  • A check-in service
  • A ratings and review service

A User-Specific Knowledge Graph System

This patent describes a search system that includes a user-specific knowledge graph system as part of that search system, either directly connected to or connected to search system over a network.

The search system may interact with the user-specific knowledge graph system to create a user-specific knowledge graph.

That user-specific knowledge graph system may provide one or more user-specific knowledge graphs, which can be stored in a data store.

Each user-specific knowledge graph is specific to a user of the one or more computer-implemented services, e.g., search services provided by the search system.

The search system may interact with the user-specific knowledge graph system to provide one or more user-specific search results in response to a search query.

Structured User Graphs For Querying and Predictions

A user-specific knowledge graph is created based on content associated with the user.

These user-specific knowledge graphs include a number of nodes and edges between nodes.

A node represents an entity and an edge represents a relationship between entities.

Nodes and/or entities of a user-specific knowledge graph can be provided based on the content associated with a respective user, to which the user-specific knowledge graph is specific.

User-Specific Knowledge Graphs and Schemas

The user-specific knowledge graphs can be created based on one or more schemas (examples follow). A schema describes how data is structured in the user-specific knowledge graph.

A schema defines a structure for information provided in the graph.

A schema structures data based on domains, types, and properties.

A domain includes one or more types that share a namespace.

A namespace is provided as a directory of uniquely named objects, where each object in the namespace has a unique name or identifier.

For example, a type denotes an “is a” relationship about a topic, and is used to hold a collection of properties.

A topic can represent an entity, such as a person, place or thing.

Each of these topics can have one or more types associated with them.

A property can be associated with a topic and defines a “has a” relationship between the topic and a value of the property.

In some examples, the value of the property can include another topic.

A user-specific knowledge graph can be created based on content associated with a respective user.

That content may be processed by one or more parsers to populate the user-specific structured graph.

A parser may be specific to a particular schema.

Confidence or Weights in Connections

Weights that are assigned between nodes indicate a relative strength in the relationship between nodes.

The weights can be determined based on the content associated with the user, which content underlies provision of the user-specific knowledge graph.

That content can provide a single instance of a relationship between nodes, or multiple instances of a relationship between nodes.

So, there can be a minimum value and a maximum value.

Weights can also be dynamic:

  • Varying over time based on content associated with the user
  • Based on content associated with the user at a first time
  • Based on content or a lack of content associated with the user at a second time
  • The content at the first time can indicate a relationship between nodes
  • Weights can decay over time

Multiple User Specific Knowledge Graphs

More than one user-specific knowledge graph can be provided for a particular user.

Each user-specific knowledge graph may be specific to a particular schema.

Generally, a user-specific knowledge graph includes knowledge about a specific user in a structured manner. (It represents a portion of the user’s world through content associated with the user through one or more services.)

Knowledge captured in the user-specific knowledge graph can include things such as:

  • Activities
  • Films
  • Food
  • Social connections, e.g., real-world and/or virtual
  • Education
  • General likes
  • General dislikes

User-Specific Knowledge Graph Versus User-Specific Social Graph

A social graph contains information about people who someone might be connected to, where a user-specific Knowledge graph also overs knowledge about those connections, such as shared activities between people who might be connected in a knowledge graph.

Examples of Queries and User-Specific Knowledge Graphs

User-specific Knowledge graph example

These are examples from the patent. Note that searches, emails, social network posts may all work together to build a user-specific Knowledge Graph as seen in the combined messages/actions below, taken together, which may cause the weights on edges between nodes to become stronger, and nodes and edges to be added to that knowledge graph.

Example search query: [playing tennis with my kids in mountain view] to a search service

Search results: which may provide information about playing tennis with kids in Mountain View, Calif.

Nodes can be provided, with one representing the entity “Tennis,” one representing “Mountain View,” one representing “Family,” and a couple more each representing “Child.”

An edge can be provided that represents a “/Location/Play_In” relationship between the nodes, another edge may represent a “/Sport/Played_With” relationship between the nodes and other edges may represent “/Family/Member_Of” relationships between the node and the nodes.

Weights may be generated for each of the edges to represent different values as well.

A Person may post the example post “We had a great time playing tennis with our kids today!” in a social networking service, associated with geo-location data indicating Mountain View, Calif.

Nodes may be identified representing tennis, Mountain View, family and children, and edges between those nodes.

Weights may be generated between those edges.

Someone may receive an electronic message from a hotel, which says “Confirming your hotel reservation in Waikiki, Hi. from Oct. 15, 2014, through Oct. 20, 2014. We’re looking forward to making your family’s vacation enjoyable!”

Nodes can be added to the user-specific Knowledge graph, where those nodes represent the entities “Vacation” and “Waikiki”

Edges can be created in the user-specific knowledge graph in response to that email that represents a “/Vacation/Travelled_With” relationship between the nodes, one that represents a “/Vacation/CityTown” relationship between the nodes, and another edge that represents a “/Vacation/CityTown” relationship between the nodes.

Timing nodes may also be associated with the other nodes, such as a timing node representing October 2014, or a node representing a date range of Oct. 15, 2014, through Oct. 20, 2014.

The user can submit the example search query [kids tennis lessons in waikiki] to a search service.

Nodes may be created in the user-specific knowledge graph representing tennis, Waikiki, family, and children, as well as respective edges between at least some of the nodes.

That example search query may reinforce the relevance of the various entities and the relationships between the entities to the particular user.

That reinforcement may cause the respective weights associated with the edges to be increased.

The user can receive an email from a tennis club, which can include “Confirming tennis lessons at The Club of Tennis, Waikiki, Hi.”

Nodes represent tennis, and Waikiki, and the edges between them.

That email reinforces the relevance of the entities and the relationships between the entities to the particular user.

The weights between the entities could be increased, and a node could be added to represent the entity “The Club of Tennis,” which could then be connected to one or more other nodes.

User-Specific Knowledge Graphs Takeaways

This reminds me of personalized search, but tells us that it is looking at more than just our search history – It includes data from sources such as emails that we might send or receive, or posts that we might make to social networks. This knowledge graph may contain information about the social connections we have, but it also contains knowledge information about those connections as well. The patent tells us that personally identifiable information (including location information) will be protected, as well.

And it tells us that User-specific knowledge graph information could be joined together to build a universal knowledge graph, which means that Google is building knowledge graphs to answer specific questions and for specific users that could potentially be joined together, to enable them to avoid the limitations of a knowledge graph based upon human-edited sources like Wikipedia.


Copyright © 2019 SEO by the Sea ⚓. This Feed is for personal non-commercial use only. If you are not reading this material in your news aggregator, the site you are looking at may be guilty of copyright infringement. Please contact SEO by the Sea, so we can take appropriate action immediately.
Plugin by Taragana

User-Specific Knowledge Graphs to Support Queries and Predictions was originally posted by Video And Blog Marketing

Free web-based robots.txt parser based on Google’s open source C++ parser

The punchline: I’ve been playing around with a toy project recently and have deployed it as a free web-based tool for checking how Google will parse your robots.txt files, given that their own online tool does not replicate actual Googlebot behaviour. Check it out at realrobotstxt.com.

While preparing for my recent presentation at SearchLove London, I got mildly obsessed by the way that the deeper I dug into how robots.txt files work, the more surprising things I found, and the more places I found where there was conflicting information from different sources. Google’s open source robots.txt parser should have made everything easy by not only complying with their newly-published draft specification, but also by apparently being real production Google code.

Two challenges led me further down the rabbit hole that ultimately led to me building a web-based tool:

  1. It’s a C++ project, so needs to be compiled, which requires at least some programming / code administration skills, so I didn’t feel like it was especially accessible to the wider search community
  2. When I got it compiled and played with it, I discovered that it was missing crucial Google-specific functionality to enable us to see how Google crawlers like the images and video crawlers will interpret robots.txt files

Ways this tool differs from other resources

Apart from the benefit of being a web-based tool rather than requiring compilation to run locally, my realrobotstxt.com tool should be 100% compliant with the draft specification that Google released, as it is entirely powered by their open source tool except for two specific changes that I made to bring it in line with my understanding of how real Google crawlers work:

  1. Googlebot-image, Googlebot-video and Googlebot-news(*) should all fall back on obeying Googlebot directives if there are no rulesets specifically targeting their own individual user agents – we have verified that this is at least how the images bot behaves in the real world
  2. Google has a range of bots (AdsBot-Google, AdsBot-Google-Mobile, and the AdSense bot, Mediapartners-Google) which apparently ignore User-agent: * directives and only obey rulesets specifically targeting their own individual user agents

[(*) Note: unrelated to the tweaks I’ve made, but relevant because I mentioned Googlebot-news, it is very much not well-known that Googlebot-news is not a crawler and hasn’t been since 2011, apparently. If you didn’t know this, don’t worry – you’re not alone. I only learned it recently, and it’s pretty hard to discern from the documentation which regularly refers to it as a crawler. The only real official reference I can find is the blog post announcing its retirement. I mean, it makes sense to me, because having different crawlers for web and news search opens up dangerous cloaking opportunities, but why then refer to it as a crawler’s user agent throughout the docs? It seems, though I haven’t been able to test this in real life, as though rules directly targeting Googlebot-news function somewhat like a Google News-specific noindex. This is very confusing, because regular Googlebot blocking does not keep URLs out of the web index, but there you go.]

I expect to see the Search Console robots.txt checker retired soon

We have seen a gradual move to turn off old Search Console features and I expect that the robots.txt checker will be retired soon. Googlers have recently been referring recently to it being out of step with how their actual crawlers work – and we can see differences in our own testing:

Google Search Console robots.txt checker is wrong

These cases seem to be handled correctly by the open source parser – here’s my web-based tool on the exact same scenario:

This felt like all the more reason for me to release my web-based version, as the only official web-based tool we have is out of date and likely going away. Who knows whether Google will release an updated version based on their open source parser – but until they do, my tool might prove useful to some people.

I’d like to see the documentation updated

Unfortunately, while I can make a pull request against the open source code, I can’t do the same with Google documentation. Despite implications out of Google that the old Search Console checker isn’t in sync with real Googlebot, and hence shouldn’t be trusted as the authoritative answer about how Google will parse a robots.txt file, references to it remain widespread in the documentation:

In addition, although it’s natural that old blog posts might not be updated with new information, these are still prominently ranking for some related searches:

Who knows. Maybe they’ll update the docs with links to my tool 😉

Let me know if it’s useful to you

Anyway. I hope you find my tool useful – I enjoyed hacking around with a bit of C++ and Python to make it – it’s good to have a “maker” project on the go sometimes when your day job doesn’t involve shipping code. If you spot any weirdness, have questions, or just find it useful, please drop me a note to let me know. You can find me on Twitter.

Free web-based robots.txt parser based on Google’s open source C++ parser was originally posted by Video And Blog Marketing

Black Friday SEO Tools Sale 2019

Last updated on

It’s that time of the year again where products from around the world go on a crazy sale and us consumers love it: Black Friday. Customers flock to retail stores to grab the cheapest deals they can get on smartphones, clothes, and other products. Lucky for us SEOs we don’t have to go out of our homes. Just sit back, relax, open our laptops or computers, and look for huge sales on SEO Tools.

If you’re looking at the best deals for SEO tools this Black Friday. Then this blog post is for you. Our partners at the SEO Hacker Toolbox will be having a big sale. I highly recommend these tools as these are tried and tested by our team and I could say that these tools made our operations 100% better. Check out the tools that are in the sale: 

  1. Netpeak Software
  2. SE Ranking
  3. Linkody
  4. JetOctopus
  5. Wordlift
  6. Mangools

Netpeak Spider & Checker

Netpeak Software is a company that develops desktop tools for SEO specialists and webmasters: Netpeak has two tools: Netpeak Spider and Netpeak Checker.

Netpeak Spider is a desktop tool for your day-to-day SEO audit, fast issue check, comprehensive analysis, and website scraping. One of the things that I love about Netpeak Spider is that it focuses directly on the issues that need your attention. You don’t have to navigate through a lot as it highlights the pages that are in need of fixing. You can also set your preferred Crawling Settings and filter your crawl results with Netpeak’s 65 parameters.

Need a quick website report? Netpeak Spider can export your data into a PDF file in 2 clicks. The best part here is that you can customize it for your clients without mentioning Netpeak Software.

Netpek Checker is also a desktop tool that scrapes SERP data from top SEO services for analysis and comparison. You could research backlink profiles for your link building campaign. Compare the qualities of websites using parameters of Ahrefs, Moz, Serpstat, Majestic, and SEMrush. Monitor your branding by collecting brand mentions. And most importantly, analyze your competitors’ SEO strategy to keep you on top of the game.

This Black Friday 2019, the Netpeak Software team offers a 40% OFF for Netpeak Spider Pro Plan and Netpeak Checker. This is their biggest sale of the year and the offer is only until December 5. You could use the promo code SEO-Hacker-BF-19 upon checkout. Visit Netpeak Software here.

SE Ranking

SE Ranking is an all-in-one cloud-based SEO and digital marketing platform for business owners, SEO pros and digital agencies. This is one of the best rank tracking tools out there and we regularly use this in our company. SE Ranking offers a complete set of tools that enables small and midsized businesses to run a comprehensive on and off-page website audit, scout competitors, track rankings, monitor backlinks, and automated SEO reports.

For SE Ranking’s Black Friday Sale, they are offering their lowest price this year with 30% for any subscription plan by using the code BLACKFRIDAY2019. By using the code, you could also be one of the 3 lucky winners to double your subscription for free!

The offer will last until December 2 at midnight EST. For more details on SE Ranking’s Black Friday Sale deal, check out this link.

Linkody

Here’s a fact for you. Backlinks are still important.

As link building gets more difficult along with an ever-evolving SEO landscape, you need to have a reliable backlink tracker tool to make sure your link building campaign is going the way you want it.

Linkody is a backlink checker tool that monitors your backlinks for you so you don’t have to go back to each website you got a link from. It gives you an easy analysis of your links by using different metrics such as Moz’s Page Domain Authority, Majestic’s Trust and Citation Flow, Social shares, and your website’s Alexa Ranking.

Linkody will crawl the whole web to check for new backlinks every day 24/7 and alert you through email notifications. It will also notify you of your competitors’ new links so you could get a heads up of your competition’s strategies.

Get a 20% off lifetime on any monthly subscription using the code BLACKFRIDAY2019 on your checkout. This deal is exclusive! Linkody does not promote any Black Friday sale but they were happy to give SEO-Hacker readers this discount. Check out Linkody’s Plans and Pricing here.      

JetOctopus

Jet Octopus

JetOctopus is a cloud-based crawler and its the fastest SEO crawler in the market. It can deliver SEO audits in a matter of minutes! JetOctopus can crawl 200 pages per second. Your 30,000 pages website will be done crawling before you even finish reading this article. 

This is a must-have tool for SEO agencies and SEO professionals that needs to crawl multiple websites every day. It delivers results fast and gives you a detailed list of issues and problems of your websites in their problem-centric dashboard. It also allows Google Search Console integration without extra fees.

JetOctopus also has an amazing log analyzer that allows you to check how bots crawl your pages and optimize your crawl budget. It has no log lines limit and it doesn’t affect your website’s site speed.

JetOctopus will have a special offer for Black Friday 2019 so stay tuned. You could check their website here and get a 7-days free trial.

Wordlift

Wordlift is an SEO plugin that allows its users to use the plugin to enhance the understanding of search engines with regards to the words and their relationships that you used inside your content. This tool allows you to create more engaging content that will produce the best results for your SEO endeavors.

It uses schema to help it understand the important terms inside the body of your content that both search engines and users will want to know more about. For a more in-depth and detailed tutorial on how the tool works and how you can use it, here’s my Wordlift review

Wordlift is having a Black Friday sale and you can check out their offer here.

Mangools

Mangools is a set of SEO tools for a variety of SEO tasks that is great for both SEO agencies and businesses who want to be on top of the SEO game. 

Inside Mangools are 5 juicy SEO tools that are powerful but easy to use. 

LinkMiner is a backlink explorer tool for your link building campaign to find links from your competitors that you can reverse engineer and grab them for yourself. You partner it with SiteProfiler, a tool that you can use to check the authority of your website or any other domain. 

KWFinder is a keyword research tool and it is the only keyword research tool that you’ll ever need. It pulls data from Google and Mangools gives you an analysis of which keywords are easy to rank for. It also gives you other keyword suggestions. You could then use SERPChecker to scout who you are up against, and track your ranking progress using SERPWatcher.

Mangools will be offering a 25% discount on all of their plans. No discount code needed. Just visit the Mangools Black Friday Sale page and get notified about the deal.

 

Black Friday SEO Tools Sale 2019 was originally posted by Video And Blog Marketing

Introducing the SEO A/B Testing Whitepaper

Just over four years ago, we broke the news that we were working on the early stages of what would become DistilledODN. Back then we wanted to help clients get changes implemented faster by separating marketing / SEO-driven changes from the engineering backlog. Not only that, we wanted to help businesses understand the impact individual changes were having on SEO traffic and the bottom line.

Since then we’ve run tests on all kinds of sites across different industries and sectors, and processed millions of dollars of online revenue for some of the worlds biggest sites.

One part of our work has been to create a clear understanding of how SEO A/B testing differs from traditional CRO testing. We’ve written a number of blog posts and the likes of Will Critchlow, Dominic Woodman, Emily Potter and Tom Anthony have been speaking at conferences around the world explaining how ODN works and telling the stories of our wins and what is now known as dodged bullets: changes that SEO teams would have just rolled out if they’d been unable to test that turned out, under testing, to negatively impact organic traffic.

Over the last four years, ODN has helped SEO teams from around the world by:

To continue spreading that word we’ve pulled together our A/B Testing for Search Engine Optimization whitepaper to help you understand the benefits of running these tests and share some of our findings.

Everything you need to know about SEO A/B Testing

Here are just a few things we’ll cover in this whitepaper:

  • Why do we need SEO A/B testing?
  • How does SEO A/B testing help?
  • What exactly is SEO A/B testing?
  • A year of lessons from running SEO A/B tests
  • Case studies
  • How SEO testing can help avoid marginal losses

Come and join us on our journey by grabbing your free copy below.

Download your copy of the SEO A/B testing whitepaper for free.

Introducing the SEO A/B Testing Whitepaper was originally posted by Video And Blog Marketing

Dibz Review: Advanced Link Prospecting and Link Building Tool

Last updated on

Dibz-Review_Advanced-Link-Prospecting-and-Link-Building-Tool

It’s already a well-accepted fact in the SEO industry that link building is one of the best tactics for improving a website’s organic search rankings. Link building as a tactic involves many strategies and executions that often overwhelm beginners. One such strategy that is vital to link building is prospecting. I’ve been building links to SEO Hacker for quite a while now and finding the perfect prospect for guest posting or simple outreach often still eludes me. But that’s not the case anymore.

Among the hundreds of tools available in the market, I’ve just found one of the best to assist in my link prospecting endeavors. Dibz is a tool created by the Four Dots company to help SEOs and webmasters around the world not waste their time looking for the perfect prospect. Here’s how they helped me maximize (and effectively lessen) the time I spend looking for prospects in the world wide web:

Dibz: One of the Best Link Prospecting Tools

Dibz is so easy to use. Even if you’re a beginner, you’ll know how to navigate the ins and outs of this tool in no time. Here’s how to start:

Log in first and you’ll automatically be taken to the New Search dashboard:

Dibz Dashboard Screenshot

Click on the “Create a New Search” button and you’ll start with the specifics of creating your campaign. Fill up the name of your search, client (you should first create a client folder), campaign, and the keywords you wish to find prospects in. 

I used 3 keywords and followed their instructions on looking for guest blogging opportunities for the sake of giving a more accurate example.

Dibz Campaign Dashboard Screenshot

After filling out all the campaign details, you need to choose the type of research that Dibz will run. This is one of the best parts of Dibz since they automate one of the link prospecting processes that eat up a lot of our time. Another point is that they have multiple types of research to choose from. To line it up with the keywords I’ve already put in, I’ll choose Guest Blogging. You have to remember that Dibz isn’t just for guest blogging. You have the choice to look for websites that accept blog comments, forums, reviews, donations, giveaways, and many more. This is what it looks like:

Dibz Research Dashboard Screenshot

So, how does Dibz automate the link prospecting process? They do the searching for you. Historically, almost all SEOs I know that do their own link prospecting resort to the manual act of looking for them. But with Dibz, they do it for you – less hassle, less time wasted, greater prospects.

Dibz Operators Screenshot

The last step before Dibz finds the best prospects for you is to specify the scope of your search. Which language do you want Dibz to look for, the Domain Extension (.au, .uk, .es, and many more), depth, and any additional data you need. Do note that the descriptions of the following parameters can be found on the right side of the page.

Dibz Parameters Dashboard Screenshot

And there you have it! Dibz will begin displaying potential prospects which you can outreach to for guest blogging (and other opportunities)

Dibz results page screenshot

How Dibz Helps Your Link Building

Links still remain as one of the top-ranking factors without a doubt. But the thing with link building today is that it takes too much time. You spend half of your time prospecting and half of it doing the content and outreach. And sometimes, I spend so much time filtering the search results in Google that it takes some of the time I need for other tasks. 

I’ve mentioned how Dibz helps make my link prospecting process more efficient and more effective and I’m going to mention it again because Dibz is THAT GREAT. What took me multiple hours just to find the best prospect now takes me around 10-20 mins to find them which enables me to focus more on producing better content and other tasks that I have to do.

I also love that you can spend a few credits and it will show you important metrics about the prospects without having to go to other tools and manually check them one by one. Dibz is simple, it’s fast, and it’s easy to use and in link-building, you’ll need all the automation that you can to run a better and more efficient campaign. 

Key Takeaway

If there’s one thing all SEOs and webmasters can agree on – it’s that we do a lot of strategies and processes that take up too much of our time. Tools like Dibz are important for us since they’re the ones doing the work for us which allows us to allot all our time and effort in other strategies/processes. When I find tools like Dibz, I immediately get excited since the anticipation to do more and create more is an exhilarating and, oftentimes, an enjoyable feeling. So, I recommend you try out Dibz for yourself and experience how great of a tool it is. Do you have any questions? Comment them down below!

Dibz Review: Advanced Link Prospecting and Link Building Tool was originally posted by Video And Blog Marketing

SearchLove San Diego 2020: Community Speaker Applications Now Open

Are you looking to secure your first speaking gig? Maybe you’ve done a couple of local meetups and are looking to share your knowledge in front of a bigger audience. Well good news, applications for community speaker sessions for SearchLove San Diego 2020 are now open! Not sure what our community speaker sessions are about? Read on.

If you would love the opportunity to speak in front of 250 people in sunny San Diego then we’d love to hear from you. We are on the lookout for speakers who:

  • Are San Diego locals (no further than 2-2.5 hours drive from the venue). We want to support the community where our conference runs and help speakers raise their local profile.
  • Are looking to get more speaking experience and willing to work with the Distilled team to take their presentation skills to the next level. Over the last year, we’ve had first time speakers, those who have done one or two meetups and those that are looking to push themselves to new levels.
  • Are available to join us at SearchLove San Diego on March 26 & 27, 2020.

When you’ve had a read of everything that follows, swing by our form and apply for a space. You’ve got until December 17, 2019, to apply.

Apply to speak at SearchLove San DIego

What’s in it for us and our audience?

Our events team spends a huge amount of time looking for the best speakers across the industry. We often invite back speakers who wow our audience, but we also want to find speakers that no-one has seen before. This could be a great speaker in a related field but who are underexposed in the search industry, or it could be a talented individual who is looking for their first speaking opportunity.

We’re looking to build long-term relationships with all of our speakers, and that means inviting community speakers back for full speaking slots, and we’ve already seen this in action with Luke Carthy and Andi Jarvis in London, and we have invited Francine Rodriguez to join us in San Diego (she was a community speaker at SearchLove Boston 2019).

Additionally, the shorter 20-minute sessions bring a little less pressure and allow our community speakers to hyper-focus their talks, bringing new perspectives and viewpoints that we might not be able to get from our more experienced speakers. We want to see hands-on advanced advice from practitioners who are doing the work every day.

What’s in it for you?

We’ve already seen some of 2019’s community speakers quickly secure slots on major industry stages including MozCon and Inbound. Even better, they’ve come back and said they wouldn’t have been able to do it without the help and advice from our team. Rather than taking our word for what you will get, we asked our previous community speakers what they took from applying.

Laura Hogan – Milos Mail

“This experience completely changed the way I deliver my talks. No one had ever critiqued my decks to this extent before or given such detailed feedback – and it was invaluable.  Because of the advice from the Distilled team, I’m now speaking at two events I’ve never done before this year!”

James Corr – Seer Interactive

“I knew speaking at SearchLove would be valuable for me, but I really underestimated the amount of time, effort and direction that the Distilled team provided. They were beyond instrumental in making my presentation a success for both myself and the audience. While they let me drive and own the presentation/content their willingness to meet with me, go through dry runs and give me feedback proved to be a huge learning experience for myself that I’ll take with me for the rest of my career.” 

Francine Rodriguez – WordStream

“This experience was like going through speaker bootcamp. You think you know your stuff until you get this team of experts dissecting everything. Now I know it was what I needed! I have picked up on great advice about deck design and presentation delivery that I will continue to apply in future talks. Out of all the advice, I think the one that hit home the most is how you start out a talk. You need to create a moment that draws the audience in and makes them want to listen.”

Here’s the full package you’ll receive if you are successful (along with your 20 minutes on stage!):

  1. Introduction call with the Distilled team
  2. Multiple video calls to run through your presentation with the Distilled team and get feedback
  3. Deck review and content call to bounce around your session ideas
  4. 1 to 1 ongoing support from a Distilled team member
  5. Final in-person review with myself and the Distilled team in San Diego before the conference
  6. VIP ticket to attend SearchLove San Diego including attending the VIP dinner with all the other speakers the night before the conference
  7. A nice bunch of Distilled and SearchLove swag

Apply to speak at SearchLove San DIego

We are extremely excited to be bringing this scheme back to our US events in 2020, we want to give this opportunity to local folks, and so we’ll only be accepting pitches from applicants living within the San Diego area (2-2.5 hours drive from the venue) for this particular conference.

Through this process, we are hopeful that we can tap into a more diverse pool of speakers with different backgrounds and experiences. We work hard to ensure our conferences are diverse with all of our conferences in 2019 being represented by a 50/50 split between male and female speakers.

We’re also serious about building a safe, inclusive and welcoming environment for our speakers and delegates and have a code of conduct in place at all of our events.

We aim to continue to lead the way on both of these fronts with our conferences. So, regardless of gender or background, we encourage you to submit an application to join us.

A note on the video requirement

You will note that the application form asks for a video. We are very aware that not everyone has professionally-shot footage of themselves speaking on stage. That’s okay. It’s also impossible for us to see how someone will perform on stage through a form submission. This video is an opportunity for you to show your personality and a little bit of stagecraft. 

To be clear, we are not expecting you to put together a professionally-lit and shot promo video. We want to see your enthusiasm, public speaking capability, and maybe a bit of your depth of knowledge. A selfie video shot on a mobile phone can totally do the job, but think about how you are going to stand out from the crowd. Once you have recorded the video upload it to a hosting platform such as Google Drive, Wistia, YouTube or Vimeo and share the URL in your application form (make sure you change the viewing permissions so we can see the video!)

In order to avoid asking you to do something I wouldn’t be prepared to do myself, I’ve recorded a short pitch myself. You can see that it’s shot on a phone, and didn’t use any editing:

A personal note

I’ve seen in my own career how powerful it has been to get better at public speaking and I’ve had tons of opportunities come my way from my talks. Having run successful Community Speaker programs in all of our SearchLove cities now, and seen our speakers go on to all the top conferences in the industry and even return to full SearchLove speaking gigs, I know that we can help more people on this journey.

The biggest things I’ve seen people need to be successful with public speaking are:

  1. Getting started. I was lucky enough to get started at our own (then tiny) events, and this is one of our ways of paying it forward.
  2. Practice and repetition. I used to go to a weekly breakfast networking event that forced a weekly 1-minute presentation and regular 10-minute presentations. Do anything a lot and you’ll find yourself improving.
  3. Someone to give you honest feedback. Our team has done this many times over for each other, and they’re ready to do it for you too.

I would strongly encourage you to think about the actual requirements. Don’t fall prey to imposter syndrome: are there things you are passionate about, where you have deep hands-on knowledge, and where you can teach even an experienced audience new things? If so, don’t sweat your speaking experience – let us be the judge of potential and get your application in.

How to apply

You’ll need to tell us:

  • Why you’d like to speak at SearchLove San Diego
  • Where you are based
  • What your speaking experience looks like so far, don’t worry if you have none or relatively little, in fact, this could be the perfect opportunity for you
  • What topic you’d like to talk about – the more specific and actionable a topic you can describe, the better
  • Remember, the closing date for applications is December 17, 2019

Apply to speak at SearchLove San DIego

SearchLove San Diego 2020: Community Speaker Applications Now Open was originally posted by Video And Blog Marketing

How to Discover Exactly what the Customer Wants to See on their Next Click: 3 Critical Skills Every Marketer Must Master

(This article was originally published in the MarketingExperiments email newsletter.)

If you, the marketer, can discover what visitors want to see when they’re visiting your website and moving toward their next click, you can better serve your customers by making sure there is no disconnect between their expectation and their experience.

There are three concrete skills you can use to achieve this:

  • Prioritize where to fix your focus
  • Identify with the customer
  • Deduce where they want to go by looking back at where they came from

Flint McGlaughlin explains various ways to apply these concepts as he examines and reviews audience-submitted webpages in this YouTube live replay.

Watch the video to get some ideas on how you can revamp your own website and get more clicks.

If you would like to receive more detailed advice from a MECLABS conversion marketing expert via a video conference, visit our Quick Win Consult page to learn more.

Here are some key points in the video:

  • 3:43 The goal of this session
  • 6:36 3 observations we can make from tightrope walker Charles Blondin that can help our marketing efforts.
  • 15:50 Case study: A national bank
  • 17:50 3 critical skills you must use for maximum conversion: prioritization (of attention), identification (with the customer), deduction (from where they came from)
  • 25:04 The importance of having empathy
  • 27:33 Summary of the successful treatment on a banking webpage
  • 28:42 Case study: Windows and door replacement specialist
  • 43:31 Gathering data is extremely important, BUT …
  • 45:16 Live optimization: Health and wellness marketing site

Related Resources

Conversion Lifts in 10 Words or Less

Customer Theory: How to leverage empathy in your marketing (with free tool)

The End of Web Design: Don’t design for the web, design for the mind

How to Discover Exactly what the Customer Wants to See on their Next Click: 3 Critical Skills Every Marketer Must Master was originally posted by Video And Blog Marketing

Content Generation with Google Question Hub: Brief Overview of the Beta Tool

Last updated on

 

Content Generation with Google Question Hub: Brief Overview of the Beta ToolIn the world of search, Google is at the forefront of delivering results in the form of fresh content for people seeking answers. With thousands and even millions of search results, you may think that each question in the world can be supplied with the answer to satisfy it. Enter, Google Question Hub.

A beta tool that would prove to be useful for information seekers and content creators in the digital sphere. According to its self-definition, it is “a tool that can enable creators or bloggers to generate richer content by leveraging unanswered questions.”

It seeks to be an area where people can have an exchange based on the questions that cannot be found in the Google search engine pages. With this tool at their disposal, it would be possible for people to seek more in terms of information and not just rely on the blue links. If you are a webmaster pushing for meaty content, then you should read how Google Question Hub can help you do just that.

Get the answer that you’re looking for added to the web

Now that we’re on the topic of Google Question Hub, I think it is also right that we talk about this SERP feature that Google has been testing.

As a search engine giant, Google readily acknowledges that even its algorithms are vulnerable to pieces of inadequate content. Asking people to participate in bulking up the process and information is a nod to the power of human intervention when it comes to search. This is evidenced by their Quality Raters and the ongoing testing of the box asking people to manually put a question in the box found together with the search results.

It encourages users to ask questions to “Get the answer you’re looking for added to the web”. During May of last year, Manashjyoti Athparia posted in the Gulshan Kumar Forums about his encounter with this test section. I have also seen this in my recent searches and the question box looks like this:

Get the answer that you’re looking for added to the web

Using this, users initiate an exchange between them and those who would want to try their shot at answering a query. Be that as it may, Google says that they collect user questions using a variety of methods, so I think it’s safe to say that collecting user queries is not limited to this.

Content Generation with Google Question Hub

This tool is still not available in the Philippines but it is one of the tools that is getting its fair share of anticipation. Deriving topics from user questions solidifies the role of user search intent because seeing their queries in real-time assures you what information your target market is expecting to see from your niche.

According to Google, the benefits of the tool can be summarized in three steps:

  1. Find the right questions
  2. Create richer content
  3. Track your impact

To get you started, you just need two things: a Google account and access to your Google Search Console.

According to the official Google India blog, “To access Question Hub, publishers need to link their account to verified properties in Search Console. For publishers without a Search Console account, other options are available. Once they’ve created an account, they can explore topics relevant to their work by either searching for keywords or browsing categories. Once a topic is added, they can view unanswered questions asked by real people. Publishers can then use their editorial judgment to review unanswered questions, and expand on them when creating content.”

Additionally, you can see how well your content is performing because you can track how well your content is doing by receiving feedback from your audience. By taking a more personalized approach to answering user questions, you get to stay ahead of your competition in terms of content relevance.

What does this mean for Google Question Hub?

Google question hub

Currently, Google Question Hub is still on its beta-testing stage with its availability limited to only residents of India, Indonesia, and Nigeria. They have stated that they are hoping to expand to other countries as the tool reaches its maturity. Going into the site, you will be welcomed by an animation as well as the words, “You publish better content. The web improves for everyone.”

For bloggers and other content creators alike, it is essential to be keen on creating insightful and informative topics for their audience. What if you can explore topics with the help of Google? And not just any topic but those that real users are actively seeking answers to. Being in the front seat of these queries can do a lot for your content generation. It will help you tap into unique ideas that other content creators fail to capitalize on.

User intent deals a lot with knowing what queries people are searching for to lead them into your brand. With Question Hub, you wouldn’t need to take a guess what is going on in the user’s head. You would just know by the type of question she is asking Google. Search engine results would have a more humane approach to answering people’s questions.

Just to make it clear, having impactful content published on Question Hub does not mean that you can use this to rank for that particular query according to Google. However, publishing your content still makes it eligible to be displayed in Google Search which is why you should still think about it staying competitive in the search results.

Key benefits that Google should also consider for the beta-tool

Aside from finding questions, creating content from relevant topics, and tracking your impact, it would be better if there is an option to categorize the queries per intent. The categories per industry are already okay but it would be cluttered especially if there are those who are just looking for information and not to have a conversation about a transaction or the like.

Additionally, the topics you can add that users can ask questions about should also have more options for it like a recommended set of topics other than the one you added. Ideally, the questions should always be relevant but since this is basically a user portal, you cannot help it if there are those who do not pass this standard.

This is why there is a reject button made for the questions. But with this, it would be great if there is a certain archive for the rejected questions because what is irrelevant now may be relevant tomorrow.

These are just some of the things I expect since I am still on the waitlist for this tool. It would be great for content marketing efforts once it has been fully rolled out. I hope that once it is up and running in various countries, it lives up to the anticipation because this is something I look forward to. Find out more about it and sign up here.

Content Generation with Google Question Hub: Brief Overview of the Beta Tool was originally posted by Video And Blog Marketing