Semantic Markup Gets A Little More Detailed

twittergoogle_pluspinterestlinkedinyoutube

Semantic Markup Gets A Little More Detailed

Many of you who know me know that I have been waving the semantic flag for years now. So when news like this comes along I am all too excited to share it with everyone (who will listen).

Google recently announced new semantic markup tags for businesses who have varying points of contact. What does that mean? It means businesses will now be able to specify preferred phone numbers for the following departments:

  • Customer Service
  • Technical Support
  • Billing Support
  • Bill payment

Since contact information is a popular query for most businesses, using these tags will help to surface that information in the SERPs.

Additional Recommendations

Along with this update, Google also released additional recommendations on how to craft the location page on your website. Listing business hours and contact numbers are always good information to have, but also provide a little extra value. Let visitors know how parking is in the area, is there an ATM nearby, do your pharmacy hours differ from your stores hours. Google goes on to provide further information here regarding schema for local business location pages.

Conclusion

If you’re not using any rich snippets on your website then you may be missing out on business; it’s that simple. Since all of the major search engines now support semantic markup, this is the new low hanging fruit as far as SEO is concerned.

Posted in SEO

Google Webmaster Tools – A Comprehensive Guide – Part 7 1/2

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to the final section of my 7 1/2 part series on Google Webmaster Tools. If you’ve made it this far you should feel pretty comfortable with using all the tools available.  In this post I will be discussing the Disavow Tool.

Disavow Links

Before I get into how to use this feature I first want to describe what disavow means. Disavow is defined as denying any responsibility towards something. So, how does this apply to our backlinks? When we disavow a link, it sends a signal to Google that we are not responsible for creating that link and would not like to be associated with it. This could be due to low quality links that were created pre-penguin, links that are not related to our content or in some cases, negative SEO.

Not available through the standard menu in Webmaster Tools, the disavow tool can be found here. Since this feature has the ability to cause a lot of damage if not used properly, I’m guessing Google doesn’t want to make it too easy for everyone to find.

 

webmaster tools disavow links screenshot 1

Google Webmaster Tools Disavow Links Screenshot 1

 

 

 

 

 

 

 

 
When we first click the link above we come across this screen. There is a warning notice from Google about using this tool. We are advised to try removing the links prior to disavowing them as this will not solve all of our issues.  Some ways we can achieve this are manually removing the links ourselves if we have access to them or reaching out to other webmasters and asking them to remove the links to our sites.

Depending on what account we’re logged into, we will see a series of URLs in the drop-down menu to the left of the red (should I press this) button.

Let’s select the account we want to clean up and click DISAVOW LINKS.

 

webmaster tools disavow links screenshot 2

Google Webmaster Tools Disavow Links Screenshot 2

The next screen we see contains another disclaimer from Google urging us to use caution when using this feature. Once we are certain that we have a complete list of URLs and/or domains that we want to disavow, we can click Disavow Links.

 

webmaster tools disavow links screenshot 3

Google Webmaster Tools Disavow Links Screenshot 3

 

 

 

 

 

 

 

 

 

We see the same cautionary notice on this pop-up that we do on the previous screen. We are asked to upload a *.txt file that has all of the links we would like to disavow.

Remember, to a get a complete list of URLs that are linking to us we can use the tool under search traffic, Links to Your Site.

We can export that list and, after deeply researching the URLs, determine which links are safe and which we would like to disavow.

The file we upload needs to be formatted a certain way otherwise it won’t be accepted. Google provides us with an example of how it should look here.

The hashtag in front of a sentence will comment it out. This means that those lines will not be reviewed as part of the links. Comments are a good way to notate your efforts when removing links; this means that you can list all efforts when contacting webmasters and asking for your link to be removed.

Conclusion

If you do need to disavow any links or domains pointing to your website, please be sure that you are careful and fully research the domain before you submit a disavow request.

On a personal note I would like to thank you all for reading this and taking this journey with me through every area of Google Webmaster Tools. I can only hope that I explained everything thoroughly and extensively enough that you all feel comfortable and confident every time you use these available tools. I look forward to any questions and comments you may have regarding any of this subject matter. Good luck!

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 7

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to part 7 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering the Labs section of Google Webmaster Tools.

Labs

There are not many parts to this section but the ones that are here are very useful. The Labs section contain experimental tools that have not yet been pushed live, to the other sections of Webmaster Tools. The two areas we’ll cover today are Author Stats and Instant Previews.

Author Stats

Those of us who have Authorship set up on our blogs may have noticed some different things about our websites. For starters, a thumbnail image from our Google+ profiles may be showing up next to our website in the SERPs. Authorship associates our identities with those websites and helps to establish us as authorities on that subject matter.

The benefit of implementing authorship is that it leads to a higher CTR because there is a certain trust factor associated tying an identity to a website.

In Webmaster Tools, we are able to see a chart of how frequently our work is showing up in the SERPs. The chart, similar to the one seen under Search Appearance and Search Traffic, give us a link to the page we wrote, the number of impressions it received, the number of clicks that post has as well as the CTR percentage and average position.

webmaster tools author stats screenshot 1

Google Webmaster Tools Author Stats Screenshot 1

It’s pretty cool to see all of your work in one central location like this.

Notice there are two links at the top of the image above. You is hyperlinked and points to your Google+ page. Learn more about verifying authorship takes us to a page where Google describes what authorship is and how to achieve it.

Instant Previews

This area in Webmaster Tools lets us see a preview of how our page will look similar to the preview window in the SERPs. Although Google is no longer showing previews in the SERPs, we can still see how it would look here.

webmaster tools instant preview screenshot 2

Google Webmaster Tools Instant Preview Screenshot 2

Conclusion

Although the Labs section of Google Webmaster Tools contains experimental tools that are not completely out of beta, it’s definitely worth checking out because we can only learn more about our websites by exploring.

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 6

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to part 6 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering the different sections Site Messages, Security Issues and Other Resources.

Site Messages

Here we are able to see any and all communication directly from Google regarding our websites. Don’t get too excited though, it’s not an open dialog rather, it’s more along the lines of notifications. For example, if we were to connect our Webmaster Tools accounts to Google Analytics, we would see notifications (like the ones below).

webmaster tools site messages screenshot 1

Google Webmaster Tools Site Messages Screenshot 1

Notice there are three options on the top menu when we first get to this screen. All allows us to see all messages regarding our website, Starred shows us messages that we manually star which is like bookmarking them, and finally we have Alerts. Under alerts are the important messages like if we receive any manual action from Google or receive some sort of penalty. Clicking on the message subject will provide us with more detail and in some cases a link. The two messages I have displayed are letting me know that I have chosen a preferred version of my domain, joerega.com sans the www subdomain and that I have linked my Webmaster Tools  profile to my Google Analytics account. This allows for more information in Analytics such as keyword data, landing pages and geographical data. One of the messages we may receive regarding our websites is about Security Issues.

Security Issues

This section in Google Webmaster Tools shows us any warnings we may receive regarding an attack or hacked site. If you have ever seen a website in the SERPs display a message underneath a URL that says this site may harm your computer or this site may be hacked, changes are the webmasters would head over to this section to see what’s wrong. Google goes into greater detail here. Fortunately, there are no issues with my site and I hope that all of you reading this see the following when you log in to this section:

webmaster tools security issues screenshot 1

Google Webmaster Tools Security Issues Screenshot 1

Other Resources

Here we have access to some more powerful tools in Webmaster Tools.

webmaster tools other resources screenshot 1

Google Webmaster Tools Other Resources Screenshot 1

First up is the Structured Data Testing Tool. This will let us test any rich snippets web have on our website by either a URL or the source code directly.

webmaster tools structured data testing tool screenshot 1

Google Webmaster Tools Structured Data Testing Tool Screenshot 1

webmaster tools structured data testing tool screenshot 2

Google Webmaster Tools Structured Data Testing Tool Screenshot 2

 

If any schema are used to markup your website, this is the section to use to test it. Since structured data is not yet widely implemented, there is also a tool to help us make sure our configuration is correct. Enter the Structured Data Markup Helper. This area allows us to select the type of data we are trying to markup and enter the URL or HTML. This works for both websites and email.

 

By selecting the type of data then entering the information below, Google will know what data to be looking for and therefore tell us if our markup is correct.

webmaster tools structured data markup helper screenshot 1

Google Webmaster Tools Structured Data Markup Helper Screenshot 1

 

 

 

 

 

 

 

 

 

 

The email screen has a different set of data types to choose from. Rich snippets in email allow for dynamic emails that companies can send out to enhance them. This may include flight departure and duration times as well as a countdown to a specific event. Here are the current options at the time of this writing:

 

webmaster tools structured data markup helper screenshot 2

Google Webmaster Tools Structured Data Markup Helper Screenshot 2

 

 

 

For structured data in email, the only option to verify if the coding is correct is by HTML.

The next link down is the Email Markup Tester. This is just another place to test the rich snippets in your email, similar to the one above, just without the data type selection.

The following link is for Google Places. This is not a part of Webmaster Tools but an external link to the Google Places website. Since it’s not a part of Webmaster Tools, may I’ll cover that in a future blog post?

After that we see Google Merchant Center. This is another external link to the Merchant Center Website. Merchant Center is used to upload product data to Google shopping and other Google services wherever available.

The second to last link is the PageSpeed Insights.  Another external link to Google Developers, PageSpeed Insights allows us to see how fast our websites are and what, if any, areas we can adjust to make them faster. This is also a great way to determine how the user experience is. There are two options to measure speed, one for mobile and one for desktop.

The final link in this section is Custom Search. Google Custom Search allows us to add a search engine to our website. This will display search results that favor our webpages first and can be configured a number of different ways. Google continues to improve upon this feature as it is now available with different schema.org types.

Conclusion

We touched upon some more features of Google Webmaster Tools today from the different types of site message we can receive to security issues that warn us when our sites may be hacked. We also covered the other resources section which contains some advanced tools regard rich snippets, Google Places, Google Merchant Center, PageSpeed Insights and how to create our own custom search engine.

In my next post I will cover the Labs section of Google Webmaster Tools.

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 5

twittergoogle_pluspinterestlinkedinyoutube

Welcome to part 5 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering everything under the Crawl section.

Crawl

This area of Google Webmaster Tools that contain a lot of advanced features used to control the search visibility of your website and provide you with with information on how often your site is crawled. The first area in this section is called Crawl Errors.

Crawl Errors

I’m not going to spend too much time on this section since I covered  it in Part 1, so I will just do a quick review. This section shows us a list of errors that Google has found when crawling our websites within the last 90 days. We are able to download that list which includes server codes, URLs and the date detected, providing you with more information to diagnose and solve those issues. Recently added features include showing errors from Smartphones and Googlebot-mobile crawl errors. From here you can also see data from DNS, Server connectivity and Robots.txt.

Crawl Stats

This area under Crawl shows all Googlebot activity, on our website, for the past 90 days. There are three graphs we are presented with when we first land on this page: Pages crawled per day, Kilobytes downloaded per day and Time spent downloading a page (in milliseconds).

Webmaster Tools Crawl Stats

Google Webmaster Tools Crawl Stats Screenshot

Rolling over a graph will provide us with the date it was crawled and information pertaining to that day. This could mean how many pages were crawled that day, how many kilobytes were downloaded or how much time was spend downloading a page. To the right of these charts are information highlights such as High, Average and Low. This makes it a lot easier if we need to diagnose any issues with site load time. If there are any issues with our site, Webmaster Tools allows you to Fetch as Google to see how Googlebot sees your page.

Fetch as Google

The third link down under Crawl is Fetch as Google. Here we have the ability to submit pages to Google’s index as well as see how Google’s spider sees the page.

Webmaster Tools Fetch as Google

Google Webmaster Tools Fetch as Google Screenshot

From here we have a few options to choose from when we want to see how Google sees our website. The first step is to enter a URL path into the text box. For example, if your website is http;//www.example.com, you only need enter the page you are interested in seeing; this is what comes after .com/. Let’s say we are interested in /sample.html, we would enter sample.html in the space provided and click FETCH. After we click fetch we are prompted with a pending message then Your request was completed successfully. Here is what it looks like:

webmaster tools fetch as Google screenshot 2

Google Webmaster Tools Fetch as Google Screenshot 2

Since I don’t currently have the page /sample.html on my website, this tool is going to return an error message of Not found. From doing SEO we know that not found is a 404 error.

webmaster tools fetch as Google screenshot 3

Google Webmaster Tools Fetch as Google Screenshot 3

Messages in the Fetch Status column are hyperlinks, clicking them will take us to a page that outlines the status in more detail. Here is a small sample of what you would see on a not found error page:

Fetch as Google

This is how Googlebot fetched the page.
URL: http://joerega.com/sample.html Date: Thursday, March 6, 2014 at 4:23:54 AM PST Googlebot Type: Web Download Time (in milliseconds): 2016
HTTP/1.1 404 Not Found
Date: Thu, 06 Mar 2014 12:24:20 GMT

Notice the 404 Not Found message above. Since this page does not exist, there is no HTML source code to download. It’s safe to assume that this page never existed, or was removed, and should be properly redirected. Jumping back to the main Fetch as Google page there are a few other areas to discuss. First, you may have noticed that there was a number under Fetches remaining; this number is 500. The counter means that you are allowed to use this feature for 500 URLs per a week. After entering a URL (that exists) and clicking  Fetch as Google, the message that appears under Fetch status will say Succes and say have a button appear next to it that says Submit to index. This will allow us to submit a page to Google’s index faster than waiting for their spiders to crawl our site. If we select the option to submit a URL to Google’s index, we are prompted with two options: Submit just this URL and Submit this URL and all linked URLs. 

webmaster tools fetch as Google screenshot 5

Google Webmaster Tools Fetch as Google Screenshot 5

 

 

 

 

 

 

Notice that the Fetch status on the new URL now says Success and has Submit to index next to it. Clicking the Submit to index will cause a pop-up to appear giving us two options. Here we can choose to submit just one URL or this URL plus all linked pages. The ladder should only be used if there were major updates to your website; this is why that option is limited to 10 per week.

webmaster tools fetch as Google screenshot 4

Google Webmaster Tools Fetch as Google Screenshot 4

Once submitted the Submit to index button will now say URL submitted to index. We can then explore other fetch options that are available . Entering another page or directory into the fetch bar we see that there are five options total: Web, Mobile XHTML/WML, Mobile cHTML, Mobile Smartphone – New and Mobile Smartphone – Depreciated. The first option, New, we already saw how it worked by entering the URL above and submitting to index. The other options are used to see how your phone will preform on various  mobile devices. In an effort to save time and not get too far off course, I will mention that the other options are used to test the different types of markup used for mobile development. Now, to do a complete 180, let’s discuss another area in Webmaster Tools where we can tell if Google and other search engines are ignoring URLs or directories on your site.

Blocked URLs

Blocked URLs in Webmaster Tools allows us to see if our robots.txt is working properly and properly blocking content we don’t want indexed. Be careful how you configure your robots.txt because it’s very easy to overlook something and block an entire section of your website, or your entire website from Google (yes I’ve seen this happen). Here is what a basic robots.txt looks like:

User-agent: * Disallow:

The asterisks ( * ) after User-agent: is saying that all search engines are welcome to crawl the site. Disallow: on the next line is currently saying that we are not disallowing search engines from crawling our site. If we were to include a  forward slash after disallow, this would block our entire website from all search engines.

User-agent: * Disallow: / This is what a robots.txt looks like when blocking an entire site.

In this section we are able to test our robots.txt before it’s uploaded to check for any errors. Here is a screenshot of how it looks:

webmaster tools blocked urls screenshot 1

Google Webmaster Tools Blocked URLs Screenshot 1

Currently we can see that no search engines are blocked an my website is open for indexing. Further down this page we see an area where we can test blocking pages before going live on our robots.txt. Here I used the page /test.html (which doesn’t really exist on  my site btw) to show the results.

webmaster tools blocked urls screenshot 2

Google Webmaster Tools Blocked URLs Screenshot 2

 

 

 

 

 

 

 

 

 

 

The page /test.html would be successfully blocked if I was to add this to my robots.txt. Similar to the Fetch as Google section mentioned prior to this, we are able to test our robots.txt against other Google user-agents such as Googlebot-Mobile, Google-Image, Mediapartners-Google (used to determine AdSense content) and Adsbot-Google (used to determine AdWords landing page quality). Some people choose to add their XML sitemap to the robots as well; while it’s not required, it’s also not harmful to do so. Picture the robots.txt as a roadmap to your website. Search engines will stop here first to determine what pages can be crawled an indexed. They already know to look for an XML sitemap but adding the line Sitemap: http://www.example.com/sitemap.xml may bring piece of mind to the webmaster. Sitemaps themselves can become very complicated if there is a mistake, which is why I’m glad that Webmaster Tools provides a section for that as well

Sitemaps

Sitemaps are an important part of a website. While HTML sitemaps are geared towards humans, XML sitemaps cater towards machines; more specifically, search engines. Google Webmaster Tools allows us to upload multiple sitemaps and will alert us to any errors they may contain or develop. We are able to see, in this section as well as in Google Index, how many pages are indexed.

Without spending too much time on sitemaps themselves, I will only cover the basics here today and how we upload them to Webmaster Tools.

The second to last option in the left-nav under Crawl is Sitemaps. Clicking on it will bring us to the following page:

webmaster tools sitemaps screenshot 1

Google Webmaster Tools Sitemaps Screenshot 1

 

This is easy to visualize since I currently don’t have may pages to my website. The first step in entering a sitemap is to click the red button in the upper-left corner that says: ADD/TEST SITEMAP. As soon as that button is clicked we see the following window appear:

webmaster tools sitemaps screenshot 3

Google Webmaster Tools Sitemaps Screenshot 3

 

 

 

 

 

 

The location of an XML sitemap generally lives on the domain name as a file. What does that mean? It means it can be found here: http://www.example.com/sitemap.xml. Some websites will have site-map.xml or sitemap.xml.gz, theses formats work just the same. After you enter the file name in the space provided, click Submit Sitemap. This will add your sitemap to Webmaster Tools and provide you with the image you see above: how many pages are indexed verses how many have been submitted.

If your website is new and recently launched, you may only see one column for Submitted. That’s perfectly normal as Google has not indexed any of your pages yet.

Before submitting your sitemap you may want to text it to see if there are any errors. In lieu of clicking Submit Sitemap, click Test Sitemap and you will be provided feedback on the health of your XML Sitemap.

Once your Sitemap is submitted and the URLs are indexed, you can click on the second tab that says Web Pages.

webmaster tools sitemaps screenshot 2

Google Webmaster Tools Sitemaps Screenshot 2

This page will let you know if there are any issues with your Sitemaps. I say Sitemaps because you can submit more than one in this section. Many webmasters will upload separate Sitemaps for different pieces of content: one for images, one for videos, etc.

 

URL Parameters

This is another technical section of Google Webmaster Tools and should be used very carefully. Changing the parameters of our URLs may severely affect how our sites are crawled and indexed.

URL Parameters work like this: Let’s say we have an eCommerce website that sells shoes. One of our URLs may looks like this: http://www.example.com/mens?category=sneakers&nike. There may also be another version of the URL that looks like this: http://www.example.com/mens-nike-sneakers.html. To avoid any duplicate content issues with our websites, we can use this section to help us show Google the main URL we would like indexed. Again, I am going to stress the importance of being extremely careful in this section. Google goes into greater detail about this and you can read more here.

 

Conclusion

We covered a lot of material in this post today. From diagnosing crawl errors to learning how Google sees our website. The Crawl section of Webmaster Tools is very useful when we need to dig deeper into our sites health.

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 4

twittergoogle_pluspinterestlinkedinyoutube

Welcome to part 4 of my 7 1/2 part series on Google Webmaster Tools. In this post I am going to discuss the next section of Google Webmaster Tools, Google Index.

Google Index

This section of Google Webmaster Tools lets us see

  • How many pages of our website are indexed
  • The most commonly used keywords on our website and how often they’re used and
  • Also allows us to remove a URL or set of URLs from Google’s index.

Let’s take a closer look at each section and how it can be used.

Index Status

The first link under Google Index is Index Status. On this page you are able to see how many pages of your website are indexed. Right away we are presented with two options: Basic and Advanced. The Basic chart shows a linear graph of how many pages were indexed over the past year. Rolling over the line will present you with a pop-up that has a date and how many pages were indexed on that day.

Index Status Chart

Google Webmaster Tools Index Status Chart

Clicking the Advanced tab give us more options to review our historical index status. The extra options are: Total Indexed, Ever Crawled, Blocked by robots and Removed. Let’s look a little closer at this:

Advanced Index Status Screenshot

Google Webmaster Tools Advanced Index Status Screenshot

Here we see the extra options above the graph. Checking them off and hitting update provides us with four lines on the chart. The red line represents all pages that were ever crawled. Remember, just because a page is crawled does not mean that it will be indexed. Google may crawl development pages, noindex,nofollow pages, pages blocked by robots and so forth, but those pages will not show up in a search.

The next line is the blue line which represents total indexed. This line is what the basic chart shows when we first click on Index Status. The third line is purple and represents URLs that have been removed. These are URLs that are no longer indexed and may have been manually removed by yourself (which is the case here). We will cover that more later in this post. A fourth line which is not represented here is URLs blocked by the robots.txt file. Since I’m not blocking any pages on mine we don’t see that data available but this just reinforces what I mentioned earlier about Google being able to crawl, but not index all URLs.

Content Keywords

Content Keywords allows us to see how often a keyword is used throughout our website. The list we are shown below are keywords, in descending order, based on how often they’re used on our site.

webmaster tools content keywords screenshot 1

Google Webmaster Tools Content Keywords

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

The blue bar indicates how often that keyword is used; scrolling over it will give us the percentage. Clicking on a word will bring us to a new page that shows us specifics such as variants, significance, occurrence as well as a list of the URLs that  contain that particular word.

webmaster tools content keywords screenshot 2

Google Webmaster Tools Content Keywords

On this screen we are provided a list of URLs that each contain an instance of this particular keyword. Hovering over a URL will show us a preview of that page in an on-page pop-up; clicking the URL will bring us to that page in a new tab. If there is a URL that you no longer wish to use as part of your website, you have the option to remove it via the Remove URLs section.

Remove URLs

Please use this section with extreme caution as you can potentially harm your website by selecting the wrong option or delete the wrong URL.

webmaster tools remove urls screenshot 1

Google Webmaster Tools Remove URLs

The first thing we see when navigating to this page is a message from Google. The message states that we should use our robots.txt to instruct them on how to crawl our website. This means if there are any directories or URLs we do not wish to be crawled or indexed, we should disallow them. If that was already done and a URL or directory is still indexing, or if you want to speed up the process, you can use this tool. Please make sure you read Google’s requirements for removing content before you proceed.

webmaster tools remove urls screenshot 2

Google Webmaster Tools Remove URLs

 

 

 

 

 

By clicking Create a new removal request we get a pop-up asking us to enter the URL that we would like to remove. Please note that URLs entered here are case sensitive, this means that www.example.com/Sample.html and www.example.com/sample.html are viewed as two separate URLs. When we enter a URL it can be just the file name, /sample.html, or directory, /sample/. Clicking continue brings us to the following screen:

webmaster tools remove urls screenshot 3

Google Webmaster Tools Remove URLs screenshot 3

On this final page we get to choose from three different options. The first option is Remove page from search results and cache. Selecting this option will remove a URL from Google’s search results, this means that it will no longer show up when a search is preformed for a particular keyword on that page. It will also removed cached versions of the page as well.

The second option is  Remove from cache only. This option is used to remove an older version or versions of the page. You would use this if you have made major changes to a page or layout and no longer want the older version to be seen.

The third option is Remove directory. This option is used if you need to remove an entire section of your website. This can be useful if your site was hacked, you deleted a lot of content, have redirected content to another section or are just no longer using this section of your website. BE CAREFUL when selecting this option since you do not want to accidentally delete an entire section of your website from Google. The Google Index section is a very powerful section of Webmaster Tools and can be very beneficial towards your marketing strategy. In my next post I will be covering the Crawl section and all of the tools that section has to offer.

Conclusion

The Google Index section in Webmaster Tools provides a lot of information as to how you website is indexed as well as how often specific keywords are used within your site. This section also provides you with ways to remove content pages, versions of pages and directories from your website.

 

 

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 3

twittergoogle_pluspinterestlinkedinyoutube

In this post I am going to discuss the next section of Google Webmaster Tools, Search Traffic.

Search Traffic

This area of Google Webmaster Tools allows us to see how traffic is coming to our sites, Search Queries, the different Links to our sites as well as Internal links and manual actions.

Search Queries

The first link under Search Traffic is Search Queries. This section is further broken down into two separate categories, Top Queries and Top Pages. Under Top Queries we are shown a graph of impressions vs. clicks, as well as a chart listing the keywords that is driving traffic to our sites.

The screenshot below show the impressions vs. clicks graph on my website.

Google Webmaster Tools Search Queries Chart

Impressions vs. Clicks chart in Google Webmaster Tools

 

The first thing you may notice is the small amount of search traffic my website is getting, At the time of this writing, I only recently launched this new version of my website and this is also my personal website, not a clients website so I’m not concerned with traffic.

Notice the blue line dictates impressions, how often your website is appearing in the SERPs, and the red line indicates clicks for those terms.

If we scroll further down this page we are shown a chart of what those keywords are, the number of impressions we’re receiving for them, the number of times they’re clicked, the CTR percentage and the average position (where a website is appearing for that query).

Webmaster Tools Search Query Data

In-depth data for search queries related to your site.

 

You are able to download this information in two separate formats: download the table or download chart data. Both versions are available in either a CSV or Google Spreadsheet. This data wasn’t always available to this extent, Google only recently started showing the amount of clicks that a particular query is receiving.

Clicking With Change will should you how much your website has improved, or receded within a time frame you select.

The next tab under Search Queries, next to Top Queries is Top Pages. Here you’ll be able to see a similar chart to the one above but, in lieu of search terms you’ll be shows your top preforming URLs.

Another recently added feature to this section is really neat. It’s a filter that allows you to parse data by Search: All, Image, Mobile, Video and Web, by Location: View traffic from a specific country that has visited  your site recently and by Traffic: All Queries and Queries with 10 or more impression/clicks.

Search Queries Filter Options

Google Webmaster Tools Search Queries Filter Options

 

The next section down in the left-nav under Search Traffic is Links to Your Site.

Links To Your Site

This section lists all websites that linking back to your site and also shows your most linked content (pages). There is also a section that shows How your data is linked and provides the phrases used to link to your website.

Links to your site

Google Webmaster Tools Links to Your Site Section

 

Clicking the more>> button on either section will bring you to an expanded table that provides you with a list of websites that are linking to you. Here you will be able to download a table of all URLs that are linking to you, how many links each URL is pointing back to your site and how many pages are linked from that source. Let me break it down a bit further.

In the screenshot below, I clicked the more link on Who links the most and was brought to this page:

Who Links The Most

Google Webmaster Tools Who Links The Most

 

Here we are provided with a list of URLs that are linking back to our website. The results are hyperlinked and if we click a domain name we are brought to the following page :

Who Links The Most

Google Webmaster Tools Who Links The Most – Domain Level

 

This page in Webmaster Tools lists individual pages and posts on your website that are linked from that specific domain as well as how many times each page/post is linked. All of the charts within these pages are available for download as either a CSV or Google Docs format. If you check this page frequently there is also an option to download the latest links so you are able to see any new links pointing to you.

Keeping on the topic of links, the next section under Search Traffic is Internal Links.

Internal Links

This section in Google Webmaster Tools lets us see the internal linking structure of our website. This is a great way to check if important pages are linked properly and how many times each page is linked.

Internal Links

Google Webmaster Tools Internal Links

Under Target Pages is a list of URLs on your website. If we roll over them we will see a pop-up that shows a preview of that particular page. If we click on the link we are brought to another page that lists every URL (internally) that is linking to that page. Both pages, Target Pages and Links are available for download as either a CSV file or Google Doc.

 

Manual Actions

This is a very important section in Webmaster Tools. Here you will (hopefully never) see a notification from Google that there is an issue with your website and forced them to take action. For example, there may be a malicious link, that you may or may not be aware of, pointing to your website. If this is deemed as spam then your website may incur a penalty. You will also be notified in the Site Messages section if this happens.

If your website does receive one of these messages, Google may provide an example of some of the malicious content or links as well as links to ways to fix the issue.

Conclusion

The Search Traffic section of your website is a great area to find out more information about your inbound traffic. This may be search queries, inbound and internal links. This post also covered receiving a manual action letter from Google.

In my next post I will be covering the following section, Google Index.

Go back to Part 2

Proceed to Part 4

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 2

twittergoogle_pluspinterestlinkedinyoutube

Welcome to part 2 of my 7 1/2 part series on Google Webmaster Tools. In my last post I covered the Site Dashboard in Google Webmaster Tools as well as how to obtain a list of all errors on your site via the crawl errors tab. In this post I will move down to the next list section in the navigation, Search Appearance.

Technically the next link down is Site Messages but since I’ll be covering that more in-depth in a later post we’re talking about Search Appearance today.

Search Appearance

The third link down in the left-hand navigation of Google Webmaster Tools is also the first drop-down. Here you’ll find links for Structured Data, Data Highlighter, HTML Improvements and Sitelinks.

You can access the further items by clicking the text search appearance.

Google Webmaster Tools Search Appearance Menu Screenshot

This is a screenshot of the Search Appearance menu in Google Webmaster Tools

 

 

 

 

 

 

 

Clicking on the “i” to the right of Search Appearance will provide you with the following pop-up:

Google Webmaster Tools Site Appearance Overview Screenshot

This is a screenshot taken from Google Webmaster Tools that shows how structured data affect your websites display in the SERPs.

 

 

 

 

 

 

 

 

 

 

 

 

 

This image shows how rich snippets, or structured data, change how a website is displayed in the SERPs (search engine result pages). You can seen that on website is displaying a review, one website is displaying authorship and another is showing sitelinks. Clicking on each of the topics on the left will bring up ways to achieve each of those fields. This brings us to our next topic, Structured Data.

Structured Data

Without getting too technical or wordy, the best way to describe Structured Data is that it helps search engines show relationships between objects. What does that mean? It means that by using as much detail as possible to describe the products or services or whatever is listed on your website, search engines will be able to better serve up your information to related queries.

In the graph below you’ll see that Google can detect how much structured data I currently have on my website. Not that much, I know.

Webmaster Tools Structured Data Graph Screenshot

This is a graph in Webmaster Tools depicting how much structured data is on a website.

There are currently two data types that are marked up on my website; hatom and breadcrumbs. Clicking on either one of them will bring you to a new chart that lists each page containing the markup. The first one, hatom, is a microformat. Microformats are typically used in publishing for content such as news articles and blogs. This makes sense since my website uses WordPress as a CMS.

hatom structured data chart

Structured Data chart in Google Webmaster Tools showing hatom markup

Clicking on hatom brings us to the page shown above. Here we see a list of URLs that are marked up with microformats. This chart gives us insight into any errors that may be occurring (I currently have 3) such as: the date it was detected, the title of the page as well as what the errors are.

Above the graph we see four options: All Data followed by three different Error Types. Clicking on each of those will provide you with a different chart and set of pages, depending on how many, if any, errors are on your site.

If we head back to the main Structured Data page, we see the second item listed under Data Type is Breadcrumb. Breadcrumbs are a miniature navigation used in publishing and retail sites that show a user where they are in a website. Breadcrumbs are particularly useful for websites that have a lot of categories and require more than three clicks to drill down to a product. You will see a linear layout of the path you took to get to the page you’re on. The screenshot below shows a list of several pages that have breadcrumbs on them what their titles are.

webmaster tools strutured data graph breadcrumb screenshot

Google Webmaster Tools Strutured Data Graph Breadcrumb Screenshot

This page lists the URLs that are marked up on your website. Here you will be able to see how many items per page are marked up, if there are any errors with the mark up, the date the markup was last detected and the title of the breadcrumb. If there are any errors with the breadcrumb markup you would be able to click on a tab above (that currently says 0 Errors) and be provided with a list of URLs and explanation of the errors.

The next item under Structured Data is the Data Highlighter.

Data Highlighter

The Data Highlighter is a great feature that was introduced to Webmaster Tools in December 2012. This useful tool allows you to highlight specific data on your website so you don’t have to change any code. Initially created just for events, you are now able to highlight different content such as: articles, book reviews, events, local businesses, movies, products, restaurants, software applications and TV episodes.

Selecting different information to highlight will yield different options on the following screen. For example, choosing article will provide you with options related to publishing whereas choosing product will provide you with price and other descriptions.

The screenshots below show the options you have when using the Data Highlighter:

Data Highlighter Menu Options

Google Webmaster Tools Data Highlighter Menu Options

 

Data Highlighter Pages To Highlight

Google Webmaster Tools Data Highlighter Pages To Highlighte

 

 

 

 

 

 

 

Once you decide on which option you want to  you, the next screen is relatively easy to use; you just highlight a section on whatever page you choose, then select the options on the right.

Here is a screenshot of an article on my website. I entered the URL, selected Articles from the drop-down then choose Tag just this page for this example. If I were to choose Tag this page and others like it, I would have been provided with a URL set. This would allow me to completely tag one page and automatically tag other pages that are similar. I would still need to review and approve them but at least the work would be done.

Data Highlighter Tagging Options

Data Highlighter options when tagging content on your website

 

 

 

 

 

 

 

Notice that once I highlight the title of a blog entry, I’m presented with a new menu. Here I would select Title as what I have highlighted is the title of this particular article. The other options are Author, where I would highlight my name, Data Published, Image, Category and Average rating.

After all chosen content is properly tagged the next step is to review everything then click publish.

Highlighting content this way is a great way to have it presented in a more attractive fashion in the SERPs and potentially in Google’s Knowledge Graph.

HTML Improvements

This is one of my favorite sections in Webmaster Tools because it shows you a bird-s-eye view of the meta data used on your website. If there are any discrepancies with your meta data, one of the options will be highlighted to provide you deeper insight as to where the issue is.

Webmaster Tools HTML Improvements

Webmaster Tools HTML Improvements Screenshot

 

This is the main screen that appears when you click on HTML Improvements. You’ll notice that Short meta descriptions is highlighted. Clicking that link will bring us to the next screen that shows details on the pages containing errors; in this case it’s short meta descriptions.

Webmaster Tools HTML Improvements Short Meta Descriptions

Webmaster Tools HTML Improvements Short Meta Descriptions

 

Fortunately, there are only two pages that require attention (and one of those pages no longer exists).

In any of these sections contained more URLs with problems, we would be able do download a CSV file or Google Doc version of this list. That is extremely useful when working with and tracking hundreds of URLs that need to be updated.

Other issues that are reported in this section are title tags and non-indexable content. Errors with title tags could be that they’re too long, too short or duplicate (which I’ve found to be most common). Non-indexable content are files that are not being indexed for whatever reason such as images and video.

Sitelinks

This section of Webmaster Tools is reserved for demoting, or removing sitelinks from a search result. Sitelinks generally only show up when a user conducts a branded search for a name or website. Unfortunately there is no way to control which URLs appear as sitelinks however, you are able remove the ones you do not which to advertise. Please exercise caution when using this tool as you do not want to remove these deep links that may be responsible for driving a lot of traffic.

Webmaster Tools Sitelinks Screenshot

Google Webmaster Tools Sitelinks Screenshot

 

In the space provided next to Demote this sitelink URL you can enter the URL that  you no longer wish to appear in Google’s search results. Just to reiterate, you are not able to control which URLs appear in the SERPs, only remove the one’s you don’t want to appear.

Conclusion

We covered a lot in this section today! There are a lot of advanced featured in the Search Appearance section of Webmaster Tools so make sure you are comfortable with the changes you wish to make before  you make them.

In my next post I will be covering the following section, Search Traffic.

Go back to Part 1

Proceed to Part 3

Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 1

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to my Google Webmaster Tools 7 1/2 part tutorial. Here I will walk you through each and every area of Google Webmaster Tools and how you can utilize them to better understand what is “technically” going on with your website. Google released Webmaster Tools in 2006 in order to help webmasters “create more search engine-friendly sites.”

Adding A New Website

Entering a URL

The website, https://www.google.com/webmasters/, allows you to submit your website to Google’s index, then see a technical overview. To do this, you must first add a site to Webmaster Tools.  When you click the red button in the upper-right hand side of the screen, you will be prompted to enter a domain name. Once you enter a domain name you will then need to verify that you own, or manage, that domain.

webmaster-tools-add-a-site

Add A Site icon found in upper right-hand side of Google Webmaster Tools.

 

 

 

 

 

 

webmaster-tools-add-a-website-screenshot

Enter a URL in this pop-up box and click continue to add a site to Google Webmaster Tools.

 

 

 

 

 

 

Verifying Your Website

Depending on where your domain is registered and hosted you may be asked to verify ownership of a domain by changing the DNS. Below is an example of something you may see (taken directly from Google Webmaster Tools):

  1. Log in to your account for test.com at www.example.com by clicking the Manage Account icon.

  2. In the left navigation bar, open the nsWebAddress (Domains) menu by clicking the + icon.

  3. Click Manage Domain Names.

  4. On the Domain Details page for the domain you’re using, select the Designated DNS radio button (to the right of Change domain to point to) and click the Apply Changes button. If you’ve previously modified your advanced DNS settings, click Edit (to the right of Domain currently points to).

  5. Under the Advanced DNS Manager heading, click Manage Advanced DNS Records.

  6. Under the Text (TXT Records) heading, click Add/Edit.

  7. In the Host field, enter @.

  8. Leave the TTL field set to the default value.

  9. In the Text field, copy and paste the following unique security token: a security code will be provided here

  10. Click Continue.

  11. Review your changes and click Save Changes.

  12. When you’ve done saving the TXT record, click the Verify button below on this page.

If you are not comfortable verifying your website this way, there are several other alternate methods for you to verify your site.

  1.  Upload an HTML file. Google will provide you with a file to download that you can then upload to your website via your CMS or FTP. This file will be appended to your domain name like this: http://www.example.com/googlexxxxxxxxxxxxxxx.html. For those who do not wish to upload a file to your website or are not comfortable doing so, you may choose to add a meta tag to your home page.
  2. Add an HTML tag. Google will provide you with the appropriate meta tag to add to the home page of your website. This will be placed in the <head> section of your website, along with the other meta data. It will look something like this: <meta name=”google-site-verification” content=”xxx-XXXxxxXxxXXXXxxxXxxxxXXXxxxxXXxx-xxXxXXxx” />. If you do not wish to touch the coding of your website, you can verify your website by using Google Analytics if your website already as this installed.
  3. Google Analytics Select this option if your website is using the Google Analytics asynchronous tracking code. This is the javascript version of the Analytics tracking code that was used prior to Universal Analytics. The tracking code MUST be in the header section of your website, just before the closing </head> tag. This is my preferred verification method and also the quickest.
  4. Google Tag Manager This option will allow you to verify your website if you are using Google’s Tag Manager. To do this you must be using the container snippet. Make sure you have permission to “manage” the tag prior to using this method.

After you complete the verification process you will brought to the Site Dashboard.

Site Dashboard

The site dashboard in Google Webmaster Tools is a 10,000 foot view of what’s going on with you website. Here you’ll be able to the current status of your website. This shows you crawl errors, search queries and sitemaps.

Crawl Errors

The Crawl Errors section in Webmaster Tools will show you if there are any issues with you website such as 404 errors, 500 errors and so on. Here you will be shown a complete list of errors found as well as a link back to that specific URL. Below is a screenshot of what I saw when I logged into my site dashboard:

webmaster-tools-crawl-errors-screenshot-1

This is a screenshot taken from Google Webmaster Tools showing crawl errors on my website.

 

 

 

 

 

 

 

 

The green check marks underneath each category indicates that there are no issues with my DNS, Server or Robots.txt. When I click on Crawl Errors under Current Status I am presented with the following screen:

webmaster-tools-url-errors-screenshot-2

Screenshot taken from my Google Webmaster Tools account showing a chart of some crawl errors.

 

 

 

 

 

 

This chart allows me to see a 90 day history of any errors that occurred on my site. You can see that on 12/10/2013 I corrected some issues but more quickly arose. This is a common occurrence with most websites and should not be cause for alarm. Most errors can be easily dealt with and I will go more in-depth on this topic in a later entry. Underneath the chart is a list of all the errors and their server response codes.

webmaster-tools-crawl-errors-screenshot-3

Google Webmaster Tools screenshot of crawl errors and server response codes.

 

 

 

 

 

 

 

As you can see on this list I have several 404 errors. This means that there are pages that are not found by Google.  This could mean a couple of different things. One reason is that there was an older version of the page that I deleted and never redirected, another reason is that a page was moved and not properly redirected. From here we are able to select one or multiple URLs and mark them as fixed. You are also given the option to download a complete version of this as a spreadsheet or Google Doc that lists their server response codes and dates detected. Regardless of the issue, once we click on one from the list under URL we see a pop-up screen that shows us a bit more detail about this specific issue.

webmaster-tools-crawl-errors-screenshot-4

 

 

 

 

 

 

 

 

You can see that Webmaster Tools tells us the date that this issue was first detected as well as the last time that Google crawled this particular URL. Underneath the dates are a brief description about what a 404 error is. If we click around we can see if this URL is in our XML Sitemap and what other pages on our website link to this erroneous URL. Here we can click on the link provided to see what the pages currently looks like. If we know for a fact that this page is no longer an issue we can mark it as fixed. But since we’re SEO’s and want to know more about why the issue occurred, we would click the fetch as Google button. The fetch as Google button allows us to see the page as Google see’s it with the server response code and all of the source code. We will discuss that in a later post as well.

Conclusion

We covered a lot of grounds today on how to set up your website with Google Webmaster Tools. Stay tuned for Part 2 where I’ll be discussing Search Appearance and all of the areas that are part of that section.

Posted in SEO, SEO - Google Webmaster Tools

On-Page SEO Checklist

twittergoogle_pluspinterestlinkedinyoutube
vintage-victrola

The old methods of doing SEO no longer apply. Many tactics are outdated and should be revised.

Trying to diagnose what is wrong with a particular webpage can be incredibly difficult. There are a number of issues that could be causing a drop in rankings and traffic. Below is a checklist of what you should review to determine if there is an issue with your on-page SEO.

1) Review Your Title Tags

The title tag is one of the most important areas of your page. It serves as the attention grabber in the SERPs and is used by search engines to determine relevance of that page. If you just list a bunch of keywords that you’re hoping to drive traffic for then you’re not doing it right. This is valuable ad space and should be treated as such. Put yourself in the position of the individual doing the search, would you click on it what’s there?

There are various schools of thought as to what the proper way to use a title tag is, here are just a few ways you can organize it:

  • Many still favor the pipe ” | ” method of separating first and secondary keywords such as this:
    • Keyword 1 | Keyword 2 | Company Name
  • You can use a hyphen ” – ” to separate keywords like this:
    • Keyword 1 – Keyword 2 – Company Name
  • The third method is a mixture of the above techniques and uses both hyphen and pipe
    • Keyword 1 – Keyword 2 | Company Name

A fourth approach that we’re starting to see more of is borrowing from the AdWords mindset and looks a lot like ad copy:

  • Keyword 1 is a great way to relax | Company Name
  • Browse Hundreds of Keyword 1 Online | Company Name

I have seen many different combinations of title tags work over the years, the main takeaway from this should be that you avoid keyword stuffing and try to have your main keyword as close to the left (the beginning) as possible.

2) Check The Header Tags

The H1 Tag

After you have chosen a suitable title tag to use for your page the next step should be choosing an H1 tag further highlights what the page is about. Second in order of importance to the title tag, the H1 tag lets readers (and search engines) know what they’re about to read. You don’t necessarily need to copy the title tag verbatim but you should include your focus keyword here.

  •  Keyword 1 Is A Great Way To Relax When You’re On Vacation
  • Hundreds of Keyword 1 Are Available Online, Shop Now

There are different mindsets as to the optimal length of an H1 tag; I have seen shorter ones rank well and I have seen full sentences rank well. I personally try to keep them under 80 characters and include the keyword as close to the left (beginning) as possible. Also, there should only be one H1 tag per page.

The H2 Tag

If you have a particular page that has a lot of content on it then you’ll want to break that up into sections to make the information easier to digest. The H1 tag will serve to let someone know what the entire page is about while the H2 tags highlight each section of that page. Take this page that you’re reading for example; the H1 tag is On-Page SEO Checklist and there are several H2 tag underneath that. Each one calling attention to a particular section of what you should look at when reviewing your website. There are even some H3 tags which serve to further organize the data on this page.

Here is an example of how you can use these different tags:

  • <h1>Your Keyword 1 Title Will Go Here</h1>
    • <h2>Your Supporting Keyword or First Section Will Go Here</h2>
    • <h2>Further Supporting Keyword Sections May Go Here</h2>

The main idea with header tags is that you want to use them to separate your content into sections so it’s easier to read and reference.

3) Take A Closer Look At Your Images

Pictures are a great a get way to both attract visitors as well as visually describe what a particular page is talking about. Your image selection may include multiple variations including screenshots, cartoons and pictures (photographs). Whatever you decide to adorn your page with just make sure you are taking the following steps to fully optimize them.

  • Check to see that the images file name is a description of what the image actually is. For example, you may choose to add a picture of construction works building a house as a clever play on words for manual link building. The image should be titled: construction-workers-building-a-house.jpg or whatever file extension you decide to use.
  • Add meta data to the image
    • The title and description elements of an image help to further discern what the image is about and how it’s related to your pages content.
  • Add a caption to the image
    • This is a brief one to two sentences that appear under the image and is another chance to include your keyword.
  • Hyperlink
    • Sometimes it may be easier to link an image to a particular page rather than using anchor text. The visual cue may yield a higher CTR (click through rate) than words alone.

4) What Does Your Internal Linking Structure Look Like?

Links are a great way to navigate within a website and the internet itself. The general rule of thumb for websites is that you should be able to get to any page on a site within three clicks, any more than that and you risk losing the interest of your visitor. Here is a breakdown of how I try to arrange links on a website:

Navigation

Whether you’re using top or side bar navigation, you should make sure that it’s consistent and appears on every page. The main goal is to make it easy for a visitor to navigate your website from any page that they’re on.

On-Page Links

If you write a page of content that references another page you have, you can link to that page within the text by using keyword rich anchor text. Be careful not to abuse this as anything that looks spammy will not rank well. They only time it’s acceptable to link out to multiple pages is when you have a category landing page.

Category Landing Page

These pages serve as an overview for particular categories which include a brief description and links to other pages that fall in that category. For example, an exterminator may have a services page that gives an overview of what they do and the different services they provide and link to those individual service pages; Termite Control, Rodent Removal, etc.

5) What Information Should Go In The Footer?

Think of the footer as another chance to help your visitor navigate your site. The footer of a website usually contains navigational links, contact information NAP (name, address, phone number) and copyright info. Some websites also include links to a disclaimer and HTML sitemap. Do not use this area to stuff keywords or hide text as it is a shady practice and you will be penalized for that.

Navigational Links

Here you can put navigation links (sans drop-downs) to category level pages so visitors at the bottom of your website don’t have to scroll up to get to another page on your site.

NAP

Listing your Name, Address and Phone Number are necessary when you own a business. Providing methods of contact such as a phone number and an email address allow users more ways to reach out to you. Listing your address shows Google that you’re have a physical brick and mortar location and will help you in terms of local search.

Summary

With so many different types of websites and businesses online today it can be difficult to find exactly what it is you’re looking for. When you decide to build a website, make sure you follow these steps for clarity and consistency:

  • Make sure that the keyword you choose is close to the left in your title tag and H1 tags. Search engines tend to put more importance on terms that begin with a keyword.
  • Images should have proper file names that describe what the picture is about
  • Internal links should be consistent when used for navigation, cautiously when used on page
  • A good footer will most likely reflect the header in terms of contact information.

There you have it, five steps to review when considering the health of your on-page SEO. What are some things you look for when reviewing a website?

Posted in SEO