249 ways to increase traffic to your blog

twittergoogle_pluspinterestlinkedinyoutube

Right off the bat I’d like to apologize for the obvious link bait title however, I promise that by the end of the post I will have made it up to you.

What I’ve done here is compile a list of different techniques that can be implemented to drive more traffic to your blog. There are many popular “Top Ways” to increase blog traffic or numbered list of whatever blog posts available online.

The numbered posts are very attractive because they promise us something and immediately set our expectations.  But I have to be honest, although they may have a higher click-through rate when compared to other article titles, I’m tired of seeing them.

Why I Rarely Read Them

The market is flooded right now with numbered post titles such as Top 10 things3 Best Ways to and 6 Great Reasons You Should. Some bring up very good points, examples and tips while others are just link bait designed to get you to visit a site. In addition, a lot of them mention the same techniques throughout each post; I mean, how many different ways could there be to bring in new readers?

How I did My Research

Very popular among SEO’s  are Google’s search operators which are often used to fine tune results we’re looking for. These codes added to search queries reduce the amount of results that are returned making is easier for us to find something specific.  The following are a list of search operators and queries I used to find that data used to write this blog post:

  • increase traffic to my blog
  • “increase blog traffic|traffic to your blog”
  • * ways to “increase blog traffic|traffic to your blog”
  • * ways to “increase blog traffic” with

Using only the results found on the first page of Google I found the following 19 titles:

15 Tips to Increase Blog Traffi 21 tactics to increase blog traffic
5 Ways to Increase Blog Traffic 50 Tips To Increase Traffic To Your Blog
5 Creative Tips to Increase Blog Traffic and Boost Your Business 10 Tips to Increase Blog Traffic with Facebook
6 Tips To Increase Blog Traffic 6 Ways To Increase Blog Traffic From Facebook
50 Ways to Increase Blog Traffic… 10 Tips to Increase Blog Traffic with LinkedIn
8 Ways to Increase Blog Traffic 5 Tips to Increase Traffic to Your Blog With Pinterest
5 Creative Ways to Drive More Traffic to Your Blog Posts 10 Tips to Increase Blog Traffic with Google+
10 powerful ways to increase blog traffic to your blog 10 Tips to Increase Blog Traffic with YouTube
Increase Blog Traffic! 8 Great Ways That Really Work 10 Tips to Use SlideShare to Increase Blog Traffic
5 tips to help increase traffic to  your blog

Once I had the URLs and titles from each of the above I compiled a list of the total amount of ways to increase traffic to a blog, which is how I got the number 249. After I read each post and finished my list, the next step was to fine tune the results and remove any duplicate or similar techniques.

I removed suggestions such as submit press releases and create a posting schedule as presses releases should be reserved for product launches and big announcements and while posting schedules will help with orgnization, they will not directly impact traffic to your blog. Some other suggestions I removed were non-specific ones such as post your link throughout the internet.

While Google Analytics is very important to have to measure our blogs success, it too will not directly affect traffic to our blogs. But seriously, add Google Analytics people! It is important to see why and how one blog out performs another so that is another reason to set up tracking.

What was left was the following list so, without further ado, I present 125 ways to increase traffic to your blog.

 

You can stop scrolling now

As promised, here is the complete list of 125 ways to increase traffic to your blog.

  1. Add Graphics, Photos and Illustrations (with link-back licensing)
  2. Add Value to a Popular Conversation
  3. Add your blog URL to your email signature
  4. Add Your Blog Website to Your Pin’s Descriptions
  5. Aggregate the Best of Your Niche
  6. Allow People to Print, Share, Embed, and Download Your Presentations
  7. Allow Sharing and Embedding of Your Videos
  8. Also include guest posts on your blog
  9. Always follow those who re-tweet your content.
  10. Always use relevant alt tags and titles for your images.
  11. Answer emails!
  12. Ask and Answer Questions
  13. Attend and Host Events
  14. Attend (blogging) conferences and any “gatherings” related to your niche. Throw business cards around with wild abandon. Network like crazy.
  15. Be Active
  16. Be Human
  17. Be interviewed
  18. Begin commenting on other blogs
    • Be careful with this one, don’t overdo it. Too much of a good thing…
  19. Break news
    • Create a post about something that just happened
  20. Check Google Trends
  21. Comment on other people’s Facebook pages to drive traffic back to your own.
  22. Complete Your Profile Strategically
  23. Connect with People
  24. Connect Your Web Profiles
  25. Consider Guest Blogging
  26. Consider Including Annotations
    • This will keep your content relevant and fresh
  27. Contact the press directly
  28. Create “link love” posts and let each blogger know that you are linking to them.
  29. Create a “Frequently Asked Questions”
  30. Create a 6-Second Preview of Your Post with Vine
  31. Create a Facebook fan page
  32. Create a SlideShare Profile Overview of Your Post
  33. Create a YouTube channel.
  34. Create an infographic.
  35. Create an interactive experience
  36. Create and post to your RSS feed frequently
  37. Create and Upload Videos that are Relevant to Your Blog Audience
  38. Create articles on controversial topics
  39. Create both a “For Experts” and a “For Dummies” section
  40. Create Circles (Google+)
  41. Create pages on Squidoo and Hubpages
  42. Create Playlists (Podcasts)
  43. Create the Best Content you Possibly Can
  44. Create Your Own Groups
  45. Create your own local periodical meet up for bloggers or for your blog’s niche.
  46. Don’t Be Shy – Talk to everyone about your blog
  47. Embed Your Presentations on Your Blog
  48. Enable Subscriptions via Feed + Email (and track them!)
  49. Ensure All of Your Pages are Being Indexed
  50. Find Friends – Ask them to share your content
  51. Find out who’s linking to you
  52. Focus on One Specific Topic
  53. Focus on problems – offer a solution
  54. Frequently Reference Your Own Posts and Those of Others
  55. Get on Blog Catalog.
  56. Get on the RSS feeds of the top 3 blogs
  57. Get Rich Pins for Your Blog
  58. Give and Request Recommendations
  59. Give Your Readers a Reason to Visit Your Blog
  60. Guest Blog (and Accept the Guest Posts of Others)
  61. Hashtag Optimization
  62. Hold a competition or contest. The prize doesn’t have to be huge!
  63. Host Hangouts
  64. Host webinars with other bloggers – tap into their audience.
  65. Host your images on flickr
  66. Include a link to your blog in any and all online profiles you complete.
  67. Include Keywords in Your Video Descriptions
  68. Include Presentation Transcriptions
  69. Include Your Blog URL at the Beginning and End of Each Video
  70. Incorporate Great Design Into Your Site
  71. Instagram an Image From Your Post
  72. Install the related posts plugin.
  73. Integrate and Cross-Promote
  74. Interact on Other Blogs’ Comments
  75. Interact with peers on forums – Join & Post
  76. Interlink your pages.
    • Link to older posts as well as static HTML pages
  77. Interview other bloggers and tap into their audience.
  78. Invite guest posters.
  79. Jump on the Google+ bandwagon and create your own page.
  80. Keep Consistent With Your Posts
  81. Keep in touch with bloggers you know – nurture relationships.
  82. Keep Your Blog Current to Increase Blog Traffic
  83. Keyword Research
  84. Leave trackbacks on other blogs.
  85. Leverage the power of “versus”
  86. Link Your Uploaded Pins Back to Your Website
  87. Make It Interesting
  88. Make sharing simple – include highly visible social media buttons on your posts.
  89. Make Sure Your Presentations Include Your Blog URL
  90. Make your blog beautiful.
  91. Make your blog easy to navigate.
  92. Make Your Videos Public and Allow Comments
  93. Mention other Facebook pages on your wall and you will show up in their feeds.
  94. Never stop trying new things.
  95. Nominate Yourself and Other Blogs for Blog Awards
  96. Offer Free E-books
  97. Offer yourself up for interview on other bloggers’ podcasts.
  98. Participate in Q+A Sites
  99. Participate in Social Sharing Communities Like Reddit + StumbleUpon
  100. Participate in the Communities Where Your Audience Already Gathers
  101. Pin Your Post to a Pinterest Group Board
    • Provide amazing content that influencers want to repin
  102. Print some business cards. Hand them to everyone and anyone.
  103. Produce a video course.
  104. Promote Outside Your Blog
  105. Promote Your Facebook Link
  106. Provide Full Presentation Descriptions with Keywords
  107. Reach out to bloggers you like – build relationships.
  108. Referring To Your Posts And That Of Others Frequently
  109. Release a free product and ask your readers to “pay with a tweet”.
  110. Remember Search Engine Optimization
  111. Repurpose Your Presentations Whenever Possible
  112. Respond to your blog comments!
  113. Reveal pricing
  114. Search out and follow likeminded people on Twitter (search by keyword, or by hashtags, etc).
  115. Sell Things via Your Blog
  116. Share Blog Posts On Your Fan Page
  117. Share Your Best Blog Content Links
  118. Share Your LinkedIn Profile Link
  119. Showcase your top articles
  120. Start a newsletter
  121. Start a periodical newsletter to bring visitors back to your blog on a regular basis.
  122. Start producing a podcast and tap into the huge iTunes market.
  123. Submit Your Blog to Search Engines
  124. Submit Your Posts to Social Bookmarking Sites
  125. Submit your site to major directories

There you have it, 125 different techniques we should all start using to increase traffic and readership to our blogs.

My Two Cents

Using all of these methods may not be an option for some but there are plenty of new ideas here to try. I for one am excited to try several of these starting today! I sincerely hope that whatever methods you decide to use, they are beneficial to you and bring you success!

Please let me know if this was helpful to you and which new techniques you plan on using.

Tagged with: , ,
Posted in SEO

SEO Title Tag Length and Excel

twittergoogle_pluspinterestlinkedinyoutube

Recently, Google has changed they way title tags are being displayed in the SERPs, causing most tags to be truncated. Prior to this change, the rule of thumb for title tags was 65-70 characters which included any spaces and special characters. The new update has title tags showing up at a pixel width rather than length.

What Does This Mean For Me?

Should I go through my website and change all of my title tags? No. Your website may be ranking well in its current format so don’t go getting nervous just yet. If however, you titles are obscenely long and not preforming as well as expected, then I would advise switching some things up a bit. Web Shop Optimizer has created a great tool to measure the width of your title tags and will compare it to other versions you have entered as pictured below:

web shop optimizer title tag width

Screenshot of web shop optimizer title tag tool

 

 

 

 

 

Moz also offer a great tool (accessible from the first link in this post) that will provide a mock preview of the SERP result when you enter a title tag and target keyword.

moz title tag width tool

Screenshot of the title tag tool offered by Moz

 

 

 

 

 

 

 

 

How Can I Test This on a Large Scale?

The tools listed above are great but limit us from one to a few example of title tags. If you have a website that have more than a dozen or so URL’s, those options may not be the best use of time.

A quicker way to test your title tags

The other day I found myself working on a website w/ a few thousand URLs and couldn’t fathom checking the page title width for each page, one at a time. So what I did was find ONE title tag that was 512 pixels and pasted it into Excel. The Column width was 60.75.

Here is a screenshot of what it looks like:

excel column width

Column width dialog box in Microsoft Excel

 

 

 

 

 

 

The easiest way to test the current width of all page titles on our site is to run the website through screaming frog. This will provide us with a list of page titles, H1 tags, Meta Description and so on that we can then export. Once in Excel, right-click on the column that contains the title tags and choose Column Width.

The newly formatted column allows us to see if our title tags are too long and which ones can be adjusted accordingly.

My Two Cents

This is a great way to present a technical audit to our clients when they ask about best practices regarding title tags. I’d love to hear any success stories from anyone who has tried this method and presented it to their clients.

Tagged with: , , ,
Posted in SEO

The Demise of Foursquare?

twittergoogle_pluspinterestlinkedinyoutube

An article on Search Engine Watch the other day spoke of major reform coming to the location based check-in app Foursquare. The company will cease to exist in its current form as it looks to split into two services. Foursquare is aiming to restructure themselves as a local discovery service so they will continue to explore that as under their current name and will launch a new service called Swarm for checking in.

So…what does that mean? It means more personalization and deeper local search functionality. According to their blog, the company compares the current state of local search to the Yellow Pages and they are looking to add “expert reviews,” not reviews from strangers.

Personalization seems to be the big focus for this change and may also be their undoing. Personalization does have its benefits but can be limiting when trying to discover anything new. If you’re constantly shown things based on “your tastes” and by “people you trust,” that limits the discovery of anything new outside of your happy little world.

Will the company survive?

It’s hard to say since Foursquare is solely identified with their check-in service how people will act to a drastic change like this. Introducing a new app that is just for checking in when every other social networking services already has that feature is going to be rough. They’re essentially going to have to reinvent the wheel if they plan on keeping engagement up.

My Two Cents

Initially I think the new Foursquare will do great as they already have the public’s interest however, I don’t think they’ll experience the longevity they’re looking for. People will swarm to it out of curiosity but I don’t think they’ll continue to use the service unless there is a huge benefit to do so. I really like Foursquare and hope that I’m wrong but I’m predicting a dismal road ahead.

 

Posted in SEO

Is Google Targeting HARO Now?

twittergoogle_pluspinterestlinkedinyoutube
Broken NYC Pedestrian Traffic Light

Is Google sending us mixed signals about backlinks?

2014 has seen a lot of changes with how Google views backlinks. The search giant has taken action on several link networks in Germany, Spain, Italy, Greece and France this year and is showing no signs of slowing down.

An article posted yesterday by +Bill Hartzer mentions how Google is now targeting HARO and press releases as bad links.  For those of you who don’t know, HARO, which stands for Help A Reporter Out, is a resource for bloggers, reporters and other content-centric individuals used to promote their brands, services, products, etc. Until today, HARO was very beneficial in this regard as it helped many people connect and share their content.

I first heard about this story on a post by Rand Fishkin on Google+ earlier this morning where the debate over this continues. It may not be HARO itself that’s targeted but low quality sites linking to Bill’s content that triggered his investigation. Long story short some of the syndicated content was scraped by low quality sites that engage in “fishy or spammy” activity as Bill mentions.

No matter which way you look at it, link building is becoming increasingly difficult as SEO evolves. This can be viewed as a good thing in a sense that only high quality content will prevail but the uncertainty of what strategies are allowed is alarming.

My Two Cents

Be careful how and where you market your content. Bills story shows us that even trusted sites like HARO are still subject to scrutiny. Research and continually monitor backlinks to your site to prevent something like this happening to you. Set up Google or Talkwalker alerts to receive notifications any time your brand or content are mentioned. This will help to see where mentions and links are coming from so you can then determine if you need to disavow them or not.

 

UPDATED 5/1/2014

A followup story by +Matt McGee highlights some examples that Google may not be targeting HARO. Matt cites some examples that point out how low quality sites scraped content from the original articles, that did not have any backlinks, and those scraper sites had duplicate content and were modified to include backlinks to Bill’s clients. I will continue to monitor this and update it with any new evidence that surfaces.

Posted in SEO

Google Makes Up Its Own Title Tags

twittergoogle_pluspinterestlinkedinyoutube

Sounds funny, right? But it’s true. The search giant confirmed that it alters or ignores the title tags on our sites to show what it feels may be more relevant towards a user’s search query. A recent video by Matt Cutts explains how they display different content than what a webmaster has provided. In the video, Matt says that Google may use the content on our page, anchor text from backlinks pointing to that page or a description from the Open Directory Project (DMOZ) in lieu of the original page title.

It almost seems unfair that this is done in the first place. If the search algorithm is designed to show better sites based on a webpages content, and title tags are a part of that content, then Google shouldn’t step in to change anything. The video further explains what they look for when reviewing title tags. Keep them short, include a good description of the site and make sure they’re relevant to the query. Let’s dissect this a little further. Keeping them short, I understand that. Though, recent findings suggest that character limit is no longer a concern, it’s more so pixel width. Next is a good description of the site. Does this mean that we’re supposed to include the brand in the title tag? What about a company slogan, wouldn’t this add to the size of the title tag?

Lastly we have relevant to the query. This is one that I’m having the most trouble with. If the title tag is relevant to the query to begin with, why does Google need to intervene? I get it, sometimes meta data (yes I’m including title tags in this) is overlooked when launching a website so it may not be fully optimized. I also understand that a website may have really good on-page content while overlooking those criteria and Google may still want to serve that information. But it just seems unfair that webmasters can ignore certain guidelines and possibly rank better than the competition.

Why Not Allow Dynamic Keyword Insertion

In AdWords, there is an advanced feature that allows for a different set of keywords to be shown based on a user’s query. Why not allow organic search to control that? I mean, that would show keywords/phrases related to a search query.

My Two Cents

I will continue to create and implement quality page titles, descriptions and content until I retire but I’m concerned with how small changes like this will continue to change in the future. Are we going to have less control over how our content is displayed in the SERPs? Is this pushing us all to use semantic markup more frequently? There are so many questions that only time will answer.

Posted in SEO

Semantic Markup Gets A Little More Detailed

twittergoogle_pluspinterestlinkedinyoutube

Many of you who know me know that I have been waving the semantic flag for years now. So when news like this comes along I am all too excited to share it with everyone (who will listen).

Google recently announced new semantic markup tags for businesses who have varying points of contact. What does that mean? It means businesses will now be able to specify preferred phone numbers for the following departments:

  • Customer Service
  • Technical Support
  • Billing Support
  • Bill payment

Since contact information is a popular query for most businesses, using these tags will help to surface that information in the SERPs.

Additional Recommendations

Along with this update, Google also released additional recommendations on how to craft the location page on your website. Listing business hours and contact numbers are always good information to have, but also provide a little extra value. Let visitors know how parking is in the area, is there an ATM nearby, do your pharmacy hours differ from your stores hours. Google goes on to provide further information here regarding schema for local business location pages.

Conclusion

If you’re not using any rich snippets on your website then you may be missing out on business; it’s that simple. Since all of the major search engines now support semantic markup, this is the new low hanging fruit as far as SEO is concerned.

Posted in SEO

Google Webmaster Tools – A Comprehensive Guide – Part 7 1/2

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to the final section of my 7 1/2 part series on Google Webmaster Tools. If you’ve made it this far you should feel pretty comfortable with using all the tools available.  In this post I will be discussing the Disavow Tool.

Disavow Links

Before I get into how to use this feature I first want to describe what disavow means. Disavow is defined as denying any responsibility towards something. So, how does this apply to our backlinks? When we disavow a link, it sends a signal to Google that we are not responsible for creating that link and would not like to be associated with it. This could be due to low quality links that were created pre-penguin, links that are not related to our content or in some cases, negative SEO.

Not available through the standard menu in Webmaster Tools, the disavow tool can be found here. Since this feature has the ability to cause a lot of damage if not used properly, I’m guessing Google doesn’t want to make it too easy for everyone to find.

 

webmaster tools disavow links screenshot 1

Google Webmaster Tools Disavow Links Screenshot 1

 

 

 

 

 

 

 

 
When we first click the link above we come across this screen. There is a warning notice from Google about using this tool. We are advised to try removing the links prior to disavowing them as this will not solve all of our issues.  Some ways we can achieve this are manually removing the links ourselves if we have access to them or reaching out to other webmasters and asking them to remove the links to our sites.

Depending on what account we’re logged into, we will see a series of URLs in the drop-down menu to the left of the red (should I press this) button.

Let’s select the account we want to clean up and click DISAVOW LINKS.

 

webmaster tools disavow links screenshot 2

Google Webmaster Tools Disavow Links Screenshot 2

The next screen we see contains another disclaimer from Google urging us to use caution when using this feature. Once we are certain that we have a complete list of URLs and/or domains that we want to disavow, we can click Disavow Links.

 

webmaster tools disavow links screenshot 3

Google Webmaster Tools Disavow Links Screenshot 3

 

 

 

 

 

 

 

 

 

We see the same cautionary notice on this pop-up that we do on the previous screen. We are asked to upload a *.txt file that has all of the links we would like to disavow.

Remember, to a get a complete list of URLs that are linking to us we can use the tool under search traffic, Links to Your Site.

We can export that list and, after deeply researching the URLs, determine which links are safe and which we would like to disavow.

The file we upload needs to be formatted a certain way otherwise it won’t be accepted. Google provides us with an example of how it should look here.

The hashtag in front of a sentence will comment it out. This means that those lines will not be reviewed as part of the links. Comments are a good way to notate your efforts when removing links; this means that you can list all efforts when contacting webmasters and asking for your link to be removed.

Conclusion

If you do need to disavow any links or domains pointing to your website, please be sure that you are careful and fully research the domain before you submit a disavow request.

On a personal note I would like to thank you all for reading this and taking this journey with me through every area of Google Webmaster Tools. I can only hope that I explained everything thoroughly and extensively enough that you all feel comfortable and confident every time you use these available tools. I look forward to any questions and comments you may have regarding any of this subject matter. Good luck!

Tagged with: , ,
Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 7

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to part 7 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering the Labs section of Google Webmaster Tools.

Labs

There are not many parts to this section but the ones that are here are very useful. The Labs section contain experimental tools that have not yet been pushed live, to the other sections of Webmaster Tools. The two areas we’ll cover today are Author Stats and Instant Previews.

Author Stats

Those of us who have Authorship set up on our blogs may have noticed some different things about our websites. For starters, a thumbnail image from our Google+ profiles may be showing up next to our website in the SERPs. Authorship associates our identities with those websites and helps to establish us as authorities on that subject matter.

The benefit of implementing authorship is that it leads to a higher CTR because there is a certain trust factor associated tying an identity to a website.

In Webmaster Tools, we are able to see a chart of how frequently our work is showing up in the SERPs. The chart, similar to the one seen under Search Appearance and Search Traffic, give us a link to the page we wrote, the number of impressions it received, the number of clicks that post has as well as the CTR percentage and average position.

webmaster tools author stats screenshot 1

Google Webmaster Tools Author Stats Screenshot 1

It’s pretty cool to see all of your work in one central location like this.

Notice there are two links at the top of the image above. You is hyperlinked and points to your Google+ page. Learn more about verifying authorship takes us to a page where Google describes what authorship is and how to achieve it.

Instant Previews

This area in Webmaster Tools lets us see a preview of how our page will look similar to the preview window in the SERPs. Although Google is no longer showing previews in the SERPs, we can still see how it would look here.

webmaster tools instant preview screenshot 2

Google Webmaster Tools Instant Preview Screenshot 2

Conclusion

Although the Labs section of Google Webmaster Tools contains experimental tools that are not completely out of beta, it’s definitely worth checking out because we can only learn more about our websites by exploring.

Tagged with: , ,
Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 6

twittergoogle_pluspinterestlinkedinyoutube

Hello and welcome to part 6 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering the different sections Site Messages, Security Issues and Other Resources.

Site Messages

Here we are able to see any and all communication directly from Google regarding our websites. Don’t get too excited though, it’s not an open dialog rather, it’s more along the lines of notifications. For example, if we were to connect our Webmaster Tools accounts to Google Analytics, we would see notifications (like the ones below).

webmaster tools site messages screenshot 1

Google Webmaster Tools Site Messages Screenshot 1

Notice there are three options on the top menu when we first get to this screen. All allows us to see all messages regarding our website, Starred shows us messages that we manually star which is like bookmarking them, and finally we have Alerts. Under alerts are the important messages like if we receive any manual action from Google or receive some sort of penalty. Clicking on the message subject will provide us with more detail and in some cases a link. The two messages I have displayed are letting me know that I have chosen a preferred version of my domain, joerega.com sans the www subdomain and that I have linked my Webmaster Tools  profile to my Google Analytics account. This allows for more information in Analytics such as keyword data, landing pages and geographical data. One of the messages we may receive regarding our websites is about Security Issues.

Security Issues

This section in Google Webmaster Tools shows us any warnings we may receive regarding an attack or hacked site. If you have ever seen a website in the SERPs display a message underneath a URL that says this site may harm your computer or this site may be hacked, changes are the webmasters would head over to this section to see what’s wrong. Google goes into greater detail here. Fortunately, there are no issues with my site and I hope that all of you reading this see the following when you log in to this section:

webmaster tools security issues screenshot 1

Google Webmaster Tools Security Issues Screenshot 1

Other Resources

Here we have access to some more powerful tools in Webmaster Tools.

webmaster tools other resources screenshot 1

Google Webmaster Tools Other Resources Screenshot 1

First up is the Structured Data Testing Tool. This will let us test any rich snippets web have on our website by either a URL or the source code directly.

webmaster tools structured data testing tool screenshot 1

Google Webmaster Tools Structured Data Testing Tool Screenshot 1

webmaster tools structured data testing tool screenshot 2

Google Webmaster Tools Structured Data Testing Tool Screenshot 2

 

If any schema are used to markup your website, this is the section to use to test it. Since structured data is not yet widely implemented, there is also a tool to help us make sure our configuration is correct. Enter the Structured Data Markup Helper. This area allows us to select the type of data we are trying to markup and enter the URL or HTML. This works for both websites and email.

 

By selecting the type of data then entering the information below, Google will know what data to be looking for and therefore tell us if our markup is correct.

webmaster tools structured data markup helper screenshot 1

Google Webmaster Tools Structured Data Markup Helper Screenshot 1

 

 

 

 

 

 

 

 

 

 

The email screen has a different set of data types to choose from. Rich snippets in email allow for dynamic emails that companies can send out to enhance them. This may include flight departure and duration times as well as a countdown to a specific event. Here are the current options at the time of this writing:

 

webmaster tools structured data markup helper screenshot 2

Google Webmaster Tools Structured Data Markup Helper Screenshot 2

 

 

 

For structured data in email, the only option to verify if the coding is correct is by HTML.

The next link down is the Email Markup Tester. This is just another place to test the rich snippets in your email, similar to the one above, just without the data type selection.

The following link is for Google Places. This is not a part of Webmaster Tools but an external link to the Google Places website. Since it’s not a part of Webmaster Tools, may I’ll cover that in a future blog post?

After that we see Google Merchant Center. This is another external link to the Merchant Center Website. Merchant Center is used to upload product data to Google shopping and other Google services wherever available.

The second to last link is the PageSpeed Insights.  Another external link to Google Developers, PageSpeed Insights allows us to see how fast our websites are and what, if any, areas we can adjust to make them faster. This is also a great way to determine how the user experience is. There are two options to measure speed, one for mobile and one for desktop.

The final link in this section is Custom Search. Google Custom Search allows us to add a search engine to our website. This will display search results that favor our webpages first and can be configured a number of different ways. Google continues to improve upon this feature as it is now available with different schema.org types.

Conclusion

We touched upon some more features of Google Webmaster Tools today from the different types of site message we can receive to security issues that warn us when our sites may be hacked. We also covered the other resources section which contains some advanced tools regard rich snippets, Google Places, Google Merchant Center, PageSpeed Insights and how to create our own custom search engine.

In my next post I will cover the Labs section of Google Webmaster Tools.

Tagged with: , ,
Posted in SEO, SEO - Google Webmaster Tools

Google Webmaster Tools – A Comprehensive Guide – Part 5

twittergoogle_pluspinterestlinkedinyoutube

Welcome to part 5 of my 7 1/2 part guide to Google Webmaster Tools. In this post I will be covering everything under the Crawl section.

Crawl

This area of Google Webmaster Tools that contain a lot of advanced features used to control the search visibility of your website and provide you with with information on how often your site is crawled. The first area in this section is called Crawl Errors.

Crawl Errors

I’m not going to spend too much time on this section since I covered  it in Part 1, so I will just do a quick review. This section shows us a list of errors that Google has found when crawling our websites within the last 90 days. We are able to download that list which includes server codes, URLs and the date detected, providing you with more information to diagnose and solve those issues. Recently added features include showing errors from Smartphones and Googlebot-mobile crawl errors. From here you can also see data from DNS, Server connectivity and Robots.txt.

Crawl Stats

This area under Crawl shows all Googlebot activity, on our website, for the past 90 days. There are three graphs we are presented with when we first land on this page: Pages crawled per day, Kilobytes downloaded per day and Time spent downloading a page (in milliseconds).

Webmaster Tools Crawl Stats

Google Webmaster Tools Crawl Stats Screenshot

Rolling over a graph will provide us with the date it was crawled and information pertaining to that day. This could mean how many pages were crawled that day, how many kilobytes were downloaded or how much time was spend downloading a page. To the right of these charts are information highlights such as High, Average and Low. This makes it a lot easier if we need to diagnose any issues with site load time. If there are any issues with our site, Webmaster Tools allows you to Fetch as Google to see how Googlebot sees your page.

Fetch as Google

The third link down under Crawl is Fetch as Google. Here we have the ability to submit pages to Google’s index as well as see how Google’s spider sees the page.

Webmaster Tools Fetch as Google

Google Webmaster Tools Fetch as Google Screenshot

From here we have a few options to choose from when we want to see how Google sees our website. The first step is to enter a URL path into the text box. For example, if your website is http;//www.example.com, you only need enter the page you are interested in seeing; this is what comes after .com/. Let’s say we are interested in /sample.html, we would enter sample.html in the space provided and click FETCH. After we click fetch we are prompted with a pending message then Your request was completed successfully. Here is what it looks like:

webmaster tools fetch as Google screenshot 2

Google Webmaster Tools Fetch as Google Screenshot 2

Since I don’t currently have the page /sample.html on my website, this tool is going to return an error message of Not found. From doing SEO we know that not found is a 404 error.

webmaster tools fetch as Google screenshot 3

Google Webmaster Tools Fetch as Google Screenshot 3

Messages in the Fetch Status column are hyperlinks, clicking them will take us to a page that outlines the status in more detail. Here is a small sample of what you would see on a not found error page:

Fetch as Google

This is how Googlebot fetched the page.
URL: http://joerega.com/sample.html Date: Thursday, March 6, 2014 at 4:23:54 AM PST Googlebot Type: Web Download Time (in milliseconds): 2016
HTTP/1.1 404 Not Found
Date: Thu, 06 Mar 2014 12:24:20 GMT

Notice the 404 Not Found message above. Since this page does not exist, there is no HTML source code to download. It’s safe to assume that this page never existed, or was removed, and should be properly redirected. Jumping back to the main Fetch as Google page there are a few other areas to discuss. First, you may have noticed that there was a number under Fetches remaining; this number is 500. The counter means that you are allowed to use this feature for 500 URLs per a week. After entering a URL (that exists) and clicking  Fetch as Google, the message that appears under Fetch status will say Succes and say have a button appear next to it that says Submit to index. This will allow us to submit a page to Google’s index faster than waiting for their spiders to crawl our site. If we select the option to submit a URL to Google’s index, we are prompted with two options: Submit just this URL and Submit this URL and all linked URLs. 

webmaster tools fetch as Google screenshot 5

Google Webmaster Tools Fetch as Google Screenshot 5

 

 

 

 

 

 

Notice that the Fetch status on the new URL now says Success and has Submit to index next to it. Clicking the Submit to index will cause a pop-up to appear giving us two options. Here we can choose to submit just one URL or this URL plus all linked pages. The ladder should only be used if there were major updates to your website; this is why that option is limited to 10 per week.

webmaster tools fetch as Google screenshot 4

Google Webmaster Tools Fetch as Google Screenshot 4

Once submitted the Submit to index button will now say URL submitted to index. We can then explore other fetch options that are available . Entering another page or directory into the fetch bar we see that there are five options total: Web, Mobile XHTML/WML, Mobile cHTML, Mobile Smartphone – New and Mobile Smartphone – Depreciated. The first option, New, we already saw how it worked by entering the URL above and submitting to index. The other options are used to see how your phone will preform on various  mobile devices. In an effort to save time and not get too far off course, I will mention that the other options are used to test the different types of markup used for mobile development. Now, to do a complete 180, let’s discuss another area in Webmaster Tools where we can tell if Google and other search engines are ignoring URLs or directories on your site.

Blocked URLs

Blocked URLs in Webmaster Tools allows us to see if our robots.txt is working properly and properly blocking content we don’t want indexed. Be careful how you configure your robots.txt because it’s very easy to overlook something and block an entire section of your website, or your entire website from Google (yes I’ve seen this happen). Here is what a basic robots.txt looks like:

User-agent: * Disallow:

The asterisks ( * ) after User-agent: is saying that all search engines are welcome to crawl the site. Disallow: on the next line is currently saying that we are not disallowing search engines from crawling our site. If we were to include a  forward slash after disallow, this would block our entire website from all search engines.

User-agent: * Disallow: / This is what a robots.txt looks like when blocking an entire site.

In this section we are able to test our robots.txt before it’s uploaded to check for any errors. Here is a screenshot of how it looks:

webmaster tools blocked urls screenshot 1

Google Webmaster Tools Blocked URLs Screenshot 1

Currently we can see that no search engines are blocked an my website is open for indexing. Further down this page we see an area where we can test blocking pages before going live on our robots.txt. Here I used the page /test.html (which doesn’t really exist on  my site btw) to show the results.

webmaster tools blocked urls screenshot 2

Google Webmaster Tools Blocked URLs Screenshot 2

 

 

 

 

 

 

 

 

 

 

The page /test.html would be successfully blocked if I was to add this to my robots.txt. Similar to the Fetch as Google section mentioned prior to this, we are able to test our robots.txt against other Google user-agents such as Googlebot-Mobile, Google-Image, Mediapartners-Google (used to determine AdSense content) and Adsbot-Google (used to determine AdWords landing page quality). Some people choose to add their XML sitemap to the robots as well; while it’s not required, it’s also not harmful to do so. Picture the robots.txt as a roadmap to your website. Search engines will stop here first to determine what pages can be crawled an indexed. They already know to look for an XML sitemap but adding the line Sitemap: http://www.example.com/sitemap.xml may bring piece of mind to the webmaster. Sitemaps themselves can become very complicated if there is a mistake, which is why I’m glad that Webmaster Tools provides a section for that as well

Sitemaps

Sitemaps are an important part of a website. While HTML sitemaps are geared towards humans, XML sitemaps cater towards machines; more specifically, search engines. Google Webmaster Tools allows us to upload multiple sitemaps and will alert us to any errors they may contain or develop. We are able to see, in this section as well as in Google Index, how many pages are indexed.

Without spending too much time on sitemaps themselves, I will only cover the basics here today and how we upload them to Webmaster Tools.

The second to last option in the left-nav under Crawl is Sitemaps. Clicking on it will bring us to the following page:

webmaster tools sitemaps screenshot 1

Google Webmaster Tools Sitemaps Screenshot 1

 

This is easy to visualize since I currently don’t have may pages to my website. The first step in entering a sitemap is to click the red button in the upper-left corner that says: ADD/TEST SITEMAP. As soon as that button is clicked we see the following window appear:

webmaster tools sitemaps screenshot 3

Google Webmaster Tools Sitemaps Screenshot 3

 

 

 

 

 

 

The location of an XML sitemap generally lives on the domain name as a file. What does that mean? It means it can be found here: http://www.example.com/sitemap.xml. Some websites will have site-map.xml or sitemap.xml.gz, theses formats work just the same. After you enter the file name in the space provided, click Submit Sitemap. This will add your sitemap to Webmaster Tools and provide you with the image you see above: how many pages are indexed verses how many have been submitted.

If your website is new and recently launched, you may only see one column for Submitted. That’s perfectly normal as Google has not indexed any of your pages yet.

Before submitting your sitemap you may want to text it to see if there are any errors. In lieu of clicking Submit Sitemap, click Test Sitemap and you will be provided feedback on the health of your XML Sitemap.

Once your Sitemap is submitted and the URLs are indexed, you can click on the second tab that says Web Pages.

webmaster tools sitemaps screenshot 2

Google Webmaster Tools Sitemaps Screenshot 2

This page will let you know if there are any issues with your Sitemaps. I say Sitemaps because you can submit more than one in this section. Many webmasters will upload separate Sitemaps for different pieces of content: one for images, one for videos, etc.

 

URL Parameters

This is another technical section of Google Webmaster Tools and should be used very carefully. Changing the parameters of our URLs may severely affect how our sites are crawled and indexed.

URL Parameters work like this: Let’s say we have an eCommerce website that sells shoes. One of our URLs may looks like this: http://www.example.com/mens?category=sneakers&nike. There may also be another version of the URL that looks like this: http://www.example.com/mens-nike-sneakers.html. To avoid any duplicate content issues with our websites, we can use this section to help us show Google the main URL we would like indexed. Again, I am going to stress the importance of being extremely careful in this section. Google goes into greater detail about this and you can read more here.

 

Conclusion

We covered a lot of material in this post today. From diagnosing crawl errors to learning how Google sees our website. The Crawl section of Webmaster Tools is very useful when we need to dig deeper into our sites health.

Tagged with: , ,
Posted in SEO, SEO - Google Webmaster Tools
Follow me on Google Plus
Google Webmaster Tools – A Comprehensive Guide