My Clients Site Isn’t Ranking, Now What?

SEO
empire-state-building-foggy-night
How to clear up some technical errors

It happens hundreds of times a day. You head over to Google to manually search for your clients position and they are no longer on page 1. Sometimes they may even disappear from from the first 3 pages of Google. This can be very confusing because you may not have altered the techniques you were using when they were ranking on better just hours earlier. Needless to say that your client will not be happy either.

So what can you to do make sure your website is properly optimized? Preform a technical SEO audit.

Step 1: Inspect The Layout and Usability

Search engines seem to have a preference for information being laid out a certain way and to stray from this may be detrimental to your rankings and traffic. Here are several steps I take when presented with an issue:

  • Make sure the NAP; Name, Address & Phone Number, are clearly listed in the header as text and not an image. Text seems to rank better than images in terms of information.
  • Make sure the file name for your logo is company-name-logo.xxx. This will help with branding when someone executes a branded search.
  • Review the internal linking structure of  your website.
    • Does your website default to .com/ and your home button link to .com/index.html? If yes then update it.
    • Are they any links that lead to 404 pages? If yes then redirect them.
    • Is your most important content or product easily accessible from the home page.
  • Are you taking full advantage of your footer? Make sure that your NAP and important links are reiterated here.
  • Does  your website redirect the the www version of the URL or are both the non-www and www versions open?
  • Is the information on all content pages easily broken down? Does it have a hierarchy of H1, H2’s and H3’s?

Step 2: Get More Technical – Review the Robot.txt and XML Sitemaps

Every website has, or at least should have a robots.txt and XML sitemap. Robots.txt are used to give information, or instructions, to search engine spiders such as what pages to crawl and what should be ignored. It also points to the XML sitemap. But what should it look like?

The most common robots.txt will look something like this:

User-agent: *
Disallow:

The asterisk is a wildcard that dictates all user agents, or search engines, are welcome to crawl this site. Since there is no specified directory after the Disallow command, this says that all pages are open for crawling.

In it’s simplest form, an XML sitemap contains a list of URLs that are on your website. This list may also contain information as to when that page was last updated or how frequently it’s updated. You also have the ability to indicate the most important pages by setting the priority.

Get Even MORE Technical – Is Everything Where It’s Supposed To Be?

At this point you may want to to review the source code of your website. This will show you a list of the coding used to build your site such as HTML, CSS and Javascript. Here are some things to look out for when reviewing this:

  • Is the Google Analytics tracking code where it’s supposed to be?
    • This should be placed in the header,  just before the closing head tag </head>.
  • Do you have the proper meta data?
    • Meta Description, Meta Keywords, ICBM coordinates
  • Are you using Authorship?
    • Set up the proper rel=”author” and rel=”publisher” tags

Review Google Webmaster Tools to Find Deeper Issues

Google Webmaster Tools is a free service offered by Google that allows someone to see how their site is operating from a technical standpoint. If you think your website may be performing poorly in the search engines, the answer may lie here.

What should you be looking for? From the dashboard you can review the following:

  • Check the amount of 404 errors you are showing. If you have a lot then you should set up 301 redirects to correct them.
  • Check to see how many pages are indexed by the XML sitemap. If you have submitted an XML sitemap and the submitted amount is larger than the indexed amount then you have a problem.
  • You may need to set canonical tags on some pages, 301 redirects or check to make sure that certain pages don’t have a noindex,nofollow tag on them. This sometimes happens when developing a new page, the noindex,nofollow pages are not removed and are pushed live.

Some other features available in Webmaster Tools allow you to check your backlink profile to see what pages and website are linking to you. This may be useful to determine the quality of sites as well as the amount of links you’re receiving. Another great feature is the query data that Google provides. Until recently, this section only displayed an average amount of clicks your website was receiving but as of last week, Google announced an updated that will now display more accurate impressions and clicks for the keywords related to your site.

There are many different approaches to take when preforming a technical audit of a website, the trick is to quickly find the issue and resolve it so you can get your rankings and traffic back. You may also want to consider looking at your social profiles and make sure they are optimized and correctly sending traffic to your website.

What are some things you look for when your site’s rankings and traffic drop?

 

 

About the Author:

An SEO in NYC with a penchant for the technical side of things. Father, Husband, Novice Photographer and Music Lover.

1 comment

  1. […] number of issues that could be causing a drop in rankings and traffic. Below is a checklist of what you should review to determine if there is an issue with your on-page […]

Comments are closed.