26 Sep, 2012  |  Written by  |  under Analytics, SEO

A client’s internal team revised their web site a few months ago, and they noticed that their “sitelinks” went away in the process. Sitelinks are those indented listings below a regular Google result. For example:

examples of sitelinks from Habitat for Humanity

These are a handy way to get more traffic to your site, especially for branded searches where folks may want to click directly to a specific page on your site instead of going to your homepage and then clicking around.

In our case, the client lost their sitelinks because they installed WordPress in a folder instead of at the root of their site. So their URL structure was like:

  • www.site.com/Wordpress
  • www.site.com/Wordpress/about-us
  • www.site.com/Wordpress/products
  • www.site.com/Wordpress/support
  • www.site.com/Wordpress/contact-us

Initially we thought the problem was just that their web root (www.site.com/) needed a 301 redirect to www.site.com/Wordpress but adding that didn’t get their sitelinks back. Instead, we had to move WordPress up to the root. Instructions are here: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory

We also used the Redirection plugin to 301 their old URLs to their root locations. An .htaccess redirect would have worked just as well.

One caveat: Be sure to update your Settings: General: Site Address (URL) and not your Settings: General: WordPress Address. The latter is your admin URL and needs to be explicitly in the real folder for your admin area to work. If you break it, you can add this to your wp-config.php (note that the variable name is odd relative to the naming used in the admin area):

define(‘WP_SITEURL’,’http://www.yoursite.com/Yoursite/’);

We made this update and a week later the site has their Sitelinks back!

 

29 Aug, 2012  |  Written by  |  under Analytics, SEO

I’ve had a few clients ask about Google’s Panda and Penguin updates this year, and at least one I think has been impacted by each. Here’s what a “Penguin” smackdown looks like, with a SHARP decrease in inbound Search Engine traffic on April 24, 2012:

So, what’s next for these guys? In this case it was an “extra” site that doesn’t have much impact on the business, which is why we didn’t notice the traffic dropoff immediately. The solutions are the same as always:

  1. Remove the old duplicate content
  2. Produce new unique content
  3. Promote the new content in social networks and generate real buzz

That’s a lot harder than re-using syndicated content and cheap link building, but the results will last a lot longer.

23 Feb, 2012  |  Written by  |  under Analytics

A frequent problem I see in evaluating web site traffic is that some site pages are missing Google Analytics code, preventing them from being tracked. Very often, these are form submission “thank you” pages (which are ideal Goal Pages in Analytics!) or other pages that have “funny” templates relative to the rest of a site.

The procedure I use to track these down works like this:

  1. Spider the site to capture all files, ignoring images:
    wget --recursive --random-wait -w 1 --force-html -R gif -R png -R jpg -R pdf -R css -R js -R mov http://www.yoursite.com/
  2. Look for pages that DON’T have Analytics, again ignoring image files, css, etc.:
    find . -type f -print | xargs grep -c UA-12341234 | grep -v png | grep -v gif | grep -v jpg | grep -v images | grep -v pdf | grep -v css | grep -v robots.txt | grep -v '\.js' | grep :0

Works like a champ to ferret out weird pages. You can of course use the same approach to grep for other strings that need to be on all site pages.

 

I was emailing with scrum expert and all-around great guy Dan Greening and thought this might be useful for others as well.

Remember when considering search engine optimization to focus first on goals, then traffic, then phrases, then rank. Ranking for irrelevant phrases won’t get you more or better leads.

Early on, be sure to set up goal/conversion tracking in Analytics. Most sites have several goals:

  1. Site visit duration. Long visits (1 minute longer than average?) = goal worth perhaps $5?
  2. Site visit pages. Many pages (1 or 2 more than average?) = goal worth another $5?
  3. Newsletter subscribes. $20 value?
  4. Contact Us submission. There are probably worth $100?

Here’s a related post if you’re having trouble setting goal values.

Once you have goals and conversions info you can track that back to phrases and find out which ones work best for you. This is a bit of a rosy outlook in that many sites won’t have enough high-value conversions to be statistically significant, which is why I emphasize soft goals like time-on-site and visit-depth.

Folks who spend a lot of time on your site or look at lots of your pages are pretty likely to subscribe to your newsletter or submit your contact form later on, so look at the phrases driving that kind of traffic and put your SEO time and money into those.

Most of the sites I work on these days are on the small end of the scale, but a few are large news portals.  One of them went through a site redesign a few months ago (before I got involved), and they’ve seen a large dropoff in traffic.  I’m starting to look into why that happened, and wanted to share some ideas about how to fix the problem.

  1. Continue Reading ->
16 Mar, 2009  |  Written by  |  under Analytics

I’m a big fan of Google Analytics, and check it often for clients’ sites, to see how things are going and get ideas about how to make improvements. One of the most frustrating stats for me is the “Bounce”. Today, I’ll show you a new tool for getting inside what happens during a bounce.

2009-03-16_1121

Continue Reading ->

12 Mar, 2009  |  Written by  |  under Analytics

One of the great mysteries of Google Analytics is just why they won’t let us see referring URLs of our users. Probably some privacy concern of Google’s, but it’s easy as pie to get from web server logs, so I don’t understand the issue.

I’m so dependent on Analytics for day-to-day work that I really would prefer to have all the info in one place (Analytics) instead of having to integrate log-based reporting (and deal with clients asking why the two give different numbers!).  Here’s how to do it:

Continue Reading ->

11 Mar, 2009  |  Written by  |  under Analytics

I normally set up sites to just log 404s and check the log files for problems, which is fine for me, but hard for clients.  Here’s how I recently set up a site so the client’s marketing folks could spot 404s on their own:

Continue Reading ->