Monday, November 30, 2009

External Links for SEO: Link Quantity vs Quality vs Diversity

Now that you've learned how building up inbound links to your web page from external websites is probably the single-most important element in SEO, you're embarking on a campaign to foster inbound links to your site. But how do you decide what links to go after and how to prioritize across various opportunities?

As a general rule of thumb, you want to have as many links from a diverse set of the most reputable sites possible.

First let's take quantity versus quality. You can think of the PageRank algorithm as a summation of all the links you receive, weighted by their authority. So, for example, if you receive 5 different links from 5 different sites with "authorities" 2, 3, 7, 7, and 9, you could say that the total "authority" gained by your site is 2+3+6+7+7, or 25. On the other hand, if you received 3 different links from 3 different sites with authorities 9, 9, and 9, you would have received 9+9+9, or 27 "authority points." In our example, you would have been better off with fewer, but higher quality, links.

That tradeoff between link quantity and link quality should be fairly clear. Of course, keep in mind that obtaining a link from a more authoritative site is oftentimes more difficult, and thus may require more time and effort on your part to obtain.

The third variable I'll throw in here is the issue of link diversity. Let's take the example above with the 3 links. In a situation where all 3 links are from the same single site (from different pages or from different parts of the same page) versus where each of the 3 links comes from a different site, the option where the links come from more diverse sites is better and thus likely to rank your site higher.

It's even said that search engines may grade sites as being more or less diverse from each other based on past link patterns and other factors. Thus, there are also varying grades of "diversity." Make sure to keep in mind link diversity when building your inbound link profile as it's yet another factor that engines look at which you should take into account!

Tuesday, November 24, 2009

Which URL Shortener to Use for SEO

The explosion of Twitter in 2009 has led to a surge in the use of URL shorteners as well as as a surge in the number of different URL shorteners out there.

There are a few things, from the SEO perspective, you need to know when selecting which URL shortener to use.

First, you want to pick a URL shortener that is going to exist into the future. If shortened links to your site are being posted throughout the web, those are all links that could be in jeopardy in the event that the URL shortener used to shorten them becomes defunct. This happened most recently with tr.im and a lot of controversy followed. Some efforts (most notably 301works) have arisen to remedy this risk, but regardless, I'd suggest picking the most robust URL shortener when shortening your own URLs (if others are shortening your URLs there's little you can do there).

The other thing you need to make sure you do is that you use a shortening service that uses a 301 redirect to your URL (we covered when to use a 301 redirect previously). Search Engine Land had a good post on this topic and a chart that goes over the main shorteners and which ones use proper redirects (as well as other factors you may want to consider).

In short, to make your decision easier, I'd say go with the biggest shortener. They're likely to be using the redirects correctly (or else not so many people would use them) and they're also likely to be a robust solution that will be around into the future.

Use Google Toolbar to View PageRank - SEO Tools

We've talked about the concept of PageRank before - the numerical measure that Google assigns for how important a given page is in the context of the overall web. PageRank is a critical determining factor in how search results are ranked by Google, and therefore a critical component of SEO.

We know how important PageRank is, but how do we know what PageRank is assigned to a given web page? There are a number of tools out there, but one that I find to be useful and convenient is the Google search toolbar. Not only can you easily search from the toolbar, but, among other things, it also shows you what the PageRank is for the page you're currently on.









This is a great way to quickly estimate how valuable a link is that you're receiving by knowing how reputable Google thinks a given page is. That said, pagerank in the toolbar is not always the most up-to-date, and it also doesn't speak to how relevant a specific page might be for a given topic (a page might have a lot of authority but only for a single topic and that might be understated in the single measure of PageRank).

Hope you can add Google Toolbar to your SEO toolbox as yet another valuable SEO tool that makes your analysis quick and effective.

Sunday, November 22, 2009

Design Your Pages for Humans not for Search Engines

I decided to kick off this week of SEO tips with a little piece of general wisdom for how to approach your overall SEO strategy: Design your site for humans not for search engines.

If you start compromising the user experience in order drive more organic search traffic, you'll soon find yourself with a site that is great for bots to read but terrible for users. In the long run you won't be able to retain users, not even for their first visit (driving up the bounce rate as people land on your page and don't get your value proposition).

The reverse is also true - if you build a site that delights real users, you'll soon gain the respect of other sites, earning you valuable inbound links that will give you far greater value than the SEO tweaks you might be doing that destroy the user experience. Hope you enjoy that bit of Monday morning wisdom.

Thursday, November 19, 2009

Re-directs and SEO: When to Use 301 and 302 Redirects

URL redirection or URL forwarding is basically when a website tells a browser that a page has moved. For example, the page formerly hosted at http://OldURL.com may now be at http://NewURL.com. Thus, as the webmaster you'd want to tell anyone that goes to OldURL what the new address is - something you can do automatically via a browser redirect (http status code starting with a 3)

There are two most common types of redirects, a 301 redirect and a 302 redirect. A 301 redirect means that a page has been moved permanently while a 302 redirect means that a page has been moved temporarily (or for an unknown reason).

If your page is moving to a new URL for whatever reason, and you want to maintain the SEO authority that you've built up (e.g., from multiple inbound links pointing to the old URL), you'll want to use a 301 redirect. The 301 redirect tells search engines that whatever authority they were previously ascribing to the old URL should now be passed on to the new URL, and therefore you shouldn't lose any of your SEO authority.

That said, there have been occasions where people have attempted to use redirects to game search engines. For example, one could buy another site with a good deal of inbound links and then attempt to 301 redirect that site to your site and transfer its link credit. Search engines can often spot these maneuvers and will see through them, thus removing the inbound link credit. If you're going to purchase a site and want to maintain its credibility make sure to thread carefully and read up about best practices prior to doing it.

Wednesday, November 18, 2009

Site Speed and SEO - Does a Slow Site Get Penalized or a Fast Site Rewarded?

Folks often ask me whether site load time affects their SEO, and the short answers is: Yes!

The importance of having a site that loads fast (or at least adequately fast) can be imputed from the fact that search engines want to point users towards reputable and high-quality sites that match their search queries. A slow site that hangs forever is not a good experience for the user, and by extension, for the search engine that led that user there.

The fact that Google's webmaster tools reports on crawl speed (time spent downloading a page), shows that they're watching (aren't they always?). Also, I've noticed a direct correlation between the faster load times translating to more pages crawled per day (which should mean better indexing of your content).

More recently though, Google's top SEO oracle (technically the head of it's web spam team), Matt Cutts, laid it out for us (covered here by SEO Moz): Although slow load times will NOT adversely affect your rankings, fast load times may have a positive effect. So there you have it, straight from the horse's mouth. So, keep those sites scaling and those pages loading as fast as you can!

Monday, November 16, 2009

Clickthrough Rates (CTR) by Position in Search Engine Results Page (SERP)

So, how much of a difference does it make to move up an extra spot in the search results rankings? Should you be happy with #2 or should you go the extra mile to be #1? What about being on the first page of results versus on the second page?

The short answer is that every spot really matters - especially the closer you get to the top of the rankings in the search engine results page (SERP).

I've seen this data shown on a number of sites, but here's one I most recently looked at (they're all roughly similar). What this data says is that the #1 search result gets nearly 4 times as many clicks as the #2 result, which in turn gets 50% more than #3. We can clearly see how your placement in the SERP makes a world of difference.

Not only that, but position 11 (first result on the second page) only receives 0.7% of clickthroughs, or 1/4th of what the last position in the first page receives. This pretty much means that if you're not on the first page for a search result, you're getting a miniscule share of the overall traffic for that keyword. In fact, 90% of all search traffic goes to a result in the first page.

Number of Pages Indexed: How to Find out How Many Pages You Have Indexed

Ensuring the search engines are properly indexing pages from your website is critical for SEO - after all, if the search engines aren't indexing your pages, those pages will never come up for search results!

So, how do you find out how many pages from your site are getting indexed? The easiest way is to search for the following on Google:
"site:url.com" (no spaces)

SEO Book notes that searching for site:url.com and site:www.url.com can lead to slightly different results. Also, you can use Google Webmaster Tools to get some additional information on the pages Google is crawling on your site and how that changes over time. We'll go over Webmaster Tools in a future post.

Thursday, November 12, 2009

Robots.txt and SEO - How to Use Robots.txt

Tonight I ran into my same friend who had previously asked me about hiding text using the same text and background colors, and this time he had another very pertinent question: How is robots.txt to be used for SEO?

Honestly, I have always thought of robots.txt as a way to tell search engines which pages on a website not to index, but hadn't thought about it more broadly than that. So, I figured I'd do some quick web research and let you all know what I learned about using robots.txt for SEO.

First of all, the robots.txt file is used to provide cooperating web robots with instructions on how to crawl the site. If the file is not present, the robot will assume no specific instructions are being given. Also, you need to include a robots.txt file for each subdomain (i.e., subdomain.domain.com/robots.txt).

The main instruction is the "Disallow: /directory/page" command which tells the robot not to crawl those pages. You can also specify that instruction to all robots (with a *) or mention specific robots (or user-agents) (check out this example from CNN).

Another common instruction is to point the robot in the direction of the sitemap file. This is done through the "sitemap: URL location of sitemap" instruction. This is helpful in making sure the robot finds all the pages on your site (more on sitemaps in a later post), but again, not necessary to include.

Wednesday, November 11, 2009

Use Google Analytics as a Tool for SEO

Part of what we want to do here at Daily SEO Tip is to showcase various tools that make you better able to optimize for search engines, and to do so in less time. One such tool that we absolutely love is Google Analytics.

First off, you should have Google Analytics enabled on your site in order to track overall traffic. If you're driving blind it can be quite difficult to steer.

But Google Analytics is so much more when it comes to SEO. It gives you access to all the keywords that drive search traffic to your website as well as summary statistics on the value of each of those keywords (bounce rate, pageviews per visit, etc.). You can also define custom goals (e.g., registration, filled out lead form, made purchase, etc.) and see how traffic from different keywords performs. Of course, you can also monitor your overall level of search traffic and do so for all search engines (not just Google).

Google Analytics also has a very useful blog which I recommend following as they often have great case studies, updates, and tips.

I could dedicated an entire blog just to Google Analytics and how to use it (and in fact, many do!), but the best way to get acquainted with it is to read the product description but then very quickly just start using it. As with most things, the best way to learn is by doing so go on and try it.

Internal Cross-Linking of Keywords (a la Wikipedia) for SEO

Have you ever noticed how every article on Wikipedia is full of links to other wikipedia articles? For example, the first sentence of the article on SEO reads (including links):

"...is the process of improving the volume or quality of traffic to a web site from search engines via 'natural' or un-paid..."

Linking internally serves two main purposes -- First, it allows readers to easily access other reference information that is pertinent to the article they're reading rather than having to lookup that information themselves (and possibly leading to a loss of that reader to another site). Second, it helps a lot with SEO by providing the target articles (in this case the article on "web site" and the one on "search engines") with both relevance and authority.

Relevance is given by the association of that anchor text (the underlined and linked text, like "web site" above) to that page. It tells the search engine that the page that link is pointing to is about the keywords that are pointing to it. Authority is given by the link authority that is transferred from the current article to the linked to article. Of course all the articles are inter-linking throughout the site, so the authority is being spread throughout.

The example of Wikipedia is a powerful example of internal cross-linking in effect. Think about how you can do this on your web site in a programmatic fashion that both adds value to the user (allowing them to access related content quickly - thus keeping them around for more pageviews) as well as improves your own SEO.

Monday, November 9, 2009

Hiding Text and Keywords for SEO: A Big No-No

I want to do a series of posts on some commonly-known "black hat" tactics (black hat refers to practices that are deceptive and try to game search algorithms). Many folks that are starting out with SEO (and also some experienced SEOs) mistakenly assume that they can trick search engine algorithms and get their sites ranked higher.

I had lunch with a friend today that was referring to the practice of hiding keywords on a page by making the text the same color as the background color. This once-common way of tricking the search engines was quickly detected and added to search engines' ever-growing list of spam tactics they look for. Those attempting the practice are usually punished by being pushed down in the rankings or the ultimate punishment of all - getting removed from the index altogether.

And if you think you can get around this by using similar (but not exact) shades of color, or hiding the text off the page, or any other thing of that sort -- think again, the engines will catch on to you pretty quickly.

Sunday, November 8, 2009

Keyword Density and SEO - How Search Engines Parse Keywords on Your Page for Context

In order to determine a web page's relevancy for a particular search query, search engines look at a variety of different factors to understand what a web page is about. We've talked about how title tags matter, as do URLs, and meta descriptions (but much less so). Additionally, one very big factor for how a search engine determines what your web page is about is by looking at the actual text on that page as well as the various words in it and their relative frequencies. The concepts of "keyword density" relates to how frequently various keywords are seen on a given page. If the keyword is seen a lot, then it follows that that page must be talking about a topic very relevant to that keyword.

In practical terms, what this means is that using the right keywords in your post matters and you should be cognizant of that. I would caveat that with a warning that "keyword stuffing," or artificially including excess keywords on your page, can not only create a bad experience for users but also potentially raise red flags with the search engines that you may be trying to game their algorithms (and subsequently incur a punishment from them). So, make sure to keep an eye on the keywords on your page but do so in a way that preserves the user experience and doesn't try to trick the search engines in any way.

Thursday, November 5, 2009

Breadcrumbing for SEO - Getting your Pages Indexed Properly Via On-Page Links

Sitemaps are one way of getting search engines to understand your site and all the pages you have, but they're not always totally effective, especially as your site grows (in number of pages) and those sitemap files start getting larger, and large.

One way to "show" search engines all the pages on your site as well as to tell them what those pages are about (playing into their relevance for targeted keywords), is by using a practice commonly referred to as breadcrumbing.

Breadcrumbing refers to leading the search engine throughout your site by making it follow specific paths you set up through links between pages. Ensuring that every page on your site is linked to from somewhere is a key component to making sure search engines know that page exists and can index it properly. Not only that, but as you lay breadcrumbs for the search engine to follow, you can use the anchor text in the links (the text that is underlined and links to the other page) to tell the search engine what the destination page is about (giving it context/relevancy).

Breadcrumbing could be done fairly 'dumbly' by simply including a couple of links from every page to the next couple pieces of content in some sort of sequential order. Perhaps by showing a "Next Article" or "Next Products" link(s) at the bottom of the current page. You can also leverage breadcrumbing to improve user navigability of your site. If the links are done in a logical manner that helps the user discover related content, that helps both SEO and navigation. For example "Related Articles" or "Similar Products." To ensure you're linking to all the pages (rather than your algorithm inadvertently leaving out certain pages from the link structure), you may want to inter-mix some kind of relevant linking with some sequential linking. Perhaps a "Next Article" link coupled with a "Related Articles" link.

Wednesday, November 4, 2009

Meta Description Length - What Should the Maximum Length Be For SEO?

The meta description tag is an interesting one from an SEO standpoint. It is a very important tag given it's potential impact on clickthrough rates in the SERPs as well as it's (albeit limited and unclear) impact on your site's relevancy (and ranking) for targeted keywords. However, search engines choose whether or not to show what you include in the meta description tag, versus some another snippet it may deem better for the user. This fact can be frustrating at times, and we'll deal with it more in-depth in a later (and more advanced) post.

Today's post seeks to answer the question: "Now that I'm going to go through the trouble of creating quality meta descriptions, what should their target length and maximum length be?"

Again, the answer is not easy (at least not as easy as with the maximum length for title tags), but I'll go ahead and say that you should target 150 characters (which is also the maximum length that Google will show), and do your best to stay within 250 characters as a maximum (I've heard the 255 and 260 figures thrown around a lot). Google says publicly that there is no max length, but guidelines are always good to have and these are the ones I go by.

Tuesday, November 3, 2009

Include Keywords in the URL to Improve the Page's SEO

Including your target keywords in the URL of a specific page will help that page's relevance towards those keywords. So, make sure your URLs look more like http://domain.com/keyword rather than http://domain.com/?content=1234.

Not only is it good to include the keywords in the URL, but keep in mind that the more to the left in the URL they show up, the better (and the higher up in the folder structure). For example, http://keyword.com is better than http://domain.com/keyword which is better than http://domain.com/folder/keyword.

Getting your URLs correct right off the bat will save you a lot of time (and site performance headaches) later when you have to re-write the URLs (more on that in a later, and more advanced, post).

Content to Code Ratio (Code to Text) - Reduce Code Bloat on Your Page to Improve SEO

The content to code ratio (or code to text ratio, etc.) refers to how much code your page has relative to how much content (or text). An easy way to see this is by viewing the html source (e.g., View > Page Source in Firefox) and looking at how much code there is relative to how much readable text. Given that search engines look at the code for a given page, bloating that page with a lot of code (which is not contextual), simply dilutes the text that is there, thus lowering your page's relevancy to various keywords you may be targeting.

Why do you have a low content to code ratio? Three typical reasons and solutions include:
  • You include Javascript code in the page - move your JS to another file, if you can't do that, move it to the bottom of the page (the lower something is in the page, the less weight it carries with the search engines)
  • You use a lot of in-line styling - use Cascading Style Sheets (CSS), located in an external file, to reduce code
  • Formatting using tables - use CSS to create grid layouts instead as it requires less cod
Here's a good tool I found to test the content to code ratio.

Sunday, November 1, 2009

Keep Titles to 65 Characters - Maximum Length for a Title Tag

As you think about your title tags, making sure that you get your point across in the first 65 characters is a pretty good target. That's roughly what Google will display in search engine results pages (Yahoo is about twice as long).

Since the titles you pick are what users see in SERPs and drives clickthrough rates, make sure your title tag -- with all it's relevant keywords in it -- is within that 65 character limit.