Wednesday, December 9, 2009

Dashes Are Better Than Underscores for Separating Keywords in the URL

So, you already know that including your keywords in your URL is an SEO best-practice and a very strong driver of good search engine rankings. But what if you have more than one keyword in your URL? How do you separate them? Should you push them together, use dashes, underscores, or other delimiters?

The short answer is to use dashes to separate keywords. Matt Cutts, Google's head of web spam and SEO king, has come out and said this explicitly so I'll leave it at that. Plus, I agree that it's more readable from a user perspective as well.

Tuesday, December 8, 2009

Roll Out Big Changes Slowly to Avoid the Sandbox

It pays to think about SEO when first building your site so as to avoid the pain of having to fix things later. However, none of us ever get things 100% right from the get-go and optimization changes are always in the cards. What's important to keep in mind is that if you're going to be changing a large number of pages or changing important elements of pages (e.g., title tags or URLs), is to roll out the changes slowly if possible. Search engines will often notice large-scale changes, and raise red flags if they think you're over-optimizing. As a rule of thumb, if you're changing more than 10,000 pages, try to roll things out in smaller chunks (perhaps 10-20% chunks) and give the engines time to digest things and adjust rankings. That way you can see what the expected outcome might be and continue with the changes if things look ok.

Some time ago I changed around 100,000 pages at once - both the title tags and meta descriptions only to see a fairly rapid drop in the search traffic they drew. The changes were all for the better, but the magnitude of pages changed at once probably drew attention from the search engines. After 6 weeks the traffic bounced back (and higher than before as we had hoped), most likely as the engines noticed that the pages had stabilized and indexed them properly again. In all likelihood had I changed the pages in 10k page chunks, that would have avoided the temporary search engine penalty / sandboxing (and a whole lot of grief and nervousness hoping that the traffic would bounce back!).

Monday, December 7, 2009

Using Images for SEO: Leverage the Image URL

We recently covered SEO for Images so I figured I'd write a follow-up post about another tactic you can use to help search engines "understand" what particular images are of. The alt tag is still your most powerful ally in this battle, but don't forget the image URL. The search engines most certainly look at the name of the image file for additional clues as to what that image is of. For example, an image file called chocolate.jpg is much more descriptive than IMG003482.jpg.

So, when naming your image files, first make sure that you use names for the files that are descriptive, and second, ensure that those descriptions are mindful of the keywords you are targeting. Every bit counts when helping search engines properly interpret your pages, especially when it comes to images.

Thursday, December 3, 2009

Finding Links to Your Web Page and Those of Your Competitors

We've talked all about the importance of link-building and how fostering many, high-quality, and diverse inbound links is probably the singlemost powerful thing you can do to improve your page's search engine rankings. But how do you track who is linking to your page? Also, what if you want to find out who is linking to your competitors? Here are two quick-and-easy tips:

The first and simplest way to get a read on links is by using Google's "link:" operator. Search for "" and Google will return pages that are linking to the page you include. Even though this is not a complete list you can use it as a good representative sample.

Second, if you're using Google Webmaster Tools (more on that in a later post) it'll give you a fairly comprehensive analysis of the links your site is receiving. Drawback here is that you can only do this for sites you have control over, and not those of your competitors.

A lot of SEO sites out there have their own backlink analysis tools but at the end of the day I find that many of them are based on the Google link: operator and it's equivalents in Yahoo and MSN and just using one of those is good enough for 90% of use cases.

Heading Tags and SEO - H1 Tag FTW!

I can't believe we haven't included Heading Tags H1, H2, H3, etc. in our daily SEO tips yet! I'll make sure to rectify that right now.

Heading tags are one of the most important things you can do to tell search engines what a web page is about. Putting text inside an H1 tag (the most important of heading tags) is almost as important/effective for SEO as including that text and keywords in the title tag or URL. So, ensure that your keywords are included within the H1 tag of the page. You can always use CSS to over-ride any styling that the H1 tag imposes on your page, so don't let that stop you.

Also, note that including keywords inside an H1 tag is much more relevant/important than within an H2 tag (which is more important than an H3 tag, and so on...)

Thus, include your primary keywords in the H1 tag and any secondary keywords within "lower-level" heading tags.

As with everything - make sure you're not using heading tags to stuff keywords, especially ones that are nonsensical for a user. Simply think about how to best craft the content within an H1 tag so that it has SEO in mind.

Finally, if you have pages that have headings but don't use any heading tags -- change that! Make sure that your headings (h1) and sub-headings (h2) are properly denoted using the correct HTML tags and this best practice of web design will also benefit your search rankings.

Tuesday, December 1, 2009

SEO for Images: The Alt Tag and Image Search

We've talked a good deal about how to ensure your content is properly indexed by search engines and ranks well for keywords that are important to you.

However, what about when your content is not "readable" as in an image on a page? How do you ensure that the search engine is able to understand what the contents of the image are and in turn drives relevant search traffic to that image and the page that contains it.

The answer to this question is fairly straightforward yet one that is often overlooked: Use the "alternative" or ALT tag to provide attributes to describe the image. Not only is this a good thing to do from an accessibility and usability standpoint but also for SEO.

The syntax is as follows: img src="image.jpg" alt="alternative description here"

As you decide what to include in the alt attribute, make sure that the attribute accurately describes the image and also that the way it is described is such that it optimizes for the keywords that are important to you.

Finally, search engines like Google have image-focused search options (e.g., Google Image Search) which can actually drive quite a good deal of traffic by itself. Make sure to keep that in mind as you create appropriate and relevant alt tags for your images.

SEOBook's Rank Checker - A Great SEO Tool

If you're wondering how to quickly track how your website ranks for multiple keywords across multiple search engines, look no further than SEOBook's Rank Checker tool.

This Firefox plugin allows you to input various keywords you want to track and then automatically checks in what place you rank for those words. Here's a video that shows you how it works:

Hopefully this saves you time and allows you to track your performance over time (the export to .csv functionality helps you store this data over time).

Monday, November 30, 2009

External Links for SEO: Link Quantity vs Quality vs Diversity

Now that you've learned how building up inbound links to your web page from external websites is probably the single-most important element in SEO, you're embarking on a campaign to foster inbound links to your site. But how do you decide what links to go after and how to prioritize across various opportunities?

As a general rule of thumb, you want to have as many links from a diverse set of the most reputable sites possible.

First let's take quantity versus quality. You can think of the PageRank algorithm as a summation of all the links you receive, weighted by their authority. So, for example, if you receive 5 different links from 5 different sites with "authorities" 2, 3, 7, 7, and 9, you could say that the total "authority" gained by your site is 2+3+6+7+7, or 25. On the other hand, if you received 3 different links from 3 different sites with authorities 9, 9, and 9, you would have received 9+9+9, or 27 "authority points." In our example, you would have been better off with fewer, but higher quality, links.

That tradeoff between link quantity and link quality should be fairly clear. Of course, keep in mind that obtaining a link from a more authoritative site is oftentimes more difficult, and thus may require more time and effort on your part to obtain.

The third variable I'll throw in here is the issue of link diversity. Let's take the example above with the 3 links. In a situation where all 3 links are from the same single site (from different pages or from different parts of the same page) versus where each of the 3 links comes from a different site, the option where the links come from more diverse sites is better and thus likely to rank your site higher.

It's even said that search engines may grade sites as being more or less diverse from each other based on past link patterns and other factors. Thus, there are also varying grades of "diversity." Make sure to keep in mind link diversity when building your inbound link profile as it's yet another factor that engines look at which you should take into account!

Tuesday, November 24, 2009

Which URL Shortener to Use for SEO

The explosion of Twitter in 2009 has led to a surge in the use of URL shorteners as well as as a surge in the number of different URL shorteners out there.

There are a few things, from the SEO perspective, you need to know when selecting which URL shortener to use.

First, you want to pick a URL shortener that is going to exist into the future. If shortened links to your site are being posted throughout the web, those are all links that could be in jeopardy in the event that the URL shortener used to shorten them becomes defunct. This happened most recently with and a lot of controversy followed. Some efforts (most notably 301works) have arisen to remedy this risk, but regardless, I'd suggest picking the most robust URL shortener when shortening your own URLs (if others are shortening your URLs there's little you can do there).

The other thing you need to make sure you do is that you use a shortening service that uses a 301 redirect to your URL (we covered when to use a 301 redirect previously). Search Engine Land had a good post on this topic and a chart that goes over the main shorteners and which ones use proper redirects (as well as other factors you may want to consider).

In short, to make your decision easier, I'd say go with the biggest shortener. They're likely to be using the redirects correctly (or else not so many people would use them) and they're also likely to be a robust solution that will be around into the future.

Use Google Toolbar to View PageRank - SEO Tools

We've talked about the concept of PageRank before - the numerical measure that Google assigns for how important a given page is in the context of the overall web. PageRank is a critical determining factor in how search results are ranked by Google, and therefore a critical component of SEO.

We know how important PageRank is, but how do we know what PageRank is assigned to a given web page? There are a number of tools out there, but one that I find to be useful and convenient is the Google search toolbar. Not only can you easily search from the toolbar, but, among other things, it also shows you what the PageRank is for the page you're currently on.

This is a great way to quickly estimate how valuable a link is that you're receiving by knowing how reputable Google thinks a given page is. That said, pagerank in the toolbar is not always the most up-to-date, and it also doesn't speak to how relevant a specific page might be for a given topic (a page might have a lot of authority but only for a single topic and that might be understated in the single measure of PageRank).

Hope you can add Google Toolbar to your SEO toolbox as yet another valuable SEO tool that makes your analysis quick and effective.

Sunday, November 22, 2009

Design Your Pages for Humans not for Search Engines

I decided to kick off this week of SEO tips with a little piece of general wisdom for how to approach your overall SEO strategy: Design your site for humans not for search engines.

If you start compromising the user experience in order drive more organic search traffic, you'll soon find yourself with a site that is great for bots to read but terrible for users. In the long run you won't be able to retain users, not even for their first visit (driving up the bounce rate as people land on your page and don't get your value proposition).

The reverse is also true - if you build a site that delights real users, you'll soon gain the respect of other sites, earning you valuable inbound links that will give you far greater value than the SEO tweaks you might be doing that destroy the user experience. Hope you enjoy that bit of Monday morning wisdom.

Thursday, November 19, 2009

Re-directs and SEO: When to Use 301 and 302 Redirects

URL redirection or URL forwarding is basically when a website tells a browser that a page has moved. For example, the page formerly hosted at may now be at Thus, as the webmaster you'd want to tell anyone that goes to OldURL what the new address is - something you can do automatically via a browser redirect (http status code starting with a 3)

There are two most common types of redirects, a 301 redirect and a 302 redirect. A 301 redirect means that a page has been moved permanently while a 302 redirect means that a page has been moved temporarily (or for an unknown reason).

If your page is moving to a new URL for whatever reason, and you want to maintain the SEO authority that you've built up (e.g., from multiple inbound links pointing to the old URL), you'll want to use a 301 redirect. The 301 redirect tells search engines that whatever authority they were previously ascribing to the old URL should now be passed on to the new URL, and therefore you shouldn't lose any of your SEO authority.

That said, there have been occasions where people have attempted to use redirects to game search engines. For example, one could buy another site with a good deal of inbound links and then attempt to 301 redirect that site to your site and transfer its link credit. Search engines can often spot these maneuvers and will see through them, thus removing the inbound link credit. If you're going to purchase a site and want to maintain its credibility make sure to thread carefully and read up about best practices prior to doing it.

Wednesday, November 18, 2009

Site Speed and SEO - Does a Slow Site Get Penalized or a Fast Site Rewarded?

Folks often ask me whether site load time affects their SEO, and the short answers is: Yes!

The importance of having a site that loads fast (or at least adequately fast) can be imputed from the fact that search engines want to point users towards reputable and high-quality sites that match their search queries. A slow site that hangs forever is not a good experience for the user, and by extension, for the search engine that led that user there.

The fact that Google's webmaster tools reports on crawl speed (time spent downloading a page), shows that they're watching (aren't they always?). Also, I've noticed a direct correlation between the faster load times translating to more pages crawled per day (which should mean better indexing of your content).

More recently though, Google's top SEO oracle (technically the head of it's web spam team), Matt Cutts, laid it out for us (covered here by SEO Moz): Although slow load times will NOT adversely affect your rankings, fast load times may have a positive effect. So there you have it, straight from the horse's mouth. So, keep those sites scaling and those pages loading as fast as you can!

Monday, November 16, 2009

Clickthrough Rates (CTR) by Position in Search Engine Results Page (SERP)

So, how much of a difference does it make to move up an extra spot in the search results rankings? Should you be happy with #2 or should you go the extra mile to be #1? What about being on the first page of results versus on the second page?

The short answer is that every spot really matters - especially the closer you get to the top of the rankings in the search engine results page (SERP).

I've seen this data shown on a number of sites, but here's one I most recently looked at (they're all roughly similar). What this data says is that the #1 search result gets nearly 4 times as many clicks as the #2 result, which in turn gets 50% more than #3. We can clearly see how your placement in the SERP makes a world of difference.

Not only that, but position 11 (first result on the second page) only receives 0.7% of clickthroughs, or 1/4th of what the last position in the first page receives. This pretty much means that if you're not on the first page for a search result, you're getting a miniscule share of the overall traffic for that keyword. In fact, 90% of all search traffic goes to a result in the first page.

Number of Pages Indexed: How to Find out How Many Pages You Have Indexed

Ensuring the search engines are properly indexing pages from your website is critical for SEO - after all, if the search engines aren't indexing your pages, those pages will never come up for search results!

So, how do you find out how many pages from your site are getting indexed? The easiest way is to search for the following on Google:
"" (no spaces)

SEO Book notes that searching for and can lead to slightly different results. Also, you can use Google Webmaster Tools to get some additional information on the pages Google is crawling on your site and how that changes over time. We'll go over Webmaster Tools in a future post.

Thursday, November 12, 2009

Robots.txt and SEO - How to Use Robots.txt

Tonight I ran into my same friend who had previously asked me about hiding text using the same text and background colors, and this time he had another very pertinent question: How is robots.txt to be used for SEO?

Honestly, I have always thought of robots.txt as a way to tell search engines which pages on a website not to index, but hadn't thought about it more broadly than that. So, I figured I'd do some quick web research and let you all know what I learned about using robots.txt for SEO.

First of all, the robots.txt file is used to provide cooperating web robots with instructions on how to crawl the site. If the file is not present, the robot will assume no specific instructions are being given. Also, you need to include a robots.txt file for each subdomain (i.e.,

The main instruction is the "Disallow: /directory/page" command which tells the robot not to crawl those pages. You can also specify that instruction to all robots (with a *) or mention specific robots (or user-agents) (check out this example from CNN).

Another common instruction is to point the robot in the direction of the sitemap file. This is done through the "sitemap: URL location of sitemap" instruction. This is helpful in making sure the robot finds all the pages on your site (more on sitemaps in a later post), but again, not necessary to include.

Wednesday, November 11, 2009

Use Google Analytics as a Tool for SEO

Part of what we want to do here at Daily SEO Tip is to showcase various tools that make you better able to optimize for search engines, and to do so in less time. One such tool that we absolutely love is Google Analytics.

First off, you should have Google Analytics enabled on your site in order to track overall traffic. If you're driving blind it can be quite difficult to steer.

But Google Analytics is so much more when it comes to SEO. It gives you access to all the keywords that drive search traffic to your website as well as summary statistics on the value of each of those keywords (bounce rate, pageviews per visit, etc.). You can also define custom goals (e.g., registration, filled out lead form, made purchase, etc.) and see how traffic from different keywords performs. Of course, you can also monitor your overall level of search traffic and do so for all search engines (not just Google).

Google Analytics also has a very useful blog which I recommend following as they often have great case studies, updates, and tips.

I could dedicated an entire blog just to Google Analytics and how to use it (and in fact, many do!), but the best way to get acquainted with it is to read the product description but then very quickly just start using it. As with most things, the best way to learn is by doing so go on and try it.

Internal Cross-Linking of Keywords (a la Wikipedia) for SEO

Have you ever noticed how every article on Wikipedia is full of links to other wikipedia articles? For example, the first sentence of the article on SEO reads (including links):

" the process of improving the volume or quality of traffic to a web site from search engines via 'natural' or un-paid..."

Linking internally serves two main purposes -- First, it allows readers to easily access other reference information that is pertinent to the article they're reading rather than having to lookup that information themselves (and possibly leading to a loss of that reader to another site). Second, it helps a lot with SEO by providing the target articles (in this case the article on "web site" and the one on "search engines") with both relevance and authority.

Relevance is given by the association of that anchor text (the underlined and linked text, like "web site" above) to that page. It tells the search engine that the page that link is pointing to is about the keywords that are pointing to it. Authority is given by the link authority that is transferred from the current article to the linked to article. Of course all the articles are inter-linking throughout the site, so the authority is being spread throughout.

The example of Wikipedia is a powerful example of internal cross-linking in effect. Think about how you can do this on your web site in a programmatic fashion that both adds value to the user (allowing them to access related content quickly - thus keeping them around for more pageviews) as well as improves your own SEO.

Monday, November 9, 2009

Hiding Text and Keywords for SEO: A Big No-No

I want to do a series of posts on some commonly-known "black hat" tactics (black hat refers to practices that are deceptive and try to game search algorithms). Many folks that are starting out with SEO (and also some experienced SEOs) mistakenly assume that they can trick search engine algorithms and get their sites ranked higher.

I had lunch with a friend today that was referring to the practice of hiding keywords on a page by making the text the same color as the background color. This once-common way of tricking the search engines was quickly detected and added to search engines' ever-growing list of spam tactics they look for. Those attempting the practice are usually punished by being pushed down in the rankings or the ultimate punishment of all - getting removed from the index altogether.

And if you think you can get around this by using similar (but not exact) shades of color, or hiding the text off the page, or any other thing of that sort -- think again, the engines will catch on to you pretty quickly.

Sunday, November 8, 2009

Keyword Density and SEO - How Search Engines Parse Keywords on Your Page for Context

In order to determine a web page's relevancy for a particular search query, search engines look at a variety of different factors to understand what a web page is about. We've talked about how title tags matter, as do URLs, and meta descriptions (but much less so). Additionally, one very big factor for how a search engine determines what your web page is about is by looking at the actual text on that page as well as the various words in it and their relative frequencies. The concepts of "keyword density" relates to how frequently various keywords are seen on a given page. If the keyword is seen a lot, then it follows that that page must be talking about a topic very relevant to that keyword.

In practical terms, what this means is that using the right keywords in your post matters and you should be cognizant of that. I would caveat that with a warning that "keyword stuffing," or artificially including excess keywords on your page, can not only create a bad experience for users but also potentially raise red flags with the search engines that you may be trying to game their algorithms (and subsequently incur a punishment from them). So, make sure to keep an eye on the keywords on your page but do so in a way that preserves the user experience and doesn't try to trick the search engines in any way.

Thursday, November 5, 2009

Breadcrumbing for SEO - Getting your Pages Indexed Properly Via On-Page Links

Sitemaps are one way of getting search engines to understand your site and all the pages you have, but they're not always totally effective, especially as your site grows (in number of pages) and those sitemap files start getting larger, and large.

One way to "show" search engines all the pages on your site as well as to tell them what those pages are about (playing into their relevance for targeted keywords), is by using a practice commonly referred to as breadcrumbing.

Breadcrumbing refers to leading the search engine throughout your site by making it follow specific paths you set up through links between pages. Ensuring that every page on your site is linked to from somewhere is a key component to making sure search engines know that page exists and can index it properly. Not only that, but as you lay breadcrumbs for the search engine to follow, you can use the anchor text in the links (the text that is underlined and links to the other page) to tell the search engine what the destination page is about (giving it context/relevancy).

Breadcrumbing could be done fairly 'dumbly' by simply including a couple of links from every page to the next couple pieces of content in some sort of sequential order. Perhaps by showing a "Next Article" or "Next Products" link(s) at the bottom of the current page. You can also leverage breadcrumbing to improve user navigability of your site. If the links are done in a logical manner that helps the user discover related content, that helps both SEO and navigation. For example "Related Articles" or "Similar Products." To ensure you're linking to all the pages (rather than your algorithm inadvertently leaving out certain pages from the link structure), you may want to inter-mix some kind of relevant linking with some sequential linking. Perhaps a "Next Article" link coupled with a "Related Articles" link.

Wednesday, November 4, 2009

Meta Description Length - What Should the Maximum Length Be For SEO?

The meta description tag is an interesting one from an SEO standpoint. It is a very important tag given it's potential impact on clickthrough rates in the SERPs as well as it's (albeit limited and unclear) impact on your site's relevancy (and ranking) for targeted keywords. However, search engines choose whether or not to show what you include in the meta description tag, versus some another snippet it may deem better for the user. This fact can be frustrating at times, and we'll deal with it more in-depth in a later (and more advanced) post.

Today's post seeks to answer the question: "Now that I'm going to go through the trouble of creating quality meta descriptions, what should their target length and maximum length be?"

Again, the answer is not easy (at least not as easy as with the maximum length for title tags), but I'll go ahead and say that you should target 150 characters (which is also the maximum length that Google will show), and do your best to stay within 250 characters as a maximum (I've heard the 255 and 260 figures thrown around a lot). Google says publicly that there is no max length, but guidelines are always good to have and these are the ones I go by.

Tuesday, November 3, 2009

Include Keywords in the URL to Improve the Page's SEO

Including your target keywords in the URL of a specific page will help that page's relevance towards those keywords. So, make sure your URLs look more like rather than

Not only is it good to include the keywords in the URL, but keep in mind that the more to the left in the URL they show up, the better (and the higher up in the folder structure). For example, is better than which is better than

Getting your URLs correct right off the bat will save you a lot of time (and site performance headaches) later when you have to re-write the URLs (more on that in a later, and more advanced, post).

Content to Code Ratio (Code to Text) - Reduce Code Bloat on Your Page to Improve SEO

The content to code ratio (or code to text ratio, etc.) refers to how much code your page has relative to how much content (or text). An easy way to see this is by viewing the html source (e.g., View > Page Source in Firefox) and looking at how much code there is relative to how much readable text. Given that search engines look at the code for a given page, bloating that page with a lot of code (which is not contextual), simply dilutes the text that is there, thus lowering your page's relevancy to various keywords you may be targeting.

Why do you have a low content to code ratio? Three typical reasons and solutions include:
  • You include Javascript code in the page - move your JS to another file, if you can't do that, move it to the bottom of the page (the lower something is in the page, the less weight it carries with the search engines)
  • You use a lot of in-line styling - use Cascading Style Sheets (CSS), located in an external file, to reduce code
  • Formatting using tables - use CSS to create grid layouts instead as it requires less cod
Here's a good tool I found to test the content to code ratio.

Sunday, November 1, 2009

Keep Titles to 65 Characters - Maximum Length for a Title Tag

As you think about your title tags, making sure that you get your point across in the first 65 characters is a pretty good target. That's roughly what Google will display in search engine results pages (Yahoo is about twice as long).

Since the titles you pick are what users see in SERPs and drives clickthrough rates, make sure your title tag -- with all it's relevant keywords in it -- is within that 65 character limit.

Thursday, October 29, 2009

Include Keywords in Meta Description and Increase Your Page's Relevancy

Are you updating the meta description tag for your pages? Make sure you don't ignore this very important tag, and also that you include your target keywords in that description.

The meta description is a way for a webmaster to tell search engines (and through that to people searching) what a particular page is about. Given that, this is one of the factors that search engines look to when assessing what a particular page is about and therefore what the relevancy of that page is to various search terms. If you want the page to be highly-relevant to certain target keywords, make sure to include them in that description. All that said, although meta description helps, it has a "light" effect on overall rankings.

Also, remember that the meta description oftentimes shows up in the search engine results page (SERP) and therefore is a big driver of the clickthrough rate to that result. So, if you're description is comprised of keywords stuffed together, but not logical sentences, a human being is not going to be able to make any sense of it and therefore be less likely to click. So remember, make sure that you build your site and pages for humans and not for search engines.

Wednesday, October 28, 2009

Using Nofollows in SEO: Controlling Page Rank Passthrough and Avoiding Links to Spam and "Bad Neighborhoods"

Nofollows are a powerful and important tool when thinking about SEO. They are used by adding a rel="nofollow" to a link, for example (courtesy of Wikipedia):
a href="" rel="nofollow"

Nofollows should be used to indicate to the search engine that you do not want to pass any authority on to the page the link is pointing to. It either means that the page is one of little importance, or one who you're not necessarily sure you trust and therefore don't want to endorse with a link.

The first case is often done for footer links on your site, or to limit the number of followed links a page has. This is oftentimes mixed up with the practice of "PageRank sculpting" which is a practice that we'll talk about another time and that has unclear value.

The second case is great to employ if you have a site with user-generated content and user-generated links. By no-following things like comments and posts, it discourages spammers to post links from your site pointing to theirs in efforts to boost their own credibility with the search engines. Also, it prevents you from getting penalized by the search engines for (unknowingly) linking to "bad neighborhoods," or sites that are known to be malicious or spammy.

The bottom line? Use nofollows when you point to pages on your site you don't want to pass authority to as well as for most out-going user-generated links that you cannot reliably trust.

Tuesday, October 27, 2009

Titles and Meta Descriptions Drive Clickthrough Rate on Search Engine Results Page

When a user searches for a keyword, the search engine displays the results on the Search Engine Results Page (SERP). The information that is displayed on that page is generally formatted to include a title and a description for each of the results (among other things). The title is underlined as the actual link to the indexed page and the description is right below it to provide the user with some more context as to the contents of that page. Also, the search engine shows the keyword bolded whenever it appears in the title and/or description.

The search engine decides which title to show for a page by looking at the title tag for that page and displaying that. The description below that is generally what's included in the meta description tag (meta name="description" content=...). The search engine may not show the description if it finds that to be inadequate in some fashion (e.g., if it's too short).

For this reason, what you define as the title and description of a page is not only important for SEO, but it's also important in driving clickthrough rates from the SERP. A title and description that is written for a search engine (e.g., non-sensical string of keywords) may help you rank better, but it oftentimes may reduce the potential clickthrough rate as users are unable to understand what a page is about and skip over it. Make sure to keep that potential tradeoff in mind as you craft those two fields.

Monday, October 26, 2009

Title Tags and SEO

The title tag for a page is one of the most important factors you can control for.

Ensure that the title contains the keywords you are targeting in it. Go back and review the titles for your pages and ensure that they're optimized.

Keep in mind, however, that titles are what shows on Google search results (as the title of the page), so make sure that they're readable to humans (and not just stuffing keywords!) or else no one will click on your result.

Thursday, October 15, 2009

Advanced SEO Tips

Here are all the tips that fall under the Advanced category

Intermediate SEO Tips

Here are all the tips under the intermediate category

Beginner SEO Tips

here is a list of all the SEO Tips that fall under the beginner category

SEO Tools

Need to tools to make SEO easier? here are some (coming soon)

SEO Blogs and SEO Resources

Want to see what else is out there? Here's my list (coming soon)

SEO Training and Learning SEO

So you want to learn SEO? What's the best way to do that - here are some resources (coming soon)

Contributors to Daily SEO Tip

Daily SEO Tip is the product of many people's contributions. If you have a tip to submit you can do so in this web form (coming soon).

What is SEO? An Overview of Search Engine Optimization

Coming soon...