Thursday, April 13, 2006

SEO and Usability

Myself and our insanely talented creative director Gina Hamm just finished up a usability report. One of our larger SEO clients had asked us to provide an analysis on the traffic reports and usability of the site. Naturally, Gina handled most of the design aspects and usability, and I spent most of my time interpreting the site metrics. Together we were able to provide a report with a ton of useful information and recommendations to improve the website experience for every user. For a large website, it only takes a small improvement to make drastic differences. (Example: Raising the conversion rate of a site just a fraction of a percentage could equal tens of thousands of dollars in revenue.)

I was also reminded how effective SEO and good usability are sometimes one in the same. Focusing the content of a page and adding textual navigation links were some of the items included in Gina's report. Of course these are helpful to a typical user, but they can also have great benefits toward an SEO program.

Working together to make sites successful...that's what Zunch is all about!

Monday, April 10, 2006

SEO 101 Refresher Part 6: Page Optimization Best Practices

In my experiences, many people with a general level of exposure to search engine optimization believe on the page optimization is where the 'magic' happens. I'm here to tell you there isn't any magic, nor a silver bullet that will gain your web site the best search engine visibility. It's adhering to best practices and having a strategy formulated to cover your keywords that will make the difference in the long run, combined with the first five segments of this series.

Best Practices:

Try to use no more than 3-5 keywords on a given page.

Prioritize the keywords you have selected for the page.

Use the keywords in your title tag in a coherent fashion (your title tags play a large part in bringing in visitors from the search engines).

Use the keywords in your meta description tag in a coherent fashion (sometimes a search engine may choose to use the meta description you have provided in conjunction with your title tag).

Use the keywords in your meta keywords (do not spam your keywords tag with keywords that are not amongst those chosen for this page).

Use an h1 header tag to begin the content of your page which contains as many of your keywords as it makes sense to use (remember your prioritization).

The content on the page needs to use the keywords you have chosen. There is no density percentages to target, however it should be clear that this content is about the keywords you have chosen for this page. If you find that your becoming too repetitive with certain keywords, use synonyms.

Break up the logical flow of your content into segments and give each segment a sub-header (ie: h2, h3, etc.) where appropriate. If possible, you could have different paragraphs of text aimed at the keywords selected for the page and use sub-headings with those keywords. This format is very similar to college English term paper writing.

Whenever possible and prudent, link to other pages from within your content and use keywords for those pages when possible (avoid 'click here' links).

These are the fundamentals of SEO page optimization, that when carried out will help to provide a good understanding to the search engines, and especially your users, as to what the page is about.

Friday, April 07, 2006

Home Page PR lower than Internal pages

We've seen a recent PageRank up date this week. One thing that people have been noticing is that some of the home page PageRank is lower that their internal pages.

There is a discussion underway here at search engine watch on the topic. View Thread >>>

I personally brought up 3 points that could be the reason. They are as follows:

  • Advertising links not using the rel="nofollow" tag, and being off topic
  • Better Link Popularity on the internal pages
  • Just another one of those wacky glitches
What ever it is, it hasn't seemed to be messing with rankings on any of my personal sites. Webforgers.net being one of them.

Google Banning Sites That Use DMOZ Data...How Could They?

While perusing the WMW forum this morning...I came across a thread that peaked my interest - being that I am an active DMOZ editor...Google Banning Sites That Use Open Directory (DMOZ) Data

I am in favor of Google doing their best to "weed out" sites that have scraped DMOZ and skinned them for their own benefit (typically for SEO/link building purposes) to improve their SERPs...HOWEVER...how can Google ban someone's site when they do this themselves?

For example...Google Directory / DMOZ Directory

Makes no sense to me!

Wednesday, April 05, 2006

Privacy and Search Behavior

The collection and storage of personal information, both with regard to statistical demographic data and with regard to search history, has been of concern to privacy advocates for some time – but was brought into sharp focus when the government asked for URLs and search data this January. Here is an article on the initial subpoena

This request was part of an attempt to begin enforcement of the 1998 Child Online Protection Act, but sets a precedent which could be applied to the enforcement of other laws. In this case, the government did not request personal data, nothing that could link a particular search to a particular individual - but there is nothing really preventing them from doing so, or from search data being requested in a criminal court case. And as tools like Google Desktop, toolbars, and MSN’s Open Platform become more widely used, the degree to which search engines are integrated into people’s personal lives will increase.

Before you dismiss this as irrelevant to you - don’t be so sure pornography laws don’t apply to you – the recent case with photographer Barbara Nitke established that the most conservative communities in the US can determine the standards by which your Website is judged. Are you sure that your online maternity clothing store won’t be considered obscene by a community in Utah, even though you are in New York? But search information could easily be collected in other types of cases by the government – such as national security. Such a possible use was specifically mentioned by the government. Do you sell anything which might be used by terrorists? That’s pretty broad – anything from books to box cutters could qualify.

But in regard to SEO specifically:
The demographic information that helps you find your customers and clients, can also be used against them. As people become increasingly aware of this, they will become more cautious about actively protecting their privacy. They may become more hesitant to reveal personal information or sign up for memberships, wary of efforts to collect information about them, however harmless the intentions. They will clean off cookies and clear caches more often – making data collection and statistical analysis more difficult, The most sophisticated and wealthiest visitors to your site will be the most difficult to track.

In short, being aware of how the law effects search engines can help you understand and keep in touch with the concerns of your customers and clients. You will need to establish yourself as trustworthy, and secure – while at the same time being open enough that search engines can find you. It’s a balancing act.

A lot of people see the ruling regarding the subpoena as a victory for privacy. Google was only required to comply with it partially. However, the judge pointed out, information collected by search engines with regard to web surfing and search behavior is not protected, private information. What he did determine, is that Google’s users had enough of an expectation of privacy that to demand search data would risk their reputation in the eyes of users, so that it placed an undue burden on the company to provide info that was available to the government through other means. This was not a right that was upheld, but a commercial interest. Which is still a good thing, but not quite the same.

A .pdf of the actual ruling his here.

SEO from the Outset

I was reminded this morning how important it is to involve an SEO consultant in the very beginning stages of a website. If you're building a large website and you expect to hire an SEO once the site launches to optimize it....STOP! Spend the money upfront to have the consultant give recommendations on engineering the database, sitemap and site structure to work for you, not against you, with regards to SEO. With an optimized website structure already in place, adding the proper keywords and content in the right place is easy.

Trust me, having to reconstruct a website after it's built is no fun. And you'll be thankful when you have rankings in three months instead of nine. ;-)

Monday, April 03, 2006

Google's Different Stances on Click Fraud

It seems Google has been confused internally over their capabilities to protect pay per click advertiser from click fraud.

In a recent article in Bloomberg CEO Eric Schmidt says:

Believe me, as a computer scientist, we have the ability to detect the invalid clicks before they reach advertisers.

But the Google Adwords FAQ says:

If we find that invalid clicks have escaped automatic detection, you'll receive a credit for those clicks.

And the Google blog says:

When we believe those clicks are invalid, we reimburse advertisers for them. Some invalid clicks do make it through our filters, but we believe the amount is very small.

I have to admit I find it humorous that Schmidt says to trust him, because of his PhD in computer science that invalid clicks don't reach advertisers then other PhDs at Google say that invalid clicks do reach advertisers. Also, does that mean that Google thinks that there aren't PhDs working against them? Or how about just really smart people who know how the system works and know enough about web technology to be dangerous?

It may have took PhDs to create the first atomic bomb for 'peace', but it doesn't take one to turn into a weapon of war.