ZEN-SEM

Monday, February 27, 2006

More and More Advertisers Worried About Click Fraud

Highlights of SEMPO's December 2005 "The State of Search Engine Marketing" survey:
Of the 553 respondents-
  • 3 times as many advertisers/agencies this year said click fraud was a serious issue.
  • 16% are tracking click fraud in some fashion and believe this to be a serious problem (up 6% vs. last year).
  • 23% of advertisers were tracking click fraud and believe it to be a moderate problem
  • 33% of agencies were and believe it to be a moderate problem
  • 1/3 of advertisers and agencies aren't tracking click fraud but are worried about it
  • 1/4 of advertisers and 18% of agencies said click fraud wasn't a problem (too bad that wont say what agencies)
  • 2% of all advertisers (all in large companies) hadn't heard of click fraud
Over half of all advertisers and 41 percent of agencies said they had experienced "competitive click fraud"

Of course click fraud is more damaging and prevalent in different verticals, but make no mistake - it will happen to most PPC campaigns. As affordable as Click Fraud Detective is, there is no sensible reason not to be protecting your online advertising investments.

Google Base

I think the best way to describe this service is Froogle combined with Yahoo Store & Paypal. It's not opened to the public yet, but there are some vendors beta testing.

We're starting with a very small number of sellers and we expect to include more over the next several months. If you're a seller and you're interested in getting an announcement when this feature is generally available, let us know. And if you want to know how this functionality relates to Google's broader work in payments, read this update. We hope this feature will make it even easier for people to use Google Base to post and distribute a wide range of content, whether information for sharing or goods for sale.
You can read the official blog post here.

Ask Loses Jeeves, Adds Features Galore

I just spent a half-hour listening to Barry Diller from Ask.com give his keynote at SES (thanks to www.webmasterradio.fm). I've got to tell you, I'm really impressed with the upgrades Ask.com received last night. I'm sure other Zunchers will go into more detail throughout the week, but here's a quick list of changes:

- Simple homepage
- AJAXed toolbox bar
- Web-based desktop search
- Less PPC ads (Organic above the fold!)
- Nice mapping at http://maps.ask.com, complete with driving/walking directions and aerial fly-by photographs (not satellite)

Read Chris Sherman's complete SearchDay write-up of Ask.com here.

Thursday, February 23, 2006

Why IE is still a great browser

Many of us in the Web Design Industry loathe Internet Explorer 6 due to it's quirkyness and many rendering bugs. Sure, IE is unique in the fact that it allows for the usage of Active X (who still uses that anyways?) but there's more to it really.

Having designed and developed several sites recently i constantly ran into the issues that IE6 gave me regarding margin and padding settings, text sizes etc. What it forced me to learn was to write better stylesheets that eliminate the need for browser detects and socalled "IE fixes" - seperate stylesheets written to address the many, if not huge, number of rendering errors.

The perfect CSS is of course that which will render a site the same in both Internet Explorer and Firefox without any additional fiddling about.

I'm not saying that IE is a better browser, but it's a great reference tool for web designers and developers to write better and more compatible stylesheets.

Personally i'm excited to see how the final release of IE7 handles CSS and i hope that Microsoft takes to heart all the discoveries made by the everexpanding community of CSS gurus.

Wednesday, February 22, 2006

MSN Yo-Yos with Search Index

What a wild ride! Late last week MSN rolled out a new index that included major changes across the board. Initial reaction was mixed, but it didn't take long for folks across the 'Net to voice their disappointment (WMW, SEW). Two days later, MSN sucked it back in and reverted to the older, more accepted index.

From MSN:

We rolled back to the old net after only about 48 hours. The new one might have been up considerably longer if you guys hadn’t been so vocal so quickly.

So...we've spent the morning emailing some clients and explaining why their MSN traffic plumetted at the end of last week, why blogs, subdomains and press releases were overtaking the rankings, and how everything should be status quo again.

Interesting to note that the MSN quality testers apparently missed this one.

Monday, February 20, 2006

SEO 101 Refresher Part 4: Link Architecture

Link architecture is the skeletal structure of the content of your site.

Search engines crawl the web a page at a time (although they have many many crawlers). When they visit a page they record the meta data and visible content of the page as well as the links from that page. It’s the links from that page that guide the search engines through the rest of your site. Obviously it's important that the search engines find each page as easily as possible.

Sites that have content buried deep within the site that’s more than a few clicks away risk having that content slow crawled or not crawled at all. It's important to note that by depth of content we are not referring to directory structure but to linking structure. If you have a web page at this URL http://www.site.com/folder1/folder2/folder3/page.html that is linked to from the home page then this page is only one level deep in the site. It's important to note the distinction. Just because a page is on the root of a website (http://www.site.com/page.html) doesn’t mean it is easily crawlable if it is linked to on the third level of your site.

For any site it is important to have a sitemap as they aid search engines with crawling the second and third levels of your site by making them accessible through one page closer to the home page. Sitemaps should at least be linked to from the home page, if not every page. Very large sites may need several sitemaps broken up logically.

Another important aspect of link architecture is cross-linking similar topics or cross topics. Done properly, your internal site pages can aid the search engines in understanding what a page should rank for. For example if you have a shoe site your walking shoes page should link to your running shoes and dress shoes category pages as well as accessory pages for socks, laces and insoles using descriptive anchor texts. It not only helps your optimization but also users in finding new merchandise on your site and provides a better chance at on up-sale or add-on sale.

Proper link architecture will ensure your site is craled easily and indexed regularly.

Friday, February 17, 2006

Eric Wards Link Webinar

Thursday afternoon the Zunch crew gathered around in the conference room, beamed Eric Ward's Linking Webinar up on the Screen, dimmed the lights and popped open the soda. An hour and a half later we shut down and chatted about what we knew already, what we didn't know and what if anything was wrong.

Let me state at first that we thought this would be an advanced linking strategy webinar and that it was marketed to other firms/professionals in the SEO/SEM industry as such.

The webinar covered all the basics and gave useful information for those that are just starting out and trying to understand what linking and link popularity is all about. For the seasoned SEO however you may pick up a one or two things there that you may not of seen in the popular SEO forums.

Here are the things I found helpful:
  • Sources for free one way links from authoritative sites
  • Email subject and body copy suggestions that keep you out of the deleted folder
  • Link Software Screen shots and personal analysis
  • How to use blogs to further empower your core site's links
Here are a few things that I thought that may have been confusing to the attendees:
  • It was stated that buying links could penalize your site, then later in the webinar strategy for buying links was discussed. Eric stated that he had clients that he had bought links for and no clarification was giving between buying links & and how they may penalize a site so it sounded a like a contradiction.
  • Someone asked what an RSS feed was. I thought the answer was drawn out and confusing and probably should have been addressed after the Webinar in the Q&A time.
Here are some things that I know were stated and are actually wrong:
  • It was stated that Spiders (specifically Googlebot) will not crawl deep folder structures. Therefore don't waist time getting links to a file burried in your site. His analogy was if the page is in a hundred foot deep lake and the link is only 50 feet of rope then you can't reach the page. This is wrong. IF the PageRank of the page that is linking has high enough PageRank to justify a crawl to a file that is being linked to then the spider WILL index the page and it CAN be reached.
  • It was stated that "Undiscovered Links" or pages behind forms are unreachable by Spiders. In Googles case this is wrong. I recently built a site that had a simple 1 variable form containing 50 values. Google used the url in the action="" and then for each of the 50 values added the forms name to form a query"?state_id=" to the action url thus completing the url then crawled all 50 variations of the landing page. I posted this over in spider-food.net.

    Currently you cannot see the results of this in Google but I think the forum post and the responces to the post indicate proof that Google did infact index those pages and that there were NO links to those variable pages because the mod rewrite implemented in the bottom links were for the bots/users that wanted to click and link to a static version of the form page. The form was there for convenience/usability and was NOT suppost to be crawled YET it was.
I've tried to give a fair evaluation for both new and experienced webmasters. If you are just getting into links I recommend this seminar. If you are experienced with linking the price of the webinar for the little tidbits of new information may not be worth your time.

Eric has had a lot of success with his current clients I think he did a good job explaining linking and showed great examples of success and how powerful links can be to an SEO campaign. He is one that thinks outside of the box and has done his linking home work I would definitely heed any free advice you may find on his website.

Eric Ward's Website >>

Tuesday, February 14, 2006

Search to Win?

Well this is an interesting idea.

Actually, I love it! Last night I built a web spider script and it's been working that engine over for a few hours now. So far I've won a $50 Amex card, two Best Buy gift cards, an xBox game and a Canon Powershot 5.0MP digital camera.

OK, not really, but I wonder if I could?

OK, follow up time. It looks like quite a few people have been hammering MSN, possibly causing MSN to change the way the Search and Win program works. From the website...

If a link appears on the search results page with the words MSN Search & Win, click the link to see if you instantly won.


But it looks like MSN is mixing up the words on the SERPs to foil the automated queries. Here's two screen shots with examples:






As you can see, one search has the words "Win what you search for" and the other has "Use MSN Search...and Win!".

Friday, February 10, 2006

The ReRight Way to Do It

I see it everyday. Someone proposing a url structure in the forums and asking for advice on whether it'll get crawled or if there are any visual flaws. So let this be an end-all post on the subject.

Out of all the experimenting with urls, I've found that
.com/ to
.com/folder/file.htm
works the best at being crawled regulary and deeply. PERIOD.

Now let me go over the problems with the other structures that I see everyday as I cruise the forums.

The Directory Structure & PR Dither Rewrite
This is like the Camero Mullet (see 2nd image) of mod_rewrites, olds school and looked cool back in the day.

.com/ (PR5)
/folder/file.hml (PR4)
/folder/folder/file.html (PR3)
/folder/folder/folder/file.html (PR0)

In the directory tree structure the root links to the first folder, the file in the first folder links to the file in the 2nd folder and so on. Once you get to 3 folders deep you lose your PR in most cases. This is because a puny PR 3 isn't strong enought to warrant a crawl that deep into a site. You could even see this in some of the deeper sections of the Yahoo! directory. The further you went the lower the PR and some sections were just so deep off the root they didn't even warrant a cache.

Of course the solution to the above would be to link to every page in the site on every teir or use the high PR from a root sitemap to feed spiders deeper. This also dissolves if you have a large site. A good example would be a country: USA (1 page) >> State (50) >> County (~3250) >> City (17,500*)

Obviously no single page could hold 20,000+ links and be crawled. Plus browsers would strain to render that coding. Then deciphering all the navigation. Its just not logical.

Junk Rewrites
Like The Tron Guy, It Should be avoided at all costs.
Nice Moose Knuckle by the way Jay!

These are the cases where the developer stuffs every variable thats not needed into the url. So the effect is a nonsensical jump from Root to a file 3-5 folders deep.

.com/ (PR 5) | (PR 3)
/folder/folder/folder/folder/folder/file.html (PR 2-3) | (PR 0)

The problem is that a site has to gain a significant amount of PR on the home page just to push the spiders into the rest of the site. This is why there are many complaints when a developer that has switched to mod rewrite static urls and complains, "I can't get my new urls to get indexed". You could be waiting months or years depending on how fast you can get inbound links.

It's not that they can't get indexed it just that your home/root pages aren't powerful enought to warrant a deep crawl. I see this alot with shopping cart/cms add-ons for mambo & oscommerce. For windows servers I would suggest using ISAPI Rewrite. ISAPI Rewrite gives you the same functionality and control as the mod_rewrite application for Apache.

The Tried and True Solution

Short and simple and 1 step away from the root at all times. I've come to this because I did all of the above and learned the hard way.

For example the site ~www.sbdpro.com was patterned after the Yahoo! Directory. It has consitantly for the last 2 years had a PR 4 home page. But with that structure it could never get the spiders deeper than 2 categories or 2 folders deep. Since the deepest depths in that directory is 8 tiers down there really was no solution or point of keeping this url structure.

It was changed about a year ago. All links from the root go to /directory/file-name.htm It didn't matter how many tiers down you went, all categories were now 1 tier away from the root, all subcategories were one tier from the root. The crosslinking all stayed the site had no index problems as all.

The effect was astonishing the PR still drops out in the 3rd tier with a PR 2 yet the spiders still followed the links through 5 more tiers of crosslinking & PR 0 pages to reach the 8th tier down.

My advice to all new mod rewrites.

1. Don't get married to your first try at mod rewritten urls. Change them because you're in it for the long run (hopefully). The above site ranked for "small business directory" page 1 consistantly during the change in Yahoo and MSN. It even jumped to page 2 in Google for that term as well and did a stretch for a while.

2. Keep your urls simple and close to the root. You will see more spider activity and have less headaches.

3. Make sure you rewriterule syntax is optimal and you are not bogging down your server. See this thread here >>

Other Resources

Webforgers.net - Mod Rewrite Tutorials
Ilovejackdaniels.com - Mod Rewrite Cheat Sheets

~ = 3rd party database of counties I bought.
* = From spidering Yahoo directory for city names under the state sections.

Thursday, February 09, 2006

Link Theory, Key points in the Patent Filed by Google

In Michael Martinez's latest post over at Spider-food.net he runs over a few factors related to link theory. As Big Daddy pours out and rankings shift around keep these factors in mind.

  • Document Inception Date (see sections 0034-0044)
  • Content Updates/Changes (see sections 0046-0056)
  • Query analysis (see sections 0058-0065)
  • Link-based Criteria (see sections 0067-0080)
  • Anchor text (see sections 0082-0086)
  • Traffic (see sections 0088-0091)
  • User behavior (see sections 0093-0095)
  • Domain-related information (see sections 0097-0102)
  • Ranking history (see sections 0104-0112)
  • User maintained/generated data (see sections 0114-0117)
  • Unique words, bigrams, phrases in anchor text (see sections 0119-0121)
  • Linkage of independent peers (see sections 0123-0125)
  • Document topics (see sections 0126-0129)


Source: forums.spider-food.net

Wednesday, February 08, 2006

"Black" Internet Marketing

Next week is “Black Family Technology Awareness Week” – a much too long way of saying, pay attention, race is a part of the internet. The week is promoted as a way to “provide Black families with technology access and training, and to promote the importance and value of technology in the educational and career preparation of Black youth.” It is a recognition of the vast economic influence of black youth, particularly in a time when Hip Hop and African-American trends are gold in media and advertising, but also of the fact that overall, Black Americans trail whites in Internet usage and access.

Internet use by the Black community trails that of non-Hispanic Whites by nearly 20 percent, and those who do have access are less likely to have broadband – something to keep in mind if you are developing a site which targets these consumers.

There are several portals and directories which are focused on African-Americans – most of which simply offer up Google results wrapped in African colors, but some of which, like the small searchblack.com, have independent results and focus on black owned and operated websites. On this site, the most popular search is for hair salons – people looking for places that specifically work on black hair. Sites like this, whether for Blacks or Hispanics or other ethnic groups, are a useful niche for websites which offer something that is truly targeting a need of that community, but not as useful for general sites trying to capitalize on a possible overlap. One of the most well-developed directories is BOBO Business, which lists black owned and operated businesses online. They strengthen their own presence by writing press releases for members.

Another way to leverage ethnicity on the Internet, is to keep in mind that you have a culture yearning for a voice online. Sites like MySpace allow independent musicians can get their work in front of potential fans. Blogs create a political, and marketing, sphere of their own. After hurricane Katrina, the dissatisfaction with the governments response, and the feeling that prejudice played a factor, was expressed in blogs and chain e-mails all over the net. One popular post included many photos, and a song, and a message of support – saying the government may not care, but we do. Viral marketing may operate in a similar manner – a musician can have a MySpace profile, and a website, and encourage fans to e-mail others about them. If the message is passionate and exciting – that e-mail may end up being circulated for months. Just the other day, I got one forwarded to me that was originally an April Fool’s Day prank last year – and it included a link to a post on a website.

If your client provides a service that targets African-Americans, or any ethnic community, remember that there are ways to specifically market to that group on the internet – from specialty search portals and directories to blogs to targeted e-mail campaigns. No, this is not something that applies only to Black websites, but to any business. Race and ethnicity are a factor on the Internet. Remember that, and find the resources that you can use to your benefit.

Monday, February 06, 2006

SEO 101 Refresher Part 3: Dynamic URLs

A dynamic URL is a URL that has values appended to it with a ? mark.

Static URL:
www.zunch.com/services.aspx

Dynamic URL:
www.zunch.com/services.aspx?id=123&cat=car

In the above example id and cat are variables with 123 & car being their values, respectively.

Most content management systems and shopping carts use dynamic URLs to efficiently display content from a back-end that allows users to usually enter content in and make new pages easily without having web development skills.

The downside is that generally when search engine see a URL with a '?' they run. Google has made progress in indexing simpler dynamic URLs, however it best to avoid them all together if possible. Content management systems and shopping carts can be expensive to implement, so how do we do this with an established site? A URL rewrite.

A URL rewrite can take a dynamic URL and rewrite it so that it appears static to the search engines and users.

Before rewrite:
www.zunch.com/sitemap.aspx?id=123&cat=car

After rewrite:
www.zunch.com/sitemap~123~car.htm

The URL is cleaner and the dynamic pieces are reduced to only what we need - the values. In addition we now have a static looking web page that looks like it sits on the web root.

Depending on what web server you use there are different solutions available commercially or you can develop one internally.


Good resource for Apache & Microsoft IIS URL rewriting:
http://en.wikipedia.org/wiki/Mod_rewrite

Saturday, February 04, 2006

Zunch Communications, Inc. Once Again Ranked #1 on TopSEOs.com

Zunch Communications, Inc. has once again been ranked the top search engine optimization firm worldwide by topseos.com (www.topseos.com). The leading search engine optimization, Website design and Microsoft Certified application development company continues to receive accolades for its search engine optimization work. In January, it was named Best SEO Company by PromotionWorld (www.promotionworld.com).

“Zunch Communications continues to give its clients the competitive edge in the Internet marketing arena,” topseos.com cofounder Bill Penden said. “It deserves to be saluted for its amazing search engine optimization work.”

The list of top search engine optimization firms aim to help users narrow their search when looking for the right firm. Currently, there are more than 10,000 Internet marketing firms worldwide. The top search engine optimization firm awards are based on competitive advantages, offered services, customer service, innovation and overall performance.

“The dedicated team at Zunch continues to work hard and put forth the extra effort,” Zunch Chairman & CEO John Sanchez said. “We are very much obliged by the recognition.”

In December, the Interactive agency was named the top search engine optimization company by topseos.com and most requested search engine marketing company by seoconsultants.com.

__

I am so proud of our SEO team of specialists and account executives! Thanks for all you do ...keep up the great work!

"SEO Mike" Joins Zunch SEO Team

Search engine optimization guru Mike Waltman - aka "SEO Mike" has joined our ever growing SEO team as account executive --> MORE

Wednesday, February 01, 2006

URLs with Anchors - Affiliate Opportunity?

Barry points out a thread this morning from WMW that goes back to a discussion Jeff and I had earlier this year about anchor tags in URLs (www.domain.com/index.html#footer). How do the search engines handle such links? Our experience has been that Google drops all pound signs from URLs.

Wizard from WMW seems to agree:
I'd say #anchor is not a part of URL actually, browser doesn't send it to server with HTTP request.


Google doesn't treat /page.html#anchor as different URL than /page.html. It might be possible that keywords after # mark matter a little, but in Google links database everything after # is stripped.


This seems like the logical thing to do in my opinion, since there would in fact be huge duplicate content problems.

So, what? Here's what: You need an affiliate program? Build it to craft your affiliate links to read like http://www.zunch.com#affid1074 instead of using question marks. Parse the URL at the server level for the ID, and set the cookie or whatever it is you do. It should work just as well as using a question mark, and you get the added benefit of every affiliate link pointing directly to your home page. That is, to the search engines, the link looks like a typical backlink, with no affiliate tracking variable. Nice!