Can you Please Them All? Universal Search Engine Ranking Algorithms. SES San Jose, 2006
Copyright August 8, 2006 by Mike Banks Valentine
Search engine specialists use to spend inordinate amounts of time creating pages that ranked well at just one search engine due to algorithmic weighting of known and very specific ranking factors. But with duplicate content penalties and increasing complexity and number of strongly emphasized factors converging, most SEO’s are moving toward using tweaks to important pages, rather than what were once known as “Doorway pages” or alternately, “Hallway pages” meant for just one engine for dozens of search phrases per engine.
Most SEO firms now realize that the vast majority of referred search traffic comes from Google and that it is followed only (often at less than one-third the referral traffic) by Yahoo and then half as much again from MSN (with Ask trailing far behind at just fractional percentages of the referrals brought by the others). Therefore, most optimization efforts are spent toward making Google happy, and the others will mostly fall in behind by bestowing rankings at similar positions to those achieved atGoogle. Still, there are many interested in improving positions at Yahoo, MSN and Ask once they have achieved their best rank at the big G.
First speaker Aaron Wall of SEObook.com emphasized that algorithms are always in a state of evolution and offered a brief overview comparing observed ranking factors of each of the top search engines. Wall elicited a chuckle from the audience with the webmaster’s quip, “A good search engine is one that ranks my sites well, a bad engine is one in which my site does badly.” He suggested that there is “No such thing as a perfect algorithm.” Wall asserts that, of necessity, SEO techniques evolve with the algo’s. Because you rank well in one engine, it does not mean you’ll do well in all.
Infrastructure or algorithmic changes may have unintended side effects. Wall mentioned Google Sandbox effect and suggested that it was really a side effect of an aging factor added to the algorithm, but that its’ effect was positive overall to the index, so it was kept. He moved to discuss “Big Daddy” infrastructure effects, which for many webmasters meant large numbers of temporarily disappearing pages dropping from search results. That was widely discussed in webmaster forums when it resulted in wide swings of results for many until the index was able to readjust and settle over a few weeks. Many sites didn’t regain positions they had before Big Daddy because they emphasized factors in their optimization that were downgraded by that major overhaul to the Google infrastructure.
Wall mentioned that new publishing formats can create algorithmic “holes” and gave two examples – Wikipedia and blogs. This was an “advanced” session, according to the conference schedule, so terms were not defined and it was assumed that most in attendance understood how different those publishing formats are. He also suggested that many will always attempt to game the system as new formats emerge.
Yahoo Focus was quite literal for years, but recently changed to be more like Google. “Nepotistic links” still working there. Bias toward commercial sites with their algo’s. MSN newest to search and they entered when spam was already heavily gaming system. Google has biased toward information resources like .gov and .edu best at determining true link quality and bad links can hurt crawl depth. Places a lot of weight on domain level trust. Aggressive duplicate content filters. Google looks much more at linguistic patterns than the others and filters out some hyper focused pages. Some have called that “over-optimized”.
He mentioned that ASK is not studied as much as others due to small size, so less is actually known about their algo’s.
Dave Davies of Beanstalk SEO then took the stand, emphasizing items such as site architecture and URL importance on his first slide, showing standard SEO factors such as key content appearing higher on the page, above the fold. Heads around the room nodded as attendees agreed with the basics as he reviewed each item on the standard SEO checklist.
As he moved to the “code to content ratio” he claimed it was a sizable weighting factor for his clients. While not revealing names of those clients, he did show several slides of example sites with keyword phrases highlighted and circled on the pages. He claims that nested tables and complex table structure hurts many sites and that when Beanstalk switched to “table-less design” from table structures, that his own site saw immediate increases in ranking with absolutely no changes to content. He went on to discuss SE friendly URL’s and “flat filing” of dynamic content.
Davies said that when optimiizing for separate engines – MSN is by far the easiest, then Yahoo then Google. But then he said, “Ranking on MSN is essentially useless though, so I’d rather be on page three of Google than on page one of MSN.” He claimed that relative ranking results in far more traffic, even though few searchers go past the first two pages of results. Google has very much higher referral numbers, so rankings are worth more on page three of Google results than page one of MSN. Determining which engine to target first for top ranking is therefore, quite easy. Ranking on MSN search is still not as valuable as ranking well on Yahoo.
Other factors include age of a page, content adjustments, freshness, keyword density, How it fares in search results (clicks from result pages out to site), backlinks, visitor stats and user analysis. Traffic is better that way. Referrer analysis – where are they coming from. Which keyword phrases are actually converting? Path analysis important to determine what users do on the site and then comparing each of the engines for path analysis and which engines result in more referred users taking the most desired action from each engine. Do users referred from MSN result in sales or quote requests? He recommends doing that same analysis at each of the engines referring search traffic.
Next up on the panel of experts for “Can You Please Them All” session was SEO Michael Murray, VP Fathom SEO, who suggested that audience members make slow subtle changes to their important pages and overall to the site, and recommended against major reworking. Don’t do a complete overhaul. Make some baby changes first to get rankings and go from there. He warned attendees, “You can’t get all the rankings with one page. Think multiple pages to achieve results for different engines.
Murray provided a fun example of what it takes to woo a search engine when he said, “MSN is easy, it just takes a little kiss.” The powerpoint showed the MSN name and a photo of a Hershey’s Kiss. The next slide showed the Yahoo name along with a heart shaped box of chocolates and a couple talking. He said, Yahoo is fickle. It’s sometimes slow but comes around. Then showed a slide for Google showing a box of long stemmed roses, a box of the best Godiva Chocolates and a wedding ring and said you’ve gotta completely romance them.
He emphasized important page structure issues and classifed them in terms of “Tiers: First Second and Third on page architecture.” Showing page titles, headings, first paragraph use of keyword phrases. Make good use of the pages you have – Don’t abandon longstanding pages for lower ranked terms, add more to
pages, rather than swapping new keyword phrases targeting a different term. He gave examples of sites switching content management systems, resulting in complete URL changes site-wide, then asking him, “What happened to our traffic?” A question many SEO’s have heard from both new clients coming for initial SEO and long time clients who neglected to tell their SEO they planned the site remake. Few understand the ramifications of site redesign on ranking.
rray then listed “Keyword development and Assessment tools” like Google 300, Wordtracker KEI, web analytics Sales data, charting performance, influence of root words. Derivitaves of words. No corporate names or bylines in title tags. Use of keywords far more important than the corporate name or catch phrase. Discussed getting ranking 3rd or 4th position on Google until the board asks for company name in the title tag and it dropped to #22. “Which actually made me happy!” He said he told them rank would drop if the title tag was used for branding and reported the new position to them.
Get ranking for important keyword phrases first, then adjust tags to include any additional info. He recocmmended that sites don’t show “breadcrumbs” for sections of site in title tags, as is often done. Home > Brand > Model > Product is far less important than the actual item description. He recommended what he called “Page Freeze” for backup to previous tags when ranking is lost at any time. Backups of previous tags on well ranked pages should be saved so they can be reverted to those versions if rankings fall following a change to those tags. Home page is best bet for best words. Test tactics, go slowly.
The session turned into a list of best practices and SEO basics when all were done, but proved that it never hurts to pay close attention to what matters most. When top SEO’s present that laundry list of issues, it emphasizes how important the basics are to top ranking. The title “Can You Please Them All” was essentially answered, “Yes, if you use best practices and target Google.”