Maximizing the pages indexed on your site

I recently had a client come to me and say ‘We just launched a new site, now what SEO should we do?’ My first thought, was that they had missed the boat, but I didn’t tell them that. That’s a lesson they will learn before they launch the next site or feature. 

Minutes after running the YSlow Firefox/Firebug add-on, I noticed some low hanging fruit. No HTTP compression on their server. 

I told them, ‘Just do it.’ They had to ask why. 

First, it will save you bandwidth, and bandwidth is money, right? Here’s a nice case study from IBM that details the bandwidth impacts server compression has on overall bandwidth. 

Second, and more important, because engines only index pages to a certain volume, the rest of the page is lost. Implementing it is a change that not only reduces the bottom line, but provides growth into new areas of SEO. 

According to Sitepoint the engines stop points are:
     Yahoo:    210 kb
     Google:  520 kb
     MSN:      1020 kb

That means most pages with a footprint more than 210 kb in download size will not be indexed completely by yahoo, and that’s something that could be tragic. 

Links