I'm think I'm being penalized by Google for accidently having duplicate pages. I contacted Google via there reinclusion form and I'm waiting to hear back. Our link list CGI program offers dynamic and static options. Well we always used dynamic pages but about three months ago I changes the links to static to make the page load faster. I now realize I probably should have but the unused dynamic pages in exclusion on a robots.txt file. At the time I forgot all about it. I was wondering if you guys had a idea as to what I should do. All of our pages that are indexed on Google are dynamic pages so if I add them to the exclusion that may hurt the standing I have now on Google but I would rather have them index the static pages. Should I continue to just use the dynamic URLs since that is what Google originally indexed? Or should I change back to the static pages and hope Google picks them up the same as the old dynamic pages. I know now that I need to have a robot exclusion for one or the other I'm just not sure what would be best.
Bookmarks