the dup content penalty is automatic.. dont do it.
ap.
the dup content penalty is automatic.. dont do it.
ap.
Thanks but I know I shouldn't have duplicate content it was just a oversight that I didn't put the duplicate static pages in a robots.txt file.
My question is google has indexed my dynamic pages years ago but I prefer to have them index static pages. Should I block the dynamic pages or since they already indexed them should I just block the static pages?
Also how to I write the robots exclution for dynamic pages?
would this block my dynamic pages? Since all dynamic pages have a ? in them.
User-agent: *
Disallow: /?
That's a tough call. If Google has had a set of your pages in its index for years, and its working, I would be inclined to leave it alone. But on the other hand it could reindex you, and integrate your new pages into the engine, but it may take a year for all the new dust to settle.Originally Posted by Northstar
Also - consider using Google Sitemaps
Steve
Yep - google sitemap is a great tool!
What is google sitemaps and does it help you position in google?
Google sitemaps doesn't help you get better rankings per se, it's a sitemap tool that helps Google index your search engine better. It helps make sure that everything you want crawled gets crawled.
Try this tool to see if you're banned:
http://www.123promotion.co.uk/tools/googlebanned.php
If you want to hit me up on ICQ or AIM and give me your URL, I can do some poking around the SE's with you to give you my assessment.
Here's a good article on the issue:
http://www.webconfs.com/dynamic-urls...-article-3.php
And here's another with specific quotes from Google:
http://www.vbseo.com/f2/google-com-d...tic-urls-2466/
Cheers,
Michael
Thanks Dzinerbear, I will take you up on that offer. It would be good to have a outside person look at the site to see if I overlooked anything.
Bookmarks