Thanks but I know I shouldn't have duplicate content it was just a oversight that I didn't put the duplicate static pages in a robots.txt file.

My question is google has indexed my dynamic pages years ago but I prefer to have them index static pages. Should I block the dynamic pages or since they already indexed them should I just block the static pages?

Also how to I write the robots exclution for dynamic pages?