Instead of editing the robots.txt
[and thanks to naz and acorn for previously explaining how simple it is to do. and yes, i did it and it is simple]
Would it accomplish the same thing by setting in SEO options
How frequently the page may change: never.
The priority of that URL relative to other URL's on the site: 0 (zero).

I think technically that would mean that google would index it one time, and never again, but if it found duplicate content on another site of mine, that it may stop showing all results from site.
Or would it be that the duplicate content would never be found by google?

Thanks, leonard
LeonardSlates.com