I have a site that has its articles syndicated by a few really strong sites. These sites are well known newspapers, and old established portals within my niche Every time I write an article it shows up automatically on their sites, it will be a 2 paragraph preview of the article with a backlink to the full article, buried a few layers down in their site. I almost never get these pages that have the backlink to me indexed by google. I understand Google's perspective that it is dupe content. But I really should be getting credit for the link. We write about 5 to 10 articles per day which ends up translating to 25 to 50 backlinks that I really want the googlebot to be counting, but GoogleBot doesn't index them or it gets to them and just decides the page is not important enough to be included because they know it is a dupe. So my question is, What do you think is the safest way to make sure the GoogleBot reads and indexes each of these pages with the Backlinks to my site? Any input would be greatly appreciated!!!