Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Global duplicates

edited October 2012 in Feature Requests
In advanced I enabled the global site lists population a while ago.
I have multiple projects that use the same URL-my money site.
I have do not post to the same URL checked in all projects.
Is it possible to add this option to projects :
Do not post to duplicate URL (use entire global list to check for dupes)
Do not post to duplicate URL (use only project list to check for dupes)

When you have Use global site list for URL checked in a project, does it already do this ( look globally to see if there is a dupe from one of your differnt projects) or
Is it only looking for dupe within that specific project?

I've had some complaints from moderators saying I'm spamming them multiple times.
I'm trying to avoid reworking all of my projects to avoid duplicate submission with specific and new projects that use my same money site URL.
Let me just add the nature of my money site involves a highly professional subject matter (not a web sales type) and I want to NEVER post a dupe across any and all of my projects- so I don't upset the wrong people/competitors.

Best Answer

  • Accepted Answer
    Hi @Mike. It appears you are running into trouble with SER finding the sites then posting to to them multiple times across the projects cause you are using the same keywords across keywords.

    Why not use the built in scraper in SER or use scrapebox and then once you get your huge list of sites to post to separate them and assign them accordingly to each project. So if you have 10 projects and you want them to post to 1,000 sites you would need a list of 10 thousand sites then just split it up into 1,000 chunks and import target urls.

    an easy way to split large lists with scrapebox is import your list then export URL List > Export as txt and split list > then just enter the number of entries you want in each list.

    Hope this helps Mike

Answers

  • SvenSven www.GSA-Online.de

    All options you have are limited to the project only and this will stay like this as projects are not communicating with each other (except for the option to use verified urls of another project).

    And I am not changing this behavior in any way. If you want to limit the posting of some things you can use the filters 

  • edited October 2012

    I need about 500 anchor text to target my URL. Four months ago, my choice was to create one project with all 500 anchor text, or 30 projects dividing the anchor text. I figured putting all 500 anchor text would take forever for them to cycle through and get good results so I went with the multiple projects option This of course is why my multiple projects are producing duplicate back links
    Now about three times a week I add a URL to the filter because somebody complain. Unfortunately it is usually too late. As revenge, People will write nasty comments on forums,reviews, about my business et cetera Before I have a chance to filter them out with Ser. If I could prevent the dupes, I don't think people would complain at all.

    Since you say you can't add the option to check globally for dupes,
    I Assume the only thing that I can do is delete all of my projects and create one project with 500 anchor text. I guess it will take a very long time for Ser to post links using all those 500 anchor text but this is the only option I guess to prevent dupes from being produced across all of my projects.

    Any chance you will ever consider this option for people who want to promote a lot of anchors quickly with multiple projects target at the same URL (and prevent duplicates)? I don't want to have to wait for many months for all of my anchors to get posted as backlinks because I can only run one project for them -this program has the power to do it much faster.

    sorry for the grammar and spelling I'm using my cell phone.
  • The only reason multiple projects may seem faster than 1 large project is because it is posting to the same sites multiple times, thus reducing the amount of time it takes for finding targets.

    The option you are requesting would perform exactly the same as providing all anchors in 1 project.
  • Thanks theorbital for your response. I was referring to the anchor texts which is completely different than my keyword list but I'm sure you know what I'm referring to. I will try out some of your techniques in the future. I was really hoping Sven could make it easy to avoid dupes across all projects with a check mark so I could keep running my existing projects as they are but I understand if he can't.

    M3 I'm not sure I understand the reasoning, how 30 projects with 30 different anchors would post out those anchors as quickly as one project with 500 anchors, but so be it. I will agree with you
    I will do what I have to do to prevent dupes across all my projects. Im definitely not an SEO expert, but learning more and more each day. Thanks again.
  • AlexRAlexR Cape Town
    "I'm not sure I understand the reasoning, how 30 projects with 30 different anchors would post out those anchors as quickly as one project with 500 anchors" - there are only so many threads your machine can run. When it's maxed them out it's at max power. Having more projects or less doesn't make a difference.

    i.e. 30 projects with 10 threads each or 1 project with 300 threads will take the same time. 
Sign In or Register to comment.