Skip to content

Best practise after you submit to the last target page.

How you usually move on to the next phase of submission with same project ?
I mean what you do when you submit to all target sites ?

Is it necessary to start completely new project for same website or should I replace email address only in current project  and start submission again ?
Should I clear history together with all accounts data ?

Today most of my projects probably reached the limits cos nothing is happening, green is everywhere but no submissions , LPM lowered from near 100 to 1,5 so I am wondering what  should I do now.

It`s not a problem of proxies ,I just probably used most of the target sites already.

Comments

  • If you are considering to create a new project for the same website then why don't you just allow SER to post multiple times to the same domains? Also, you should try scraping new targets. When one of my projects has almost finished a list of target sites, I simply import another list or change its settings to start scraping a lot more than previously.
  • edited February 2014
    I thought about that (: but I am not sure posting under same accounts several times and link back to the same website is safe (1tier), I though maybe it will be better and safer to create a new accounts on the same target sites, therefore I am asking how is the better way to do it.

    Of course scraping of new sites is something I do in daily basis (: but the problem is in my case I can`t find as many PR1+ I could post to as I want, I have at max 500-600 verified PR1+ only, I am talinkg about contextual ones of course.

    Do you think is it safe include to the  1TIER  PR0 target sites ?
    I know it will look more natural probably but not sure is it safe. I am using PR0+ for 2tier only.


  • Then it's awesome there's an option to not post multiple times but rather create multiple accounts, right?

    To my understanding then blackhat linkbuilding is all about simulating a natural backlink profile as much as possible, so it seems rather obvious to also include pr0 sites as there are a lot of websites which has pr0 and thereby it's natural to get quite a few of these backlinks. People who finds a site they like and wants to share are not going to check the root domain pr first before writing about it. But I can't give you the yes or no answer you're looking for. I know what I'd do, but I guess it's up to you to decide.

    Also you should keep on scraping, because I know that there are a lot of possible targets which you have not uncovered yet.
  • You are right of course. But the problem is looks natural is not always equal to high quality what Google expect (((:
    But that is another story ...
  • Google expects to see a natural backlink profile. If they deem your backlink profile unnatural then they're going to penalize you. It's your choice.
  • Yeah, limit to the PR1+ is not probably good idea ... mixing up everything as a hell is the right solution (:
  • Some people are getting great results with a pr filter while other are getting great results without. It's all about testing what will work out for you and how you are doing things.
Sign In or Register to comment.