Skip to content

No Target To Post To + Very Low Success Rate

edited June 2014 in Need Help
I have been getting a lot of no target to post to and extremely low posting success rate especially with social bookmarks. It wasn't like this on the 5 previous updates back then.

My settings:
I don't scrape for new sites to post to.
I use verified link lists from Trevor, Blue List, and tons of other purchased list.
I set PR to 1 and above.
My lists have been compiled for more than 5 months now so each month there are more and more lists.
GSA SER runs through the entire list until it shows "no target to post to" message

Again, I say, the posting success rate has been way higher than this previously. I could easily get 300+ post on all platforms within 3-5 hours while now I am only getting less than 30 in 24 hours.

Why is this so?

Comments

  • Edit: My list have been compiled for more than 5 months now, each month I purchase new set of links from almost all the service providers I can find here. So, my lists are growing each month.
  • Anyone please?
  • ronron SERLists.com
    edited June 2014

    I would never mix an older list into a newer list. It is tempting to do that, but it hurts productivity by a lot. Older lists get beat up to a pulp, so if you plan on mixing lists, I would run older lists and 'distill' them into a new verified folder to capture whatever good links come out. Then maybe take that list and add it to a newer list.

    Personally, I would take each list, run that list through a bunch of projects, and then channel the verifieds that come from this exercise into separate folders for each, and then rename each list something like Revised XXX List 1, Revised XXX List2, etc.

    Then I would still keep these lists separated, and then run one out of identified, another one out of submitted, and another one out of failed. Keep them separate, but have your projects read three of the revised lists. Your LPM will shoot through the roof if you take the time to do what I say.

  • @ron Thanks for the answer, mate! In this case, my lists are all 'contaminated' now. What should I do to gain back my normal LPM? Please help! :(
  • @ron Also, ron, I realize my posting success rate for Social Bookmark platforms is very low. I think it's probably something to do with my content. When it says 250 words for description, is it 250 words or alphabets? Because I've never seen any of the posted social bookmarks link with actually 250 words. They only take out the first or second sentence, that's it. I usually just put a 300 words article in the description part. Is it too long? Any advice on this part?
  • ronron SERLists.com

    @rabbit23 - Just do what I said. I personally just run the most recent list because the older ones bog everything down in SER. So in my case I just use the most recent one for two weeks, and then use the next list. I just don't have the time or the motivation to keep distilling the good links out of older lists.

    Bookmarks are a PITA. If you are using just CB for captcha solving, don't waste your time with that platform as they are all hard captchas. If you have hard captcha services, then that will be a big help. It is all based on character count, not word count. But the issue is captcha, not the amount of characters. 

  • @ron, thank you for the tips man! Last question, what if my SB description has more than 250 characters? Will they still be able to post them?
  • RuFFCuTRuFFCuT UK
    edited June 2014
    @Ron is right, if you want to build SB I would use DBC or a similar service; what I personally do is send all ReCaptchas to DBC from within CB.

    Oh my god, all these abbreviations...

    As for your low LPM I would suggest the following, it's similar to what Ron has said but I would make use of your old lists, it will take longer but from my personal experience it's worth it.

    Take all of your lists (including your verified lists in SER) and merge them into one file using the ScrapeBox dupe remover tool, then load that large file back into the dupe remove tool and... remove all the duplicates. Since you have merged several lists together I imagine there are a lot of duplicates which will damage your LPM if you haven't got 'allow posting to same site again' checked.

    Delete all of the files in your verified lists folder (take a backup just in case you mess something up), now start a new project for a random URL, and uncheck all search engines and all the boxes below the search engines box like this:

    image

    Check all the link types and populate the data fields, then import the large list you have created as target URL's and let it run until you get 'no targets to post to'. Check that your lists have been created in your verified lists folder and then moving forward check the 'use verfied lists' box and you should get a decent LPM.

    I also do this for my identified lists, but this might just be me going OTT :)

    I'm not saying this is the best way to do it, this is just what I do and I seem to get a good LPM. I used to do it like Ron mentioned, I just found it a bit long winded when this from what I believe achives the same thing.


  • ronron SERLists.com
    edited June 2014
    ^^ @RuFFCuT - We are actually on the same page, or a similar page. Personally I assign a different list to each project. I will have multiple projects - one for each list. That way it just processes everything faster.

    This is probably a good time to mention that it pays to have these projects set up as test projects, going to bing or yahoo or whatnot. That way people are not all hung up on link limits and the speed in which these lists process - you can basically turn them up to 'no limits'. And like you said, all engines and platforms are checked (I forgot to add that, so good catch).

    To me it doesn't matter about deduping. Mainly because it is our responsibility to dedupe the verified list on a continuous basis. So that takes care of that issue regardless.
  • I used to have ~150 LPM with SER, now its "choked" for some reason. I let it scrape and i get crap results even with 100k+ keyowrds imported (Got me like ~15 LPM). So I import scraped list (Gscraper/SB, SER footprints) and its even worse - 5 LPM. I maybe get "couple" (3k, used to have like 100k) of submissions, and 800 verifed in 12h. I used to have projects on pause status just after hour or two as it reached its limits. Now i have to run it whole day and got no guarantee of achieving my goal (pretty much 50 verification per project). 

    I even tried run projects for T1 contextuals like Social network with manualy typing recaptchas and still no luck with submissions. Dont know what's going on. I cant build decent T1 links and cant even blast links like i did 3 months back. Wierd shiet! 
  • From what I've found, scraped lists from Scrapebox actually usualy have a low LPM anyway. This could be because a number of reasons but it will never have an LPM like your verified list because, well you have already posted to them before.

    What settings do you have checked under search engines? 
Sign In or Register to comment.