No Target To Post To + Very Low Success Rate
I have been getting a lot of no target to post to and extremely low posting success rate especially with social bookmarks. It wasn't like this on the 5 previous updates back then.
My settings:
I don't scrape for new sites to post to.
I use verified link lists from Trevor, Blue List, and tons of other purchased list.
I set PR to 1 and above.
My lists have been compiled for more than 5 months now so each month there are more and more lists.
GSA SER runs through the entire list until it shows "no target to post to" message
Again, I say, the posting success rate has been way higher than this previously. I could easily get 300+ post on all platforms within 3-5 hours while now I am only getting less than 30 in 24 hours.
Why is this so?
My settings:
I don't scrape for new sites to post to.
I use verified link lists from Trevor, Blue List, and tons of other purchased list.
I set PR to 1 and above.
My lists have been compiled for more than 5 months now so each month there are more and more lists.
GSA SER runs through the entire list until it shows "no target to post to" message
Again, I say, the posting success rate has been way higher than this previously. I could easily get 300+ post on all platforms within 3-5 hours while now I am only getting less than 30 in 24 hours.
Why is this so?
Comments
I would never mix an older list into a newer list. It is tempting to do that, but it hurts productivity by a lot. Older lists get beat up to a pulp, so if you plan on mixing lists, I would run older lists and 'distill' them into a new verified folder to capture whatever good links come out. Then maybe take that list and add it to a newer list.
Personally, I would take each list, run that list through a bunch of projects, and then channel the verifieds that come from this exercise into separate folders for each, and then rename each list something like Revised XXX List 1, Revised XXX List2, etc.
Then I would still keep these lists separated, and then run one out of identified, another one out of submitted, and another one out of failed. Keep them separate, but have your projects read three of the revised lists. Your LPM will shoot through the roof if you take the time to do what I say.
@rabbit23 - Just do what I said. I personally just run the most recent list because the older ones bog everything down in SER. So in my case I just use the most recent one for two weeks, and then use the next list. I just don't have the time or the motivation to keep distilling the good links out of older lists.
Bookmarks are a PITA. If you are using just CB for captcha solving, don't waste your time with that platform as they are all hard captchas. If you have hard captcha services, then that will be a big help. It is all based on character count, not word count. But the issue is captcha, not the amount of characters.
Oh my god, all these abbreviations...
As for your low LPM I would suggest the following, it's similar to what Ron has said but I would make use of your old lists, it will take longer but from my personal experience it's worth it.
Take all of your lists (including your verified lists in SER) and merge them into one file using the ScrapeBox dupe remover tool, then load that large file back into the dupe remove tool and... remove all the duplicates. Since you have merged several lists together I imagine there are a lot of duplicates which will damage your LPM if you haven't got 'allow posting to same site again' checked.
Delete all of the files in your verified lists folder (take a backup just in case you mess something up), now start a new project for a random URL, and uncheck all search engines and all the boxes below the search engines box like this:
I also do this for my identified lists, but this might just be me going OTT
I'm not saying this is the best way to do it, this is just what I do and I seem to get a good LPM. I used to do it like Ron mentioned, I just found it a bit long winded when this from what I believe achives the same thing.