Skip to content

Low success rate on verified lists

So I've setup a campaign and deleted the target url history + the cache + all account data. I've selected the option keep trying to post even if failed before.

Then I've imported target URL's, and browsed through to my GSA verified lists folder, and selected my Budypress article file - about 1mb in size.

I have 10 fresh unused emails in the project.
Captcha breaking is running perfectly
I have 30 dedicated proxies from buyproxies.org
I have 200 threads

So it starts posting, but I'm getting a HUGE amount of wrong engine type errors.

This is a VERIFIED list, in that yes I've successfully posted to these links before.

I'd imagine maybe a 30% fail rate or so due to the sites changing registration pages etc maybe, but as of right now... I have: 30,048 submitted, and 1648 verified from about 3 hours of running.

This seems insanely low, what could I be missing here?

Thanks!

Comments

  • edited July 2013
    is that for real?

    30k submitted and 1.6k verified in just 3 hours?

    whoa. that's big sir , even im an old user of SER, i didnt achieved that in just 3 hours o.O
    (i think because im not using CB :( )




  • Yes CB makes a huge difference :)

    Still running, and getting a lot of verified, but seriously 80-90% of my "verified list" is getting no engine matches.

    Isnt there a way that GSA can remove any link that doesnt work so that the verified list is actually PURE 100% linkable sites?

    I'd rather a site list with 100 urls then 100k unworkable ones

    Thanks
  • image

    maybe you could use this button? try it out :D
  • i don't think lrichard2112 fully understand what you are asking for (no offense).

    two option comes in mind to me. one with SER and one with Scrapebox and each has its flaws.

    SER:
    1.) change the destination of your actual verified folder
    2.) import the verified list of your old folder and run it through your project(s)
    3.) all working urls are now stored in a fresh list in your new folder

    disadvantage to this method:
    if you are using the global list options for verified urls for all your projects than it won't search the list of your old folder for urls. you can circumvent to a point when you are just using (the old) submitted urls list in your projects for a while and import the verificated list when you are creating new projects by hand.

    Scrapebox:
    1.) import the list to SB and do an alive check
    2.) use the page scanner addon with the "page must have=" footprints you find in the engine scripts of SER. this maked sure that the alive urls will be identified by SER correctly and nothing has changed on the end of the site owner (domain expired or under maintaince, etc.)

    disadvantage:
    this just ensures that your list is cleaned up a bit but doesn't make sure that SER is able to post to it. 
  • i have this problem too, i think its necessary to scrape new targets on a weekly basis to keep SER powerful. But im seeing a lot less targets than in the past (well, this aint a miracle since a lot of people are spamming with SER now and webmasters are installing recaptcha/mollom or moderate everything by hand..)
  • Success rate is low here too and overall effectiveness seems quite down.  Definitely hit its peak already.
  • edited July 2013
    Success rate low here also when scraping new targets..
  • vijvij
    edited July 2013
    OP, Its because I use a similar list as you on N servers and also because I sold it on fiverr to NNN users and NNNN of us have been using this affordable magical software to whore out those already whored out blogs. 
    Hope that answers your question.

    If only Sven would sell GSA for 10$ - fiverr guys could have made more sales and we could have more of these threads.
  • Lol, nice answer vij. Yeah gotta find some new keywords.
  • GSA has become a go-to tool in nearly every SEO's arsenal.  There's only so many postable targets out there, with hundreds or thousands of SEOs spamming them on a daily basis.   The rate at which sites go down for whatever reason, is going to be higher than the rate of new sites popping up.  I think we're all going to see GSA become less effective as time goes on. 

    That's generally what happens when there's such a low-barrier for entry.  IMO, I'd much rather see either a higher price point for GSA or a monthly cost associated with it.  This will keep the # of users lower, and most likely keep targets alive for longer as a result.  @sven Personally, I'd pay $197/month for this program without even thinking about it.  But hey, that's just me. 
  • I had called that this would start to come up back in march.  Happens with every single tool out there that is effective. 

    https://forum.gsa-online.de/discussion/2527/some-concerns-i-m-having-with-ser#latest
  • edited July 2013
    @sven we need that platform trainer!
  • @Ozz

    I took your advice, and created new folders for the Verified/Submitted/Identified lists, and spent the weekend merging lists and running them through GSA and getting really fresh lists.

    Here is my LPM today from using my verified only list :)

    image
  • Today? you mean the first 5 mins maybe
  • It lasted 400 + LPM for the 30 minutes it took me to run through a selection of my verified list (Wiki/Social network/Articles)

    No rubbish/kitchen sink platforms
  • i get 200+lpm on my global lists the prob is verifeid rate is only 5% now i've noticed a sharp dropoff.

  • @Digicept I've got a big complex right here seeing your LPM, lol.
Sign In or Register to comment.