Skip to content

Low lpm while posting at Identified site list

The title says for itself. I have ~2-3 lpm (~0,2 vpm) while posting at Identified site list. I'm using 10 semi-proxies from (100 threads running, Articles and Blog Comments checked only). The sites for Identified I've collected myself.
So here goes the question: how to optimise the process of submitting and hence increase lpm/vpm?


  • shaunshaun
    That could be a decent LPM for your rig depending on your VPS and how you got your identified URLs.
  • I'm running core i7 6700k, 32gb ram.
    Scraped from different search engines.
  • shaunshaun
    Are you scraping with SER and the 10 proxies or something such as scrapebox with public proxies?
  • ShweppShwepp UA
    edited January 2017
    I scraped with Hrefer (free for Xrumer users) and 10 semi-proxies if I'm not mistaken (it was quite a while ago). Then I imported list with urls into GSA SER using tool "Import URLs (identify platform and sort in)".
  • shaunshaun
    Are you using custom footprints or the default ones? Read this post on how to optimise the default footprints for scraping as there are a bunch of dead ones included.

    Try up your threads to 300 and report back with any changes :).
  • Default ones. 
    I've read your article but still don't get what this have to do with low lpm. It might be helpful for scraping but it doesn't help speeding up submission.
    Made 200 threads (as at the same time I do scrape 2 engines with 2x Hrefer copies with 100 threads each) and I'll post how did effect the speed.
  • shaunshaun
    Pretty much the first example in the article “Powered by Drupal”.

    For example say you get 1,000,000 targets on your scrape.
    Due to it being such a broad term and usable in a bunch of other things say 750,000 are drupal blogs.
    As that footprint is used on the bulk of drupal blogs, for arguments sake say only 10,000 allow submissions.
    Out of those 10,000 says only 10 are verified.

    So out of 1,000,000 you were only able to actually submit to 10,000 of the targets meaning your LPM could be low because of the amount of useless urls the tool has to process.

    Now say we use this made up example footprint “Powered by Drupal Leave Us A Guest Post”.
    It is less broad and gives an indication the site accepts submissions.
    Say the scrape returns only 50,000 results.
    As the site accepts submissions from users 45,000 of the targets are submitted to.
    A much higher submission count with a much lower useless target count meaning your LPM goes up.

  • Got you.
  • ShweppShwepp UA
    edited January 2017
    An extra question: do you use default footprints or come up with your own?
    Plus, it seems I have problems with proxies (they don't work all the time properly)
  • shaunshaun
    I use my own and filter the defaults as I explain in the article.

    Yea that's probably softbans on your proxies.
  • It seems that problem occurs with articles only. Other engine types are doing much better.
Sign In or Register to comment.