Skip to content

What LPM you get with only Article, Web 2.0 and WIKI Submission Selected Only

Hi,

I only get about 2-3 LPM with these option selected

Regards

Comments

  • goonergooner SERLists.com
    Are you scraping with SER?
  • goonergooner SERLists.com
    Well web 2.0s only work well if you have serengines.

    Wiki's are quite to hard to find and articles are best with a Recaptcha OCR.

    So all in all it's not that easy to get those links.

    You could get serengines and CaptchaTronix or ReverseProxy OCR or alternatively you could buy a verified list to save you scraping and increase LpM
  • Hi @shanrocks666, scraping for contextual sites is not an easy task. If you just started using SER, I strongly suggest you buy a good list first. Then slowly learn how to scrape effectively, either with SER or an external program like Scrapebox or Gscraper. I just heard SERList.com is going to release a new list with 10,000 contextual links! Also comes with great tutorials to help you get started. Check the Buy/Sell section...
  • edited August 2014
    Well,

    Yes i also think ,it is diffuclet to get better LPM with them , I have SEREngines and CaptchaTronix subscription.

    But i have canceled CaptchaTronix subscription because they are very bad , only 25% solving rate and maximum is wrong.

    Also with other selection i get LPM of 80-90
  • Olve1954 , Buying list is my last option . I want to build my own list . So i am trying different way of getting More LPM and verified .

    LPM is really low.
  • goonergooner SERLists.com
    Assuming you are using tiers...You could try using CaptchaTronix or other OCR only for tier 1.

    I have been that recently and it makes a big difference because it doesn't stress the OCR or your proxies trying to deal with Recaptcha for all projects.

    Oh and thanks @Olve1954 for the recommendation :)
  • Ok @shanrocks666, good luck then. Do search this forum using Google,

    site:gsa-online.de scrape list

    There are plenty of tips and here's one of mine,

    https://forum.gsa-online.de/discussion/comment/88295/#Comment_88295

  • Shan ...keep in mind that LpM is just speed ....i for one prefer quality over quantity / speed.

    But if you want higher LpM i suggest you scrape with gscraper / hrefer and import the list to ser. That way your LpM will rise but i believe this is the only option you can actually use to speedup article submissions.


  • @shanrocks666: I'd hate to tell you how much it costs in server capacity alone to build a list. The SERLists team has 6 dedicated servers going 24/7. You can go down that road (building your own lists) if you want to, but if we weren't building lists, we would be buying them.

    Contextual links also consume a ton of processing power. Even with a clean verified list, your LPM will be really low. That's also why it's so hard to build a good list of contextuals. The processing power required is enormous.
  • @satan apprentice,am glad you made it clear to everyone :)@shanrocks666 you can check out my contextual link list service,i think you'l love it honestly.
  • edited September 2014
    @gooner

    Hi gooner , you have mentioned about running OCR for only tier 1 project.

    Can you please tell me know how could i run OCR for only tier 1 project while runningg all tier project at once.
    "


  • looplineloopline autoapprovemarketplace.com
    @shanrocks666
    In the project options you can choose to use 1st captcha service or 2nd or all or ask user or a combo of those. 

    So you can just set your reCaptcha service as service 1 and then in Tier 1 tell it to use all captcha services

    Then for tier 2+ you set your project options to only use the 2nd captcha service, which you could set as Captcha Breaker or whatever. 

    You could also reverse that, but the method by which you do it is to use that option in project options and then set your captcha services appropriately. 
Sign In or Register to comment.