👉 GSA SEO and Marketing Forum 👈
GSA Search Engine Ranker
It looks like you're new here. If you want to get involved, click one of these buttons!
GSA Search Engine Ranker
Other / Mixed
GSA SEO Indexer
Other / Mixed
GSA Captcha Breaker
Buy / Sell / Trade
GSA Content Generator
GSA URL Redirect PRO
GSA Platform Identifier
GSA Proxy Scraper
GSA PR Emulator
GSA Auto Website Submitter
GSA Keyword Research
GSA Email Spider
GSA Email Verifier
GSA Image Spider
GSA Targeted Email Finder
GSA Website Contact
Other GSA Software Products
Other / Off Topic
In this Discussion
What LPM you get with only Article, Web 2.0 and WIKI Submission Selected Only
I only get about 2-3 LPM with these option selected
, good luck then. Do search this forum using Google,
site:gsa-online.de scrape list
There are plenty of tips and here's one of mine,
In the project options you can choose to use 1st captcha service or 2nd or all or ask user or a combo of those.
So you can just set your reCaptcha service as service 1 and then in Tier 1 tell it to use all captcha services
Then for tier 2+ you set your project options to only use the 2nd captcha service, which you could set as Captcha Breaker or whatever.
You could also reverse that, but the method by which you do it is to use that option in project options and then set your captcha services appropriately.
Are you scraping with SER?
Well web 2.0s only work well if you have serengines.
Wiki's are quite to hard to find and articles are best with a Recaptcha OCR.
So all in all it's not that easy to get those links.
You could get serengines and CaptchaTronix or ReverseProxy OCR or alternatively you could buy a verified list to save you scraping and increase LpM
, scraping for contextual sites is not an easy task. If you just started using SER, I strongly suggest you buy a good list first. Then slowly learn how to scrape effectively, either with SER or an external program like Scrapebox or Gscraper. I just heard SERList.com is going to release a new list with 10,000 contextual links! Also comes with great tutorials to help you get started. Check the Buy/Sell section...
edited August 2014
Yes i also think ,it is diffuclet to get better LPM with them , I have SEREngines and CaptchaTronix subscription.
But i have canceled CaptchaTronix subscription because they are very bad , only 25% solving rate and maximum is wrong.
Also with other selection i get LPM of 80-90
, Buying list is my last option . I want to build my own list . So i am trying different way of getting More LPM and verified .
LPM is really low.
Assuming you are using tiers...You could try using CaptchaTronix or other OCR only for tier 1.
I have been that recently and it makes a big difference because it doesn't stress the OCR or your proxies trying to deal with Recaptcha for all projects.
Oh and thanks
for the recommendation
Shan ...keep in mind that LpM is just speed ....i for one prefer quality over quantity / speed.
But if you want higher LpM i suggest you scrape with gscraper / hrefer and import the list to ser. That way your LpM will rise but i believe this is the only option you can actually use to speedup article submissions.
: I'd hate to tell you how much it costs in server capacity alone to build a list. The SERLists team has 6 dedicated servers going 24/7. You can go down that road (building your own lists) if you want to, but if we weren't building lists, we would be buying them.
Contextual links also consume a ton of processing power. Even with a clean verified list, your LPM will be really low. That's also why it's so hard to build a good list of contextuals. The processing power required is enormous.
apprentice,am glad you made it clear to everyone
you can check out my contextual link list service,i think you'l love it honestly.
edited September 2014
Hi gooner , you have mentioned about running OCR for only tier 1 project.
Can you please tell me know how could i run OCR for only tier 1 project while runningg all tier project at once.
Powered by Vanilla
jump to page