There are a few different ways that you can do this. Personally I only have the "verified" folder ticked and the location of this folder needs to be an empty folder, as this is where GSA will save your newly verified links.
The other 3 folders identified, submitted, failed can be used to store the location of any site lists you've purchased or are wanting to test. I only save the verified links, hence why it is the only folder I have ticked above.
In project settings, ONE project I will use and set it to run with the identifed, submitted and failed folders. This depends on if you've utilised all 3 folders with different site lists. I do not run this project with the verified folder. This project will run super duper slow as the list of sites is untested with your set up.
ALL my other projects will be set to run with verified folder only. These proejcts will run super duper fast, as they are using only freshly verified link list which you just created with the project above.
If you follow that, you will get higher vpm.
Beyond the site list, it's about proxies, emails and internet speed that can either make things run super quick or super slow.
I'm getting over 800vpm on a hetzner ax51 dedi with 200 mexela dedicated proxies and private catchall email, with xevil and captcha breaker running also.
According to your settings, my VPM has been optimized. Currently VPM: 80, but my LPM is still only 120+. Can you share a copy of your project settings? Thank you very much.
These settings don't really change much unless i'm processing a scraped list or using article sites with re-posting, or using entire site list for indexing. Each scenario will have slightly different settings.
@sickseo I checked my settings, the only difference between us is "Ask all services/user to fill captchas" and "Ask user/services in custom order", I think the problem should be on the verification list, thank you very much
The list makes a big difference. I've tried all the list sellers and I've not had any good results with them. I do my own scraping which is the way to go. I use hrefer and platform identifier and have a few projects processing it continuously. Takes ages to find working sites, but worth building your own.
For scraping you need to have a lot of proxies to be able to do it with any speed. If you are trying to scrape from google then public proxies won't work. Residential proxies would be best with services like smart proxy. Any of these residential/rotating proxy services are very very expensive, but will produce excellent results and speed. The cheaper ones like storm proxies are already blocked by google but will be good for scraping bing.
Scraping from google clones will be much faster, as well as scraping bing/yahoo as they are not as trigger hapy as google when it comes to blocking ips. I'm using public proxies from gsa proxy scraper with hrefer.
It's found 42,000 urls using GSA built-in footprints in just over 10 hours. I'm running multiple hrefer installs on different machines. Can run up to 5 installs with business license. It comes with xevil which you'll need for v2/v3 solving.
Comments
According to your settings, my VPM has been optimized. Currently VPM: 80, but my LPM is still only 120+. Can you share a copy of your project settings? Thank you very much.
Is that what you were wanting to see?
I want to see these options, I’m sorry to trouble you, thank you
These settings don't really change much unless i'm processing a scraped list or using article sites with re-posting, or using entire site list for indexing. Each scenario will have slightly different settings.
I checked my settings, the only difference between us is "Ask all services/user to fill captchas" and "Ask user/services in custom order", I think the problem should be on the verification list, thank you very much
Scraping from google clones will be much faster, as well as scraping bing/yahoo as they are not as trigger hapy as google when it comes to blocking ips. I'm using public proxies from gsa proxy scraper with hrefer.
It's found 42,000 urls using GSA built-in footprints in just over 10 hours. I'm running multiple hrefer installs on different machines. Can run up to 5 installs with business license. It comes with xevil which you'll need for v2/v3 solving.