Skip to content

Need help with verifying scrapped contextual links

I am currently trying to scrap my own list of contextual links but my results are not too encouraging, in my opinion, so need a little help.

In terms of scrapping links, I am using the default GSA's footprints, and I use gscraper list of keywords to do the scraping, and I am only scraping for targets that allows contextual links.

In GSA, I am running 800 threads, and I have 15 test projects running to do submissions and get verified list. Each project, I have the following settings (ones that I presume makes a difference):
- I don't allow posting on same site again.
- I use 2 email accounts per test project only
- Links type to create (anchor text, article, article-wiki, pdf upload)
- I keep adding target urls to the projects before they run out, so I have a long queue of 100k plus links for each project to run through continuously.

But my results so far from the last 24 hours of running has been rather discouraging. Not sure if my results are typical or not. After removing duplicate urls and removing duplicate domains, I end up with just 300 or so verified links. The number of verifieds is a lot higher at 1317 currently after 24 hours, submitted are 3022, but a lot of duplicates which I've removed, so ended with just 300 plus unique verified links.



Are my results typical, bad or good?

How can I improve upon this?

Do I need to set the project to do verification work to get more verified links? As I am continuously adding target urls to the projects, does that mean the projects have no room to do verification work, does it work that way?
Sign In or Register to comment.