Skip to content

Anychance of a Blacklist removal?

shaunshaun https://www.youtube.com/ShaunMarrs
@Sven

This is my current process.

Scrape (Scrapebox) - Identify (GSA PI) - Verify (GSA SER) - Live link building projects (GSA SER)

A massive amount of duplicate target URLs manage to make it through the system each time and end up getting reran through verification and waste a fair amount of system resources that could be used elsewhere.

Is there a way you could add a folder where the user would put all the scraped URLs from past scrapes. Then when a new project is made in PI for a new scrape it scans this folder first and removes all urls that have been ran in the past before identifying the URLs? Another way would be to have it run the new project and then at the end compaire it to the new list it has just produced and then remove all the duplicates from the old scrape folder. Another way that is probably the easiest is to have some type of mini feature where the tool compaires two folders to each other and removes all entries from folder B that are seen in folder A leaving folder B as fresh targets only to be identified.

Its late here and im crazy tiered so if im not making myself clear just let me know and I will try explain it some other eay.

Cheers

Shaun.

Comments

  • Scrapebox v2 has this feature,

    Import URL List -> Select the URL lists to compare (or on domain level)

    It can help you remove the URLs (or domains) you have already processed.

    Do note that many URLs will fail due to incorrect captchas or temporary unavailable. If they get into your blacklist, you'll never give them another chance to get verified again...
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Ideally I wanted as hands off as possible, I know it only takes a few seconds but I rather just set automator up and let it run over and over and over dumping its URLs into a folder PI monitors.

    Once through the process target URLs are pushed through multiple times so they have a decent chance of getting veriffied.
  • s4nt0ss4nt0s Houston, Texas
    @shaun - Thanks for the suggestion. Pretty good ideas, I'll look into them and see which would be best to add. 
Sign In or Register to comment.