Skip to content

Monitor Folders (Not Working)


When I try to start monitoring an ongoing scrape folder from scrapebox ("Harvester Sessions Folder") it starts to load but after a couple of seconds (2 - 3 seconds) it just halts - in the request tab it says that the status is at "idle".



PS i'm running the trial version, will purchase as soon as i get this sorted out.


  • s4nt0ss4nt0s Houston, Texas
    Yes it will hault at the interval you set it at, but keep in mind while scrapebox is scraping and writing to the file, it will lock the file while its in use, then when SB starts on the next file by default, it will free up the previous one. 

    I guess that's just how Scrapebox works.

    You can write some URLS in a test file and drag it into that folder you're monitoring and you should see them get sorted on the next interval check.
  • Is there a feature that automatically delete duplicate urls without manually running the remove duplicate tool all the time? And exporting to gsa as of now is not as smooth as I expected it to be. I bought almost all your software and sometimes multiple licenses, is there a way I can get a discounted price for this? Thanks!
  • s4nt0ss4nt0s Houston, Texas
    The automatic dup remove is already added to the latest version, I just haven't pushed that version yet because I've been testing. Not sure what you mean the exporting to GSA isn't as smooth as you expected to be. Currently no discounts available but maybe in the future.
  • If for example the software has identified 30% of the list, if I export it to gsa will it overwrite or just add the links when I'm done processing the whole list?
  • s4nt0ss4nt0s Houston, Texas
    You can right click on the project > export to .SL 

    Then in  SER you can go to the obtions > advanced > tools > import site lists. Then import it into the folder you want (usually identified).
Sign In or Register to comment.