Skip to content

Automating Scraping + Testing URLs?

In order to ensure that my verified list is always growing, I want to automate scraping + posting in SER.

I use Scrapebox for scraping. So I can easily set up the SB automator plugin in advance with lots of different scraping projects with different keywords, exporting to a separate file in a folder each time.

Now how do I set up SER to import these new raw URL lists into a test project periodically? Is this possible?

I currently have a very basic VPS ($20 from Solid SEO VPS) and it can run GSA SER and Scrapebox at 150 and 30 threads respectively simultaneously.

Comments

  • SvenSven www.GSA-Online.de
    you can write yourself a little tool that adds new urls to <projectname>.new_targets file and SER will read them.
  • Oh? Will this be similar to those software (like loopline's Bluebox Sync) that put in daily URL lists into projects?

    So does SER have an API for doing this? Any guides, tutorials? :)
  • SvenSven www.GSA-Online.de
    no api but the format of the files should be easy to understand and you can simply modify it with an external program and SER will detect the changes and reload.
  • Oh I see. How do I set up SER to read this file automatically?
  • If I simply import target URLs to a project with a file named like projectname.new_targets, SER will automatically detect changes to the file and add any *new* URLS only to the project?
  • Oh right. I see the file is located in SER's AppData.

    Now I'm not that much of a coder, only if someone would write a small tool for this. Would be useful for everyone to get a truly set-and-forget SER setup. :)
Sign In or Register to comment.