Skip to content

Auto import urls

edited February 2020 in GSA Search Engine Ranker
Hi (sorry for noob question but not use for long time):
I have a doubt, if I want to scrape the URLs with Scrapebox or Gscraper and use directly in Ser (always in scraping and not filtered) how do I do so that the list of urls which are refreshed without interruption is imported without stopping?
I remember that you had to use the site list. But since it is a single file I have a doubt. If I import into my Ser projects directly, will it do the trick (the process shouldn't stop ???)

Thank you
Thanked by 1barnacerrajero


  • SvenSven
    if you save it to the project file with <prj>.new_targets it will work directly/instantly.
  • KaineKaine
    Thank you, just where i save the file ? 
  • SvenSven
    c:\users\<login>\appdata\roaming\gsa search engine ranker\projects\<project name>.new_targets
  • KaineKaine
    Thank you Sven
  • KaineKaine
    @SvenIt works well but how to make so that all projects can take in a single file?
  • SvenSven
    impossible as each project would want to modify the file on its own.
  • KaineKaine
    Maybe file locked time one thread take many urls ? No says...
Sign In or Register to comment.