It looks like you're new here. If you want to get involved, click one of these buttons!
Would it be possible to create an independent tool for
importing and sorting URL’s, rather than doing it via GSA SER as it’s done
currently? I frequently scrape with scrapebox, and as a result, I’m constantly
trying to import these newly discovered URLs into GSA SER, so that they can be
sorted, but this has become a real pain. While importing, the program
constantly freezes, until the importing and sorting process has finished (which
takes some time, due to the size of the files I import and the frequency), and
it’s becoming a nightmare to use in conjunction with actually creating
backlinks with GSA SER.
At the moment, I pretty much have to stop my projects from
running, and set aside a specific time to import and sort my urls, which is
obviously cutting into the time I’m actually able to use GSA SER to create
backlinks. I then have to remove duplicate urls, which again takes time.
Would it not be possible, to create an independent tool, in
the same way as GSA Indexer, which can run along side GSA SER? You specify
where you want the identified url’s stored, and it then imports and sorts them within that folder?
And if you have GSA SER running at the same time, it can then also make use of
the newly imported URLS which have been identified, assuming your making use of
the global list? It would also be great, at if the end of a import and sort
run, you could set it up to automatically remove duplicate URLs, rather than me
having to do this manually.