Skip to content

Use Latest URLs From Lists First?

Okay as far as I know a project set to get URLs from lists gets a batch of URLs randomly and tries to post to them until the batch is finished. Then takes another random batch and so on..

As my URL list is being built the newest URLs are being added at the end of each platform txt file. My list has become so huge that I'm not even sure about the success rate of some of the first URLs that were scraped at the beginning. Now I know that I should clean up my list from time to time, and I do, but I still would like to try and post to the newest added URLs first. I do use google 24 hours as well in my URL scraping so I'd like to take advantage of posting to fresh URLs as soon as possible.

So would it be possible to create an option in GSA SER to make a project try to post to the newest added URLs first(this would be the lowest URLs in the list, the last lines in the txt) ?

Comments

  • SvenSven www.GSA-Online.de
    SER reads the next batch from the end of the file and truncates the file at the position where it read from to the end. So it gets the newest added links first.
  • Oh I didn't know that. Don't know where did I get this "random batch" thing from. Thanks @Sven this can be moved off from feature requests into questions then. :)
Sign In or Register to comment.