Skip to content

GSA Wishlist and questions

Hello, thank you for checking out my wishlist and questions. I'd like to start by saying GSA is the best.

Could it be possible to set the time for when to send tier links? For instance, I'd like to time it for an hour or two so as to not send second tiers to removed links. 

For the proxy modules, could it be possible to set a constant state of testing? Mostly for scraped and port scanned proxies to ensure there is a ready pool of fresh tested proxies to be used?

For deleting emails can we set the time for sooner than a day, for hours too?

For Proxy Scanner could we be able to tag the source as well? Some tools work better with different sources of proxies and mixing them up is a pain. When I set the output files I can only tag them for passed for.

When deleting duplicate URLs. I see we can choose which folders to do. Could we have the option to remove dupes across all folders? There could be a priority for which folder gets to keep the dupe and which ones will Lose it. 

When working with the tools and folders, could it be possible to merge folders from outside the id/ver/sub/fail folders? I see in the tools we can move and merge id/ver/sub/fail from within SER but it's not possible to merge with Lists outside.

When a project is running, and it's set to only post to a certain country, but it IDs a suitable platform of the wrong country .. does it still save it to the identified folder for later?

Within a project, we can choose what countries to post to, could we have it so we can set the country only for certain individual engines within the project?

Could we set up projects to read URLs off a single txt file instead of getting URLs from search or system folders? Sort of like when we right-click URLs into the project. My idea is I could set up my scraper to drop txt files of URLs into a folder and SER could grab them up, bring them in as if they were searched by SER to be sorted out? And then with the option for SER to delete the file after reading (or keep it I guess too).

I better stop there, I have a ton of other stuff to ask about but I think I'm starting to go too far haha. 

Thanks for reading.





















Comments

  • SvenSven www.GSA-Online.de
    Could it be possible to set the time for when to send tier links? For instance, I'd like to time it for an hour or two so as to not send second tiers to removed links.

    Thats possible already. You can set this on the tier project itself. There you would setup a tier filter and only take urls from the main project being xyz days old. 

    For the proxy modules, could it be possible to set a constant state of testing? Mostly for scraped and port scanned proxies to ensure there is a ready pool of fresh tested proxies to be used?

    Hmm I don't understand this request. You can set it up to test whenever you want no?

    For deleting emails can we set the time for sooner than a day, for hours too?

    Would not make much sense as some engines have a long delay on the emails that arrive before removing that link again.

    For Proxy Scanner could we be able to tag the source as well? Some tools work better with different sources of proxies and mixing them up is a pain. When I set the output files I can only tag them for passed for.

    Would not make much sense to me at least. A proxy is either good or bad, reliable or not. I would not set that up on the source found. You have many filter for the export to e.g. only export proxies with high speed or a reliability of xyz%.

    When deleting duplicate URLs. I see we can choose which folders to do. Could we have the option to remove dupes across all folders? There could be a priority for which folder gets to keep the dupe and which ones will Lose it. 

    You mean for the site lists? Its not a good idea to remove duplicates across the different site list types as there is a reason to have a site in submitted and verified to later determinate which one is worth to submit to.

    When working with the tools and folders, could it be possible to merge folders from outside the id/ver/sub/fail folders? I see in the tools we can move and merge id/ver/sub/fail from within SER but it's not possible to merge with Lists outside.

    You can import the *.sl files from outside as well when choosing that file instead of plain text files. If you have just folders, simply zip them into a file and rename it to *.sl

    When a project is running, and it's set to only post to a certain country, but it IDs a suitable platform of the wrong country .. does it still save it to the identified folder for later?

    Yes

    Within a project, we can choose what countries to post to, could we have it so we can set the country only for certain individual engines within the project?

    No, you need to split the project up for that then.

    Could we set up projects to read URLs off a single txt file instead of getting URLs from search or system folders? Sort of like when we right-click URLs into the project. My idea is I could set up my scraper to drop txt files of URLs into a folder and SER could grab them up, bring them in as if they were searched by SER to be sorted out? And then with the option for SER to delete the file after reading (or keep it I guess too).

    You can export that to the project itself in...
    c:\users\<login>\appdata\roaming\gsa search engine ranker\projects\<project name>.new_targets
    Thanked by 1nycdude
Sign In or Register to comment.