Skip to content

Performance Suggestion: For Larger Lists

Hi

I love this software. It's one of the best and I use it daily, but I'm finding it a bit sluggish with larger lists.
1) It spends ages 'loading a project' when it has many urls in it.
2) I auto load a bunch of URLS (rather than fill it with keywords)

It seems that the more URLS there are in a project the slower it is. I've also found that about 10% of urls get posted to successfully.
If you have 1M urls in 'sent' only:
100K will be successful.
900K will have 'failed' for whatever reason.
Thus 90% of a projects url data is just slowing it down that doesn't need to be there.


Now imagine that X5 over 5 projects.
500K success and 4.5million failed!

My Suggestion:
1) Have a feature that auto exports all failed submissions to a TXT file for the project and then removes them.
- this way, it will keep projects neat and tidy only with successful submissions only.

We can always re-import the failed ones but they don't need to sit in projects slowing everything down and bloating it especially when the failed to success ratio is 9/1. I.e. 900K failed urls to 100K success.

I'm currently doing it manually for each project but it's such slow work..

Comments

  • SvenSven www.GSA-Online.de
    but the failed urls are still undetected at first and it would only speed up over time. So not much of a value really.
  • AlexRAlexR Cape Town
    but the failed urls are still undetected at first and it would only speed up over time.

    Exactly! Why are the failed URLS kept in a project after we know they've failed?

    I.e. if you have a project with 1M urls in it and 900K are failed.

    Now when you try edit it or add any new urls it's super slow because it has to show the 900K failed urls in the display.

    So instead of the project only having 100K (the success ones) it's got 900K failed that just make editing or adding to the project very slow. If it only had the 100K success ones in it, it's super fast.

    Do this for 10 projects and it's super slow!

    Not sure if I'm explaining correctly but after I remove the 900K failed URLS from a project adding new urls or tweaking settings is super fast rather than painfully slow. Would be nice if this could be automated in some way.

  • AlexRAlexR Cape Town
    I think the .dat file size has something to do with speed.
    1M URLS in a .dat is about 100MB
    but if you remove the failed urls it's 10MB.
  • Hi, AlexR how to use scrape hugh list? Because GSA can't scrape hugh list is fast
Sign In or Register to comment.