Skip to content

Feature request - time cap in seconds per site

Hello. As a whole i am very happy with GSA Email spider, however the most annoying thing that happens to me is on some sites (http://www.eeco.ro/ for example) it spends A LOT of time. And by a lot i mean 10-20 or more minutes. I would like an option to set maximum time allowed to be spet by a site. I would gladly skip a site and lose 1-2 emails if in the time it takes to fully parse that time, it would collect 100 other emails. Thank You.

Comments

  • And i DO use the option "Skip whole domain when no item ... " but it does not seem to work, or time is tooooo big
  • SvenSven www.GSA-Online.de
    May I ask what you try to do? Searching for emails on imported URLs? If thats for the contact/imprint you try to find and then the email, you can simply parse just 1 sublink. This will increase speed.
  • edited November 2014
    I am using a keyword list and "Usesearch engines" to generate url's.  I use lists of max 2-300 keywords

    I have checked "parse results for new sublinks" and "Extra check for keywords".  I have not checked "Check also on sublinks".

    What is the benefit of "parse results for new sublinks"  ? 

  • SvenSven www.GSA-Online.de
    that option would go to the url delivered by search engines and parse every link on that one (but has to be on same domain/host).
  • edited November 2014
    and if email adress is not on the first url and imediate sublinks, will it fail to harvest the email of the site ?
  • Thanks Sven for the great support. Issue was adressed by me in greater detail by mail and an update was released in 2-3 working days. Starting 7.15 i no longer have this issue.

    Tx Sven and great support. keep up the good work.
    :-bd
  • SvenSven www.GSA-Online.de
    Your welcome :)
Sign In or Register to comment.