Skip to content

very slow parsing sometimes

sometimes, spider is taking waaay to long to parse a site or some links. And by wayy to long i mean several minutes for each link. If the site has tens similar pages ....
Is this a bug? can a option be introduced like a time cap / site or /link?

thank you

Comments

  • I have the same problem.

    I'm scraping google and bing for top 10 results. The most of the sites are parsed very quickly but "google.us" takes 10-50 secs (and sometimes other sites).

    Is it a bug, some kind of protection against google ( I'm using 100+ fast proxies), or is there any config change I should do?

    Thanks
  • SvenSven www.GSA-Online.de
    @lone_promoter there is already an option to skip certain sites/domains if no result was found for a longer time.
    @dbrick sorry there is no known bug present for me.
  • edited December 2014
    Thanks Sven for the great support. Issue was adressed by me in greater detail by mail and an update was released in 2-3 working days. Starting 7.15 i no longer have this problem.

    Tx Sven and great support. keep up the good work. :-bd

  • SvenSven www.GSA-Online.de
    Your welcome :)
Sign In or Register to comment.