Skip to content

Huge Queue

@Sven

Queue:


I have this huge project and am at about 2.7gb of ram. I'm ok for now, but I know with the program being 32 bit I may run out as this project spiders and scrapes.

Any thoughts about the best way to  handle this if the project bloats up more in pages to scrape?

Comments

  • SvenSven www.GSA-Online.de
    Well indeed thats close to the edge. Can't you split this up to e.g. different countries or any other data?
  • gsatreegsatree us
    edited October 25
    Yeah I can't unfortunately. All I can think of to do is to turn off the sublink digging for a while. I'll probably miss stuff but it is what it is I'm thinking?
  • SvenSven www.GSA-Online.de
    why is splitting in country not working?
  • Sven said:
    why is splitting in country not working?
    It's what I'm scraping specifically that it doesn't apply.

    In my use case I figured out the answer tho.

    Just save my results, then clear them since that number was huge too = massively decrease mem usage.

    Then continue autosaving under a new file name.

    When done I can merge and dedupe if any.
Sign In or Register to comment.