Skip to content

Program Error while not at 3.2 Gb limit

I am running into this problem while scraping multiple projects while gb limit is like running at less than 1 gb, this makes the program crash and projects mess up so once i get this error i got to revert to an backup.


Comments

  • I have noticed this sometimes when I keep messing with the proxy harvester while running the projects. Are you doing that?

    it may also be possible that this is happening because you are running concurrant projects on high number of threads for scraping. Try and schedule them using the project scheduler.

    Whats your memory running at with those 3736 threads ?
  • dp001 said:
    I have noticed this sometimes when I keep messing with the proxy harvester while running the projects. Are you doing that?

    it may also be possible that this is happening because you are running concurrant projects on high number of threads for scraping. Try and schedule them using the project scheduler.

    Whats your memory running at with those 3736 threads ?
    Im using Private rotating proxies so not the harverster, and the highest stable number of threads i can get is 2000 and on 2000 threads it juses only 750-800 mb of memory
Sign In or Register to comment.