@Sven Queue:
I have this huge project and am at about 2.7gb of ram. I'm ok for now, but I know with the program being 32 bit I may run out as this project spiders and scrapes.
Any thoughts about the best way to handle this if the project bloats up more in pages to scrape?
Comments
In my use case I figured out the answer tho.
Just save my results, then clear them since that number was huge too = massively decrease mem usage.
Then continue autosaving under a new file name.
When done I can merge and dedupe if any.