What Uses More CPU/RAM: Processing Raw Lists or Submission/Verifying
JudderMan
UK
Simple question, but one I want to hear from the pros:
I'm moving some projects to a new server and proxies as I want to localise them to the EU but wondered how hard SER runs just processing lists (raw ones from Gscraper/Scrapebox/ScraperBandit) compared to posting and verifying them. I've been using SER for a long time but I'm not 100% sure on the answer.
Thanks in advance
Comments
So far i didn't see that SER would use a lot of ram,rather CPU.
Scrapebox is using the RAM,when you finish with the scraping.
Ah, got it. Thanks