How to handle big files? How to filter them?
Hi guys, i have small problem, xrumer / scrapebox / gscraper cant handle it.
I have 2 domain lists:
1 list = 20 milions
I have 2 domain lists:
1 list = 20 milions
2 list = 10 milions
I want to filter out all domains that exists in 1 list from 2 list.
These list are growing each day and i dont know how to handle bigger lists in future, someone can help me?
Comments
And this addon dont have this option. I have to filter domains, not remove dups.
And no, i cant split them into 1 file, remove dups and split again into 2 files.
I need something like that:
Xrumer can handle it, but it takes a lot of time to load the files. I think maybe there something faster.
However there are some programs that do not use just RAM to store the tempoary data; so even with a little RAM you can still open it.
You have to understand the task that is given.
Delete all domains that are in list A from list B.
This is not "remove duplicate domains"
Yes gscraper can open these files, but there is not such options.
Yes this amount of domains is not huge, but for that task it is.
They are also growing all the time.
Xrumer have such option but after 10 hours task is completed in 3% - xrumer took only 600mb of ram so stop telling me about ram.