Skip to content

10 million linkaroonies

I'm set up on a vps with private proxies. I loaded/imported 10 million urls which took an hour to load in. I click check websites and it takes a few hours to prepare data and it gives me a loading bar, then crashes after 10k were checked/failed/filtered. I've retried 4 times today. I can't split it into 1 million because I to edit the file it requires gigasheets and it is a lot of money. I also tried scraping from the gigasheets file and for some reason my scraper on gets 27 rows from each page of 100. So I wanted to check all these in one go then put the ones that are checked into a file becuase then excel would accept one with 1 million rows.

Comments

  • Devender_GargDevender_Garg India
    What is the orignal file format you got?

    Scrapebox has an excellent function to split text files and it is pretty good.
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    nickman Try opening it with notepad++

  • Alisafeer460Alisafeer460 Hyderabad, Pakistan
    You should spilt it into multiple files.
    Here is the free online tool.
    https://products.groupdocs.app/splitter/txt
Sign In or Register to comment.