Question about making my list myself
Hi,
Let's say I use lots of the footprints built in SER to scrape URLs with scrapebox. Then I put all the URLs in a txt file and import them in SER. Will it recognize the engines itself and post to them or do I have to split my txt file by engines and import each file the one after the other?
Thanks
Tagged:
Comments
Since GSA PI can handle up to 5,000 threads and is much more stable and faster compared to the Scrapebox extractor addon, that gave me an idea - if that’s the case, I could literally use it to extract sitemaps or internal URLs, then pull external links from each page, process them, and keep repeating the loop to end up with millions of URLs. Instead of using Scrapebox’s deep crawl feature (which can get glitchy and slow, especially when running up to level 5 with only 500 threads), I could just use the Scrapebox Sitemap Extractor addon.
That way, for every site in the list that supports a sitemap, it would extract all internal URLs directly from the sitemap - much faster and more stable. Then, those internal URLs could be processed with GSA PI’s link extractor to pull out external links efficiently.