Where to Focus Scraping for Most Link Success
Im using ScrapeBox to scrape URLs using the footprints that comes with GSA in order to take the load off of GSA so it can focus only on posting. Most of what I want to use GSA for is tier 2 because I create my tier 1s on blog networks and other Web 2.0 platforms. So my goal with GSA is to send as much link spam as I can at these tier 1 properties.
Do you guys think it is necessary to setup a tier 3 with GSA? Since im trying to send as much links as possible to the tier 1 with tier 2, the tier 3 links would probably be just the same amount as the tier 2 so I dont know how useful that'd be.
Anyway, the main reason for the thread was to see what CMSs I should focus my scraping on..again going for the most amount of links possible. I've gone through and singled out only the platforms that CB can handle, pulled the footprints for those platforms and have started to scrape and import, but it is a slow process with only 20 private proxies and at 4 connections. It takes an entire day to scrape 15k keywords.
Today I focused on social networks and only got around 25k domains out of 3 million scraped that werent duplicates. Then out of those 25k that I imported to GSA, I think I had a bit over 2k that were successful in submitting. Id really like to get more bang for my time here and wondering where I should focus the majority of my scraping efforts.
Also, once I create my lists for each platform, is there an easy way to separate the new links I scrape from the master lists so I dont keep on trying to import the same URLs again?
Thanks for any help you guys might be able to provide.
Do you guys think it is necessary to setup a tier 3 with GSA? Since im trying to send as much links as possible to the tier 1 with tier 2, the tier 3 links would probably be just the same amount as the tier 2 so I dont know how useful that'd be.
Anyway, the main reason for the thread was to see what CMSs I should focus my scraping on..again going for the most amount of links possible. I've gone through and singled out only the platforms that CB can handle, pulled the footprints for those platforms and have started to scrape and import, but it is a slow process with only 20 private proxies and at 4 connections. It takes an entire day to scrape 15k keywords.
Today I focused on social networks and only got around 25k domains out of 3 million scraped that werent duplicates. Then out of those 25k that I imported to GSA, I think I had a bit over 2k that were successful in submitting. Id really like to get more bang for my time here and wondering where I should focus the majority of my scraping efforts.
Also, once I create my lists for each platform, is there an easy way to separate the new links I scrape from the master lists so I dont keep on trying to import the same URLs again?
Thanks for any help you guys might be able to provide.
Comments
u would not use blog comment shit, thats low verify..