Question for you, does scrapping with non English keywords only result in non english urls or i can also get list of English urls with non english keywords?
@raycol What do you mean by non English URLs? Scraping with foreign keywords give you both completely foreign and english pages - some may have a partial reference to different languages but are still, essentially English while others would be pages in completely different languages. You simply get a mixture of both, but the proportion varies depending on what language you choose to use.
All orders processed today. Thank you @Loopline for the amazing video! I'm sure it'll give others here who are new an excellent idea of what the list is about.
Thanks @FuryKyle I hope it is of a help to someone. My only problem I have run into with your keyword list is that its so large I have to make myself not merge in too many footprints. Having too many potential queries because of a great keyword list is a good problem to have though!
@loopline: Do you only run on scrapebox or also use Gscraper. SB is might slow if we compare to Gscraper. Second, how many proxies u use, semi or private or port scan proxies ?
@redfoxseo I only use SB. If you compare only google in gscraper to google in SB and you have the connections turned up etc... then gscraper can scrape faster, for now. Scrapebox 2.0 screenshots show that it is faster then gscraper as well as it has well over 20 engines, rather then just 1, so it could be Exponentially faster.
Regardless, speed isn't what I Am after, I can harvest millions of highly targeted urls with private and shared proxies, I also use stuff like proxy rack, although they are slow, but work well for advanced operator queries. But I don't have a time frame in mind when I start scraping, I am looking for targeted quality results over speed.
SB is much more powerful taken as a whole and not just on the speed front, so I only use SB. I have maybe 150 or there abouts semi private and private proxies combined from various providers. Truthfully I don't even have all of them being used for scraping 24/7.
Like I said, I put my focus on mastering footprint creation and I am able to get what I want in a highly targeted way, so I do scrape constantly, but my goal is actually to return only what I want and return as few useless urls with it as possible.
@redfoxseo I completely agree with loopline. SB just proves to be a much more stable and powerful scraper compared to Gscraper, and I find that most of my scrapes are very targeted and have a higher conversion rate when importing to GSA. I do include some tips on scraping as well as a step-by-step guide along with recommendations on what proxies to use in my guide so you're covered no matter what.
Bought yesterday. Understand how to use it... freaking awesome! I have Scrapebox so it makes it even easier.
QUESTION:
If I break up these lists into smaller, 100 keywords per text file.. and then go use the EDU & GOV footprints..
Is the next step to go into GSA and just Import and Identify those sites that I scraped?
Do I go 100 results or 1000 results in the Search Engines & Proxies section?
No need to truncate the URLs to the root. Just take one big list and import and verify?
So those sites are now available in my current running project?
I
assume I can do 1 small text file at a time, get the URLs and import.
No need to create a huge list with multiple scrapes if I just want to
attack this very slowly?
Comments
What do you mean by non English URLs? Scraping with foreign keywords give you both completely foreign and english pages - some may have a partial reference to different languages but are still, essentially English while others would be pages in completely different languages. You simply get a mixture of both, but the proportion varies depending on what language you choose to use.
I bought some time ago so not sure if i am eligible for updates still.
Cheers
Thanks again.
Your welcome, glad you liked the scrapebox channel.
-SuperSEO
I only use SB. If you compare only google in gscraper to google in SB and you have the connections turned up etc... then gscraper can scrape faster, for now. Scrapebox 2.0 screenshots show that it is faster then gscraper as well as it has well over 20 engines, rather then just 1, so it could be Exponentially faster.
Regardless, speed isn't what I Am after, I can harvest millions of highly targeted urls with private and shared proxies, I also use stuff like proxy rack, although they are slow, but work well for advanced operator queries. But I don't have a time frame in mind when I start scraping, I am looking for targeted quality results over speed.
SB is much more powerful taken as a whole and not just on the speed front, so I only use SB. I have maybe 150 or there abouts semi private and private proxies combined from various providers. Truthfully I don't even have all of them being used for scraping 24/7.
Like I said, I put my focus on mastering footprint creation and I am able to get what I want in a highly targeted way, so I do scrape constantly, but my goal is actually to return only what I want and return as few useless urls with it as possible.
I completely agree with loopline. SB just proves to be a much more stable and powerful scraper compared to Gscraper, and I find that most of my scrapes are very targeted and have a higher conversion rate when importing to GSA. I do include some tips on scraping as well as a step-by-step guide along with recommendations on what proxies to use in my guide so you're covered no matter what.
List sent.
PM replied.
@Blandos
Order confirmed and processed. Happy scraping!
Order processed and list sent. Enjoy
No problem! Glad you like it.
@chopos
PM replied.
All orders processed today.
QUESTION:
If I break up these lists into smaller, 100 keywords per text file.. and then go use the EDU & GOV footprints..
No need to truncate the URLs to the root. Just take one big list and import and verify?
So those sites are now available in my current running project?
I assume I can do 1 small text file at a time, get the URLs and import. No need to create a huge list with multiple scrapes if I just want to attack this very slowly?