Diji1 has perfectly answered your question Unfortunately I believe it's against the forum rules to talk about other forums here, so I cannot do that. It's not too hard to find, though.
Thank you for the compliments Will keep it updated as often as I can. Happy scraping! I'm sure you'll get tons of untapped sites, especially with the other languages.
Yes, that is on my to do list. However, not many platforms have a "foreign" version so it may or may not be quite limited. I'll make sure I'll get some out after I add a few more new languages (which are coming very soon).
already paid, but never receive the keyword. Already pm you but no answer, even email to you.. Don't make me wait so long..
thomass
Hi there Furykyle, Bought your list this morning, still waiting to receive it?
FuryKyle
Thank you @Diji1 for helping out at my thread I'll send you a bonus list on the next update.
I've sent the list out to all buyers now, each with their own unique identification code. Please check your junk mail if you have not received it.
DonAntonio
I think I'm going to take the plunge too
When scraping with gscraper, how many keywords and footprints do you use per session?
Should I scrape all the keywords with one group of footprints?
Yurium
edited October 2013
When scraping with gscraper, how many keywords and footprints do you use per session?
Should I scrape all the keywords with one group of footprints?
have the same questions for scraping by scrapebox.
gtsurfs
@furykyle - just purchased the lifetime package. Look forward to using it. Thanks!
Diji1
@FuryKyle: that would be great matey, looking forward to it!
Diji1
@Yurium: Just using articles SB crashed loading all keywords. (memory error)
What I did is split the articles footprints and (EN) keywords into 10 files, each the same number of entries.
Then I opened 10 instances of Scrapebox and loaded the keywords-1 list, added the footprints-1 list and scraped. Rinse and repeat for the other 9.
Whilst Scrapebox is a "Swiss Army Tool" for many things it's crap with large lists and URL scraping. I'd recommend paying the money for GSraper - does the same thing as above only at speeds that have to be seen to be believed and using much larger lists. It leaves Scrapebox in the dust as far as scraping URLS.
OnkelMicha
OK, wondering wtf I keep doing wrong my gscraper literary crashes all the time when I want to import your bigger keywordlists
gtsurfs
I don't have Gscraper or Scrapebox. Will I be able to copy the keyword list into SER and just let it scrape targets?
AlexR Cape Town
@FuryKyle - Are the footprints the same as the ones in GSA SER or are there new ones?
FuryKyle
edited October 2013
I've just sent out the list to all buyers today.
@OnkelMicha Some of our lists have millions of keywords, so importing and scraping ALL of them at one go would cause certain programs to crash. Try splitting them up into lists of 10 and scraping them one by one as what Diji1 recommended. That would be a lot more efficient than scraping an entire list at one go.
@gtsurfs Yes, you can, but that would be a lot less efficient than using a scraper as SER's main purpose is to create links, not scrape. I highly recommend you to get a scraper to use along with SER. If you know what you're doing, you can earn the money back easily. If you really are tight on budget, then I guess you could do that. Just load a list up when creating a project.
@AlexR My footprint list consists of the ones I've managed to collect over the years (my personal footprint list) as well as the ones in SER. There's a lot more footprints than the ones included in SER such as those for edu\gov sites.
mrizalm
How to split those keyword to smaller size, recommended by GSA not more than 10MB?
@Cherub .... I imported some of the foreign language KW lists into Gscraper and even after converting them to UTF-8 I am getting alot of garbled characters. Any suggestions? Thx
joland
Still haven't received my list after 6 hours, is this normal? I know the message says within 24 hours, but I thought it was automatically sent to you.
For those who bought, how long did it take you to receive your list? Was it a manual sending?
FuryKyle
@sweeppicker Why did you convert them to UTF-8? I've already encoded non ISO languages for scraping on Google so you can just load those up directly, they are equivalent versions The reason why you are seeing garbled characters is because your text editor does not have the proper language pack to utilize.
@joland I process all orders manually to prevent any leaking of the lists. Please be patient, as promised it'll be sent out WITHIN 24 hours
mmtj
Do you provide an invoice for the purchase?
eagleflux
already ordered..hope you can process it as soon as possible thank you.
Comments
I would like to see Korean KW list if possible.
Don't make me wait so long..
Bought your list this morning, still waiting to receive it?
What I did is split the articles footprints and (EN) keywords into 10 files, each the same number of entries.
Then I opened 10 instances of Scrapebox and loaded the keywords-1 list, added the footprints-1 list and scraped. Rinse and repeat for the other 9.
Whilst Scrapebox is a "Swiss Army Tool" for many things it's crap with large lists and URL scraping. I'd recommend paying the money for GSraper - does the same thing as above only at speeds that have to be seen to be believed and using much larger lists. It leaves Scrapebox in the dust as far as scraping URLS.
@OnkelMicha
Some of our lists have millions of keywords, so importing and scraping ALL of them at one go would cause certain programs to crash. Try splitting them up into lists of 10 and scraping them one by one as what Diji1 recommended. That would be a lot more efficient than scraping an entire list at one go.
@gtsurfs
Yes, you can, but that would be a lot less efficient than using a scraper as SER's main purpose is to create links, not scrape. I highly recommend you to get a scraper to use along with SER. If you know what you're doing, you can earn the money back easily. If you really are tight on budget, then I guess you could do that. Just load a list up when creating a project.
@AlexR
My footprint list consists of the ones I've managed to collect over the years (my personal footprint list) as well as the ones in SER. There's a lot more footprints than the ones included in SER such as those for edu\gov sites.
Why did you convert them to UTF-8? I've already encoded non ISO languages for scraping on Google so you can just load those up directly, they are equivalent versions
@joland
I process all orders manually to prevent any leaking of the lists. Please be patient, as promised it'll be sent out WITHIN 24 hours
thank you.
An invoice? For what?
Just sent out the keyword lists to all buyers. Will add a Korean keywords list very soon.