Skip to content

۞MOST COMPLETE GSA Keyword Scraping List۞ • 10+ Major Languages • CREATE Your Own GSA Site Lists◄

2456718

Comments

  • FuryKyle . Love the package. One request ... could you add some foreign language footprints if possible in future. Thanks alot.
  • @Username
    Diji1 has perfectly answered your question :) Unfortunately I believe it's against the forum rules to talk about other forums here, so I cannot do that. It's not too hard to find, though.

    Thank you for the compliments :) Will keep it updated as often as I can. Happy scraping! I'm sure you'll get tons of untapped sites, especially with the other languages.

    Yes, that is on my to do list. However, not many platforms have a "foreign" version so it may or may not be quite limited. I'll make sure I'll get some out after I add a few more new languages (which are coming very soon).


  • purchased please send  :)
  • @arkant1
    Just sent it out. 
  • edited October 2013
    FuryKyle I just purchased lifetime.

    I would like to see Korean KW list if possible.
  • @greeny1232

    I'll process your order once I'm back at my workplace. Expect your list in no longer than 24 hours. Will work on a Korean KW list very soon. 
  • From the 1 billion list, how many are English only?
  • purchased. Worth giving a try imo
  • already paid, but never receive the keyword. Already pm you but no answer, even email to you..
    Don't make me wait so long..
  • Hi there Furykyle,
    Bought your list this morning, still waiting to receive it?
  • Thank you @Diji1 for helping out at my thread :) I'll send you a bonus list on the next update. 
    I've sent the list out to all buyers now, each with their own unique identification code. Please check your junk mail if you have not received it.
  • I think I'm going to take the plunge too :)

    When scraping with gscraper, how many keywords  and footprints do you use per session?
    Should I scrape all the keywords with one group of footprints?
  • edited October 2013
    When scraping with gscraper, how many keywords  and footprints do you use per session?
    Should I scrape all the keywords with one group of footprints?
    have the same questions for scraping by scrapebox.
  • @furykyle - just purchased the lifetime package. Look forward to using it. Thanks!
  • @FuryKyle: that would be great matey, looking forward to it!
  • @Yurium: Just using articles SB crashed loading all keywords. (memory error)

    What I did is split the articles footprints and (EN) keywords into 10 files, each the same number of entries.

    Then I opened 10 instances of Scrapebox and loaded the keywords-1 list, added the footprints-1 list and scraped.  Rinse and repeat for the other 9.

    Whilst Scrapebox is a "Swiss Army Tool" for many things it's crap with large lists and URL scraping.  I'd recommend paying the money for GSraper - does the same thing as above only at speeds that have to be seen to be believed and using much larger lists.  It leaves Scrapebox in the dust as far as scraping URLS.
  • OK, wondering wtf I keep doing wrong my gscraper literary crashes all the time when I want to import your bigger keywordlists
  • I don't have Gscraper or Scrapebox. Will I be able to copy the keyword list into SER and just let it scrape targets?
  • AlexRAlexR Cape Town
    @FuryKyle - Are the footprints the same as the ones in GSA SER or are there new ones?
  • edited October 2013
    I've just sent out the list to all buyers today.

    @OnkelMicha
    Some of our lists have millions of keywords, so importing and scraping ALL of them at one go would cause certain programs to crash. Try splitting them up into lists of 10 and scraping them one by one as what Diji1 recommended. That would be a lot more efficient than scraping an entire list at one go.

    @gtsurfs
    Yes, you can, but that would be a lot less efficient than using a scraper as SER's main purpose is to create links, not scrape. I highly recommend you to get a scraper to use along with SER. If you know what you're doing, you can earn the money back easily. If you really are tight on budget, then I guess you could do that. Just load a list up when creating a project.

    @AlexR
    My footprint list consists of the ones I've managed to collect over the years (my personal footprint list) as well as the ones in SER. There's a lot more footprints than the ones included in SER such as those for edu\gov sites.
  • How to split those keyword to smaller size, recommended by GSA not more than 10MB?
  • Just bought, looking forward to it!  thx! :)
  • You can use this to split large files: http://sourceforge.net/projects/textwedge/
  • @Cherub .... I imported some of the foreign language KW lists into Gscraper and even after converting them to UTF-8 I am getting alot of garbled characters. Any suggestions? Thx
  • Still haven't received my list after 6 hours, is this normal? I know the message says within 24 hours, but I thought it was automatically sent to you.  

    For those who bought, how long did it take you to receive your list?  Was it a manual sending?
  • @sweeppicker
    Why did you convert them to UTF-8? I've already encoded non ISO languages for scraping on Google so you can just load those up directly, they are equivalent versions :) The reason why you are seeing garbled characters is because your text editor does not have the proper language pack to utilize.

    @joland
    I process all orders manually to prevent any leaking of the lists. Please be patient, as promised it'll be sent out WITHIN 24 hours :)
  • Do you provide an invoice for the purchase? 
  • already ordered..hope you can process it as soon as possible =)
    thank you.
  • @mmtj
    An invoice? For what?

    Just sent out the keyword lists to all buyers. Will add a Korean keywords list very soon.
Sign In or Register to comment.