Skip to content
  • FuryKyle . Love the package. One request ... could you add some foreign language footprints if possible in future. Thanks alot.
  • @Username
    Diji1 has perfectly answered your question :) Unfortunately I believe it's against the forum rules to talk about other forums here, so I cannot do that. It's not too hard to find, though.

    Thank you for the compliments :) Will keep it updated as often as I can. Happy scraping! I'm sure you'll get tons of untapped sites, especially with the other languages.

    Yes, that is on my to do list. However, not many platforms have a "foreign" version so it may or may not be quite limited. I'll make sure I'll get some out after I add a few more new languages (which are coming very soon).


  • purchased please send  :)
  • @arkant1
    Just sent it out. 
  • edited October 2013
    FuryKyle I just purchased lifetime.

    I would like to see Korean KW list if possible.
  • @greeny1232

    I'll process your order once I'm back at my workplace. Expect your list in no longer than 24 hours. Will work on a Korean KW list very soon. 
  • From the 1 billion list, how many are English only?
  • purchased. Worth giving a try imo
  • already paid, but never receive the keyword. Already pm you but no answer, even email to you..
    Don't make me wait so long..
  • Hi there Furykyle,
    Bought your list this morning, still waiting to receive it?
  • Thank you @Diji1 for helping out at my thread :) I'll send you a bonus list on the next update. 
    I've sent the list out to all buyers now, each with their own unique identification code. Please check your junk mail if you have not received it.
  • I think I'm going to take the plunge too :)

    When scraping with gscraper, how many keywords  and footprints do you use per session?
    Should I scrape all the keywords with one group of footprints?
  • edited October 2013
    When scraping with gscraper, how many keywords  and footprints do you use per session?
    Should I scrape all the keywords with one group of footprints?
    have the same questions for scraping by scrapebox.
  • @furykyle - just purchased the lifetime package. Look forward to using it. Thanks!
  • @FuryKyle: that would be great matey, looking forward to it!
  • @Yurium: Just using articles SB crashed loading all keywords. (memory error)

    What I did is split the articles footprints and (EN) keywords into 10 files, each the same number of entries.

    Then I opened 10 instances of Scrapebox and loaded the keywords-1 list, added the footprints-1 list and scraped.  Rinse and repeat for the other 9.

    Whilst Scrapebox is a "Swiss Army Tool" for many things it's crap with large lists and URL scraping.  I'd recommend paying the money for GSraper - does the same thing as above only at speeds that have to be seen to be believed and using much larger lists.  It leaves Scrapebox in the dust as far as scraping URLS.
  • OK, wondering wtf I keep doing wrong my gscraper literary crashes all the time when I want to import your bigger keywordlists
  • I don't have Gscraper or Scrapebox. Will I be able to copy the keyword list into SER and just let it scrape targets?
  • AlexRAlexR Cape Town
    @FuryKyle - Are the footprints the same as the ones in GSA SER or are there new ones?
  • edited October 2013
    I've just sent out the list to all buyers today.

    @OnkelMicha
    Some of our lists have millions of keywords, so importing and scraping ALL of them at one go would cause certain programs to crash. Try splitting them up into lists of 10 and scraping them one by one as what Diji1 recommended. That would be a lot more efficient than scraping an entire list at one go.

    @gtsurfs
    Yes, you can, but that would be a lot less efficient than using a scraper as SER's main purpose is to create links, not scrape. I highly recommend you to get a scraper to use along with SER. If you know what you're doing, you can earn the money back easily. If you really are tight on budget, then I guess you could do that. Just load a list up when creating a project.

    @AlexR
    My footprint list consists of the ones I've managed to collect over the years (my personal footprint list) as well as the ones in SER. There's a lot more footprints than the ones included in SER such as those for edu\gov sites.
  • How to split those keyword to smaller size, recommended by GSA not more than 10MB?
  • Just bought, looking forward to it!  thx! :)
  • cherubcherub SERnuke.com
    You can use this to split large files: http://sourceforge.net/projects/textwedge/
  • @Cherub .... I imported some of the foreign language KW lists into Gscraper and even after converting them to UTF-8 I am getting alot of garbled characters. Any suggestions? Thx
  • Still haven't received my list after 6 hours, is this normal? I know the message says within 24 hours, but I thought it was automatically sent to you.  

    For those who bought, how long did it take you to receive your list?  Was it a manual sending?
  • @sweeppicker
    Why did you convert them to UTF-8? I've already encoded non ISO languages for scraping on Google so you can just load those up directly, they are equivalent versions :) The reason why you are seeing garbled characters is because your text editor does not have the proper language pack to utilize.

    @joland
    I process all orders manually to prevent any leaking of the lists. Please be patient, as promised it'll be sent out WITHIN 24 hours :)
  • Do you provide an invoice for the purchase? 
  • already ordered..hope you can process it as soon as possible =)
    thank you.
  • @mmtj
    An invoice? For what?

    Just sent out the keyword lists to all buyers. Will add a Korean keywords list very soon.
  • i didn't get yet the keyword list..This is my payment ID : 2ST28203YE687260V
  • goonergooner SERLists.com
    just ordered, looking forward to seeing the list. thanks
  • @FuryKyle - I thought I'd post this question here so that others who are using GSA SER as their scraper could benefit. I purchased the lifetime package and I'm looking at the keyword list folder that has 10 text files containing keywords in different languages. How many keywords are in each language file and how do you recommend splitting these text files into groups of 100K keywords?

    Sorry for the noob questions. Thanks for your help!
  • FuryKyle ordered yesterday, haven't recieved it yet.
  • londonseolondonseo London, UK
    I just ordered - Your transaction ID for this payment is: 1J063044YP810703J.
  • Ok so splitting it up makes sense for more then jus tthe memory reason.

    One thing I'm not sure you answered, if I don't jave the language packs installed on my computer so can it not handle them?

    I tried to load some of the asian stuff and it freezes as well.


  • I've just processed all orders and sent out your lists. Please check your inbox.

    @Eagleflux
    Just sent it.

    @RayBan
    According to my records, the list was sent to the email address you provided in the form. Could you check your junk mail and let me know? I'll have it resent if you still can't find it.

    @gtsurfs
    You can load them  up to the KW section of GSA to find out how many keywords are there for any list you choose. I personally use SB to split up lists, but you can use free programs out there such as the one already mentioned in one of the pages before this. Here's the URL - http://sourceforge.net/projects/textwedge/

    OnkelMicha
    Most text editors can handle them by default, so there's no need to install or add anything extra. It won't freeze if you don't have the pack installed, you simply get a bunch of jarbled text with squares everywhere. Use the encoded versions to scrape on Google for those non ISO encoded languages. That's why they're there :)
  • Sent out all lists today!
  • everything is sorted, chopped up all lists and got it all cataloged and running now. thanks a bunch.
  • Just made a payment. Waiting for the list.
  • I just ordered - transaction ID: 5EM09492B9767874F
  • Ordered, just to see if there are some nice footprints to add to our collection. Also would be cool, if you can shoot us an invoice for the purchase to our mail (for accounting purposes)?
  • Sent you a pm.
  • Processed all orders.
  • I've just placed an order with Order number: 4601-4036-3978-5154 email: urenvivx@outlook.com. Happy to receive your kw lists Thanks much
  • Have just ordered. Let the scrapping begin!
  • Just sent out the lists to you guys. Update coming soon.
  • Just placed order. Transaction ID: 5XX85643A01387027
  • Vijayaraj 
    Just processed your order.
  • Ordered.
    Please send ******olt@gmail.com
  • All orders processed.
  • Can someone post instruction how to use the pack list by scrapbox  thanks
  • edited October 2013
    andy1024

    It's pretty much plug and play. Merge a keyword language list with footprints and you're good to go.
  • need help  my scrapebox crashed everytime when I try to import english keyword and wiki footprint 
  • just purchased
  • for the keyword  list  should I use single keyword and scrape more long keyword  then load the footprint or just use the keyword  from the list only? 
  • I tried to buy the list, but when I logged in to pyapal I got this message: "We are unable to validate your information. Please try again."

    Let me know how we can arrange the payment
  • @andy1024
    There's absolutely no need to scrape for more footprints. There's almost a billion in my list, trust me, it's way more than enough. Just merge the platform footprints with the language ones and you're good to go.

    @DonAntonio
    I'm not sure why that happened, but I've dropped you a PM.
  • purchased
  • Just bought 
Sign In or Register to comment.