i didn't get yet the keyword list..This is my payment ID : 2ST28203YE687260V
gooner SERLists.com
just ordered, looking forward to seeing the list. thanks
gtsurfs
@FuryKyle - I thought I'd post this question here so that others who are using GSA SER as their scraper could benefit. I purchased the lifetime package and I'm looking at the keyword list folder that has 10 text files containing keywords in different languages. How many keywords are in each language file and how do you recommend splitting these text files into groups of 100K keywords?
Sorry for the noob questions. Thanks for your help!
RayBan
FuryKyle ordered yesterday, haven't recieved it yet.
londonseo London, UK
I just ordered - Your transaction ID for this payment is: 1J063044YP810703J.
OnkelMicha
Ok so splitting it up makes sense for more then jus tthe memory reason.
One thing I'm not sure you answered, if I don't jave the language packs installed on my computer so can it not handle them?
I tried to load some of the asian stuff and it freezes as well.
FuryKyle
I've just processed all orders and sent out your lists. Please check your inbox.
@RayBan According to my records, the list was sent to the email address you provided in the form. Could you check your junk mail and let me know? I'll have it resent if you still can't find it.
@gtsurfs You can load them up to the KW section of GSA to find out how many keywords are there for any list you choose. I personally use SB to split up lists, but you can use free programs out there such as the one already mentioned in one of the pages before this. Here's the URL - http://sourceforge.net/projects/textwedge/
OnkelMicha Most text editors can handle them by default, so there's no need to install or add anything extra. It won't freeze if you don't have the pack installed, you simply get a bunch of jarbled text with squares everywhere. Use the encoded versions to scrape on Google for those non ISO encoded languages. That's why they're there
FuryKyle
Sent out all lists today!
OnkelMicha
everything is sorted, chopped up all lists and got it all cataloged and running now. thanks a bunch.
Liolik
Just made a payment. Waiting for the list.
vifa
I just ordered - transaction ID: 5EM09492B9767874F
mmtj
Ordered, just to see if there are some nice footprints to add to our collection. Also would be cool, if you can shoot us an invoice for the purchase to our mail (for accounting purposes)?
mmtj
Sent you a pm.
FuryKyle
Processed all orders.
urenvivx
I've just placed an order with Order number: 4601-4036-3978-5154 email: urenvivx@outlook.com. Happy to receive your kw lists
Thanks much
mmark
Have just ordered. Let the scrapping begin!
FuryKyle
Just sent out the lists to you guys. Update coming soon.
It's pretty much plug and play. Merge a keyword language list with footprints and you're good to go.
andy1024
need help my scrapebox crashed everytime when I try to import english keyword and wiki footprint
eliquid
just purchased
andy1024
for the keyword list should I use single keyword and scrape more long keyword then load the footprint or just use the keyword from the list only?
DonAntonio
I tried to buy the list, but when I logged in to pyapal I got this message: "We are unable to validate your information. Please try again."
Let me know how we can arrange the payment
FuryKyle
@andy1024 There's absolutely no need to scrape for more footprints. There's almost a billion in my list, trust me, it's way more than enough. Just merge the platform footprints with the language ones and you're good to go.
@DonAntonio I'm not sure why that happened, but I've dropped you a PM.
Comments
Sorry for the noob questions. Thanks for your help!
One thing I'm not sure you answered, if I don't jave the language packs installed on my computer so can it not handle them?
I tried to load some of the asian stuff and it freezes as well.
@Eagleflux
Just sent it.
@RayBan
According to my records, the list was sent to the email address you provided in the form. Could you check your junk mail and let me know? I'll have it resent if you still can't find it.
@gtsurfs
You can load them up to the KW section of GSA to find out how many keywords are there for any list you choose. I personally use SB to split up lists, but you can use free programs out there such as the one already mentioned in one of the pages before this. Here's the URL - http://sourceforge.net/projects/textwedge/
OnkelMicha
Most text editors can handle them by default, so there's no need to install or add anything extra. It won't freeze if you don't have the pack installed, you simply get a bunch of jarbled text with squares everywhere. Use the encoded versions to scrape on Google for those non ISO encoded languages. That's why they're there
Please send ******olt@gmail.com
It's pretty much plug and play. Merge a keyword language list with footprints and you're good to go.
There's absolutely no need to scrape for more footprints. There's almost a billion in my list, trust me, it's way more than enough. Just merge the platform footprints with the language ones and you're good to go.
@DonAntonio
I'm not sure why that happened, but I've dropped you a PM.