Skip to content
  • Ordered your list - looking forward to recieve it :)
  • Ordered your list. Waiting to check it out. 
  • edited December 2013
    All orders processed. 2 new language keyword lists will be added today :)
  • Terrific news @furykyle! What languages?
  • Korean and Swedish. I'm afraid I can only send it out in 2 days as I've got something else planned.
  • Hello, i have purchased right now, waiting for the list thanks. My transaction id 0L529790UE130023Y
  • Interested enough. I hope we being existing subscribers won't need to do anything and will be automatically mailed to us?
  • Need korean list ASAP :)
  • Sorry if i send you messagges everywhere, but are more 48 hours and i don't receive my list, can i have it please?
  • Hey there!, I ordered your list. Waiting to check it out.

    Thanks!
  • Yeah, already paid,  My transaction id: 9JD17620073619110
  • Processed all orders today. Update sent out as well. I've updated the keyword list with 2 new language lists - Korean and Swedish. I will try to expand the Korean list in the future as Korean keywords are pretty hard to scrape and compile. The current Swedish keyword list contains more than half a million entries which should last you quite a while. More language lists coming up next week.
  • Purchased life time one :D

    I cant wait to see the list :)
  • Also pay attentio on the footprints, they are just great. Thanks for the list!
  • I still didnt get the list, how did you get it? @dimcik
  • Okay, processed all orders today. New lists coming out next week.
  • @kpexoro you should wait for a while. I was waiting for 48h, but now i have the list! I think few hours don't change nothing, important is receive a good keywords. 

    @furykyle can i know if you think to add also italian keywords? It would be just great, thanks.
  • Hi, Please send me your List, paid Yesterday... can`t await ;)
  • edited December 2013
    Processed all orders today. @dimcik Yeah sure, it'll be on my to do list. An example of my scraping stats using my list. This barely touches the surface. image
  • @FuryKyle: Holy Balls! Those stats are impressive though so far I only scraped with around 50k keywords, not 2.5 mil. Should try it next time and see how many results I get. How many years does it take you to import+identify those 20 mil (after dupe cleaning) into GSA?

    Last thing: Is that keyword/footprint list the total amount of footprints you have (2.4 mil) or do you have more in your list and this was just part of it?
  • @FuryKyle You did that just with 1 proxy? :O

    Impressed lol.
  • @johnmiller After dupe cleaning this list will probably get me a few hundred thousand unique domains. With my current setup it should take a few days for it to get fully imported into GSA. And no, the list I used was only a small fraction of the total amount of lists available. You can go on scraping indefinitely and get every single potential site out there for every platform. I scrape occasionally for fresh sites. Here's an updated stat of my scraping progress. This is only with one virtual machine. image Also all orders processed today.
  • Just purchase your list minutes ago!
  • I bought the list 11 days ago and have yet to see it, pretty disappointing. I paid the money from the same email address as my paypal so there is no excuse as to why i haven't received it.
  • @simon69r I sent your list on the 8th to the email address you provided. I'm sure if you search your junk mail you'll find it.
  • Don't forget me, i'm still waiting.
  • All orders processed.
  • I can assure you that i haven't received it, it is not in my junk folder which i check daily and not in my inbox.
  • Can you send it again please as i have not received it.
  • @simon69r I've just resent the mail. Let me know if you've received it this time.
  • Thanks mate, got it this time.
  • mate, payment sent!
  • Hello after 24 hours, any update? I'm still waiting
  • All orders processed today.
  • edited December 2013
    image Just a list I imported into SER today. Extra 600k domains for me to use today. All unique domains as I've removed duplicates before feeding it in.
  • FuryKyle

    Dec 18th post, How are you using 1 proxy? There is no way you have been scraping that much with 1 proxy??
  • Can somebody explain me what is the benefits of this? 
  • edited December 2013
    "Êàêâî å Pligg?"
    "Què és Pligg?"
    "Pligg ¥Y?H"
    "Pligg ¥Y›õH"
    "Co je to Pligg?"
    "Hvad er Pligg?"
    "Wat is Pligg?"
    "Mis on Pligg?"
    "Ce qui est Pligg ?"
    "Mikä on StumbleUpon?"
    "Was ist Pligg?"
    "Ôé åßíáé Pligg;"
    "Ki sa ki Pligg?"
    "îä Pligg?"
    "Pligg ?????? ????"
    "Mi az a Pligg?"
    "Apa itu Pligg?"
    "Che cosa è Pligg?"
    "Che cosa è Pligg?"
    "Pligg ‚͉½‚Å‚·‚©H"
    "Pligg ¹«¾ùÀԴϱî?"
    "Kas ir Pligg?"
    "Kas yra Pligg?"
    "Hva er Pligg?"
    "á? ?ÓÊ¿"
    "Co to jest Pligg?"
    "O que é o Pligg?"
    "Ce este Pligg?"
    "×òî òàêîå Pligg?"
    "Co je Pligg?"
    "Kaj je Pligg?"
    "¿Qué es Pligg?"
    "Vad är Pligg?"
    "Pligg ¤×ÍÍÐäÃ"
    "Pligg nedir?"
    "Ùî òàêå Pligg?"
    "Ùî òàêå Pligg?"
    "Pligg là gì?"


    still planning to do this for all platforms? we really really need this... more important than other keywords
  • edited December 2013
    @eLeSlash I'll try to get that up ASAP. Major updates coming very soon - working on it as we speak. I see competitors stealing my lists and reselling them as part of their services which is uncool.
  • Hi, 
    Please send me your List, paid yesterday

    ;)
  • All lists sent today. Happy new year people!
  • Waiting for the list. Already paid!
  • waiting as well - paid!
  • paid today.
  • Hello .. paid today ^__^
  • I paid for updating version on 01.01. Today we have 03.01. .

     

    Till now I didn't get anything and no reply to my email I extra sent to FuryKyle.

     

    It's not very nice...

  • Just processed all orders today. Sorry about the slight delay, was extremely busy during New Year and I'm preparing a major update for the list as well as additional materials.
  • Just got my package. Thanks, FuryKyle.
  • i am pay life time... please send my list. thanks.
  • Purchased yesterday, just got around to posting now.
  • so you take this list, put it where it says keywords when making a campaign.... and because the list is so extensive, gsa will be scrapping a gazillion sites.... which will find you lots of new places to post to. 

    correct? 

    and if this is true, why not just buy a list that has verified urls and links??

    what are benifits of buying big list of keywords?

    (i'm just trying to figure out details :) )
  • Processed all orders. @goldstenag1 Yes, that is the basic idea and the very least you can do with it. But if you know what you're doing you can do so much more than that. Buying these lists ensure that you don't have to continually buy updated lists as sites die on a large proportion daily these days and new domains get put up. Constant scraping with the list ensures that your list always stays fresh, up to date and maxxed out.
  • I am still lost. I mean with Scrapebox, I just tried to import English keywords, which did fine. But when I tried to import the default footprints of contextual engines of SER, Scrapebox crashed. How come you don't experience crash in so many keywords? What am I doing wrong? @gooner what do you do to prevent this? Would like to hear from you as well @FuryKyle.

    Thank you.
  • molchomolcho Germany
    edited January 2014
    How many keys do you put in to your Scrapebox, and how many footprints?
    When you put 500 footprints, and 10000 keywords, you have 5.000.000 searches that can Scrapebox not handle.
    when you have no more than 10 results per search than have you 50 million results.
    All over 1 million have scrapebox problems, just the same in the search results, no more than 1 million.

    And i think furys list is not greater because of the restriction from Scrapebox

  • edited January 2014
    @pratik My SB can handle large amounts of keywords without crashing. What computer are you using it on? You can try splitting the lists with a program like GSplitter and then scrape in batches. OR just don't import ALL of the default footprints. Import on a per-engine basis.

    @molcho Do you know what you are talking about? SB handles 5m keywords no problem. And the limit SB can show may be 1 million, but the rest gets saved in your harvester file - your full scraping data is there.
  • @fury you are right, i test it out at the moment, and i had only the 1 mio show at the display in the head, sry, my mistake.
  • I left using SB since long time so lol I made a foolish comment. I forgot that when combining footprints and keywords is what SB counts as keyword count. And I was importing 300K keywords + 100+ footprints so it was definitely going to crash haha. :D

    @FuryKyle Interesting. I however am not sure of what that 1 million funda is. Could you explain? Is it like it'd only display 1 mil URL list in the harvester area with URLs and rest of them are saved to harvester sessions and can be split and imported one by one to perform operations on them? And yes, I do use Gsplit, life is not possible without it lol.
  • Purchased lifetime version, looking forward to it.
  • All orders processed today.
Sign In or Register to comment.