Skip to content

۞MOST COMPLETE GSA Keyword Scraping List۞ • 10+ Major Languages • CREATE Your Own GSA Site Lists◄

145791018

Comments

  • @simon69r I've just resent the mail. Let me know if you've received it this time.
  • Thanks mate, got it this time.
  • mate, payment sent!
  • Hello after 24 hours, any update? I'm still waiting
  • All orders processed today.
  • edited December 2013
    image Just a list I imported into SER today. Extra 600k domains for me to use today. All unique domains as I've removed duplicates before feeding it in.
  • FuryKyle

    Dec 18th post, How are you using 1 proxy? There is no way you have been scraping that much with 1 proxy??
  • Can somebody explain me what is the benefits of this? 
  • edited December 2013
    "Êàêâî å Pligg?"
    "Què és Pligg?"
    "Pligg ¥Y?H"
    "Pligg ¥Y›õH"
    "Co je to Pligg?"
    "Hvad er Pligg?"
    "Wat is Pligg?"
    "Mis on Pligg?"
    "Ce qui est Pligg ?"
    "Mikä on StumbleUpon?"
    "Was ist Pligg?"
    "Ôé åßíáé Pligg;"
    "Ki sa ki Pligg?"
    "îä Pligg?"
    "Pligg ?????? ????"
    "Mi az a Pligg?"
    "Apa itu Pligg?"
    "Che cosa è Pligg?"
    "Che cosa è Pligg?"
    "Pligg ‚͉½‚Å‚·‚©H"
    "Pligg ¹«¾ùÀԴϱî?"
    "Kas ir Pligg?"
    "Kas yra Pligg?"
    "Hva er Pligg?"
    "á? ?ÓÊ¿"
    "Co to jest Pligg?"
    "O que é o Pligg?"
    "Ce este Pligg?"
    "×òî òàêîå Pligg?"
    "Co je Pligg?"
    "Kaj je Pligg?"
    "¿Qué es Pligg?"
    "Vad är Pligg?"
    "Pligg ¤×ÍÍÐäÃ"
    "Pligg nedir?"
    "Ùî òàêå Pligg?"
    "Ùî òàêå Pligg?"
    "Pligg là gì?"


    still planning to do this for all platforms? we really really need this... more important than other keywords
  • edited December 2013
    @eLeSlash I'll try to get that up ASAP. Major updates coming very soon - working on it as we speak. I see competitors stealing my lists and reselling them as part of their services which is uncool.
  • Hi, 
    Please send me your List, paid yesterday

    ;)
  • All lists sent today. Happy new year people!
  • Waiting for the list. Already paid!
  • waiting as well - paid!
  • paid today.
  • Hello .. paid today ^__^
  • I paid for updating version on 01.01. Today we have 03.01. .

     

    Till now I didn't get anything and no reply to my email I extra sent to FuryKyle.

     

    It's not very nice...

  • Just processed all orders today. Sorry about the slight delay, was extremely busy during New Year and I'm preparing a major update for the list as well as additional materials.
  • Just got my package. Thanks, FuryKyle.
  • i am pay life time... please send my list. thanks.
  • Purchased yesterday, just got around to posting now.
  • so you take this list, put it where it says keywords when making a campaign.... and because the list is so extensive, gsa will be scrapping a gazillion sites.... which will find you lots of new places to post to. 

    correct? 

    and if this is true, why not just buy a list that has verified urls and links??

    what are benifits of buying big list of keywords?

    (i'm just trying to figure out details :) )
  • Processed all orders. @goldstenag1 Yes, that is the basic idea and the very least you can do with it. But if you know what you're doing you can do so much more than that. Buying these lists ensure that you don't have to continually buy updated lists as sites die on a large proportion daily these days and new domains get put up. Constant scraping with the list ensures that your list always stays fresh, up to date and maxxed out.
  • I am still lost. I mean with Scrapebox, I just tried to import English keywords, which did fine. But when I tried to import the default footprints of contextual engines of SER, Scrapebox crashed. How come you don't experience crash in so many keywords? What am I doing wrong? @gooner what do you do to prevent this? Would like to hear from you as well @FuryKyle.

    Thank you.
  • molchomolcho Germany
    edited January 2014
    How many keys do you put in to your Scrapebox, and how many footprints?
    When you put 500 footprints, and 10000 keywords, you have 5.000.000 searches that can Scrapebox not handle.
    when you have no more than 10 results per search than have you 50 million results.
    All over 1 million have scrapebox problems, just the same in the search results, no more than 1 million.

    And i think furys list is not greater because of the restriction from Scrapebox

  • edited January 2014
    @pratik My SB can handle large amounts of keywords without crashing. What computer are you using it on? You can try splitting the lists with a program like GSplitter and then scrape in batches. OR just don't import ALL of the default footprints. Import on a per-engine basis.

    @molcho Do you know what you are talking about? SB handles 5m keywords no problem. And the limit SB can show may be 1 million, but the rest gets saved in your harvester file - your full scraping data is there.
  • @fury you are right, i test it out at the moment, and i had only the 1 mio show at the display in the head, sry, my mistake.
  • I left using SB since long time so lol I made a foolish comment. I forgot that when combining footprints and keywords is what SB counts as keyword count. And I was importing 300K keywords + 100+ footprints so it was definitely going to crash haha. :D

    @FuryKyle Interesting. I however am not sure of what that 1 million funda is. Could you explain? Is it like it'd only display 1 mil URL list in the harvester area with URLs and rest of them are saved to harvester sessions and can be split and imported one by one to perform operations on them? And yes, I do use Gsplit, life is not possible without it lol.
  • Purchased lifetime version, looking forward to it.
  • All orders processed today.
Sign In or Register to comment.