@eLeSlash
I'll try to get that up ASAP. Major updates coming very soon - working on it as we speak. I see competitors stealing my lists and reselling them as part of their services which is uncool.
Just processed all orders today. Sorry about the slight delay, was extremely busy during New Year and I'm preparing a major update for the list as well as additional materials.
so you take this list, put it where it says keywords when making a campaign.... and because the list is so extensive, gsa will be scrapping a gazillion sites.... which will find you lots of new places to post to.
correct?
and if this is true, why not just buy a list that has verified urls and links??
Processed all orders.
@goldstenag1
Yes, that is the basic idea and the very least you can do with it. But if you know what you're doing you can do so much more than that. Buying these lists ensure that you don't have to continually buy updated lists as sites die on a large proportion daily these days and new domains get put up. Constant scraping with the list ensures that your list always stays fresh, up to date and maxxed out.
I am still lost. I mean with Scrapebox, I just tried to import English keywords, which did fine. But when I tried to import the default footprints of contextual engines of SER, Scrapebox crashed. How come you don't experience crash in so many keywords? What am I doing wrong? @gooner what do you do to prevent this? Would like to hear from you as well @FuryKyle.
How many keys do you put in to your Scrapebox, and how many footprints? When you put 500 footprints, and 10000 keywords, you have 5.000.000 searches that can Scrapebox not handle. when you have no more than 10 results per search than have you 50 million results. All over 1 million have scrapebox problems, just the same in the search results, no more than 1 million.
And i think furys list is not greater because of the restriction from Scrapebox
@pratik
My SB can handle large amounts of keywords without crashing. What computer are you using it on? You can try splitting the lists with a program like GSplitter and then scrape in batches. OR just don't import ALL of the default footprints. Import on a per-engine basis.
@molcho
Do you know what you are talking about? SB handles 5m keywords no problem. And the limit SB can show may be 1 million, but the rest gets saved in your harvester file - your full scraping data is there.
I left using SB since long time so lol I made a foolish comment. I forgot that when combining footprints and keywords is what SB counts as keyword count. And I was importing 300K keywords + 100+ footprints so it was definitely going to crash haha.
@FuryKyle Interesting. I however am not sure of what that 1 million funda is. Could you explain? Is it like it'd only display 1 mil URL list in the harvester area with URLs and rest of them are saved to harvester sessions and can be split and imported one by one to perform operations on them? And yes, I do use Gsplit, life is not possible without it lol.
Comments
Dec 18th post, How are you using 1 proxy? There is no way you have been scraping that much with 1 proxy??
"Què és Pligg?"
"Pligg ¥Y?H"
"Pligg ¥Y›õH"
"Co je to Pligg?"
"Hvad er Pligg?"
"Wat is Pligg?"
"Mis on Pligg?"
"Ce qui est Pligg ?"
"Mikä on StumbleUpon?"
"Was ist Pligg?"
"Ôé åßíáé Pligg;"
"Ki sa ki Pligg?"
"îä Pligg?"
"Pligg ?????? ????"
"Mi az a Pligg?"
"Apa itu Pligg?"
"Che cosa è Pligg?"
"Che cosa è Pligg?"
"Pligg ‚͉½‚Å‚·‚©H"
"Pligg ¹«¾ùÀԴϱî?"
"Kas ir Pligg?"
"Kas yra Pligg?"
"Hva er Pligg?"
"á? ?ÓÊ¿"
"Co to jest Pligg?"
"O que é o Pligg?"
"Ce este Pligg?"
"×òî òàêîå Pligg?"
"Co je Pligg?"
"Kaj je Pligg?"
"¿Qué es Pligg?"
"Vad är Pligg?"
"Pligg ¤×ÍÍÐäÃ"
"Pligg nedir?"
"Ùî òàêå Pligg?"
"Ùî òàêå Pligg?"
"Pligg là gì?"
still planning to do this for all platforms? we really really need this... more important than other keywords
I paid for updating version on 01.01. Today we have 03.01. .
Till now I didn't get anything and no reply to my email I extra sent to FuryKyle.
It's not very nice...
When you put 500 footprints, and 10000 keywords, you have 5.000.000 searches that can Scrapebox not handle.
when you have no more than 10 results per search than have you 50 million results.
All over 1 million have scrapebox problems, just the same in the search results, no more than 1 million.
And i think furys list is not greater because of the restriction from Scrapebox