i haven't been able to crack 10 lpm for using GSA only for searching / submitting (i.e. global lists off).
i've experimenting with Gscraper to get lists to import into GSA to build up my global list base. what is the speed everyone is scraping in GScraper I'm using public proxies i've scraped or their in built ones getting about 1400 links per minute scraped.
i then delete duplicate domains and import the list in GSA.
Also Zeusy: I'm getting about 80 Lpm for your modified engines. cheers thanks for uploading (using global lists turn on though).
Ron identified I named one of the files wrong, this is correction file. Ning was named as social bookmark and not a social network...sorry for this error. Replacement
@zeusy thanks for making the changes i've combined them with my own changes.
can i ask everyone i know @ron can get high lpm without global lists but what LPM do people get running zeusy engines 12 projects (4 tier 1, 8 kitchen sink):
a) with global lists
b) without global lists
for me
a) i get 150lpm for a day then after 1 day i suspect when global lists run out of targets and you keep getting already parsed messages the lpm drops down to 50 lpm then on the 3rd day you get down to 20 lpm. if i clear my target url history on my kitchen sink tiers it shoots back up to 150 lpm.
b) i get about 5-10lpm.
i'm running 20 private proxies, 3 second timeout between engine searches, 120 html timeout.
@Zeusy did you change the "search term" only or did you also change the "page must have" and "url must have" options?
I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches"
@svenI guess this is difficult so fix because SER supports all these platforms and it's based in scripts, maybe an option to tell SER to submit to a list without searching for matches, but telling you that list is all one specific engine, something like a "import target urls for a specific engine" and it wouldn't search for a matching engine, just try to submit (will add this to the Feature Requests).
Comments
grab here
I'll await feedback before I do anymore.
Will give these a try.
Added new ones too, thank again Zeusy.
Tks
Zeusy
P.S the general blog engines tooks hours todo ..but it fricken flys now:)
i don't use GSA harvester
I use GSA for only submission
This extremely increases LPM
So i am using GSA footprints to harvest results with my about 20k keyword list
I already generated 42m queries ^^
by the way you have 174 new footprints
Unfortunately the second last download link is not working
http://anonym.to/?http://www.2shared.com/file/OvHC_byF/ser_zeus_files.html
The file link that you requested is not valid. Please contact link publisher or try to make a search.
grab the replacement file in above post.