@LeeG I noticed you wrote you got 5000 verified Wikkas. That is 5000 spread over many projects right? For a single project I have 102 verified after scraping for a hole night and not allowing to post twice on the same domain.
First I want to say this is a great thread. Just from yesterday to today. I have already doubled my LPM.
There is one thing I read here about the Blacklist setting that i'm not really sure about.
Am I supposed to have this "Checked" for ser to check if the domain is on that list and skip submission if it finds it there, or do I have it "Un Checked" to not even check the Blacklist?
LeeG my question is what is your bandwidth = connection to www ??
I also strictly use GSA to do all the work and get excellent results = MUCH better than ANY other tool ever used before (UD/AMR and other online services)
but my limit seems to be the number of threads I can run
in bad times I have 8-12 threads in best times 22-44 threads
living / wkg in Cambodia, depending on daytime / weekday and ISP I use / I need at least 4 different mobile 3G providers to assure qn almost 100% up time) my bandwidth varies around 250kbps - 500kbps upload and 1-4Mbps download MAX
my max submission per day is in the range of 800 and 10-15% are verified
I am almost sure your results are achieved with much more bandwidth
I've been using GSA seriously for about a month now. I just moved from Berman into a Powerup hosting.
Here's how my numbers look like at the moment:
Now, I've been reading this forum (this thread especially), and playing around with GSA for the past 12 or 13 hours straight, and here's where I'm at:
I'm running all of this from a VPS like such: - 4 CPU's @ 3,6 Ghz - 120Gb Hard disk - 4G RAM - unmetered bandwidth - 100 Mbps connection
On top of that I have 30 semi-dedicated proxies from Buy Proxies, and a Indexification license.
Here's what I've done already: - drop the HTML timeout to 130 - choose 6 English Google SE's - take OFF the SpamVilla, and run only GSA Captcha Breaker - put the verification to 1440
I'm running 14 projects. All have a Tier 1 as well.
Now, for every project I have a little bit of project-related keywords, and then that 99k+ list I kept tweaking and honing for probably something like 8 hours to get rid of the bad keywords.
I'm beginning to run out of ideas here... I've removed the duplicate url's and domains every day.
Still, I get these 60-70 LpM numbers only for short periods of time, and then I'm right back to the 10-20 range.
Okay, well couple of things have happened since the time I posted my question.
I actually read all of this thread again (who wants to sleep at night, anyway?)
Most important updates: - got myself 30 PRIVATE proxies, and tweaked the PROXY timeout and thread settings as Buy Proxies suggested => no more burning proxies! - got myself another VPS to run Gscraper on, and I've imported some lists from there as well
It seems like I'm settling to about 50-60 LpM right now, although I DID visit the 900+ LpM zone for a while over there
One thing I noticed helping me was to import my submitted list to the projects. This makes sense as well, so that I can get a bigger verified list that way.
Okay, I necrod this thread. Get over it. But this is amazing to see what you ancients used to do. On contextual dofollow article engines only I get about 10lpm on a good vps, which sucks ass honestly. Now, I have to go back and tinker with settings. Au revoir!
I suggest you guys stop focusing on your LPM and start focusing on the results you want to get. LPM depends on many things, engines, platforms, amount of e-mail adresses, email providers, your gsa settings, your proxies, your project settings, your captcha breaking solution, your engines.ini file, domains that you scrape, how you import lists into projects, how you submit... the list is so long...e.g
when blasting fresh scrap for blog comments you wont get much lpm but when blasting blog comments using verified list and "allow to post on same site again" your lpm will blow...
ill rather keep 20lpm than 2000 but get my website ranked
Comments
I am not using any hard method of finding the low yield footprints
I do things for ease and speed
Extract footprint, import them into gscraper pro and press a button to see how results are returned and then remove the low yield ones
Now that is being above and beyond technical, I think not
Its a set and forget method once done
Rather than extract footprint, scrape keywords, pull 100k results
Add to lower tier, repeat constantly with each engine your targeting
Im not a fan of using scraped lists
I can get the same or better submission results day in, day out just by letting ser run under its own steam
And no need for a middle man list scraper and another pc running pulling the lists
@LeeG you said that about me???
I was just finishing reading a new book...the Best of Albert Einstein, lol.
As I always say, there is one simple equation to using ser properly
time + effort = high LpM and verified
And this relatively easy method that can boost submissions is only time consuming
Proof the above works
Over 5k of those were verified yesterday were WikkaWiki
One day editing engine files with the footprints
Four consecutive days with double my normal verified
Not rocket science to do. Even Rons not been able to over complicate this yet
Give him time on that one
That's a very large amount of verified links in one day.
Maybe people will pay attention to what you say.
Lower tiers I allow posting on the same domain twice
If your posting to blogs, how do you know your not going to miss out on a pr6 page
Im now using blogs as a secondary type link. Got fed up with seeing the damn things in my verified
Do you just scrape generic keywords (must be at 100k per day???)?
As always thanks for time and effort and generosity in sharing so much.
my question is
what is your bandwidth = connection to www ??
I also strictly use GSA to do all the work and get excellent results = MUCH better than ANY other tool ever used before (UD/AMR and other online services)
but my limit seems to be the number of threads I can run
in bad times I have 8-12 threads
in best times 22-44 threads
living / wkg in Cambodia,
depending on daytime / weekday and ISP I use / I need at least 4 different mobile 3G providers to assure qn almost 100% up time)
my bandwidth varies around 250kbps - 500kbps upload and 1-4Mbps download MAX
my max submission per day is in the range of 800 and 10-15% are verified
I am almost sure your results are achieved with much more bandwidth
No idea on monthly bandwidth in all honesty. I did monitor when I first moved to a vps and the totals reset at the beginning of each billing cycle
And since I don't get any warnings about maxing out my bandwidth from the company I get my vps from, its one less thing to be concerned or worry about
From memory it was a couple of tetra byte a month or a couple more for good luck
Internet connection is 100mbt on my vps
I've been using GSA seriously for about a month now. I just moved from Berman into a Powerup hosting.
Here's how my numbers look like at the moment:
Now, I've been reading this forum (this thread especially), and playing around with GSA for the past 12 or 13 hours straight, and here's where I'm at:
I'm running all of this from a VPS like such:
- 4 CPU's @ 3,6 Ghz
- 120Gb Hard disk
- 4G RAM
- unmetered bandwidth
- 100 Mbps connection
On top of that I have 30 semi-dedicated proxies from Buy Proxies, and a Indexification license.
Here's what I've done already:
- drop the HTML timeout to 130
- choose 6 English Google SE's
- take OFF the SpamVilla, and run only GSA Captcha Breaker
- put the verification to 1440
I'm running 14 projects. All have a Tier 1 as well.
Now, for every project I have a little bit of project-related keywords, and then that 99k+ list I kept tweaking and honing for probably something like 8 hours to get rid of the bad keywords.
I'm beginning to run out of ideas here... I've removed the duplicate url's and domains every day.
Still, I get these 60-70 LpM numbers only for short periods of time, and then I'm right back to the 10-20 range.
You guys have any ideas where to go from here?
I actually read all of this thread again (who wants to sleep at night, anyway?)
Most important updates:
- got myself 30 PRIVATE proxies, and tweaked the PROXY timeout and thread settings as Buy Proxies suggested => no more burning proxies!
- got myself another VPS to run Gscraper on, and I've imported some lists from there as well
It seems like I'm settling to about 50-60 LpM right now, although I DID visit the 900+ LpM zone for a while over there
@Brandon, I'm running 300 threads.
One thing I noticed helping me was to import my submitted list to the projects. This makes sense as well, so that I can get a bigger verified list that way.
Visit here to make more links http://cevapistiyoruz.com
But this is amazing to see what you ancients used to do.
On contextual dofollow article engines only I get about 10lpm on a good vps, which sucks ass honestly.
Now, I have to go back and tinker with settings.
Au revoir!
LPM depends on many things, engines, platforms, amount of e-mail adresses, email providers, your gsa settings, your proxies, your project settings, your captcha breaking solution, your engines.ini file, domains that you scrape, how you import lists into projects, how you submit... the list is so long...e.g
when blasting fresh scrap for blog comments you wont get much lpm
but when blasting blog comments using verified list and "allow to post on same site again" your lpm will blow...
ill rather keep 20lpm than 2000 but get my website ranked