Finally, some nice results.
I was always sort of sad and jealous seeing users with so high LPMs. I did almost all, worked out many options alone or all at once but it was all in vain. My all efforts to increase LPM mostly resulted in vain. There were times in between when I got LPM of like 45 or so, but that too stalled after some SER updates, reason being unknown.
I got desperate to the point, that 6-7 days back, I started working like a mad scientist. Day and night, I spent my time in tweaking SER (with only around 13 projects active, 7 ran each for 20 minutes with the help of scheduler), trying various options, some with one or the other and so on. By hook or crook, I wanted now really to achieve good LPM.
So as a result of these past days, I finally am now able to push to 55-60 LPM at constant rate (as of today's observation).
So work hard to the point until you achieve your desired rate. Its tricky, I agree, but you gotta figure out. What my advise would be, might not work for you. For example, my friend just followed simple options out of scratch SER with no global site lists too and he keeps 90 - 100 LPM. So that all depends really on...really not sure what, but its different from version-to-version and system-to-system. So just work and tweak out until you achieve the desired output.
The new versions might change my performance too, not sure, but I'd now never give up. I'll keep this thread updated if needed.
edit: Also just in case anyone wonders, I now also keep the JUNK tiers (i.e. T1A, T2A, T3A, etc) to never verify, but would just verify all of them at once in the weekend so I can increase the number of submissions.
Thanks.
Comments
the lowest % verified I have usually in a batch of submission is around 20%
an ideal verified run is around 80%
the only criteria is a common sense criteria
submit to sites that CAN BE verified
sites with moving pages show up as submitted but never as verified for obvious reasons
other criteria for HIGH % of verified is to avoid being deleted soonest after submissions:
to achieve more or less permanent links:
make sure you have human readable valuable content that a site owner finds valuable enough to keep
if ever you used SB to extract external links
then you know how many sites delete ALL = profile pages as well as submitted content
often nearly half of all Google found poteential target URLs no longer exist
hence
1.
to improve your efficiency and thus your LpM
avoid abusing SER for attempts to submit to dead pages
if another webmaster submitted days or weeks ago and HIS submissions were all deleted
chances are near 100% that YOUR submission may succeed but deleted within hours or days as well
2.
to increase SER efficiency further
make use of the newly created clean up process to remove all death URLs as well as re-test for correct matching engines
may be up to half and some engines even more of all global site list targets may be death even on a young project of only a few months old
3.
when importing NEW target URLs from SB
FIRST make a LIVE test with the proper SB addon
this assures only active URLs are imported into your global list
4.
when comparing LpM with others HERE
one key factor usually always is MISSING in your comparisons =
YOUR CPU
and
YOUR bandwidth
I work from home
had LpM the first 2 months of around 1 and rarely up to 1.5 but almost never 2 or higher
I was running a quarter Mb/s until before yesterday
atom 1.86 GHz
2GB RAM
now my NEW ISP bandwidth data before starting SER and SB as of yesterday:
:::.. Internet Speed Test Result Details ..:::
Download Connection Speed:: 1197 Kbps or 1.2 Mbps
Upload Connection Speed:: 456 Kbps or 0.5 Mbps
Tested At:: http://TestMy.net Version 13
Test Time:: 2013-11-03 23:10:15 Local Time
however MOST of the time 33 threads use up to 99% CPU almost permanently,
hence even if I would have 10Mbps, there would be NO way to further improve performance
if you run too many threads
chances are your LpM may be lower than if you go slowly and steady flow of data with the least possible loss of packets or time outs
5.
HTML time out is another crucial factor
the longer the less submissions you may fail / loose
but the LpM may get much lower
GOOD quality sites are on quality hosts
make a choice]quality over quantity
my time out is 75 seconds - mostly 33 threads, rarely 44
without any verification / re-verification going on
that results in an LpM of 40+ (highest peak once yesterday was 100+ for a few hundred submissions)
and a verification of some 80% in good times
when extensive verification AND re-verification starts, then LpM may go down to around 12-15
remember:
NO re-verification only is safe if NO links are built to those sites
if you have a T2 linking to T1
that T1 needs to have at least the UN-LINKED pages re-verified to avoid building 404s
a SELECTIVE re-verification AS requested as feature
https://forum.gsa-online.de/discussion/6588/re-verification-of-urls-single-largest-resource-waste-criteria-in-entire-ser-on-slow-www-connections#latest
could substantially improve overall performance by reducing re-verification resources to really NEEDED next few link-targets
hence NO re-verification is safe on your highest (last) T
because NO links are built yet to that T
all others need re-verification as near to the link-building moment as possible to avoid building links to deleted pages
one of the main factors to improve my own overall performance was to have
- NO junk engines, NO trackbacks and almost NO forum / comment stuff
- switch OFF any features that suck CPU or time from SER
- do as much as possible OUTSIDE SER to free SER for actual submission
just quality targets on all levels supplied with quality content on all levels
a highest LpM with highest verified still seems to result from article / content submission to
blogs
web20
social networks
Wiki
these a.m. are YOUR self created "backlink-networks" you create using SER by supplying quality human readable, useful / helpful content WELL spun and human edited
many or all of a.m. are instant approval since most require first a creation of an account
once done your LpM may go UP
better to have an LpM of 20-30 with 80% verified
than to have a LpM of 100 with 10% verified
!!!
make the math =
A. 80% of 20-30 is 16-24 verified links
B. 10% of 100 is 10 verified links
check YOUR CPU even if / specially if on a VPS
on a VPS you have a fraction of ONE CPU and a fraction of a typical machine RAM
at home you have one full CPU plus all RAM
yet as you may see from my own a.m. data,
my ONE machine (small laptop) is fully loaded with 33-44 threads at CPU of almost permanently 99%
SER is VERY resource hungry and I seriously doubt that a VPS can give 100-several hundred LpM with equally high verified links that survive the next few weeks or months
even a VPS with a quad 64 bit CPU = divided by the humber of VPS per machine may MAY result in less actual CPU power than a middle class machine at home on a 2-10Mb/s ISP connection
final recommendation
find YOUR ideal values
test different settings from excessively LOW values to clearly TOO HIGH
YOU can know YOUR limits by EXCEEDING your limits !!
run each test for may be a similar number of hours or may be 1000 submissions
compare results
make sure you have a comparable mix of all types of engines in each test run
working at home,
start with
a HTML time out of 60 seconds
up to max 120 seconds
test with 22 threads up to 55 threads during submissions (for verification/re-verification I switch to 66 threads since verifications have less data transfers in BOTH directions
monitor CPU and reduce threads until your CPU goes at least every now and then clearly below 99%
(now for example I run 33 threads at 99% permanent CPU BUT every several seconds the CPU load drops to the lower 90-ies or even 80ies
meaning the CPU is NOT overloaded and thus NOT near a freeze up or crash
then if still ample power, you may slowly increase
however always watch the relationship of submitted vs verified
too fast = your LpM may increase but percentage wise your verified decrease = reduced overall efficiency !
with above testing
you find YOUR ideal values
at home as well as on VPS
slower is sometimes faster
just as in real life
and finally
if you simply scrape for new URLs, by any means/methods/tools (SER or SB or any other)
keep in mind that a blog found or wiki found in NO way means you can post articles
if however you find new targets using creative SB methods (mentioned in other threads) then you get targets with have already approved posts = your chance of new auto-approve targets is near 100% = saving time and resources and improving verified vs submitted
when comparing / publishing data = always put it into a relationship with your technology available to allow a comparison by others
My tier 1's (contextual) tend to have up to 1000 submitted but not verified yet at any one time, for tier 2's (contextual) about 5000 and for tier 3's (junk) up to 10,000 submitted but not verified.
I don't really worry too much about those numbers because my daily totals are always around 60+ % verified - that's ok for me.
I've never tested any other setup so i don't know how it changes the results. Sorry mate.
300 threads, 100 private proxies (shared between 3 vps), 120 html timeout, 120 custom wait time. LPM varies from around 30 on the vps with least projects to 100+ on the vps with most projects.
Submissions total per is around 100,000 to 250,000.
I haven't tweaked any engines or removed any. I really i don't have the time or motivation to get into that.
I replaced all emails once after 2 months of running SER. They were still working ok as far as i know but i was trying to solve another problem at the time so i tested new emails.
What i meant before was that on any day i can look at SER and see those numbers of submitted but not verified. After some days they will go away but are always replaced by new links of course, so those numbers are pretty much constant.
Submitted stats never go to 0 for me, because every day SER submits new links.
I don't do anything special with SER i think i have big number because i have 100s of projects, only that.
I would need about 110,000 links per day to get the full amount for each of the projects at that level.