What LeeG is talking about is essentially using a second method to identify which engines you could remove from your projects in SER.
He is simply looking at what CS was not supporting, and pulling out those engines from his projects. Because if they are not supported, you won't get links from there.
Use verified divided by identified. I should have mentioned to skip submitted as that's what ends up getting verified, and then that number reduces when SER either finds the link or can't. Either way, that column doesn't help you out.
I see people saying their lpm is over 100... how are you achieving this? Mine has been stuck under 1 for a couuple days now. Some of my active projects in there are new with barely any links so I know they haven't run out of keywords. Right now my lpm is 0.70 with 730 submissions for the day and 33 verifications.
just curious i posted something on another thread reposting here. i've narrowed a lot of my issues to dead proxies from buyproxies for certain googles is everyone on this thread experiencing issues using buyproxies semi-dedicsted?
i'vebeen having major problems with my buy proxies in the last 2 weeks i think that's been the source ofmy problems. there is no easy way to tell if a proxy has been banned except for searching my logs and seeing 000/000 constantly appearing.
i was running at 60 lpm this morning and this afternoon went to 4lpm. lots of proxy bans. i'm running 40 shared proxies using buy proxies just curious for everyone who has had low LPM issues recently were they all using buy proxies?
I was doing some test with scheduler, i have some high pr projects PR2 PR3 filters, and i use proxies for everything, using scheduler my high pr project where all active at the same time, 3 of my proxies got banned from google, but now they are alive again.
1) Proxies that go dead. (Hence it would be nice for SER to check proxies auto every so often and disable bad ones. Then on next check they may have recovered and can be renabled. So...if you're getting 25% of proxies temp banned LPM will decrease by 25% for that period. Why not have it auto remove these and add them back when not banned)
2) Issue 2 is keywords and SE results overlap. Having 100k keywords is great, but if they just generate the same SE results you're wasting time parsing and getting "already parsed". I.e. how different are results for "blue widget" and "Blue widget 2013"???Still no workaround or tool for us to remove keywords that overlap too much. (@LeeG - how are you getting around this?)
3) It's running out of targets. Would be nice to have an option for it to also use "related searches" to expand the number of results generated from a set of keywords.
You guys keep pointing out proxies, but how can it be proxies, if the problem goes away as soon as i clear url history ? If the proxies where dead they would still be dead after that, or am i missing something here.
About kw or running out of targets, if i am using the global lists and i have unchecked "avoid posting to same domain", shouldnt the application endlessly blast the list repeatedly, or it does only one pass and then goes hunting with the kw provided ?
to clear your URL history is obviously increasing your submission rate because you won't get "already parsed" messages in the beginning and SER can use your global lists with URLs it has already identified or submitted to earlier. SER don't need to spend time for searching new target URLs because of this.
after a while SER is done with (part of) the lists and tries to search for new target URLs. but when your proxies are banned from searching than you won't get new target URLs which results in a low submission rate.
if you just want to use just your lists than disable SEs for a while and see how far you can go with that. just make sure to delete duplicated URLs before doing this to speed things up.
Yes i've been saying that in every post i make in this thread (but as much as i love all the info in this thread, it is basically drifting from the OP), and again i tried with 2k or 100k kw, even with kw overlapping, i'm pretty sure 100k should yield more results than 2k, or a least i should notice the slowdown occuring much later.
Either way no one seems to know the answer to my question about what happens when the global list is fully posted and i have unchecked "avoid posting to same domain". any ideas on that one ?
Keep in mind one more thing, the slowdown is the threads going down, (sub30), but the cpu is always at 99% in the interface, if the threads and the cpu would fade in a similar fashion it would seem a lot normal to me.
Going to try and run without SE's then, but funny thing is when the app is grinding to a halt i check the proxies with google (google.com and look for the word "about") and they come out OK, that is what really puzzles me atm.
i reread your OP again and you are saying that your proxies are OK. are they OK for posting and for searching or do you observe many "000/000" messages in your log when SER is searching?
also you've never posted a log which might help to figure out your issue, if i'm not wrong.
Yes i am scraping with SB, so i'm bound to have lots of similar KW, i'm looking into something like the google wheel, that would put out related but not similar results (like http://www.instagrok.com/ for example), just analyzed a old log that i got and i have +-15 000/000 msg for every minute, is that excessive or an indicator that the proxies are not good for search ? What url and words are guys using to check if the proxy is banned for search (assuming everyone is using google to search that is) ?
Yes i haven't uploaded yet, but i have uploaded the zipped log (1h) here
000/000 msg its an indicator that you are not scraping new urls to post, maybe you are using a lot of repeated keywords.
something thta have worked for me is, go to google keyword tool, search for your niche keyword, and use the 800 keywords GKT gives you, and try with that.
Comments
I went through my engines again last night.
Most of the engines Im using were two short and one needed pulling because its not supported by CB
Not bad for not using the nuclear science zombie killing degree method (the walking dead is back on TV hence my fascination with zombies)
I compared submission speeds in 5.17 to 5.03
5.03 rocks I was running at over 200LpM earlier with that version
5.17 is lucky to hit 120LpM
I have to check that out against my current list.
What LeeG is talking about is essentially using a second method to identify which engines you could remove from your projects in SER.
He is simply looking at what CS was not supporting, and pulling out those engines from his projects. Because if they are not supported, you won't get links from there.
I have found the same with poor LpM
Look at the difference in speed I noted between two versions
Im lucks, I back up exe´s from time time, so always have a fall back
CPU usage is not consistent with the meter
In the older version, I kill my vps cpu
In the new version, if you monitor cpu with resource meter, its up and down
Use verified divided by identified. I should have mentioned to skip submitted as that's what ends up getting verified, and then that number reduces when SER either finds the link or can't. Either way, that column doesn't help you out.
Or, if you use cb, just use the engines cb does captchas for and you wont be far off
Less technical skills with girly type word processing stuff needed
Two calculations
girly pansy methods + time = less tv and beer drinking time
time + effort + brain + spending time working on bits that matter = high lpm and verified
Plus the next ser release will alter all the calculations anyway with the search method Ozz has got added
When I added some new campaigns and didn't put any sort of filters on them GSA started flying and my LPM went way up.
@sonic81 - 9.9 times out of 10 the issues really aren't the proxies. It's usually the settings somewhere in SER that is causing a drag.
@rodol provided an excellent example of how proxies get blamed for the problem when in actuality the OBL filter is the root cause of the aggravation.
About kw or running out of targets, if i am using the global lists and i have unchecked "avoid posting to same domain", shouldnt the application endlessly blast the list repeatedly, or it does only one pass and then goes hunting with the kw provided ?
You say clearing the url history speeds it up?
Sounds like you need to add a lot more keywords
What your doing is deleting any history of where its already been.
So it visits the same places again and again after clearing that
Either way no one seems to know the answer to my question about what happens when the global list is fully posted and i have unchecked "avoid posting to same domain". any ideas on that one ?
Keep in mind one more thing, the slowdown is the threads going down, (sub30), but the cpu is always at 99% in the interface, if the threads and the cpu would fade in a similar fashion it would seem a lot normal to me.
Going to try and run without SE's then, but funny thing is when the app is grinding to a halt i check the proxies with google (google.com and look for the word "about") and they come out OK, that is what really puzzles me atm.
What are you using to scrape keywords?
If its scrapebox, you will scrape a lot of the same type of words
So your target finding will be extremely limited, even with 100k words
Be the same results, time and time again
Yes i haven't uploaded yet, but i have uploaded the zipped log (1h) here
Ty for the help.
I have been getting a lot of crashes myself recently
About every 8 hrs
My LpM has been running low at about 140 to 160. Which is very low by my standards. I push 200LpM+ most days
My crashes normally happen when its pulling site lists and verifying links.
This is what I did last night, four stage attack
Stage one
Delete all global site lists. I only use submitted and verified
Stage two
Recreate global site lists via tools on the advanced menu
Stage three
Delete duplicate urls from those lists
Stage four
Delete target url cache
Touch wood, its been running all night with a health LpM and submitted rate
So your only using global sites lists?
No search results from the search engines?