Where's My Bottleneck?
Here is a top level view of my projects
http://prntscr.com/3i6hxh
Submission Tab
Even when I put threads at 2000 it only uses 45% max of the CPU
I saw in a few videos that HTML timeout of 140 is decent.
When I first started I used private proxies for everything. I'm not sure if I should do that or keep it like I have it. Makes sense to only use them for submission. I have 10 semi-private proxies. Do people use Private Proxies for everything or is it always a mix?
I kept getting a lot of "download failed" or "IP seems to be blocked on blahblah" Meaning it was a bad public proxy where do I find good public proxies? I tried searching for sites to add, but after it added 80K I only had about 500 that were good.
http://prntscr.com/3i6j9o
The Data Tab
http://prntscr.com/3i6ic9
The Options Tab
I tried enabling and disabling the Send verified link to indexer services to no avail. Only English Search engines are enabled, Google, Yahoo, MSN to reduced duplicate search results
http://prntscr.com/3i6ih2
Options Screenshot 2
PR 0 and above then OBL 100
http://prntscr.com/3i6it6
For email verification I just selected
-Create New
-Then 25 Mailcatch.com
I don't have an AA list enabled. It's just searching based on keywords. I didn't think my search filters were that strict but maybe they are? Hard to believe that an AA list would improve my LPM that much. I wanted to get niche relevant back links. I didn't just want to throw up links anywhere.
I'm open to thoughts, ideas, suggestions?
http://prntscr.com/3i6hxh
Submission Tab
Even when I put threads at 2000 it only uses 45% max of the CPU
I saw in a few videos that HTML timeout of 140 is decent.
When I first started I used private proxies for everything. I'm not sure if I should do that or keep it like I have it. Makes sense to only use them for submission. I have 10 semi-private proxies. Do people use Private Proxies for everything or is it always a mix?
I kept getting a lot of "download failed" or "IP seems to be blocked on blahblah" Meaning it was a bad public proxy where do I find good public proxies? I tried searching for sites to add, but after it added 80K I only had about 500 that were good.
http://prntscr.com/3i6j9o
The Data Tab
http://prntscr.com/3i6ic9
The Options Tab
I tried enabling and disabling the Send verified link to indexer services to no avail. Only English Search engines are enabled, Google, Yahoo, MSN to reduced duplicate search results
http://prntscr.com/3i6ih2
Options Screenshot 2
PR 0 and above then OBL 100
http://prntscr.com/3i6it6
For email verification I just selected
-Create New
-Then 25 Mailcatch.com
I don't have an AA list enabled. It's just searching based on keywords. I didn't think my search filters were that strict but maybe they are? Hard to believe that an AA list would improve my LPM that much. I wanted to get niche relevant back links. I didn't just want to throw up links anywhere.
I'm open to thoughts, ideas, suggestions?
Comments
Also mailcatch and those types of services are OK for T2 but you are going to going to come up against huge spam filter hurdles for T1. Look around on here or other forums for bulk Yahoo/Gmail accounts. I buy them from Kelvin on here for $6/1000 accounts.
You really think it's the proxies huh? Do you just use your private proxies for everything or do you still mix in the free ones too. I'm not really sure if I have it set up right. I figured only using them for submission made the most sense.
I can't really justify paying $100 a month for semi-private proxies just for posting back links when I'm not really making any money with GSA. I could probably swing $20 a month though just to see if it does in face increase my lpm. What type of LPM on average are you getting with PR0 OBL 100 and 100 proxies?
I've got several licenses of SER though, so you could probably make do with a lot less. I would try with as many as you can afford, you can always cancel after 1 month.
Good thinking Olve1954 I was wondering why I was getting 4 LPM then it dropped down to 0 for awhile.
Restoring hope to the hopeless my friend.
Do you have any idea of what speeds would be like with verification turned on?
SER will always show what verifieds its made so far to the right of each project wont it? didnt know there was a way to turn of verfiication completley? I presumed we were just talking about reverfication here.
I normally only verify projects that have other projects pointing to them, so I usually set each one to verify once a day, but the verification never happens *only* once per day.
It seems that once it starts verifying the project, it is constantly dipping in and out of verification even though it's only set to verify once every 1440mins
well this sucks...i jsut turned off verificaiton on all projects and no noticeable change in lpm, well it doesnt suck its good in that i keep keep it turned on in that case
Found a 870K keyword list and use %spintax% in the keywords field to randomly choose a file.
Turned verification off
Turned off Web2.0
Turned off indexing
Checked try to have an anchor text
Changed filters to PR 0 and OBL 100
I kept the bad words filters though.
I got a charity hand out of 30 yahoo emails. Which was a huge help!
My question is they were all in .txt format. I added from file and chose "identify platform and sort in" will this add the URLS to my verified list? How does GSA know which list to add it to?
To verify I guess I just make the project "active V", but then where do I get the links to add to SEO indexer?
I ran a campaign and my LPM went from 4 to 10. The only other variable is the proxies. I need to buy more.
What else am I missing?
Looks like im back into the land of the mortals.
Have you filtered for only high performance platforms?
@silverdot - I would just import the target urls into a project (like 30 projects that are all clones) with all platforms turned on, and definitely leave verification turned on for automatic 1440.
This scraping and processing you are doing is best done with dedi's, one for scraping and one for processing them through SER projects to create the verified file. You need about 10 emails per project.
It sounds like you picked up some old lists. Never import them into verified. Huge mistake. Most targets are probably dead. The only way to find the gems is to process them through projects as I described above. I would create those 30 cloned projects with fake URL's to post to. In other words, I wouldn't even waste my time actually trying to post them to my real projects. What comes out as verified from all that processing should be used for your real projects.
In other words, you are mixing a very inefficient process with your real projects. Processing massive lists to find a relatively few legitimate targets is a painful and slow process. It really should be separated from running your own real projects. Yes, it is expensive to do it the right way, but if you want to take that route, then a separate server with a second copy of SER + CB is the right way to do it.
You basically have 3 options: 1) Have SER scrape for targets and directly feed your real projects (which is what most people do, and how I did it for a year and a half - and did just fine); or 2) set up a separate scraping server to find targets and feed a separate processing server to run those targets through SER - and then take the product and use on your 'real projects' server; or 3) buy lists.
I feel that you are mixing approaches, and that is holding you back from speed and success.
http://prntscr.com/3jdi36
I changed my proxy settings to have everything on private except search engines. I kept those on public with a 60sec timeout. With an AA list I don't think this setting matters because I'm giving it the target URLS.
Also I've noticed that when running SER like this after a few hours it tends to slow down for no apparent reason. Do either of you get that?
I've just been stopping and restarting which seems to fix it.
It'll run for a couple of hours hovering at it's setting of 1250 threads and post 10s of 1000s of submissions, but then it'll slow right down to 200-500 threads and the number of subs slow too, sometimes just a few hundred.
Still has lots of targets to post to and as soon as I stop and restart it speeds back up, but it just seems to get stuck after a few hours.
I'm guessing it doesn't happen to you or the others and its something unique to my set up
@davbel - I got into a routine now where I have verification disabled on all projects, and then switch to to setting Active(V) later in the day for a couple hours to clear it all out. And then flip it back to Active.
I also clear cache and reimport the targets once per day. That helps it to run fast and with plenty of targets.
It appears as I'm not the only one with the slow down - there's another thread on the forums with users getting the same slow down issue whilst it's posting and then remedying it in the same way with a stop / start.
They've asked @sven for a feature for SER to restart automatically every 2 hours
@ron how many projects are you running when submitting? Are you using the scheduler?
And you are importing the list into every project aren't you?
Normally I've been running about 60ish projects, no scheduler at 1250 threads and SER has been at about 30-50% CPU and 1.3-1.7Gb memory.
However when I import the site list into all the projects, SER runs out of memory within a few minutes and struggles to run more than 10-15 projects without 100% CPU.
This is how I had been doing it:
- Selecting all projects
- right clicking
- Import Target Urls -> From Site List
- Selecting the correct sitelist
- Then clicking "Yes" from the Automatically choose URLs from engines blah blah pop up
But based on what you say @ron, what I should be doing is:Correct?