200LPM Consistently! My Settings Included
JudderMan
UK
(Edit - 5pm it's at 280LPM and 200k links for now, running Active V for an hour)
Over the past two days I've been getting 180-280LPM, and this is letting SER scrape!!! I have loaded my own scraped lists previously but they gave <0.01% verified results, so this is all down to SER really.
Settings
OK I have a very quick server (32GB, octo-core, 1GB uplink) and know it's overkill but I reckon you could achieve the same results on an SSD VPS or normal VPS from SolidSEO. I will be buying one of these soon for another copy of SER but only for posting. I do use my server for other things too so it's worth it for me, and it's not much more than the fastest SSD VPS anyway. Thanks to @gooner for helping with settings.
- HTML timeout 180 seconds
- Time between searches 40 seconds (I used to run at 5-12 seconds previously...not good)
- 40 Semi-dedicated proxies (I need to buy more, I know. I did cancel my private proxies are they weren't good enough ProxyHub are crap these days)
- Never verify (I run Active V once per day for a few hours as I can run 1000 threads and 150 projects at the same time)
- 700 Threads
- CB and EVE (in that order)
- 450 projects (usually 3-7 tiers/projects per URL)
- Some spam projects to boost all verified links of all T1A+ links
- Some high PR projects to boost all verified links of all T1 links Global Only (run some high PR projects for a few weeks to build up a base)
- Bad word list 450 words (search for them online)
- Do not check continuously try and post
- Check post to same URL with x amount of time between posts
- Try and put anchor in comments/descriptions
- 100 OBL (not sure on this as being a good idea as I'm more inclined to just get more links)
- Use global lists but only submitted and verified.
- 20/20 Scheduler - so you don't NEED lots of projects as I only run 20 at a time. Any more and I find that I get N/A on some of the active projects.
In a week or so, I'll compare the rankings of my sites and post back. Hopefully more LPM means higher rankings (in my case, not everyone's). I'm sure when I learn how to scrape better then I should see stupid LPM but I wanted to nail the right settings with SER doing everything first. That way I can run one copy of SER doing everything and one for posting lists from GScraper/Scrapebox.
Comments
The LPM is good because you're running the same URLs through 450 projects, twice (submitted and verified).
Just my thoughts, not trying to be negative as I always appreciate some tips that make me think.
But possible to get higher than 200 LPM with SER not scraping at all, using only imported and global sitelists.
@judderman is somewhere in the middle of that.
"- Check post to same URL with x amount of time between posts"
This is this option?
What ammount of time you giving in minutes?
Looking forward to your results with global lists unchecked.
Good job @JudderMan. And even more impressive since you only joined a few months ago.
I'm sure some of that LPM came from footprints. I did this exercise a year ago and it makes a difference, but a quite a bit of work. And then Sven changes the script in an engine to make it work better, and then you need to update the modified engine. So it never stops, lol.
What I found was that when I deleted duplicate domains - drumroll - 95% were duplicate. So be prepared to stain your shorts haha. Then you stare at that information, and you begin to change the game plan.
There is nothing absurd with 95% duplicates. You start with 1 link in T1, which becomes 10 links in T2 and 100 links in T3. In theory you should only have less than 1% unique domains by that math. The only reason it made it to 5% is because the T3 could never manufacture 100 links for every link in T2. Hell, I'm lucky to be twice my T2 on the T3 tier, lol.
This footprints helps too if you importing list, not scraping from google?
how do you setup your tiered campaigns?
Best regards,
@sagarpatil no mate, just use 5-10. I only use UK ones for my UK sites and US ones for everything else. You'll get too many duplicates if you use too many.