Getting Closer - Tuning SER and a few things I do...
Thanks be to @Sven for a really cool piece of kit. You're relentless... addaboy!
SER has so many options it's taken me a few weeks to wrap my head around it and have it humming.
This thread is the goods in regards to settings at different tiers - https://forum.gsa-online.de/discussion/comment/17992
53 private proxies
CPU's
For SER
1 x Dell E5345 2 x Xeon 2.33Ghz 8gig RAM, 4 x 72gig 15k SAS drives. $299 off of eBay
For Hrefer, SENuke, SER and Indexer
3 x Custom i7-2700k, 16gig RAM, 1 x 120gig SSD drives
Internet
Comcast Business Class
Fiber 30 MBps up 10 MBps down
Cable 50 MBps up 100 MBps down
Additional tools
3 x SER and CB licenses
Content Spiffer - used daily
http://www.uniqueblend.net/ - spinnets for T1
WAC for T2 and Garbage
Hrefer - smokes them all when it comes to volume scraping without crashes
Xrumer - Garbage king!
Several points:
Firstly - correct me, work with me, debate me. Let's find the best practices for each technique.
SER along with fightback network, is ninja chop. I'm seeing some serious results with this software.
I run SER totally lean as per the settings in the thread above. No PR, No scrapes, No secondary scrapes, No pings, just bare bones.
Learn your Macros and use them. They are great for customizing blog comments that will stick.
Nothing better than being able to have your templates 90% ready and just pointing Macros to the right data folder.
Take the time to rewrite all the footprints, blog comments, etc. Stop.. do it right now, or I keep snagging your links.
Break out T1 and T2 indexed links and put them into their own campaign.. then hit those really hard with fresh Garbage.
Don't worry about Garbage links indexing or reverifying, you're not going to use them again. Hell, verifying Garbage is just a resource waste.
ABS, Always Be Scraping - I know @ron lets SER do it's thing, I don't. Hrefer > Import Into Garbage Level.
Build your own keyword scrape list. I haven't used an English list in over a year. Top 2000 singular in 10 languages and lots of number variations (3, 4 and 5 digit permutation's).
If you do scrape with SER - 3to5 Google, 1 Bing, 1 Ask. IMHO
I don't ping with SER. Use a service or do it with software. I use SEO Indexer on Garbage using second internet connection.
I pull RSS feed daily, upload to my server and let SENuke run them through the feed services.
Analyze your verified and deselect low success platforms.
Stop looking at your SERP positions, stop staring as the logs roll by, diversify your link strategy with other techniques, and build MORE money sites.
Regards
/swede
Comments
@swede, I wanted to compliment you on a great post! You're no noob, that's for sure.
And I like how you think...
I can't get it to much higher than 20 lpms. Here's my configuration:
I use 20 shared proxies.
250 threads
130 html timeout
Custom wait time 5 seconds
Unchecked saved identified sites
Unchecked save failed sites
5 search engines to use (all google)
Unchecked all the filter urls (pr, outbound links etc)
Unchecked use global list
Unchecked analyze competitors
Unchecked GSA indexer
Any suggesstions?
Firstly, is your first tier mainly from Fightback Network, or do use SER for tier 1 as well?
Secondly, are you saying that you rewrote all your footprints for each engine? Or something else, as I see you do your scraping with xrummer.
And finally, are your spinnets accepted by FBN without rewrites?
Thanks for your info, have reread this post a few times now and have gotten alot out of it.
Yea I should write it on my list. Thanks @swede, nice thread
Thanks!