Skip to content

Getting Closer - Tuning SER and a few things I do...

edited March 2013 in Other / Mixed
Thanks be to @Sven for a really cool piece of kit.  You're relentless... addaboy!

Thanks to @ron, @Ozz, @LeeG, @thisisalex + more for your patience and mind dumps.  Killer!

SER has so many options it's taken me a few weeks to wrap my head around it and have it humming.

This thread is the goods in regards to settings at different tiers - https://forum.gsa-online.de/discussion/comment/17992

53 private proxies

CPU's
For SER
1 x Dell E5345 2 x Xeon 2.33Ghz 8gig RAM, 4 x 72gig 15k SAS drives.  $299 off of eBay :)

For Hrefer, SENuke, SER and Indexer
3 x Custom i7-2700k, 16gig RAM, 1 x 120gig SSD drives

Internet
Comcast Business Class
Fiber 30 MBps up  10 MBps down
Cable 50 MBps up  100 MBps down

Additional tools
3 x SER and CB licenses
Content Spiffer - used daily
http://www.uniqueblend.net/ - spinnets for T1
WAC for T2 and Garbage
Hrefer - smokes them all when it comes to volume scraping without crashes
Xrumer - Garbage king! 

image


Several points:
Firstly - correct me, work with me, debate me.  Let's find the best practices for each technique.
SER along with fightback network, is ninja chop.  I'm seeing some serious results with this software.

I run SER totally lean as per the settings in the thread above.  No PR, No scrapes, No secondary scrapes, No pings, just bare bones.

Learn your Macros and use them.  They are great for customizing blog comments that will stick. 
Nothing better than being able to have your templates 90% ready and just pointing Macros to the right data folder.

Take the time to rewrite all the footprints, blog comments, etc.  Stop.. do it right now, or I keep snagging your links. ;)

Break out  T1 and T2 indexed links and put them into their own campaign.. then hit those really hard with fresh Garbage.  
Don't worry about Garbage links indexing or reverifying, you're not going to use them again. Hell, verifying Garbage is just a resource waste.

ABS, Always Be Scraping - I know @ron lets SER do it's thing, I don't.  Hrefer > Import Into Garbage Level.  
Build your own keyword scrape list.  I haven't used an English list in over a year.  Top 2000 singular in 10 languages and lots of number variations (3, 4 and 5 digit permutation's).

If you do scrape with SER - 3to5 Google, 1 Bing, 1 Ask. IMHO

I don't ping with SER.  Use a service or do it with software.   I use SEO Indexer on Garbage using second internet connection.
I pull RSS feed daily, upload to my server and let SENuke run them through the feed services.

Analyze your verified and deselect low success platforms.

Stop looking at your SERP positions, stop staring as the logs roll by,  diversify your link strategy with other techniques,  and build MORE money sites.


Regards

/swede

Comments

  • Great post @swede ... how freshly unwrapped are the sites you are seeing results on and how long before you are impressed after booting all of this strategy into play?
  • @nicerice  - All my $ domains are 1+ years before they see action.  Web 2.0 $ sites like weebly, webs, etc get smashed immediately.  

    Testing with SER, I saw page 1 results in 5 to 6 weeks on several of my older money sites that needed some fresh medium keywords.

    BTW, I do quite a bit of tier 1 juice via 301 redirects to $ sites.  Obfuscate link profiles.
  • interesting... thanks for detailing that @swede your share is appreciated.
  • AlexRAlexR Cape Town
    Great post. Will reread it later. 
  • try gscarper.. its very fast
  • AlexRAlexR Cape Town
    @swede can you comment on this if you have any experience:

    Would be curious about your setup outside of SER as well. 
  • ronron SERLists.com
    edited March 2013

    @swede, I wanted to compliment you on a great post! You're no noob, that's for sure. :)

    And I like how you think...

  • Do you think it's possible to reach 100 lpm if you use SER to scrape the urls?

    I can't get it to much higher than 20 lpms. Here's my configuration:

    I use 20 shared proxies.
    250 threads
    130 html timeout
    Custom wait time 5 seconds
    Unchecked saved identified sites
    Unchecked save failed sites
    5 search engines to use (all google)
    Unchecked all the filter urls (pr, outbound links etc)
    Unchecked use global list
    Unchecked analyze competitors
    Unchecked GSA indexer

    Any suggesstions?
  • ronron SERLists.com
    Things have changed in more recent versions. I think we were all going a lot faster before. Your settings look correct. Experiment with more search engines. I really think other products with scrape in their name are geared to be much faster if you are looking for speed.
  • @swede, just a couple of questions if that ok.
    Firstly, is your first tier mainly from Fightback Network, or do use SER for tier 1 as well?
    Secondly, are you saying that you rewrote all your footprints for each engine? Or something else, as I see you do your scraping with xrummer.
    And finally, are your spinnets accepted by FBN without rewrites?
    Thanks for your info, have reread this post a few times now and have gotten alot out of it.

  • Stop looking at your SERP positions, stop staring as the logs roll by
    Yea I should write it on my list. Thanks @swede, nice thread
  • Bookmarked!
  • Awesome post, Swede. Btw, what macros do you use to customize your blog comments? Do you use macros to customize your articles as well?
  • Yeah, the blog comments (I assume you do the same for forum/guestbook as well?) was my question too. I'm pretty much solid on the rest.
  • So much to learn :)

    Thanks!
Sign In or Register to comment.