I never do multi tier campaign in a go..Its waste of time and resources...best way is steadily create your tier1 links...Then find out your VIP links. I have following criteria for my VIP links
- Must be older than 30 days [So I am sure its not going to get deleted] - Must be dofollow [I never build lower tier for nofollow links] - Must have my 'exact anchor text' - Must be indexed [I never build multitier links to nonindexed links]
Once I have got enough VIP links..i sort them according to 'domain PR'. Then I start my seperate T2 campaign for only T1 links which pass above criteria and have high PR root domain..
This strategy is working amazingly for me..and it dosen't take too much resources..I promote 25-30 sites through My SER and never went above 30-40 LPM ..cos I don't need that much links :-)
PS - whenever i see my ranking droping for perticular keyword..the first thing i do is to 'POWERUP MY VIP LINKS'..this always push my keyword up...then I try to increase number of my vip links
There is an email address creation tool being developed by one third party that will make boosting lower level links easier.
Stop forum Spam will be on overload when it gets released
At the moment, if your not using a catch all, you have to create multiple hotmail type accounts and then add a divert to the account that your using to check registrations
The one thats being developed, you add the email address you want emails forwarded to, add proxies and enter the amount of accounts needed. Which is about 3x the proxies used per 24hrs
Once done, add the created accounts to ser in spintax
So you will be able to add more forum profiles etc and cover your footprint more
mm i am having a hard time finding new social networks + articles to post, i dont know why i cant find a lot of these, how many verified social network + articles you have @ron or @leeg ?
how many projects are needed to judge if my settings are running well?
trying to figure out if i am using gsa properly cause my lpm is like only at 0.51 even after tweaking according to this thread (i only have 3 projects going though... too early to judge)?
@skyf: how many threads are you using? if you are using <50 than try to use your private proxies for posting only, uncheck to modify "query time" and "use proxies for PR check" and see if thats work out for you.
@skyf several people have found that two main tricks can give a big boost in LpM
Only using about five googles. 4 random + google international
Plus only using engines that your captcha breaker can crack
With CB thats easy to do, just by comparing the engines listed in ser setting the engines listed by type in CB
I personally dont do a lot of the junk thats listed, ie image comments, guestbooks, video links, referrer and pingback. Plus I dont do the web 2´s, since I own other software to create those types of links.
Details: ~25 Projects in GSA SER (12 new projects and 13 3 weeks old projects) - Projects run at the same time 3 - Switch to next Project after 20 minutes - 240 Threads ; HTML timeout 70 seconds - SER skips recaptchas - 5-10 Googles used per project
Verifications are set to 3000 minutes, that's enough for these projects
No proxies, I'm sure I was reported a lot of times to my (dedi) hosting, but seems like they don't really care (good for me ). I've been with them for many months.
@Brandon Yes, in SER I use the dedi IP only for scraping (Used proxies for scraping before, but those were really slowing me down). I scrape separately with scrapebox too and feed those lists into SER.
@LeeG Yes, it's Captcha Breaker (solve rate is at least 15%). The max I was able to get with Captcha Sniper was 150LpM.
So after a problematic sunday today back to full 200lpm posting.. strange.... must be vps speed/shared usage, proxy speed, global internet usage after all
Comments
I dont do much link feeding unless I see a drop in certain link types personally.
My vps is running flat out with ser and cb
I have found you pull more results just running it with a good search engine selection and plenty of keywords
@micb - It all depends on the competition. If I am steadily increasing rank with just a T1 & T2, I leave it there to see if I get to #1.
If I get caught in a plateau and it stays there for a while, then that is when I usually start up a T3.
Remember, you always have the ability to create multiple T2's. You are not just limited to T1, T2, T3, T4, etc.
- Must be older than 30 days [So I am sure its not going to get deleted]
- Must be dofollow [I never build lower tier for nofollow links]
- Must have my 'exact anchor text'
- Must be indexed [I never build multitier links to nonindexed links]
Once I have got enough VIP links..i sort them according to 'domain PR'. Then I start my seperate T2 campaign for only T1 links which pass above criteria and have high PR root domain..
This strategy is working amazingly for me..and it dosen't take too much resources..I promote 25-30 sites through My SER and never went above 30-40 LPM ..cos I don't need that much links :-)
PS - whenever i see my ranking droping for perticular keyword..the first thing i do is to 'POWERUP MY VIP LINKS'..this always push my keyword up...then I try to increase number of my vip links
There is an email address creation tool being developed by one third party that will make boosting lower level links easier.
Stop forum Spam will be on overload when it gets released
At the moment, if your not using a catch all, you have to create multiple hotmail type accounts and then add a divert to the account that your using to check registrations
The one thats being developed, you add the email address you want emails forwarded to, add proxies and enter the amount of accounts needed. Which is about 3x the proxies used per 24hrs
Once done, add the created accounts to ser in spintax
So you will be able to add more forum profiles etc and cover your footprint more
Im not even going to tempt looking for the next couple of hours
Last time I needed to check a setting when I was on a personal best roll, I crashed ser
If your running short of a type of link, run Gscraper or Scrapebox to find more target sites
Another trick to boost article submissions, is to copy and paste the engine in the engines folder
Then add that as well
So your running double the searches for any engine types your low on
You can also try adding more footprints to the copied one, so your pulling a lot more results.
There is a thread on bhw with a lot of extra footprints
http://www.blackhatworld.com/blackhat-seo/black-hat-seo/491042-get-huge-search-engine-optimization-footprints-collections.html
If you edit the files, you will see each footprint is separated by "|" without the "
Easy enough to do
Just remember to back up the edited engines. Each update, they get over written
Just hit my won all time recorded personal best with ser
See if I can bump those results even more once I start adding a lot more fresh email accounts
how many projects are needed to judge if my settings are running well?
trying to figure out if i am using gsa properly cause my lpm is like
only at 0.51 even after tweaking according to this
thread (i only have 3 projects going though... too early to judge)?
Oi less of that. its 27 t3´s I run
One guy I have been helping is 200k a day on three projects with no tiers
lol, i have 3 projects and i do 0.51 lpm...
need to put up some more projects to see what happens
thanks @ron @LeeG
@skyf several people have found that two main tricks can give a big boost in LpM
Only using about five googles. 4 random + google international
Plus only using engines that your captcha breaker can crack
With CB thats easy to do, just by comparing the engines listed in ser setting the engines listed by type in CB
I personally dont do a lot of the junk thats listed, ie image comments, guestbooks, video links, referrer and pingback. Plus I dont do the web 2´s, since I own other software to create those types of links.
Thanks for the tips guys
Details:
~25 Projects in GSA SER (12 new projects and 13 3 weeks old projects)
- Projects run at the same time 3
- Switch to next Project after 20 minutes
- 240 Threads ; HTML timeout 70 seconds
- SER skips recaptchas
- 5-10 Googles used per project
Thats the downside, your doing about 1 in 40 verified
I do between 1 in 10 to 1 in 20
Just think if you was using a paid captcha service with that lot
ewandy, what are you doing for proxies?
No proxies, I'm sure I was reported a lot of times to my (dedi) hosting, but seems like they don't really care (good for me ).
I've been with them for many months.
No doubt there will be questions asking which captcha software is used
And there is only one that you can disable proxy types with
Looks like Captcha Breaker strikes again for helping increase submissions
Yes, in SER I use the dedi IP only for scraping (Used proxies for scraping before, but those were really slowing me down).
I scrape separately with scrapebox too and feed those lists into SER.
@LeeG
Yes, it's Captcha Breaker (solve rate is at least 15%).
The max I was able to get with Captcha Sniper was 150LpM.