@m1xf1 yep. Do Options-->Advanced-->Tools-->Export site lists-->Verfied and it'll create a nice handy *.sl file for to import the same way you exported. It merges with any existing lists you have too.
Thanks to this discussion I changed my Proxies from public to private, and take off a lot of engines with small percentage of verified links and now I have 50% of verified!
Wow guys, went from like 1-2 lpm for the past month or so to over 100 lpm almost overnight!
Spent about an hour or so implementing all the changes on this thread, so many thanks to all the GSA gurus. Just a few changes made all the difference, here are some of the things I did:
- Reduce search engines to Google only, 5 selected - Deselected all the engines that had less than 10 verified links built and applied it to all my campaigns - Am using 10 semi-private proxies from buyproxies - In project options, am using 'use keyword to find sites' checked, and also checked 'use global site list' and only checked the 'submitted' option.
Am running GSA on Berman hosting the middle package and CB.
I think culling the engines that didn't perform took me from 50 lpm to over 100 lpm and it wasn't that extensive either.
but did i understand you correctly that you have 'always use keyword to find sites' have checked? that is not recommended as it will lower your LpM in the long run as the SEs won't find as many sites as you do when you leave that option unchecked.
@Ozz Thanks, I will give that a go. But yes, at the moment I have that option checked. How does the SE's find sites to post on then if not to use the keywords?
I don't fully understand how GSA works, just ploughing my way through it.
Did something change in the last couple of updates? My LPM has been in the range of 30-40 for the last weeks. The last 24 hours it's been 1-2. I've tried disabling proxies - same thing. My verified has dried up naturally as well.
Nothing has been changed in the 20 projects I'm running.
@weeza, i don't want to bore you with technical things but in general it works like this:
- if an enignine like "General Blog" search new targets it uses the footprints + your keywords for searching from time to time regardless of that "Always Use Keyword" option. this will work for those platform as there so many different and niche related websites to get found by SEs. to make it clear, the script adds your keyword automatically to the footprint, but not all the time
- if you search target URLs for "Directories" and do this with with a footprint + keywords you won't get that results at all
as you see you won't find any article directory with the "footprint + keyword" search term. sure, if the keyword is just a single general word like "dog" or a broad keyword with two words like "dog training" you will find more targets, but how many of your keywords are (specific) multi part keywords and how many are just a single word or broader terms?
@noct . THANK YOU for this huge list! helps me A LOT!!!!!!!!!! But somehow.. some engines perform better than you with my setup.. like PHP-Nuke or php bbb
Is it normal for GSA to be running at 2-5 threads with only 1-2 projects running? I've tried creating a new project - as well as deleting history on an existing. Site lists are turned on with thousands of verified links. Project have been running with - and without proxies. Google searches are working fine as well - no banned proxies.
I've attached a screenshot showing GSA running. Max threads are 50 - but only 2-5 are running. When verifying all 50 threads are used. If I turn on enough projects, all 50 threads will be used.
Brandon - something is definitely wrong. It has created 2 links in 4 hours. It just hangs there - doing nothing. Is there a place where I can download 5.08 - 5.09? It ran much better for me with this.
I know you own very few sites as you told me that before. How can you get away from Google's penalties while building all these huge amounts of links every single day ?!
I know you build links for all your sites internal urls. But is that safe enough though ? I'm promoting only one 2 years old site with about 6000 urls and I was thinking to try your strategy but I want to be as safe as possible that's why I'm asking. It's gonna be really nice from you to teach us how you hide from Google's eyes! after you taught us how to build huge amount of links! : )
@mamadou - I do 20X on Tier2 and no limits on T3+. The key is to not go crazy on Tier1. I may submit 5/day on a new site, and a few months later do 20/day. Those are very safe levels.
I have so many urls on my site and I'm targeting totally different keywords with each url. I tried doing the 20/100 a day strategy. That yields very very poor results. Right now I'm doing like 100/200 link a day for each URL on my site!. I'm getting good results but not perfect.
I was asking you guys because I want to raise it to something like 1000/day for each url on my site!!
It is averaged out from 12am so it will be more difficult to keep up that speed all day, especially the closer you get to midnight, especially if you only verify once a day and that is in the latter part of the day, as you may spend an hour verifying so that will skew your overall results for the day.
Guess you also need to take into account times of day when you may have slower bandwidth, such as after school when all the kids come home and start surfing the internet/playing xbox live etc
Comments
Spent about an hour or so implementing all the changes on this thread, so many thanks to all the GSA gurus. Just a few changes made all the difference, here are some of the things I did:
- Reduce search engines to Google only, 5 selected
- Deselected all the engines that had less than 10 verified links built and applied it to all my campaigns
- Am using 10 semi-private proxies from buyproxies
- In project options, am using 'use keyword to find sites' checked, and also checked 'use global site list' and only checked the 'submitted' option.
Am running GSA on Berman hosting the middle package and CB.
I think culling the engines that didn't perform took me from 50 lpm to over 100 lpm and it wasn't that extensive either.
I don't fully understand how GSA works, just ploughing my way through it.
Nothing has been changed in the 20 projects I'm running.
also using virtual box and this will help for sure
I've tried creating a new project - as well as deleting history on an existing. Site lists are turned on with thousands of verified links. Project have been running with - and without proxies. Google searches are working fine as well - no banned proxies.
I've attached a screenshot showing GSA running. Max threads are 50 - but only 2-5 are running. When verifying all 50 threads are used. If I turn on enough projects, all 50 threads will be used.
rodol shared the 5.08 exe on here yesterday
https://forum.gsa-online.de/discussion/comment/15219#Comment_15219
I know you own very few sites as you told me that before. How can you get away from Google's penalties while building all these huge amounts of links every single day ?!
I know you build links for all your sites internal urls. But is that safe enough though ? I'm promoting only one 2 years old site with about 6000 urls and I was thinking to try your strategy but I want to be as safe as possible that's why I'm asking. It's gonna be really nice from you to teach us how you hide from Google's eyes! after you taught us how to build huge amount of links! : )
The trick is tier building
I know in the past it was said on the net t1 is about 100 links per day
So if you apply that rule t1 100 links t2 1000 links t3 10000 t4 100000
Also take into consideration, your looking at a low confirmed level. 1 in 10 or worse
http://www.matthewwoodward.co.uk/tutorials/the-ultimate-guide-to-tiered-link-building-part-4/
also read this
https://forum.gsa-online.de/discussion/1888/get-better-and-learn-more-with-seo/p1
http://seosunite.com/f2/tiered-link-building-tiered-linking-tutorial-815/
5/day is VERY safe.. what engines do oyu use there?
I have so many urls on my site and I'm targeting totally different keywords with each url. I tried doing the 20/100 a day strategy. That yields very very poor results. Right now I'm doing like 100/200 link a day for each URL on my site!. I'm getting good results but not perfect.
I was asking you guys because I want to raise it to something like 1000/day for each url on my site!!
Mine briefly jumped to 413LpM at midnight here
Looks like clearing the target url cache cured a problem I had
Back to running a steady 230 > 240 LpM early evening
See if that cured my problem with 5.16
Failing that, I might invest $8.50 in some software to monitor ser crashes
http://sertools.com/ser-crash-catcher/