I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up: 1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers? 2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil? 3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company? 4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA? 5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?
Please let me know if you have any other advice that can help.
Thanks
I recently purchased GSA and RankerX. Can you share your experience using Rankerx? How to generate tier 1 articles?
That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.
Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.
Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.
Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.
I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.
sickseo I noticed you haven't got Web 2.0 ticked, do you not use this for tier 2? I also noticed you haven't got GSA SEO Indexer ticked, don't use GSA SEO Indexer or GSA URL Redirect PRO? I do own both of these but they seem heavy on the CPU and I'm wondering if they're needed for only tier 2 links and whether it's even worth using them?
The web 2.0 doesn't make any links, so I don't use it.
Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.
To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.
@sickseo - Do you connect both SEO Indexer & URL Redirect Pro to each other?
I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.
To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed.
Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.
Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.
To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed.
Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.
Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
@sickseo - for the combined 25k links would you set SER to post 100 times per
That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.
Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.
Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.
Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.
I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.
sickseo I noticed you haven't got Web 2.0 ticked, do you not use this for tier 2? I also noticed you haven't got GSA SEO Indexer ticked, don't use GSA SEO Indexer or GSA URL Redirect PRO? I do own both of these but they seem heavy on the CPU and I'm wondering if they're needed for only tier 2 links and whether it's even worth using them?
The web 2.0 doesn't make any links, so I don't use it.
Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.
To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.
@sickseo - for the combined 25k links would you set SER to post 100 times per account and make 250 accounts?
I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.
But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.
But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
Thanks for your help and contribution to this forum.
Is there an option in SER to auto-reset campaigns?
It's not built into GSA SER, so I had to make my own solution for it.
First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.
Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects
These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.
So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.
The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.
To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.
Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler.
If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
It's not built into GSA SER, so I had to make my own solution for it.
First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.
Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects
These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.
So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.
The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.
To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.
Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler.
If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.
Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.
Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.
Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.
I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.
sickseo I noticed you haven't got Web 2.0 ticked, do you not use this for tier 2? I also noticed you haven't got GSA SEO Indexer ticked, don't use GSA SEO Indexer or GSA URL Redirect PRO? I do own both of these but they seem heavy on the CPU and I'm wondering if they're needed for only tier 2 links and whether it's even worth using them?
The web 2.0 doesn't make any links, so I don't use it.
Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.
To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.
I noticed that using the multiple URLs option does not give me N25k links each.
That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.
Url redirect pro is giving an extra 1300 links when using the single url option.
That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.
Url redirect pro is giving an extra 1300 links when using the single url option.
Yes, so with the multiple URL options, one should get X times the number of links.
You do get X times the number of links with multiple urls. But the link numbers shown in the software are for the current run only. When it starts processing a new url, those url stats are reset.
Are you using the "save to file" and "create reports" options on the tools? I'm getting separate files of links for each url from each tool. Link numbers do correlate to the above in each file.
Comments
I recently purchased GSA and RankerX. Can you share your experience using Rankerx? How to generate tier 1 articles?
@sickseo - Do you connect both SEO Indexer & URL Redirect Pro to each other?
To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed.
Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.
Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
@sickseo - for the combined 25k links would you set SER to post 100 times per
@sickseo - for the combined 25k links would you set SER to post 100 times per account and make 250 accounts?
But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
Thanks for your help and contribution to this forum.
Is there an option in SER to auto-reset campaigns?
First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.
Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects
These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.
So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.
The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.
To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.
Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler.
If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
Thanks a lot for this!
I noticed that using the multiple URLs option does not give me N25k links each.
I have to do it individually.
Did you have this issue?
Getting just under 25k urls combined with the multiple urls option.
A single URL gives under 25k links combined.
So, if I had 3 URLs and used the multiple URL option, it should give under 75k links, but this is not the case.
@sven - Is there a reason for this?
That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.
Url redirect pro is giving an extra 1300 links when using the single url option.
Yes, so with the multiple URL options, one should get X times the number of links.
Are you using the "save to file" and "create reports" options on the tools? I'm getting separate files of links for each url from each tool. Link numbers do correlate to the above in each file.