RankerX and GSA
Hello,
I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up:
1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers?
2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil?
3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company?
4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA?
5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?
Please let me know if you have any other advice that can help.
Thanks
I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up:
1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers?
2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil?
3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company?
4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA?
5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?
Please let me know if you have any other advice that can help.
Thanks
Tagged:
Comments
1. Do you think it would be best to copy all 3 tier links from RankerX into GSA for maximum results then?
2. I also use a third-party captcha provider on top of XEvil but I'm hoping to keep the costs down with them, so do you think it would be pointless buying GSA Captcha Breaker or do you think it would help a little?
3. Do you have any recommendations for a cheap but decent SER list provider?
4. What I mean is if I import more links from RankerX into GSA in the future, do I add them into an existing campaign or a new one because my thinking is if I add new links to an existing campaign, it won't build backlinks to the previous URL'S already summited for the older links or will it?
5. Think I'll give it ago with the 10 and see what happens, I can always buy some more if needed, I'm assuming it will just run a bit slower and I will need to buy more to speed things up?
2. If you have xevil already, there won't be much left for captcha breaker to solve, so no point getting it.
3. serverifiedlists isn't bad. I'm testing it right now and seems to have a wide range of working targets. Personally I prefer scraping my own targets. You'll get better results than using over spammed site lists. I don't normally buy lists so nothing I would recommend getting.
4. Best to set up new campaigns for new rankerx campaigns. You will find it hard to keep up with rankerx. Each multi tier template makes thousands of links. That means you'll need hundreds of thousands of links to power them up. Best to keep gsa campaigns separate so that each rankerx link gets the proper attention. If you over load a gsa campaign with too many rankerx links, many links won't get any links built to them.
5. I use datacenter proxies. They can handle many more threads than your normal ipv4 proxies. Any decent proxy provider should be able to handle 100 threads per proxy. So 10 dedicated proxies means you should be able to push 1000 simultaneous threads through them before the performance deteriorates.
I've been testing GSA Captcha Breaker on top of XEvil with RankerX for the past 2 days and so far Captcha Breaker hasn't been able to solve any of the links that XEvil has sent over and they have been forwarded over to 0captcha instead. I'm going to retest with GSA once it's up and running but I've got a feeling it will probably be the same. I'm thinking about using the DeepSeek option instead of 0captcha as a backup, does anyone have any experience with this, is it cheaper than 0captcha?
I've just brought a list from GSA Verify Link List - Shen e-Services (recommended by GSA), I've been speaking to him on WhatsApp and he even set everything up for me
I just need to figure out the email account options and then I think I'm all done with the setup, any advice on the email side of things? I'm thinking about creating an email account for each domain for example seo@test1.com for test1.com backlinks and seo@test2.com for test2.com backlinks, what do you think? will 1 email account be ok or will I need a couple per domain? can I use the email for all campaigns? and new campaigns?
The 0captcha service isn't much different to Xevil solving. I tested a campaign with just xevil and then the same campaign with just 0captcha and the overall link numbers were pretty much identical. I actually wonder if the 0captcha service is powered by Xevil. Many captcha services do just this.
For emails I use catchall emails - 1 catchall email per gsa install. You may get better results with 1 catchall email per project, but I've not noticed a difference in performance or link numbers between the 2 options. If you use the spin file format for the catchall, you'll get hundreds of thousands of unique emails geenrated with just 1 catchall email. The software will do it automatically.
I'm thinking about setting catchall up myself in cPanel on my main domain but I'm worried about the domain getting blacklisted or any other negative effects?
I'll have to test 0captcha again. Although my costs of running that could be quite high as I've got multiple rankerx running 24/7, so I prefer the fixed cost option with just xevil.
https://forum.gsa-online.de/discussion/33628/catche-catchall-email-creator-for-any-tool-automatic-gsa-ser-project-email-updater-email-hosting
To give you an idea of my situation, I'm currently using RankerX to build backlinks to my 6 websites and I've now brought GSA to build backlinks on top of these backlinks. I also have separate hosting (cPanel) just for my email accounts. I have around 100 campaigns running it RankerX so there will also be another 100 running in GSA once all set up.
I'm glad to see contributions in SEO, and it's funny that I have a very similar system.
Years ago I developed a little script that allows me to create thousands of emails in a few seconds, then just import them into CPanel and GSASER.
I usually use 1 domain per campaign, but if that campaign is going to involve building a lot of backlinks, I'll expand to 3 or 5 domains to avoid any spam considerations.
The reason I have 5 is that the cpanel hosting I have is for 5 domains. Just turned out that way. There are unlimited sub domains, so unlimited catchall emails. I use them in Rankerx, GSA. Also Xrumer now and again. It's probably quite expensive set up as direct admin would be way cheaper. Likely other alternatives that are way cheaper. But it's always worked and only issues I ever had was not having enough storage. Suppose I should really change the domains to fresh ones, but never got round to it.
I think the reason why the domains aren't on any blacklists is due to the blacklist filter I use in GSA.
The rankerx emails can go quite high - sometimes it can be 200-300mb per account. Depends on how many campaigns and which templates I'm running. 16 installs with 5 catchall emails in each. 80 catchalls just for rankerx.
The gsa set up uses even more emails but storage per account is normally very low. I suspect gsa does a better job of emptying the inbox than rankerx does.
I've had to upgrade the storage twice now as it wasn't enough.
However, recent updates are also meager, and there are so many posts that get deleted.
Rather than being deleted after an index to Google,
Even if the quantity is small, it is better to write it by hand and not delete it and keep it for a long time.
Everyone says "RankerX" is good, but if you look up the backlinks that are actually deleted,
Will it be a good program?
For web 2.0 with large filtering in the first place, it is correct to do it by hand.
If you rely solely on automation, you will experience numerous deletion backlinks.
That's what I run on my last tier. Personally, I think blog comments and guestbooks are an absolute must as these are pages already indexed in Google. So anything they point at will get crawled by google. Sometimes I run indexing campaigns with just blog comments and guestbooks as these are the most effective for indexing links - but they tend to use high cpu.
Url shorteners and indexers are questionable, as these days they don't get indexed as easily as they used to. They do still create another path to your T1 links for google to crawl, so that's why I still use them. You just don't know what impact a particular url will have as a Tier 2 on the T1 link.
Exploits you can probably deselect. I still use them as they're 100% do follow with keyword anchors. But again indexing rates on them will be non existant.
Articles, forums, social networks and wikis are my core engines which I use normally on T1, but I do put them in as T2 as 90% of my site list here are no follow links, and platforms like wiki, gnu, and dwqa - I have thousands of these sites. Both Google and Bing do use no follow links as a hint for crawling, so T2 is the best place to put these types of links to get your T1 links crawled.
I don't have many options selected. The scheduled/additional posting can be a good one to use as it will make more links on sites that support it - mainly article sites like wiki and wordpress. So with 5 accounts and 5 posts, the supported sites will make 25 links each time instead of 1.
I do however use MR to post direct to money sites. It moves rankings, that's why I use it.
I pay zero attention to the data in sem rush. It's incomplete poop data that people still pay for. I'm not a fan of any of these 3rd party data services.
sickseo I'm also wondering what sort of options you're using at the top of the data tab?
P.s really appreciate your help
Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.
To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.
I'll use a large number of keywords mixed with domain and generic anchors.
Could I ask you with what kind of tool for scrap you do your list ? Is it with GSA SER or scrapped box or another ? I tried to do with scrapped box but GSA SER did not select any of them for publication. I'm having trouble doing something in scrappebox. thank you for your response.
Muriel
I recently purchased GSA and RankerX. Can you share your experience using Rankerx? How to generate tier 1 articles?
@sickseo - Do you connect both SEO Indexer & URL Redirect Pro to each other?
To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed.
Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.
Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
@sickseo - for the combined 25k links would you set SER to post 100 times per
@sickseo - for the combined 25k links would you set SER to post 100 times per account and make 250 accounts?
But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
Thanks for your help and contribution to this forum.
Is there an option in SER to auto-reset campaigns?
First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.
Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects
These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.
So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.
The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.
To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.
Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler.
If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
Thanks a lot for this!
I noticed that using the multiple URLs option does not give me N25k links each.
I have to do it individually.
Did you have this issue?
Getting just under 25k urls combined with the multiple urls option.
A single URL gives under 25k links combined.
So, if I had 3 URLs and used the multiple URL option, it should give under 75k links, but this is not the case.
@sven - Is there a reason for this?
That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.
Url redirect pro is giving an extra 1300 links when using the single url option.
Yes, so with the multiple URL options, one should get X times the number of links.
Are you using the "save to file" and "create reports" options on the tools? I'm getting separate files of links for each url from each tool. Link numbers do correlate to the above in each file.