Skip to content
  • highsavehighsave usa
    edited October 2024
    Hello,

    I've been using RankerX for a while now and I'm happy with the results however I would like to take it a step further and heard that GSA is good for tier 2/3 links on top of the RankerX tier 1 links but I have a couple of questions I'm hoping someone can help me clear up:
    1. Most of the campaigns in RankerX are already tier 1/2/3, should I import all 3 tier links to GSA or only certain tiers?
    2. I currently use XEvil with RankerX, can I also use XEvil with GSA or do I need GSA Captcha Breaker as well? I don't know if there's any point in buying GSA Captcha Breaker if I've got XEvil?
    3. Do I need to buy an SER list if I'm not doing tier 1 links? if I do need an SER list, can anyone recommend a cheap but decent company?
    4. After I've set everything up and imported the RankerX links to GSA, what happens when I have new links in RankerX, do I add them to existing campaigns in GSA or create a new campaign for the new links? Also is there an option to duplicate campaigns in GSA?
    5. I currently have 10 proxies from buyproxies.org, with this be enough to run both RankerX and GSA?

    Please let me know if you have any other advice that can help.

    Thanks

    I recently purchased GSA and RankerX. Can you share your experience using Rankerx? How to generate tier 1 articles?
  • londonseolondonseo London, UK
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    @sickseo - Do you connect both SEO Indexer & URL Redirect Pro to each other?
  • sickseosickseo London,UK
    I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.  

    To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed. 

    Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.

    Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    sickseo said:
    I've tested that. 2 issues with doing that is that you're just creating tiers of redirects, neither of which are likely to get indexed. Plus the sheer volume of links that one tool makes creates an endless queue of links to process in the 2nd tool. You can reduce link numbers by applying the PR filter though.  

    To get these types of links indexed, you need to be pointing other types of links at them - blog comments, guestbooks, wikis, articles. These types of links are good for crawling/indexing - so you'd need to point them at the redirects for them to have any chance of being crawled/indexed. 

    Currently, I use these 2 tools as domain boosting tools only. Many of the sites have decent DA/DR so pointing them at homepage of money sites then running a 2nd tier on them using those link sources mentioned above is probably the best use.

    Indexing them these days is tough. Google recently targetted redirects in one of their spam updates. So things have changed.

    @sickseo - for the combined 25k links would you set SER to post 100 times per
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    @sickseo - for the combined 25k links would you set SER to post 100 times per account and make 250 accounts?
  • sickseosickseo London,UK
    I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.

    But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.
  • londonseolondonseo London, UK
    sickseo said:
    I just set up a T2 campaign in GSA and let it run for several weeks. My set up will auto reset campaigns every 24 hours so it's completely hands free. No need to use scheduled posting and reposting on multiple accounts.

    But sure, you can do it that way as well. As long you build T2's, that's all that matters. The only benefit of using the reposting settings is that the project will make more links before it needs to be reset. Otherwise it will reach the end of your site list and stop making new links.

    Thanks for your help and contribution to this forum. B)


    Is there an option in SER to auto-reset campaigns?


  • sickseosickseo London,UK
    It's not built into GSA SER, so I had to make my own solution for it.

    First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.



    Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects

    These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.

    So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.

    The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.

    To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.

    Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler. 



    If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.
  • londonseolondonseo London, UK
    sickseo said:
    It's not built into GSA SER, so I had to make my own solution for it.

    First you need to install a software called cyber D's autodelete onto same machine where you have gsa ser.



    Then you add the above filters which will auto delete different files from your GSA install. This is the file path to the GSA folder: C:\Users\Administrator\AppData\Roaming\GSA Search Engine Ranker\projects

    These files will correspond to target urls, target url cache, submitted links, verified links and accounts. It replicates the steps of deleting target url cache, history and accounts as well as deleting verified and submitted links. In essence it's doing a full reset of link building data.

    So if you don't want to delete any of these then probably best not to use it. Or perhaps set up your own filters based on what you want to be reset.

    The .success filter will remove verified links, so that's the one to be careful of as if you are running T1 campaigns you may not want your T1 links deleted.

    To control this, I rename my projects with "Power Up" and the above filter uses that as a mask (*Power Up.success*) so that only projects labelled "power up" will have verified links deleted each time the software runs.

    Then I use windows task manager to schedule when this program will run. Currently I've got it set to run every 24 hours, but it can be scheduled for any X number of days or even hours via windows task scheduler. 



    If you want the settings file for the above set up I can share it. Then you can use the import function on the autodelete software to have the same filters as above.

    Thanks a lot for this! B)
  • londonseolondonseo London, UK
    edited October 2024
    sickseo said:
    The web 2.0 doesn't make any links, so I don't use it.

    Seo indexer and redirect pro I do use but use them as standalone tools to link blast new domains or individual urls. Combined they make about 25k links.

    To connect them to every gsa ser install I have just creates an endless queue of backlinks to process. Basically the 2 tools can't keep up with the volume of links coming from gsa ser. Besides the same link sources are in my gsa ser site list and are already being used as T2.

    I noticed that using the multiple URLs option does not give me N25k links each.

    I have to do it individually.

    Did you have this issue?


  • sickseosickseo London,UK

    Getting just under 25k urls combined with the multiple urls option.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    edited October 2024
    sickseo said:

    Getting just under 25k urls combined with the multiple urls option.

    A single URL gives under 25k links combined.

    So, if I had 3 URLs and used the multiple URL option, it should give under 75k links, but this is not the case.

    @sven - Is there a reason for this?


  • sickseosickseo London,UK

    That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.

    Url redirect pro is giving an extra 1300 links when using the single url option. 
  • londonseolondonseo London, UK
    sickseo said:

    That's using the single url option. Slight difference in numbers. But could be related to connection issues or proxies or sites not working during submission.

    Url redirect pro is giving an extra 1300 links when using the single url option. 

    Yes, so with the multiple URL options, one should get X times the number of links.
  • sickseosickseo London,UK
    edited October 2024
    You do get X times the number of links with multiple urls. But the link numbers shown in the software are for the current run only. When it starts processing a new url, those url stats are reset.

    Are you using the "save to file" and "create  reports" options on the tools? I'm getting separate files of links for each url from each tool. Link numbers do correlate to the above in each file.
    Thanked by 1londonseo
  • londonseolondonseo London, UK
    Ok, I was not using the "save to file" and "create  reports" options.

    Let me try that.

    Thanks again B)
  • ShenShen Sri Lanka
    Thanks sickseo, that really helped clear things up :)

    I've been testing GSA Captcha Breaker on top of XEvil with RankerX for the past 2 days and so far Captcha Breaker hasn't been able to solve any of the links that XEvil has sent over and they have been forwarded over to 0captcha instead. I'm going to retest with GSA once it's up and running but I've got a feeling it will probably be the same. I'm thinking about using the DeepSeek option instead of 0captcha as a backup, does anyone have any experience with this, is it cheaper than 0captcha?

    I've just brought a list from GSA Verify Link List - Shen e-Services (recommended by GSA), I've been speaking to him on WhatsApp and he even set everything up for me :)

    I just need to figure out the email account options and then I think I'm all done with the setup, any advice on the email side of things? I'm thinking about creating an email account for each domain for example seo@test1.com for test1.com backlinks and seo@test2.com for test2.com backlinks, what do you think? will 1 email account be ok or will I need a couple per domain? can I use the email for all campaigns? and new campaigns?
    Thank you for your Feedback.
  • codyjasoncodyjason New York

    1. Which Tier Links to Import from RankerX to GSA?

    Short Answer: Only import Tier 2 and Tier 3 links into GSA, not Tier 1.

    • Tier 1 links from RankerX are your most valuable (Web 2.0s, Wiki, etc.), often pointing directly to your money site.

    • GSA is great for volume-based spammy links (blog comments, profiles, trackbacks, etc.), which can dilute your Tier 1 quality if not used carefully.

    • Ideal setup:

      • GSA targets Tier 2 and 3 RankerX links, strengthening them without risking penalties to your money site.


    2. Can You Use XEvil with GSA or Do You Need GSA Captcha Breaker?

    Short Answer: Yes, you can use XEvil with GSA, and no, you don’t need GSA Captcha Breaker if XEvil is already working well.

    • GSA natively supports XEvil via CapMonster/XEvil API integration.

    • GSA Captcha Breaker is good, but XEvil is far superior in solving complex captchas (ReCaptcha v2/3).

    • Just configure XEvil to accept GSA connections, and you're good.


    3. Do You Need to Buy an SER List If You're Only Doing Tier 2/3?

    Short Answer: Yes, a list is highly recommended, even for Tier 2/3 campaigns.

    • Public scraping can be time-consuming and ineffective nowadays.

    • A decent SER list ensures high success rates and verified link submissions.

    • Good, affordable list providers:

      • Asia Virtual Solutions

      • Loopline

      • SerVerifiedLists

      • SWE lists (budget-friendly)

    • Look for lists that include contextual platforms (Articles, Wikis, etc.), even for Tier 2, for better indexing and power.


    4. Handling New Links from RankerX in GSA (Updates + Duplication)?

    Short Answer: You have two options: add new links to existing projects or clone/duplicate and create new ones.

    • Adding to existing campaigns:

      • Open your existing GSA project, go to “Options” → “Where to Submit” → Paste new URLs in the “URLs to promote” section.

    • Duplicating campaigns:

      • Yes, GSA allows you to duplicate campaigns easily. Right-click → Duplicate → Exact Copy or with modifications.

    • Best Practice:

      • If new links are from the same campaign or niche, add to the existing.

      • For different projects/niches, clone or create new projects for better organization.


    5. Are 10 Proxies from BuyProxies.org Enough for RankerX and GSA?

    Short Answer: 10 proxies is borderline minimal for running both tools, but it can work if you throttle threads and stagger usage.

    • RankerX is usually light on proxies (especially if you’re running just 1-2 campaigns).

    • GSA is proxy-hungry, especially for scraping and submitting.

    • Recommendation:

      • Run fewer threads (20-50 max) in GSA.

      • Make sure your proxies are private, not shared.

      • Use a proxy manager to split them between tools or buy an extra 10 if budget allows.


    ✅ Bonus Tips:

    • Indexing: Consider using indexing tools or services (like 1Indexer) to ensure your GSA links actually get crawled and boost your tiers.

    • Clean Up: Regularly clean GSA's target URLs and verified lists to maintain performance.

    • Backups: Backup your GSA projects weekly to avoid data loss.

    • Logs: Check logs often to ensure you're not getting captchas blocked or proxy bans.

Sign In or Register to comment.