Skip to content

SERnuke Custom GSA SER Engines

cherubcherub SERnuke.com
edited May 29 in Buy / Sell / Trade


Introducing SERnuke

Custom Private Engines for GSA Search Engine Ranker

We can now offer several packages of engines that enable SER to submit to a variety of new platforms. Working via an API key, SER will be able to download engines and their updates automatically.
Expand your range of link building targets with engines covering several link types. Each licence is a one-off cost, and includes lifetime updates.
Our engines are aimed at experienced SER users. If you're not already having success building links via the software, it's probably best to concentrate on your working knowledge of the existing engines and practices before looking at ours.

How it works

Register for an account on our site
Find a package of engines that you are interested in
Purchasing the package will give you access to an API key
Add this API key to the APIs section of your copy of SER
Start using your new engines!
Purchasing additional licences will allow you to use the engines on multiple copies of SER at the same time

What you will need

A copy of GSA SER
A decent set of proxies
A captcha solving service or app
Email accounts
A good working knowledge of SER usage
The ability to scrape your own lists
Our engine packs are limited to 100 sales each. This is to try and ensure the domains they can submit to don't get oversaturated with links. We've all seen what can happen to new platforms being added to SER by default.

Register Here For A Free Account!

«1

Comments

  • AllanAllan seoservices.com.br
    Anyone who bought it could tell us about its use?

    Many different engine sites?

    I got the $77 price tag, I would like the opinion of those who have already used it.
  • cherubcherub SERnuke.com
    Hi @Allan, would you like me to send you some samples from both current packages?
  • s4nt0ss4nt0s Houston, Texas
    @Allan - I was one of the early testers for SERNuke. I set up a basic project, only used one SERnuke engine and got around 63 unique domains, with 58 of those being Dofollow links. I even got a nice .edu link in the mix. I ran PR check using Domdetailer Moz Domain Authority on the subdomains and you can see the stats below:



    That was without additional scraping in Scrapebox. I just did a basic run to test things out, and I was very happy with the results. 
  • cherubcherub SERnuke.com
    I should probably point out that @s4nt0s tested out a smaller engine that was only used for testing purposes and is not available for purchase. The two current packages for sale offer a vastly broader spectrum of targets.
    Thanked by 1s4nt0s
  • AllanAllan seoservices.com.br
    cherub said:
    Hi @Allan, would you like me to send you some samples from both current packages?
    please send it to me, I would be very grateful.
    Thanked by 1cherub
  • AllanAllan seoservices.com.br
    s4nt0s said:
    @Allan - I was one of the early testers for SERNuke. I set up a basic project, only used one SERnuke engine and got around 63 unique domains, with 58 of those being Dofollow links. I even got a nice .edu link in the mix. I ran PR check using Domdetailer Moz Domain Authority on the subdomains and you can see the stats below:



    That was without additional scraping in Scrapebox. I just did a basic run to test things out, and I was very happy with the results. 
    Looks interesting, what engine did you use?

    I saw that it has two options.

    Are they contextual links?
  • cherubcherub SERnuke.com
    Allan said:
    cherub said:
    Hi @Allan, would you like me to send you some samples from both current packages?
    please send it to me, I would be very grateful.
    Samples sent to your PM :)
  • AllanAllan seoservices.com.br
    How many unique verified targets will I get with each engine?
  • cherubcherub SERnuke.com
    Allan said:
    How many unique verified targets will I get with each engine?
    That is not something I can answer with any degree of accuracy, as it will depend on the scraping efforts of the end user, functionality of their captcha solvers, proxies, emails etc - just like standard SER engines. However, I do not put a package up for sale unless I can get at least 400+ verified links with only minimal scraping using the footprints supplied, with no additional keywords used.
  • sickseosickseo London,UK
    I bought the wowonder engine and have been scraping and testing for less than a day. So far have 461 verified links and growing. I'm interested to see how many sites I can find after a month of scraping.



    First impressions - really good engine and looks like there will be a lot of targets. The included footprints work really well with scrapebox. Not done a full run - just tested each footprint with a small group of keywords to see if the scraped links worked. 

    Mixed do follow and no follow, so far about 25% are do follow, but numbers will depend on what you can scrape.

    The highest DA so far is DA37, but the vast majority of the links I've scraped are DA1 - DA15. But these values will go up as people start building tiered links to the sites.

    @cherub Glad you you got this service off the ground. Looking forward to the next set of engines. 
    Thanked by 2cherub londonseo
  • AllanAllan seoservices.com.br
    sickseo said:
    I bought the wowonder engine and have been scraping and testing for less than a day. So far have 461 verified links and growing. I'm interested to see how many sites I can find after a month of scraping.



    First impressions - really good engine and looks like there will be a lot of targets. The included footprints work really well with scrapebox. Not done a full run - just tested each footprint with a small group of keywords to see if the scraped links worked. 

    Mixed do follow and no follow, so far about 25% are do follow, but numbers will depend on what you can scrape.

    The highest DA so far is DA37, but the vast majority of the links I've scraped are DA1 - DA15. But these values will go up as people start building tiered links to the sites.

    @cherub Glad you you got this service off the ground. Looking forward to the next set of engines. 
    You encouraged me to buy, thanks for the review!



    Thanked by 1sickseo
  • AllanAllan seoservices.com.br
    @cherub, I made the payment, when can you release the engines? I'm looking forward to using them hehehe
    Thanked by 1cherub
  • sickseosickseo London,UK

    I bought the other Git-alikes engine too. Also seems like a very good one to go for. Mixed do follow/no follow links. Combination of contextual links and profile links. Similar site metrics to the wowonder engine. Found a few DA35 and DA37 sites, but most are DA1 -DA15 that I've scraped so far.

    All engines combined, I've got 1700+ new sites to drop links on.
    Thanked by 2cherub londonseo
  • cherubcherub SERnuke.com
    Allan said:
    @cherub, I made the payment, when can you release the engines? I'm looking forward to using them hehehe
    API keys should be available in your dashboard within 15 minutes of order payment. It looks like you got it up and running?
  • sickseosickseo London,UK
    Just to update the link numbers from these 2 engines:


    The Gitea engine stands out and there seem to be a ton of them out there. It makes both profile and contextual links. The profile link is do follow but the post url is no follow. Total unique domains now stands at 3156 sites!
    Thanked by 1cherub
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    I used mostly the Decidem engine as part of the beta group, and the results were satisfactory.
    I only added the Git Alike and WoWonder engines  2 -3 days ago and still need to scrape target URLs. In the meantime, I have expanded the footprints for these engines in Footprint Studio.

    The support from @cherub has been good. When I had questions about the engines, he helped me quickly.
    Only time will tell how well the engines are maintained, but @cherub has been doing a great job so far.

    The CPU load from the SERNUKE engines is really good, unlike the other 2 paid web 2.0 services (which I would rather not name), which cause many Chromedriver browser-spawned processes that, at times, crash GSA ser with SERNUKE. I have no issues at all with the load from these engines.

    Below are my results so far. Remember that the new engines were only added about two days ago.
    I observed that Xevil is showing a lot of Bad site key | incorrect parameters for the Gitlab and never engines, so I wondered what Recaptcha Captcha solving you guys have been using. Maybe @Sven can see if the recaptcha can be fine-tuned. 


    Now, all that remains for me to do is scrape targets for the new engines, and then we will see the performance after running a full month.
    I am also including the SERNUKE target URLs in the Asia Virtual Solutions GSA Site list so SERNUKE users can use the target URLs I am scraping.

    Looking forward to the new engines to be released by @cherub
    Thanked by 1cherub
  • edited June 24
    i also  Xevil is showing a lot of Bad site,i thinks It may website not be supported proxy ipv6 .xevil i used proxy ipv6
  • cherubcherub SERnuke.com
    'Bad Sitekey' error messages on Xevil is unfortunately not something that can be remedied via a SER engine file. I recommend you always have a backup solver service when using Xevil.

    If you find sites that give 'Bad Sitekey' messages, and checking manually the ReCaptcha is working and allowing registration (many CMS sites are misconfigured, or just don't get round to setting up ReCaptcha properly), it's best to send them to GSA in order to possibly improve SER's ReCaptcha detection/extraction functions.
  • AllanAllan seoservices.com.br
    i also  Xevil is showing a lot of Bad site,i thinks It may website not be supported proxy ipv6 .xevil i used proxy ipv6
    This is a problem for Xevil, which will have to be solved by them, unfortunately.
  • AllanAllan seoservices.com.br
    I started scraping today, I scraped only 1 day and got these targets.

    So far everything is working very well, another detail is the indexing so far that I managed to index 80% of the backlinks and still haven't started the link pyramid for tier 2 and 3, which is very good.



    Thanked by 1cherub
  • cherubcherub SERnuke.com
    A new engine has been added to the WoWonder and Co package.

    For existing API owners, SER should download the new engine during it's periodic update checks; alternatively you can force download of the engines via Options > Advanced > APIs > corresponding Update button. You may also wish to redownload the documentation pdf from your dashboard section, to access footprints and other target url sources.
    Thanked by 1sickseo
  • Is there a coupon code? I want to buy.
  • cherubcherub SERnuke.com
    Is there a coupon code? I want to buy.
    Sorry, we do not have any active coupon codes currently.
  • cherub said:
    Is there a coupon code? I want to buy.
    Sorry, we do not have any active coupon codes currently.

    Ok, can I pay with PayPal?
  • cherubcherub SERnuke.com
    cherub said:
    Is there a coupon code? I want to buy.
    Sorry, we do not have any active coupon codes currently.

    Ok, can I pay with PayPal?
    We accept debit/credit card payment via Stripe, or a variety of Cryptocurrencies, but not PayPal sorry.
  • cherubcherub SERnuke.com
    new engine has been added to the Git-Alikes package. Additionally, some updates have been made to the WoWonder engine to improve the posting of html in articles.

    For existing API owners, SER should download the new engine during it's periodic update checks; alternatively you can force download of the engines via Options > Advanced > APIs > corresponding Update button. You may also wish to redownload the documentation pdf from your dashboard section, to access footprints and other target url sources.
    Thanked by 1sickseo
  • cherubcherub SERnuke.com

    Package #3 has been released!


    Named Employment Package, it's based on platforms offering job search/portals, and consists of 4 engines, each posting contextual links in articles.

    Check it out here!

    Thanked by 1sickseo
  • sickseosickseo London,UK
    New package looks very good. Lots of new do follow link sources. Got it a few days ago and have found a lot of new working sites already.

    Lots of good footprints included on the employment package and I've only used 3 footprints from each engine so far to scrape with, so plenty more sites to be found on the latest package.



    That's the site numbers I've got for all 3 sets of packages. Many thousands of new sites. Over 10K lol!

    Many thanks to @cherub It's really brought the software back to life with tons of new targets. These should last a long time as well as access to these sites is pretty exclusive.

    I'm just using them as T1 to boost my referring domains on projects. Personally I wouldn't use them on T2 as the sites would get spammed too quickly and reduce their shelf life.

    Thanked by 2cherub londonseo
  • iamzahidaliiamzahidali United States
    sickseo said:
    New package looks very good. Lots of new do follow link sources. Got it a few days ago and have found a lot of new working sites already.

    Lots of good footprints included on the employment package and I've only used 3 footprints from each engine so far to scrape with, so plenty more sites to be found on the latest package.



    That's the site numbers I've got for all 3 sets of packages. Many thousands of new sites. Over 10K lol!

    Many thanks to @cherub It's really brought the software back to life with tons of new targets. These should last a long time as well as access to these sites is pretty exclusive.

    I'm just using them as T1 to boost my referring domains on projects. Personally I wouldn't use them on T2 as the sites would get spammed too quickly and reduce their shelf life.

    I think with employement backlinks we can index really instantly with Google Indexing API as its was suppose to be used on those realtime updates websites 
  • sickseosickseo London,UK
    If you are scraping google to find these sites, then every single link will be indexable. You just need to pont some links at them to get them indexed.
    Thanked by 1Deeeeeeee
Sign In or Register to comment.