Skip to content

Most “Where to Submit” engines seem inactive today

Hello everyone,

Sorry if some of this might not be 100% accurate — I’m not an expert. I just tried to collect information here and there on the web because I was struggling to understand why some of the “Where to Submit” engines in GSA SER were not producing any links anymore.
I hope it’s okay to share what I noticed, and maybe it can help others too.

From what I understand, many of the old engines (especially forum scripts and CMS platforms) simply don’t work in 2025. They either don’t exist anymore, or the websites using them have switched to modern protections like Cloudflare, JavaScript checks or login systems that GSA SER can’t bypass.

For example, engines like phpBB, SMF, vBulletin, XenForo, IPBoard, Joomla Blog, Drupal Blog, Wordpress Article, BuddyPress, and many article directory scripts — they just don’t allow automated posting anymore. Most of them require JS, or captcha steps that SER can’t handle, or have closed registrations completely.

The same seems to happen with many old directory scripts, gallery scripts, video CMS, social network clones, Q&A scripts, guestbooks, classified ads scripts, and so on. They used to work years ago, but today almost all installations have either been abandoned, shut down, or protected behind systems that block bots.

Because of this, even if the engines are still listed inside SER, in real life almost none of them generate backlinks anymore. I’m not saying this to criticize — just trying to understand why my projects weren’t producing anything, even with good proxies, good emails and the correct setup.

Personally, I think SER still has a lot of potential, but maybe the submission engines need to be updated with newer platforms that still allow automated posting today.
Only a few engines still work consistently (things like MediaWiki, DokuWiki, TikiWiki, simple blog comments, public bookmarks, rentry.co, etc.). Everything else looks “alive” in the interface, but in practice doesn’t produce results.

So this is just my humble opinion, as a normal user:
maybe it’s time to add new engines to match the modern web, because most old engines simply don’t work anymore.

Again, sorry if something I wrote is not perfectly correct — I’m still learning.
Thank you for reading.

Tagged:

Comments

  • edited November 26
    here the tendency is towards a change in approach rather than towards improving the quality of solving. What will Sven say about this?
  • KonstantinKonstantin Ukraine
    edited November 27
    I have to say this thread reflects the exact reality of 2024–2025, and it’s not just a temporary slowdown or a proxy issue — it’s a structural problem across almost all legacy platforms SER was originally built around.
    Most “Where to Submit” engines are not just performing poorly…
    They are objectively dead in the modern web:
     - phpBB / SMF / MyBB / vBulletin / PunBB → registrations closed, captchas upgraded, anti-bot layers everywhere
     - Drupal / Joomla / older CMS engines → forms now require JS, dynamic tokens, invisible captchas
     - Guestbooks, directories, article platforms → either wiped out, turned into private sites, or filled with anti-spam walls
     - Many domains from old engines are simply abandoned, expired, or redirect to generic hosts
    The result is simple:
    The engine list looks huge — but the number of actually alive and postable engines in 2025 is extremely small.
    SER still reports “submitted” counts, but the verified output shows the real picture:
    a tiny number of engines produce almost everything that still works today.
    In practice, only a few categories remain somewhat reliable:
    - some MediaWiki instances
    - a small fraction of open blog comment systems
    - a handful of public paste/notes platforms
    - very rare CMS installations without JS challenges
    Everything else is practically non-functional from an automated posting perspective.
    This is not a complaint — it’s simply the current web environment.
    Anti-automation systems, Cloudflare, tokenized forms, login barriers, and real-time JS checks have eliminated almost all of the “classic” automated posting targets.
    ---
    And here is the important part:
    SER remains an extremely powerful tool — but the engine list no longer reflects real-world survivability.
    Right now the UI suggests hundreds of engines.
    In reality, maybe 5–10% are capable of producing links in 2025.
    From a product perspective, this means:
    - many engines need a fresh audit
    - some should probably be retired
    - others need modernized scripts or new approaches
    - the ecosystem SER was built on has changed dramatically
    I’m not criticizing the software — it has served the community for over a decade —
    but it might be time to rethink the engine layer and bring it in line with today’s web.
    If that happens, SER could stay relevant for many more years.
  • KonstantinKonstantin Ukraine
    edited November 27
    One more point worth mentioning — and I think many long-term SER users will agree:
    It might be extremely valuable to run a modern-day “engine audit” and identify which engines still produce verifiable results in 2025.
    Not to remove anything right away, but simply to get a realistic picture of what’s alive and what’s not. The market has changed drastically, and this information alone would already improve the efficiency of SER campaigns.
    With such transparency, users could focus on engines that actually work instead of spending resources on legacy platforms that no longer exist in practice.
    Just an idea — but I believe it could make SER significantly stronger going forward.
    Thanked by 1Hunar
  • edited November 26
    actual
  • I just want to clarify something because it looks like my post may have been misunderstood, and I really didn’t want to create any controversy.

    I did not mean to criticize GSA SER at all. SER is a solid piece of software, and I fully respect the work behind it. What I described is not a problem caused by SER — it’s simply how the modern web has changed over the years. Many auto-postable platforms that existed 10–15 years ago are now protected, offline, or no longer allow automated submissions.
    This affects every backlink software, not only SER.

    I only wrote my message because I’m a user of SER and I like the software, so naturally I’m hoping to see it grow and adapt. My intention was not to complain, but to share what I noticed and maybe help improve things if possible.

    If my wording sounded wrong or too direct, that wasn’t intentional.
    I’m just a normal user trying to understand what still works today and what could be updated.

    Thanks for your understanding.

  • Thanks for clarifying — and I fully agree with you.

    None of this was meant as criticism toward SER or its development.
    Like you said, this is simply how the modern web evolved, and every tool in this space is facing the same reality.

    The reason this discussion is valuable is because SER still has great potential, especially for the types of engines that remain functional today.
    Understanding which platforms still work — and which ones no longer do — could help all of us use SER more efficiently.

    No controversy here at all — just users trying to understand the current landscape and share feedback that might help improve the experience for everyone.
  • I fully agree with @verdemuschio and @Konstantin.

    The reality is that the web has changed, and the software needs to adapt to 2025 defenses (Cloudflare, JS checks, etc.). Currently, we are burning through good proxies and emails trying to post to scripts that effectively no longer exist.

    A "Spring Cleaning" of the engine list or a dedicated update for modern platforms would make SER significantly more efficient. Quality over quantity is the only way forward now. Hopefully, the dev team can take this feedback on board!



  • It's time to completely revise the GSA Ser and make a global upgrade
  • SvenSven www.GSA-Online.de
    Well coding engines takes some time. Especially if someone only gives me a bunch of urls and thats it. Often they are not even from the same platform.
    Im OK to add new engines if they
    1. Consists of not just a few URLs in the wild
    2. I got some work done just like footprints, an account to test things, maybe instructions what to click to get a link...

    I agree that many engines are no longer working as there are hardly any sites that use them, but I also can not test this without a bit of help from you guys.

  • Hi @Sven,

    I remember the list of search engines I provided caused some problems for some people, so I'm stepping aside. But if anyone can provide Sven with a good list, come on guys. Sven tells us he's available, let's help him with the URLs.
  • Of course, we are ready to help and will do it with pleasure.
    Thanked by 1verdemuschio
  • KonstantinKonstantin Ukraine
    edited November 29

    Thanks everyone for the discussion — it really helped clarify what’s going on today with SER.
    I’ll add my perspective from a technical point of view, without any pressure — just ideas that might be useful.

    From my own tests, the problem is not SER as software.
    The problem is that the web has fundamentally changed:

    • many classic CMS engines from 2010–2016 don’t exist anymore

    • a lot of registration forms have been removed or locked

    • Cloudflare / JS challenges became default on many platforms

    • anti-bot fields and dynamic tokens became standard

    • some engines still work, but success rates are naturally much lower

    No tool can post where a form no longer exists — that’s simply how the modern web evolved.

    So instead of asking for major changes, here are practical and realistic suggestions that could help SER stay effective without rewriting the software.


    1) A small, community-driven “engine status review”

    Not to overload Sven — the opposite.
    Users can collect:

    • example URLs

    • footprints

    • form behavior

    • tokens

    • hidden fields

    • notes about anti-bot elements

    Sven would only need to review and integrate what is already prepared.

    This makes the process manageable.


    2) Label engines, instead of removing them

    Just something simple like:

    • Working (2025)

    • Low success

    • Likely inactive

    This helps users avoid wasting proxies and emails.
    Even this tiny change would have a big impact.


    3) Small improvements for modern anti-bot logic

    Nothing big — not JS rendering.
    But things like:

    • keeping session cookies

    • retrying with the same session

    • passing basic timestamp hidden fields

    • reading simple dynamic tokens

    Even these minimal additions would noticeably increase success rates on some platforms.


    4) Allow community-contributed engines

    Maybe once per month:

    • one or two user-created engines

    • Sven reviews them

    • adds them if they meet quality standards

    This keeps SER evolving without increasing Sven’s workload.


    5) Simple engine diagnostics

    Something like:

    Form not found  
    Registration disabled  
    Unexpected challenge  
    Engine likely inactive on this target
    

    This would help users understand what is happening, without guessing.


    Final thoughts

    SER still has strong potential in 2025 —
    but the engine list simply needs a bit of cleanup and modernization,
    and the community is ready to help with the data collection part.

    No pressure, no demands —
    just ideas that many users might find useful moving forward.

    Thanks again for keeping SER alive and for being open to this conversation.

    Thanked by 1Hunar
  • Thanks everyone for the discussion — it really helped clarify what’s going on today with SER.
    I’ll add my perspective from a technical point of view, without any pressure — just ideas that might be useful.

    From my own tests, the problem is not SER as software.
    The problem is that the web has fundamentally changed

    In fact, an initial premise was made about this precisely to avoid sparking controversy over SER. The software was never blamed, and in my opinion, with the right settings, it works more than well. The problem was attributed to the Web, which has completely changed, and to the difficulty of finding good proxies. The discussion was open on this.
  • verdemuschioverdemuschio Italy
    edited November 30
    I'd like to point out that, despite my efforts, I haven't received any backlinks for weeks. I've started a new project, but I'm not getting any results. For search engines I'm using international ones. The engines "Where to submit" I've currently set up based on the tiers I believe work best are as follows:

    Tier 1 (Main project):
    Press Release Script
    Rentry.co
    Wordpress XMLRPC
    DokuWiki
    MediaWiki
    TikiWiki

    Tier 2 (tier):
    Press Release Script
    Rentry.co
    Wordpress XMLRPC
    General Blogs
    BlogEngine
    Bravenet Comment
    PHP Fusion Comment
    Directory97 PRO
    IndexScript
    NL Directory
    Freeglobes
    php Link Directory
    PHP Weby
    Wordpress Directory
    Easy Guestbook
    Guestbook
    OpenBook Guestbook
    Web Wiz Guestbook
    Piwigo
    Pixelpost
    ZenPhoto
    Pligg
    Plikli
    Drigg
    PHPDug
    Public Bookmark
    Scuttle
    Trackback
    Trackback-Format2
    DokuWiki
    MediaWiki
    TikiWiki

    Tier 3 (tier tier) and Tier 4 (tier tier tier):
    Press Release Script
    Rentry.co
    General Blogs
    Bravenet Comment
    PHP Fusion Comment
    Directory97 PRO
    IndexScript
    NL Directory
    Freeglobes
    php Link Directory
    PHP Weby
    Wordpress Directory
    Easy Guestbook
    Guestbook
    OpenBook Guestbook
    Web Wiz Guestbook
    Piwigo
    Pixelpost
    ZenPhoto
    Pingback
    Pligg
    Plikli
    Drigg
    PHPDug
    Public Bookmark
    Scuttle
    Trackback
    Trackback-Format2
    0.gp
    0a.sk
    0cn.de
    0rz.tw
    0x6f776f2e7663.net
    101.gs
    135.it
    1aas.com
    1ab.in
    1o2.ir
    1s.pt
    1url.cz
    2.gp
    2.ly
    222.at
    2cm.es
    2tu.us
    3.ly
    301.link
    3le.ru
    4.gp
    4.ly
    42.pl
    5du.pl
    6.gp
    6.ly
    6b.cz
    7.ly
    7ee.ir
    8.ly
    8em.pl
    9.ly
    92url.com
    9en.us
    afit.edu
    AllEasySEO.com
    alturl.com
    anasayfa.info
    atikon.at
    ay.gy
    barhosting.com
    bea.sh
    beanybux.me
    beget.tech
    bin.wf
    bit.ly
    bool.icu
    bpl.kr
    BrokenScript
    btty.in
    byte.my
    cfg.me
    chilp.it
    cia.sh
    cisn.xyz
    cleanuri.com
    clickstat.xyz
    CodeCanopy URL Shortener
    come.ac
    cortas.as
    cuts.top
    cutt.us
    d--b.info
    da.gd
    dropmylinks.com
    dssurl.com
    dstats.net
    due.im
    e49.us
    e7.al
    easyurl.cc
    el32.com
    elpais.com
    ennt.net
    etinyurl.com
    fadurl.com
    filoops.info
    firsturl.de
    fly2.cf
    Free URL shortener
    g.asia
    g9.yt
    General URL Shortener
    GET API URL Shortener
    gg.gg
    gmy.su
    go.ht.gs
    goto.now
    gourl.gr
    hasurl.com
    herotel.com
    hideuri.com
    hirz.ir
    holyballs.com
    hts.io
    ic9.in
    icks.ro
    icux.xyz
    in.mt
    ina.am
    inbox.lv
    inshort.link
    inx.inbox.lt
    is.gd
    ito.mx
    ity.im
    ivyti.es
    ix.sk
    j9.click
    jiotools.com
    jplopsoft
    kmk.party
    krati.co
    ku.ag
    kub.sh
    kuc.cz
    kurzelinks.de
    kurzurl.net
    l8.nu
    lc.cx
    lil.so
    lilURL
    link-cut.com
    linkbun.ch
    linkcuts.org
    linky.nu
    ln4.ru
    lnk.bz
    lnkm.ee
    lnkz.at
    lopurl.com
    lyhyt.eu
    lynx.re
    m17.ca
    make-t.in
    merky.de
    merq.org
    miffy.me
    minik.link
    minilink.pro
    minilinks.dev
    miniplease.com
    miniurl.top
    mub.me
    muz.so
    myminiurl.net
    nah.uy
    nene.cz
    nowlinks.net
    nutshellurl.com
    ogy.de
    ohi.im
    opnlink.com
    ouo.io
    ourl.in
    owo.gay
    owo.vc
    p.asia
    pastein.ru
    peekurl.com
    Phishyurl (adult)
    Phishyurl (crypto)
    Phishyurl (dating)
    Phishyurl (financial)
    Phishyurl (gambling)
    Phishyurl (phishing)
    Phishyurl (shopping)
    Phishyurl (tech)
    PHPurl
    Phurl
    plu.sh
    Polr
    Premium URL Shortener
    psee.io
    ptiturl.com
    pxl.fm
    qi.lv
    qr.net
    qrurl.cc
    reducelnk.com
    rlu.ru
    s.coop
    S6L ShortURL
    sdu.sk
    seomafia.net
    shorl.com
    short-link.me
    short-url.uk
    shorter.me
    shortlink.uk
    shorturl.at
    shorturl.la
    shorturl.ma
    shorturl.re
    shorturl.ru
    shr.name
    shrink.im
    shrt.in
    shrturi.com
    shrunken.com
    sim.link
    singkat.link
    sini.la
    skrat.it
    skrun.ch
    smi.link
    smlr.org
    snipurl.com
    soo.gd
    surl.me
    surl.site
    t1p.de
    tals.top
    tau.lu
    teil.cc
    thinfi.com
    thrixxx.me
    tighturl.com
    tinie.link
    tinilink.com
    tinlie.com
    tiny.cc
    tiny.ee
    tiny.pl
    tinylink.in
    tinylink.info
    tinylink.onl
    tinyplease.com
    tinyurl.com
    tinyurl.nu
    tinyurlshortener.com
    to.ly
    tu15.com
    tw.gs
    u.to
    ubyt.es
    ukaz.to
    umeedwar.com
    ur.link
    uri.im
    URL Redirect
    url.hys.cz
    url.ie
    url2a.site
    url3.ru
    urlc.net
    urlcrop.com
    urlgo.in
    urli.info
    urls.day
    urlshortener.biz
    urly.it
    ux.nu
    ux9.de
    v.gd
    viewsiterank.com
    visitmy.link
    vurl.com
    w2.am
    wall.sh
    wc.tv.br
    wck.me
    wdurl.ru
    webthemez.com
    wee.so
    xzx.kr
    ykm.de
    ylink.bid
    YOURLS
    yourstudent.com
    yy7.com
    zap.buzz
    znouze.eu
    zzb.bz
    DokuWiki
    MediaWiki
    TikiWiki
  • DeeeeeeeeDeeeeeeee the Americas
    edited December 1
    I see how the Internet has changed over time...
    A lot has changed.  Is it evolution? Devolution? Neither, but rather just cyclic change?
    Social media killed the small sites. No doubt about that.. 

    Now people SAY they're turning back to smaller sites, that they distrust big socials...
    But is it really a trend?
    I do hope that this  becomes a significant trend, and not merely talk on social media. 
    A lot of huffing and puffing, but ultimately not leaving or doing anything. Complaining about the medium they're using to complain. :|  haha Classic!
    Remember when people were all "on AOL", and before that Q-Link? Then AOL had an Internet gateway. People moved away from AOL and provider-service social services like forums and chat to websites with these equivalent services. So will we ever be there again? I hope so!


    So what I was wondering was, what's the story for each engine? Is it a case of there not being any  targets for those engines that seem to never produce any links and the sites are all gone or just that most still exist but have upped their game in one way or another to prevent "spamming," as noted by forum users above in this thread?  (I have firmly believed since the beginning that auto-post is not necessarily equivalent to spam because it is not. ) 
    Are so many types of engines gone because platforms are TOTALLY dead? Is it just that they stopped updating, people moved along to new projects? Just curious about the reality of it all...there are sooo many engines on SER.  I really didn't even clean up my list because it seems once in a while I get a random engine here and there. But maybe even that has stopped?? I am surely wasting resources...one way or the other.   Other users have stated which engines are really still getting results...those are experienced users we probably should trust...


    I remember SEO tools that would locate platforms (comment, guestbook, etc.) and then the user logs in in real-time with a browser window that opens. I actually found such tools incredibly useful and produced some good, stable links that way. Of course, it required the user (me) working in-real time using the tool, but I was OK with that. I could have also just collected the links.

    Could we get SER to do that?  I might be okay using such a tool, even still, if I can get links  onto more difficult-to-auto-post sites.

    Is there an engine command to pop-open a browser for the user, not for the engine to work with?  I'm ashamed to say I never learned the engine construction language.  It seems that would be a super-easy engine to write, if there is such a command. It's just scrape, identify, open window. Maybe a different GSA tool? I could see the use for a lightweight free-standing tool even today, but really, it seems SER is best equipped to do this.

    Thinking about this, I know the idea for GSA SEO is total automation.  Obviously. Some users on here have that down in ways we'd probably be amazed seeing. 

    The easy targets, the SEO-sites, the auto-approves, are still out there for some engines.  

    GSA friends, make some of your own SEO sites on old platforms  SER can post to that aren't abandoned projects (or popular ones today) and add your share to the number of "Spam" SEO Sites out there , with any criteria for posting you choose. No way we can't make some topical cool sites this way. Not sure which hosting companies are good for this. G00ggl3 shouldn't care. It's not paid links or link farms. Nothing against the rules. Just places to gain exposure. Just some BETTER auto-post platforms open to the general SEO auto-post public, but with restrictions for each niche. If ten of us each but ten, we'll have 100 new sites to auto-post to.  Just another idea to help this stay alive...We could all just focus on ONE engine at a time. I am not sure about this idea...again, just thinking...

    Getting links on better sites takes more work.

    But the actual sites that are used a lot by actual people are not easy to auto-post to, the "SEO" sites, the  auto-post platforms  allowing pretty much anyone's articles, are not really places anyone goes, except to post SEO articles.

    GSA forum users above have already noted in detail the many ways the sites of worth that are  still out there set out some new hurdles. Not that (some of) this couldn't be dealt with. There are already some cool third-party engines that require a headless browser...but there also limitations...
    Truthfully, I feel badly that we even expect @Sven to do the engine updating. We need to leave Sven time and space to write new software and update the GSA product line!  I feel like this all should be more of a group effort.  Sorry so long a response, everyone...
  • edited December 2
    I'd honestly love to see some engines updated or even new engines that work.  I'd honestly say the only reason i'm still having success with SER right now is cause of the awesome Engines that Cherub is doing.    I've tried the other one,  Serlists or whatever,  But between all the crashes, Chrome memory problems,  and the amount it was slowing down the actual posting.  Just wasn't worth it for me any longer.  I'd say the only links that are being built for me right now for T1  are Cherub's Engines,  then a lot of Wiki's and some random Wordpress and other engines here and there.  

    Don't get me wrong, SER is still the best tool out there. I think i have bought around 40 copies of it and been around the forums since it first came out.     I think new engines / Updated engines that work,   Would improve a lot of SEO for us that use it,  reinvograte new interest in new people buying it.  Hell,  Sven could even charge a monthly fee for updating engines/Adding new ones and i would pay it. 

    I'd absolutely love to update the engines myself,  and trust me i've tried.  I've tried countless amount of ways,  Hiring People,  using ChatGPT/Antigravity/Claude/ to update engines,  Spending the time to test it,  run it.  It just took away from my other work though,  So i can't devote a lot of time to it.  So if anyone is able to update/add new engines and charges for it.  I'm all for it. 
  • Hunar said:
    Hell,  Sven could even charge a monthly fee for updating engines/Adding new ones and i honestly would pay it.
    Would you pay for it? Well, then pay for it for me too!

    I don't have all that much money, and I don't see why I should pay for additional engines if I've already paid for the software that doesn't currently give me backlinks. This isn't a criticism of SER, but you had a terrible idea.
    You tried. I chose SER because, compared to the others, it was a comprehensive and cheaper software; it seemed like a good solution. Plus, I don't like subscriptions; I like one-time solutions. Again, if you're willing to pay more, pay for me too. We're trying to solve a problem, and you're offering paid solutions? I don't think so. I've already paid.
    Thanked by 1Hunar
  • edited December 2
    I agree with hunar and think sven should charge a month fee if we expected him to constantly add engines and keep them updated. 

    The reason GSA SER is so cheap and one time fee is BECAUSE sven does not have time to keep the engines updated and add more constantly, and I understand that. That's time consuming work and the main reason other software like RankerX, Seo Neo, ect.. mainly charge a monthly fee.

    To expect him to do all that on top of all the work he already does for nothing is insane. He even gives you a way to add more and fix them yourself, but noone has the time or want to take the time to do it themselves, why? Because its time consuming. So yeah, I think he should charge a small monthly fee if you expect him to do all that work, sounds fair to me.
    Thanked by 1Hunar
  • Hunar said:
    Hell,  Sven could even charge a monthly fee for updating engines/Adding new ones and i honestly would pay it.
    Would you pay for it? Well, then pay for it for me too!

    I don't have all that much money, and I don't see why I should pay for additional engines if I've already paid for the software that doesn't currently give me backlinks. This isn't a criticism of SER, but you had a terrible idea.
    You tried. I chose SER because, compared to the others, it was a comprehensive and cheaper software; it seemed like a good solution. Plus, I don't like subscriptions; I like one-time solutions. Again, if you're willing to pay more, pay for me too. We're trying to solve a problem, and you're offering paid solutions? I don't think so. I've already paid.

    I completely agree
  • edited December 2
    It was just an idea.  I personally hate subscriptions as well.  They really drive me crazy.  But Like tank said.  Look at every other tool currently out there.  Not even just link building tools.  They all have a subscription and a fee and they all come with updates new features added.  Sven already does this for free which is absolutely amazing for how little it costs.  

    I see how much cherub and others have to update engines because of the constant changes going on.  So, I'm gonna assume it's a lot of work.    I don't see how anyone would be willing to do that for free? Or at least be motivated to stay on top of it for free.  

    It would be great and i do love your idea about the community getting together and solving some of these and updating engines etc. etc.   I just don't see it happening but hey, I would love nothing more then to be wrong about it. 
    Thanked by 1Tank
  • KonstantinKonstantin Ukraine
    edited December 3

    Since we already have several users confirming the same observations, maybe the most practical next step is to collect some real data instead of opinions.
    Not to overload Sven — but to give him something he can actually work with.

    I propose a very simple community-driven engine check.
    Nothing big — just a lightweight status sheet.

    Here’s the structure:

    Engine name:
    Test URL:
    Date tested:
    Result: (submitted? verified?)
    Notes: (captcha? JS? hidden fields? Cloudflare response?)
    

    Even 2–3 tests per user would give us a clear picture very quickly.

    This is NOT about criticizing SER —
    it’s simply about understanding which engines still have real targets in 2025 and which ones are gone or changed.

    If anyone has:

    • a working example URL,

    • or a form that still submits,

    • or even a platform that partially works,

    please share one or two examples using the format above.

    Once we have even a small set of fresh data,
    we can summarize it here and this will give Sven a realistic starting point
    without expecting him to test hundreds of engines himself.


  • Here’s the structure:

    Engine name:
    Test URL:
    Date tested:
    Result: (submitted? verified?)
    Notes: (captcha? JS? hidden fields? Cloudflare response?)

    Even 2–3 tests per user would give us a clear picture very quickly.
    I'm afraid this approach will give false results. Testing a target URL manually is totally different from an automatic link building tool.

    Various forums (and other platforms) are using the StopForumSpam (SFS) platform. If you now try to create an account on 2-3 sites manually, it will certainly work. But if you do it in bulk through SER, it will fail because of the anti-spam plugin triggers. Or the proxies used. Or the captcha solver. Or the email address. Running some phpbb forums myself, I can see a huge number of denied registrations because of these typical patterns when automatic link building tools are used.

    Another approach could be to run a target URL list
    - through SER
    - through another tool
    and compare the results, i.e. accounts registered. This way we could create a solid dataset, filter by platform and look into the relevant GSA engines.

    I am optimistic that small changes in the popular engines (wait, timeout, strings) will already provide visible results.



  • KonstantinKonstantin Ukraine
    edited December 3

    You made a very valid point — manual testing and automated posting behave differently, especially on platforms that use SFS, anti-bot plugins or rate-limit patterns.
    So instead of abandoning the idea, I think we can refine it to make the data more realistic.

    What we really need is a 2-layer dataset:

    1) Manual form check

    Just to confirm that the engine script still exists and the platform is not completely dead.
    (Otherwise SER has no chance at all.)

    2) Automated SER check

    Using the same target list — so we can compare:

    • form reachable

    • form submits manually

    • SER submit result

    • verification result

    • logs: captcha, timeout, filter, SFS response

    This way we can separate:

    • platforms that are dead
      from

    • platforms that are alive but require engine adjustments
      from

    • platforms that block automated footprints

    Once we have this difference mapped, even small engine updates (as you said — waits, tokens, patterns, timeouts) can significantly improve success rates.

    This combined approach gives Sven actionable data without requiring him to test hundreds of URLs manually.

  • verdemuschioverdemuschio Italy
    edited December 8

    I tested several "Where to Submit" engine combinations for a Tier 1 project and noticed something that might help other users or perhaps even improve future versions of SER.

    Initially, my Tier 1 worked perfectly when using a very limited list of simple engines such as:

    General Blogs
    BlogEngine
    Bravenet Comments
    PHP Fusion Comments
    Easy Guestbook
    OpenBook Guestbook
    Web Wiz Guestbook
    Public Bookmark / Scuttle
    Trackback / Trackback-Format2
    DokuWiki / MediaWiki / TikiWiki
    Rentry.co
    Press Release Scripts

    With this list, Tier 1 Active immediately turned blue, started searching for targets, and started submitting content. Everything worked exactly as expected.

    Then it turned green again. After a while, it turned blue again, and then green again. Intermittently

    Later, I expanded the list by adding four more engines:

    Wordpress XMLRPC
    WordPress Article
    Drupal – Blog
    Joomla – Blog

    After adding these four engines, the project stopped being blue, and SER was no longer performing active searches. It seemed like SER had "nothing to do," even though everything else (proxies, keywords, emails, filters) remained unchanged.

    After some testing, I figured out what was happening:

    Simple engines generate new targets from keyword searches, even when the global site lists are empty.

    But CMS-based engines like WordPress/Joomla/Drupal do NOT generate new targets from keyword searches. They rely on existing site lists or URLs already discovered by other engines.

    So, when I added these four engines, SER tried to work with them first, but since there were no pre-existing target URLs for these CMS platforms (and no site lists were populated yet), the project immediately ran out of matching targets and stopped being blue.

    When I removed those four engines, the project started working again immediately.

    In short:

    Simple engines = SER can find new targets from keywords → the project remains active.

    CMS engines (WordPress/Joomla/Drupal) = r

    This isn't a bug, but it might help users understand why a perfectly good Tier 1 suddenly stops searching when some "heavier" engines are added.

    Maybe SER could one day handle this situation more gracefully, but for now at least the behavior makes sense once you know what’s going on.

    Thanked by 1Konstantin
  • Based on what has been discussed in this thread, I decided to test the Where to submit engines once again to better understand the situation. Even though SER was running correctly and the connection was active for weeks, no backlinks were being created. At that point, I wanted to understand whether the problem was caused by my configuration or whether many of the engines I was using were simply no longer working.

    After testing again from scratch, the only engines that produced positive results were Social Bookmark and URL Shortener engines. This helped me confirm that SER itself was working correctly, and that the issue was related to engine compatibility and filtering, not to the software or the connection.







    Below I describe exactly what I did, step by step.

    1. Initial problem

    The project was running normally, but no verified backlinks were being created.

    SER had been active for weeks without errors.

    The counters were not increasing and no Tier was unlocking.

    2. First analysis

    The project was set to Tier 1 – VISIBLE LINK ONLY.

    Several engines were selected under Where to submit.

    Despite activity, no real submissions were happening.

    This suggested that SER was running, but had no valid targets to post to.

    3. Testing a simple target (Rentry)

    I created a very simple text page.

    The page contained:

    A title

    A short description

    One visible URL (naked link)

    No formatting, no anchors, no keywords forced.

    This page was used as a clean and stable destination for testing.

    4. URL verification

    I imported the URL into the project.

    I used Show URLs → Verify.

    Verification completed successfully.

    This confirmed that SER could detect and verify the target correctly.

    5. Anchor text clarification

    I initially filtered engines by removing those that do not publish anchor text.

    This turned out to be a mistake.

    Important clarification:

    Social Bookmark and URL Shortener engines do not require anchor text.

    They publish naked URLs only.

    Filtering these engines based on anchor text removed all working engines from the project.

    6. Press Release Script test

    I created a simple press release script.

    The content included:

    A title

    A short descriptive text

    One visible URL

    No anchor enforcement was applied.

    This was used to test platforms that accept basic textual content with visible links.

    7. Correcting the Where to submit engines

    I reset the engine selection.

    I selected only:

    Social Bookmark

    URL Shortener

    I did not remove engines based on anchor text capability.

    This was the key correction.

    8. Tier duplication

    I duplicated the project structure:

    Tier 1 – VISIBLE LINK ONLY

    Tier 2 – VISIBLE LINK ONLY

    Tier 3 – VISIBLE LINK ONLY

    Each tier links to the previous one.

    No content changes were made between tiers.

    9. URL shorteners behavior

    Multiple short URLs were used.

    Each short URL points to the same final destination.

    The links were:

    Successfully submitted

    Verified

    Shown as naked URLs (expected behavior)

    No anchor text was required or forced.

    10. Final result

    Submissions started to appear.

    Verified links were created.

    The Tier system started working correctly.

    The "no targets to post to" situation was resolved.

    11. Final conclusion

    The problem was not caused by SER itself.

    Once the correct engines were selected, SER started working as expected.

    Final Note

    The problem is not SER, but rather the Web itself, which has changed. It might be helpful to adapt to the new standards.
Sign In or Register to comment.