Most “Where to Submit” engines seem inactive today
verdemuschio
Italy
Hello everyone,
Sorry if some of this might not be 100% accurate — I’m not an expert. I just tried to collect information here and there on the web because I was struggling to understand why some of the “Where to Submit” engines in GSA SER were not producing any links anymore.
I hope it’s okay to share what I noticed, and maybe it can help others too.
From what I understand, many of the old engines (especially forum scripts and CMS platforms) simply don’t work in 2025. They either don’t exist anymore, or the websites using them have switched to modern protections like Cloudflare, JavaScript checks or login systems that GSA SER can’t bypass.
For example, engines like phpBB, SMF, vBulletin, XenForo, IPBoard, Joomla Blog, Drupal Blog, Wordpress Article, BuddyPress, and many article directory scripts — they just don’t allow automated posting anymore. Most of them require JS, or captcha steps that SER can’t handle, or have closed registrations completely.
The same seems to happen with many old directory scripts, gallery scripts, video CMS, social network clones, Q&A scripts, guestbooks, classified ads scripts, and so on. They used to work years ago, but today almost all installations have either been abandoned, shut down, or protected behind systems that block bots.
Because of this, even if the engines are still listed inside SER, in real life almost none of them generate backlinks anymore. I’m not saying this to criticize — just trying to understand why my projects weren’t producing anything, even with good proxies, good emails and the correct setup.
Personally, I think SER still has a lot of potential, but maybe the submission engines need to be updated with newer platforms that still allow automated posting today.
Only a few engines still work consistently (things like MediaWiki, DokuWiki, TikiWiki, simple blog comments, public bookmarks, rentry.co, etc.). Everything else looks “alive” in the interface, but in practice doesn’t produce results.
So this is just my humble opinion, as a normal user:
maybe it’s time to add new engines to match the modern web, because most old engines simply don’t work anymore.
Again, sorry if something I wrote is not perfectly correct — I’m still learning.
Thank you for reading.
Comments
I just want to clarify something because it looks like my post may have been misunderstood, and I really didn’t want to create any controversy.
I did not mean to criticize GSA SER at all. SER is a solid piece of software, and I fully respect the work behind it. What I described is not a problem caused by SER — it’s simply how the modern web has changed over the years. Many auto-postable platforms that existed 10–15 years ago are now protected, offline, or no longer allow automated submissions.
This affects every backlink software, not only SER.
I only wrote my message because I’m a user of SER and I like the software, so naturally I’m hoping to see it grow and adapt. My intention was not to complain, but to share what I noticed and maybe help improve things if possible.
If my wording sounded wrong or too direct, that wasn’t intentional.
I’m just a normal user trying to understand what still works today and what could be updated.
Thanks for your understanding.
I fully agree with @verdemuschio and @Konstantin.
The reality is that the web has changed, and the software needs to adapt to 2025 defenses (Cloudflare, JS checks, etc.). Currently, we are burning through good proxies and emails trying to post to scripts that effectively no longer exist.
A "Spring Cleaning" of the engine list or a dedicated update for modern platforms would make SER significantly more efficient. Quality over quantity is the only way forward now. Hopefully, the dev team can take this feedback on board!
Thanks everyone for the discussion — it really helped clarify what’s going on today with SER.
I’ll add my perspective from a technical point of view, without any pressure — just ideas that might be useful.
From my own tests, the problem is not SER as software.
The problem is that the web has fundamentally changed:
many classic CMS engines from 2010–2016 don’t exist anymore
a lot of registration forms have been removed or locked
Cloudflare / JS challenges became default on many platforms
anti-bot fields and dynamic tokens became standard
some engines still work, but success rates are naturally much lower
No tool can post where a form no longer exists — that’s simply how the modern web evolved.
So instead of asking for major changes, here are practical and realistic suggestions that could help SER stay effective without rewriting the software.
1) A small, community-driven “engine status review”
Not to overload Sven — the opposite.
Users can collect:
example URLs
footprints
form behavior
tokens
hidden fields
notes about anti-bot elements
Sven would only need to review and integrate what is already prepared.
This makes the process manageable.
2) Label engines, instead of removing them
Just something simple like:
Working (2025)
Low success
Likely inactive
This helps users avoid wasting proxies and emails.
Even this tiny change would have a big impact.
3) Small improvements for modern anti-bot logic
Nothing big — not JS rendering.
But things like:
keeping session cookies
retrying with the same session
passing basic timestamp hidden fields
reading simple dynamic tokens
Even these minimal additions would noticeably increase success rates on some platforms.
4) Allow community-contributed engines
Maybe once per month:
one or two user-created engines
Sven reviews them
adds them if they meet quality standards
This keeps SER evolving without increasing Sven’s workload.
5) Simple engine diagnostics
Something like:
This would help users understand what is happening, without guessing.
Final thoughts
SER still has strong potential in 2025 —
but the engine list simply needs a bit of cleanup and modernization,
and the community is ready to help with the data collection part.
No pressure, no demands —
just ideas that many users might find useful moving forward.
Thanks again for keeping SER alive and for being open to this conversation.
Tier 2 (tier):
Tier 3 (tier tier) and Tier 4 (tier tier tier):
Don't get me wrong, SER is still the best tool out there. I think i have bought around 40 copies of it and been around the forums since it first came out. I think new engines / Updated engines that work, Would improve a lot of SEO for us that use it, reinvograte new interest in new people buying it. Hell, Sven could even charge a monthly fee for updating engines/Adding new ones and i would pay it.
I'd absolutely love to update the engines myself, and trust me i've tried. I've tried countless amount of ways, Hiring People, using ChatGPT/Antigravity/Claude/ to update engines, Spending the time to test it, run it. It just took away from my other work though, So i can't devote a lot of time to it. So if anyone is able to update/add new engines and charges for it. I'm all for it.
I don't have all that much money, and I don't see why I should pay for additional engines if I've already paid for the software that doesn't currently give me backlinks. This isn't a criticism of SER, but you had a terrible idea.
I completely agree
I see how much cherub and others have to update engines because of the constant changes going on. So, I'm gonna assume it's a lot of work. I don't see how anyone would be willing to do that for free? Or at least be motivated to stay on top of it for free.
It would be great and i do love your idea about the community getting together and solving some of these and updating engines etc. etc. I just don't see it happening but hey, I would love nothing more then to be wrong about it.
Since we already have several users confirming the same observations, maybe the most practical next step is to collect some real data instead of opinions.
Not to overload Sven — but to give him something he can actually work with.
I propose a very simple community-driven engine check.
Nothing big — just a lightweight status sheet.
Here’s the structure:
Even 2–3 tests per user would give us a clear picture very quickly.
This is NOT about criticizing SER —
it’s simply about understanding which engines still have real targets in 2025 and which ones are gone or changed.
If anyone has:
a working example URL,
or a form that still submits,
or even a platform that partially works,
please share one or two examples using the format above.
Once we have even a small set of fresh data,
we can summarize it here and this will give Sven a realistic starting point
without expecting him to test hundreds of engines himself.
Various forums (and other platforms) are using the StopForumSpam (SFS) platform. If you now try to create an account on 2-3 sites manually, it will certainly work. But if you do it in bulk through SER, it will fail because of the anti-spam plugin triggers. Or the proxies used. Or the captcha solver. Or the email address. Running some phpbb forums myself, I can see a huge number of denied registrations because of these typical patterns when automatic link building tools are used.
Another approach could be to run a target URL list
- through SER
- through another tool
and compare the results, i.e. accounts registered. This way we could create a solid dataset, filter by platform and look into the relevant GSA engines.
I am optimistic that small changes in the popular engines (wait, timeout, strings) will already provide visible results.
You made a very valid point — manual testing and automated posting behave differently, especially on platforms that use SFS, anti-bot plugins or rate-limit patterns.
So instead of abandoning the idea, I think we can refine it to make the data more realistic.
What we really need is a 2-layer dataset:
1) Manual form check
Just to confirm that the engine script still exists and the platform is not completely dead.
(Otherwise SER has no chance at all.)
2) Automated SER check
Using the same target list — so we can compare:
form reachable
form submits manually
SER submit result
verification result
logs: captcha, timeout, filter, SFS response
This way we can separate:
platforms that are dead
from
platforms that are alive but require engine adjustments
from
platforms that block automated footprints
Once we have this difference mapped, even small engine updates (as you said — waits, tokens, patterns, timeouts) can significantly improve success rates.
This combined approach gives Sven actionable data without requiring him to test hundreds of URLs manually.
I tested several "Where to Submit" engine combinations for a Tier 1 project and noticed something that might help other users or perhaps even improve future versions of SER.
Initially, my Tier 1 worked perfectly when using a very limited list of simple engines such as:
BlogEngine
Bravenet Comments
PHP Fusion Comments
Easy Guestbook
OpenBook Guestbook
Web Wiz Guestbook
Public Bookmark / Scuttle
Trackback / Trackback-Format2
DokuWiki / MediaWiki / TikiWiki
Rentry.co
Press Release Scripts
With this list, Tier 1 Active immediately turned blue, started searching for targets, and started submitting content. Everything worked exactly as expected.
Then it turned green again. After a while, it turned blue again, and then green again. Intermittently
Later, I expanded the list by adding four more engines:
WordPress Article
Drupal – Blog
Joomla – Blog
After adding these four engines, the project stopped being blue, and SER was no longer performing active searches. It seemed like SER had "nothing to do," even though everything else (proxies, keywords, emails, filters) remained unchanged.
After some testing, I figured out what was happening:
Simple engines generate new targets from keyword searches, even when the global site lists are empty.
But CMS-based engines like WordPress/Joomla/Drupal do NOT generate new targets from keyword searches. They rely on existing site lists or URLs already discovered by other engines.
So, when I added these four engines, SER tried to work with them first, but since there were no pre-existing target URLs for these CMS platforms (and no site lists were populated yet), the project immediately ran out of matching targets and stopped being blue.
When I removed those four engines, the project started working again immediately.
In short:
Simple engines = SER can find new targets from keywords → the project remains active.
CMS engines (WordPress/Joomla/Drupal) = r
This isn't a bug, but it might help users understand why a perfectly good Tier 1 suddenly stops searching when some "heavier" engines are added.
Maybe SER could one day handle this situation more gracefully, but for now at least the behavior makes sense once you know what’s going on.