*EDIT* - SEREngines V2 will be closing new sign ups soon and will most likely not be updating V2 any further. The developer is now focused on building V3.
I cant stress the high quality proxies and human solve captchas enough guys, I have been using 50 semi dedicated proxies, 2captcha and yahoo/mail.ru emails and having excellent success rates.
Can someone recommend a service for yahoo/mail.ru emails? I usually use the SEOSpartans catchall service, which is excellent but it looks like I'll need an additional provider.
Also can someone say what the 7 engines currently are so I can get a flavour for it?
Ampedpages Edublogs Onesmablog Rediff Webgarden Blog Wikidot Wordpress
Could I ask a question regarding the number of threads the new serengines uses? Because it seems v2 uses real browsers (presumably Chromium) so if I set the number of threads to 7, there will be 7 browsers to run at a same time or just one for all engines.
AldenEach engine/thread is a separate isolated custom browser.. So yes, setting to 7 threads will load 7 different threads with different isolated custom browsers. Those are closed at the end of the job.
@micha - I'm not sure if that's 100% decided yet, but I believe it will be 2 instances per license.
@jpvr90 - I believe they should all be dofollow, but I'll double check and get back to you. The main focus will be dofollow engines unless users are requesting a certain engine that isn't.
s4nt0sI ran Serengines 2.0 in a few times, but I only have 2 successful submissions: ampedpages.com and onesmablog.com. These seem to be Money Robot’s sites (bought domains). And the browsers seem to consume quite a lot of memory. 7 instances were created and they consumed nearly 700MB. I think you should somehow disable loading images to save the resources.
@nereupo So you can't make any wordpress blog? @S4nt0s is there any moneyback if I will cancel service before 3 days? I want to test this and ensure, that I will have links from more platforms that Money Robot's sites.
@nereupo - There will be many more engines added, the last week or two they have been focusing on stabilizing the headless browser (used for submission). There are still some issues with it being worked out. 700 MB seems pretty high for 7 instances so that might be something to look into. It hasn't got that high on my server.
If you mouse over where you put in the API key in SER, you will have a support token. If you provide that, he'll be able to see what was causing the submission errors. Please PM me the # and he'll check it out.
@micha - You might want to hold off for a few days while we get some things sorted out.
Keep in mind this isn't an official release, this is just testing so we can see how it works and adjust things as needed.
@joaquinrios - Not sure what you mean? It does work. I've been making hundreds of web 2.0's every day just testing things out.
Here's one I did earlier:
There is a crash that still occurs randomly while its running, but it doesn't stop the projects from building links. The crash "pop up error box" is what's trying to get solved right now while also releasing new engines here and there so people can still try it out.
However, the crash needs to be solved before any kind of official release happens.
From 7-8 days it does not work well we work with links in Spanish. When I make a list of the footprints of gsa and keywords in Spanish, GSa leaves no link, just says "no matches engine". With
the lists before I let such comments two weeks ago, now when opening a
new project, the program leaves no link, and that I delete cache, and
delete all the program begins to 0. And I'm not the
only one who has this problem, this week we have been investigating and
we have observed that only works with the lists we draw in Scrapebox
with footprints GSA and keywords in English, but with keywords in
Spanish leaves no links or anything like before if he did. Does
not operate with verified lists that have, or with footprints than
before if they worked, ie, I have my own footprints with which a week
ago drew lists Scrapebox with keywords in Spanish and GSA BE worked
well, but now leaves no link. I
have a list of 400 thousand url's where a week ago I left a lot of
links two days ago a new campaign and did not leave any link, only "no
matches engine". I hope not fix it soon no longer working as before if he did.
Running good for me...One thing GSA doesn't do is show the verified post...So when it creates say a wordpress blog the only verifed link we get his the main url...Not the actual post url....So when tiering all im hitting with T2 is the main url....
@bourbonkola - It would be better to post that in the GSA Search Engine Ranker section of the forum. This thread is specifically about SERengines and testing. I believe you're just using GSA Search Engine Ranker without SERengines.
I'm uploading two sample projects so people can see the settings. There is nothing special about how I setup the projects. You can dl them here: https://ufile.io/ba39
It's super important to use good proxies, captchas and emails. We've seen people trying to use Captcha Breaker with SERengines and if you do that, its not going to work well.
I'm using 2captcha for captcha solving, mail.ru for emails (don't use catchalls) and proxies are stormproxies/blazing proxies. I just happen to have both so I threw them in there. Make sure your proxies are good speed and anonymous.
Regarding scheduled posting options, this is being reviewed to be more accurate.
its good, very good, the only one engine that has lowest succeess rate is wordpress. anyone have run it good? i used yandex & mailru, cant wait for new web.20 list.
@pelangi - The biggest issue I see with Wordpress is creating and submitting too fast, this isn't a SEREngines thing, but the protection mechanisms that Wordpress has in place. Try and space the articles posting out to realistic times and don't create too many accounts with a small amount of proxies, otherwise the IP address used will overlap and be an easy detection for wordpress.
There is no recommended setting for this at the moment so just don't be greedy and hammer the site too hard with a low amount of I.P.'s or you could run into issues.
I use buy proxies and they work like a charm, I have tried google hard banned proxies but its as if Google share their ban list with some web 2s and they ban them too so I just pay the extra for the better proxies these days.
The latest version of SERengines was pushed yesterday so make sure you completely stop, then start your projects to update. So far no crashes on our end when using this version so please let us know if you see a crash after updating. Hopefully its gone
Comments
Can someone recommend a service for yahoo/mail.ru emails? I usually use the SEOSpartans catchall service, which is excellent but it looks like I'll need an additional provider.
Also can someone say what the 7 engines currently are so I can get a flavour for it?
Thanks.
Ampedpages
Edublogs
Onesmablog
Rediff
Webgarden Blog
Wikidot
Wordpress
Could I ask a question regarding the number of threads the new serengines uses? Because it seems v2 uses real browsers (presumably Chromium) so if I set the number of threads to 7, there will be 7 browsers to run at a same time or just one for all engines.
ill back for report.
thnks mr @s4nt0s
Can I use same key in each my instances, or I need to buy few copies of serengine?
These are 7 engines:
Ampedpages
Edublogs
Onesmablog
Rediff
Webgarden Blog
Wikidot
Wordpress
@S4nt0s is there any moneyback if I will cancel service before 3 days?
I want to test this and ensure, that I will have links from more platforms that Money Robot's sites.
I set my project up like this
https://gyazo.com/293fd48f570e20c6f360a3d93eba7e5a
Was just looking at some web 2.0's
WP- had 13 posts to it and rediff blog had 16 posts
based off my settings i asume 3 posts should be the max...Not big deal from my end...Just sharing what i see...
http://www.blazingseollc.com/account-store/
the only one engine that has lowest succeess rate is wordpress.
anyone have run it good?
i used yandex & mailru,
cant wait for new web.20 list.