Skip to content

Number of Account Created Per Domain and Number of Submitted Articles

prnichesprniches Lagos
edited July 2019 in Need Help
I am having issues with the operation of SER regarding the number of accounts that it creates per domain and also the number of articles it sends per account.

Below is my settings for posting:


However, for one particular DotNetNuke (Social Network) engine, it has created well over 148 different accounts and it just keeps posting to a few domains several times a day.

Also, when I check the "Show remaining Target URLs I see a long list of urls it intends posting to. Looking at the numbers, as a webmaster, I believe it would be easy to detect these as spam and Google will definitely see it the same way considering the quantity.

At max posts of 7 posts per account and having say a total of 17 accounts, that should be a max of 120 submissions per domain.

What I don't understand in this scenario is the function of the "per URL". Does that imply 120 articles per URL. I need clarification as per how exactly these setting work.

Instead of searching and posting to newer domains, it keeps on posting virtually to the same domain all through the day.

Thanks.

Comments

  • SvenSven www.GSA-Online.de
    You use the 'per URL' option and when it is a tier project or a project with many urls, you need to multiply the numbers with it. 
  • Sven, thanks for the reply.

    It is a Tier1 Project as so I want its submission to be a little quieter.

    My question specifically is this:

    Is it going to multiply this way

    [Number of Accounts x Number of Post x Number of Urls] per domain

    or

    If I turn off the "per URL", will it be like this:
    [Number of Accounts x Number of Posts]

    The other question would be, does it stop posting to that domain once it has reached the number in question or does it resume once it sees a new url added to the Global site Verified list?

    Thanks.

    PS: I thought you would be on your vacation!
  • SvenSven www.GSA-Online.de
    Yes it is calculated this way as you already thought. 
  • andrzejekandrzejek Polska
    edited July 2019
    Before you do any posting like that, please check your verified urls after few days. This particular engine has a lot of UserIDS in range 1000000~~. Most of them are dead after few days, keep in that in mind when posting to other engines... also, a lot of profiles are disallowed by robots.txt which is also waste of time. 
  • andrzejek said:
    Before you do any posting like that, please check your verified urls after few days. This particular engine has a lot of UserIDS in range 1000000~~. Most of them are dead after few days, keep in that in mind when posting to other engines... also, a lot of profiles are disallowed by robots.txt which is also waste of time. 
    Hi andrzejek,

    Thanks for the answer. Then how does one avoid such engines because they just waste resources and do not allow SER to post to other engines on time because of the time spent on these engines.

    I think there should be a way to restrict or tell SER not to post to specific domains at different Tier levels. If it did this at Tier 3 or above, I won't bother much but at Tier1, I think that is a big problem. Not going to my Money Site though but hitting my Buffer like that is just not good with Google.

    Thanks.
  • Well i always say hit blindly with a hammer and you will just destroy, but be precise and you build a house... you have to know what you are doing, SER is not any magic, you wont build hight quality backlinks if you just hit start with default settings and data fields, there is a lot of macros, and settings, use them wisely. Over the 8 years almost all sites are spammed to death. I mean, if all users just hit start and expect results, not knowing what they are doing they just spam all the nice targets and destroy websites. But well... if someone does not secure his house he takes the responsibility, but we should educate people..., but they dont listen usually or dont care so... we do the harm and then start to think... so in the end the spam is good, i mean, leaving a link somwhere is not a big deal right?  Test, fail, test fail, or steal from competition and rank how they rank, just forget that you will hit start on random list with random settings and rank... but well, ive seen it happen... at the end there is google algorithm... but SEO guys get smarter, why? Beacuse they want? No, they have to, beacuse the algo is improving, SEO guys dont know shit (like me) we just test and watch the results is all we do, but then there are some experts (less than 1%?) that know what works... but they have budget, tricks and contacts, they make money beacuse others dont do what they do, thats why the SEO niche is so silent and noone talks loudly what works and how to do it, beacuse if they do, they lose money, but money is not important, what is important is the feeling of security. There are tools popping up to do X Y, so you end up with 10 different tools wasting time and money not knowing what you are doing, its just another be-rich-scheme. SER and GSA is different (and not the only one on market), its a great tool, with a vision and unique idea, its automated precise hammer, but needs someone who can control it...
    Thanked by 1prniches
  • Well said, andrzejek.

    The is basically the fact about using any tool especially a tool as delicate and powerful as GSA SER. It has both the power to make you and break you before you know.

    Special attention as to be paid to understand the intricacies involved in using this monster of a tool. That's probably why I'm being so careful as I don't want my site to get hurt - at least for now!

    Thanks for the hints and best of luck pal.
  • If you want some hardcore testing try setting up a parasite page!
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    edited July 2019

    I will explain what your problem is.

    On the setting you have set it to create between 7 and 13 verification per URL per day ( 10 with a +/- 3 ).
    GSA SER  will start submitting and submitting and after a little whilst  it might check how many has been verified, if it did not reach the +/- 7 verified per URL, then it will continue to submit. So it could be submitting 1000 links  per url but only 5 got verified, so it will continue to submit until the +/- 7 verified links are reached per url
    --- lets say gsa is submitting at 300 urls per minute - that is 18 000 submit in a hour but it is not checking the verification in real time so it could over run the set verification per url.


    What you can do is lower your threads allot, then force GSA to do the Verification sooner. By doing the verifications more often, you will prevent gsa ser from over running the set amount of verified per url
    You can set a custom verification time to say do verifications every 10 min 


    You will have to play with the verification setting to see what works best.

    This might help with the verification over run but it will slow down submissions allot to do the verifications every few minutes. So forget about high LPM  when doing this.

    Another option would be to simply change it to something like : pause when reaching 200 submissions per day per url. Doing it this way you might get less verifications but at least it wont be by much.

    Best is to experiment, create a few projects and  set different options for each - then see which give the best results.

    Hope that helps




    Thanked by 1prniches
  • Thanks a lot Mike for the wonderful explanation and recommendations.

    I think these are very good options to explore and see how SER responds. Thanks again!
    Thanked by 1royalmice
  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    @prniches

    Another solutions and probably the best in terms of not burning you main site url and not having to worry about over running the set verification per day, would be to diversify the urls that you use.

    Do NOT  use only URLs from one domain, that is super bad. You should be diversifying your URLs, meaning use as many urls as possible from different domains and platforms.

    For example: money site URLs, the money site FB page url, Twitter page url, YT channel, YT playlist, YT Video, Linked in company page and any other high authority links from either web 2.0 or social sites that link to your main site. The more diversity the better...

    Thanked by 2andrzejek prniches
  • The URL were not pointing to my Money Site, I'd be dead if that was the case.

    The urls were my Buffer backlinks but I was worried about the huge amounts that were being posted to just one domain - but not mine.
Sign In or Register to comment.