Now I see what you meant. After turning off that option I see lots of already parsed. I know that I should have cleaned my cache first but who gives a funk right ?
The thing is - I am getting about 7 vpm atm without cleaning cache and going through same sites that I failed from start. If I am right, I'll be getting over 10 lpm after I parse myself through hell into the light.
Unticking that option will miss out on a lot of potential links. Unless you clear the target history as well. Think about the recent failure rate on links it SHOULD BE submitted? It will just skip on over that potential target every time it comes across it if you untick that option or don't clear the target url history.
And actually that's not what I meant. "Already parsed" just means it's already tried to post to that link/site.
Here I'll try to explain it better.
When you first start a new project and import fresh targets it has to go through the process of identifying the platform, then registering (usually more captchas are involved registering than submitting which slows things down), then email verification at intervals, and then finally submission (usually at intervals).
Now whether it received an email verification or not SER automatically tries to login to these sites and posting at certain intervals (usually after your link verification and email verication finishes up) because not every site needs you to verify your email to post. SER usually can guess this with these certain platforms. On alot of urls you will just see it registering until it tries to verify the email and on others it will register and then immediately try to login and post.
Now okay. Like I said SER does the logging in and posting to most sites only after it has tried to verify the email in INTERVALS.
Now when you restart SER or a project those urls (those which have already been verified by email or have failed and are ready to post) are pushed to the top of the que until it finishes what's already set to try to post to and THEN it goes on with the rest of the link which need "identifying, registering, email verifying, submission" process.
If you still don't get it then I guess wait for someone else to explain it, lol. But I've been up for 2 days and tired as f--ck
I know that lots of potential targets are lost that way but getting more than 1 vpm is better imo. But if ser fails on some websites over and over again isn't that a bigger waste of threads ? As I understand - it tries with these steps over and over again until link is verified. But if on certain websites it fails over and over again it still wastes a lot of "oxygen".
Oh, nevermind. I thought that your picture with table flipper meant that I still didn't get it. Am I getting it now ? Because it doesn't seem confusing when guy took his time to write an essay about it. Thanks for better explanation
Try to post to site 1(2) times and then move on with that thread.
I think I am up to something here
I have now set my frankenstein experiment - I set my threads to 3k and cpu 85%
It rarely gets laggy and posts more links until it drops dead (1 vpm) + I get that valuable to option to post to dead-end websites which never works though.
And I'm thinking. Why would the option that are in that guide be so misleading.
I am now getting 12+ vpm for over 30 minutes and it keeps increasing.
I've set html time out on 60 seconds and maximum website speed to load 7 mb since ones with over 10 mb are filled with spammy spam or irrelevant crappy pictures anyways and take ages to load. I am getting virtually none timed outs
I need a scientific explanation for this. Ok, failing threads are still there. What's changed ? I think that some threads get to stage where they are just not responding or something like that.
I am not a tech guy so I will just be testing stuff untill I get my 20+ vpm
I think that when I tweak engines I will be able to get over 20 vpm.
Edit: turns our it takes s**tload of ram to do it that way so your pc crashes after a while. Gotta find another way.
I have tried unticking those two options and I am getting about 2 vpm steadily now it keeps dropping though.
Ok, so even if I succeed to delete wrong engines and increase my vpm by 100% or 200% I will still be getting shite.
Even if my GSA never gets any errors (I get about 60-70% of errors atm) I would still be better of buying wham bam's...
I mean if I am getting 4-5 vpm = about 90$ of spammy links a month
I have to pay for proxies + scm + a verified monthly list + vps + indexification = 115$
+ count in the little work when you tweak projects
I need about 260-270K (bulk costs less you know) of spam links to make it work. Thats about 5-6 vpm if I run for 24/7/30. Take out server crashes sometimes you get 5-7 vpm.
If my math is right I should be just ordering cheap blasts of links and not waste my time reading all these guides.
Been doing it for the past 4 days now for 14-15 hours straight. I have "successfully" increased my vpm count. And no, I won't take a break from IM because s**t always happens when I take brakes.
Should I focus on creating parasites and buying social signals + blasts ? Or try to escape this matrix and divide by zero ?
Comments
And actually that's not what I meant. "Already parsed" just means it's already tried to post to that link/site.
Here I'll try to explain it better.
When you first start a new project and import fresh targets it has to go through the process of identifying the platform, then registering (usually more captchas are involved registering than submitting which slows things down), then email verification at intervals, and then finally submission (usually at intervals).
Now whether it received an email verification or not SER automatically tries to login to these sites and posting at certain intervals (usually after your link verification and email verication finishes up) because not every site needs you to verify your email to post. SER usually can guess this with these certain platforms. On alot of urls you will just see it registering until it tries to verify the email and on others it will register and then immediately try to login and post.
Now okay. Like I said SER does the logging in and posting to most sites only after it has tried to verify the email in INTERVALS.
Now when you restart SER or a project those urls (those which have already been verified by email or have failed and are ready to post) are pushed to the top of the que until it finishes what's already set to try to post to and THEN it goes on with the rest of the link which need "identifying, registering, email verifying, submission" process.
If you still don't get it then I guess wait for someone else to explain it, lol. But I've been up for 2 days and tired as f--ck
Try to post to site 1(2) times and then move on with that thread.
I think I am up to something here
I have now set my frankenstein experiment - I set my threads to 3k and cpu 85%
It rarely gets laggy and posts more links until it drops dead (1 vpm) + I get that valuable to option to post to dead-end websites which never works though.
Maybe I should be a programmer.
Oh wait, I am already studying web development.
God damnit.
And I'm thinking. Why would the option that are in that guide be so misleading.
I am now getting 12+ vpm for over 30 minutes and it keeps increasing.
I've set html time out on 60 seconds and maximum website speed to load 7 mb since ones with over 10 mb are filled with spammy spam or irrelevant crappy pictures anyways and take ages to load. I am getting virtually none timed outs
I need a scientific explanation for this. Ok, failing threads are still there. What's changed ? I think that some threads get to stage where they are just not responding or something like that.
I am not a tech guy so I will just be testing stuff untill I get my 20+ vpm
I think that when I tweak engines I will be able to get over 20 vpm.
Edit: turns our it takes s**tload of ram to do it that way so your pc crashes after a while. Gotta find another way.
Ok, so even if I succeed to delete wrong engines and increase my vpm by 100% or 200% I will still be getting shite.
Even if my GSA never gets any errors (I get about 60-70% of errors atm) I would still be better of buying wham bam's...
I mean if I am getting 4-5 vpm = about 90$ of spammy links a month
I have to pay for proxies + scm + a verified monthly list + vps + indexification = 115$
+ count in the little work when you tweak projects
I need about 260-270K (bulk costs less you know) of spam links to make it work. Thats about 5-6 vpm if I run for 24/7/30. Take out server crashes sometimes you get 5-7 vpm.
If my math is right I should be just ordering cheap blasts of links and not waste my time reading all these guides.
Been doing it for the past 4 days now for 14-15 hours straight. I have "successfully" increased my vpm count. And no, I won't take a break from IM because s**t always happens when I take brakes.
Should I focus on creating parasites and buying social signals + blasts ? Or try to escape this matrix and divide by zero ?
Any suggestions ?