donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
edited November 2013
I would recommend using CB + Spamvilla, it's what I used to make the list.
Also, I don't know why, but rerunning the list multiple times will increase the success rate. Try rerunning the ones that failed.
That's why it's not really ideal to use a manual captcha solving service.
edit: If you want to get as many unique domains as possible, I guess I'd use the list that includes identified and successful urls along with verified, and set the list to be used as a global site list. Then just make sure you set SER to not post to duplicate domains, and let it try to post multiple times. Just my advice.
Trying to load x amount of urls and then posting one time isn't ideal.
Trying to load x amount of urls and then posting one time isn't ideal.
Why you think so?
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
edited November 2013
It's just based on my experience from using SER.
For whatever reason, SER does not repost super well to article sites. I don't know why. Doing what I said above(using global site list with all the urls and letting it try to repost a lot) helps improve the #'s. There might be a better way to do it, or something I'm missing, but that's what I do.
The way i use it is set up a dummy project just for this list and choose option "continually post to same domain even if failed before" so it just keeps trying to post to each link over and over.
Real projects use the "Global list - verified" to post to.
So, live urls go from dummy project to verified list and real projects use them from verified list.
Works pretty good for me.
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
I also use "continually post to same domain even if failed before", and would recommend it.
Hard to use Spamvilla with consistent downtime, I'm not even subscribing there anymore but I agree that rerunning the lists increase the numbers. I doubt CB + Spamvilla will yield a higher captcha success rate than expert decoders for instance, just curious because the verifieds is pretty much lower than in the list. I guess we're in the same boat with GSA getting up to 50% fewer verifications out of the same list ran on the same day.
I'm running list now with CB + Spamvilla now. It's on the scheduler with other projects so the pace is very slow. But so far:
Submitted: 140 Verified: 730
That's not too bad for just the start of the first run. I'm happy with it anyway, see how it goes over the next day or so.
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
I wasn't trying to suggest that CB + SV would yield a better result captcha for captcha compared to a manual service, only that it makes it much more financially viable to repost with that combination.
Yeah, I don't know why the reposting isn't perfect with some platforms. The only thing I can suggest is to take a bruteforce approach to reposting, and use as many different urls from the same domain as possible.
Is the list in a format that allows import of CERTAIN engine types only = i.e. articles/blogs/forum comments/social networks ONLY ???
OR
do we have to import ALL / complete list even if I use only very precise types of engines only??
please clarify
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
You have to import everything.
But the global site list allows you to specify any folder that you want when importing the file. So you could temporarily choose a different location and import the list, then delete the files you don't want.
ALL CPU and bandwidth is 100% used for SER submissions 24/7 mens import of un-used 80% of URLs absolutely impossible
the list ONLY of use if in SER format = engines separated in files
If you can offer in that format = I get a list ASAP = today later = too late as my spamvilla canceled and expiring in about 10 days and my entire SER project successfully concluded in VERY near future
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
It comes as a .sl file. The engines are in separate files once you import the .sl file.
Hi, Just to confirm again, http://gsa.0accounts.com/domains.txt all from this list are unique domains? you used the following settings to post on ? - avoid posting on same site again (checked) - maximum accounts per site : how much you set? 1? - maximum posts per account : how much you set ? 1? - verified links must have exact url - no more filters like skip sites with x outbound links, skip sites with pr below x, etc. - also no filters for domain name badwords, or where bad words appear on site - as per email what email you used?
= as clearly said import NOT possible = NO deal either in single files sorted by engine = original as in SER or NOTHING at all
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
@eLeSlash - The numbers in that text file will be off a little bit. SER identifies stuff incorrectly in some cases. The total amount of domains in the sales post is correct though.
The first 4 settings you mentioned don't have anything to do with making a list, and the way I'll use them will vary.
@hans51 - If you pause projects, it only takes a few seconds to import .sl file and then you have the list sorted by engine, you can import into "failed" for example to keep separate from other lists.
@donaldbeck i assume you test your lists before selling them, so you need to create a project with certain settings and filters - avoid posting on same site again (checked) this is one of the most important option for all projects. Bought the list anyway to update my databases, its a great addition to my lists, around 20% targets i didnt had. (speaking more about tier 1 links article-directory-forums-social book-social net)
Excuse me, i dont understand what are those files ? All_ad_split ? what are this files? what platforms are inside? are they verified links? identified? I noticed there are no trackbacks blog comments guestbooks image comment in this file, what does it hold?
donaldbeckAdvanced SER Videos -> http://bit.ly/1ySrbwu | Learn SER For Free Step By Step -> http://sertips.com
Theres a text file called 'how to install and use the list' thats included. It explains what everything is.
First minute i purchased the lists i watched the videos, i had a bug page didnt loaded with sound for some reason so i watched the videos but coudnt hear anything, i though its a video without sound lol
Comments
Also, I don't know why, but rerunning the list multiple times will increase the success rate. Try rerunning the ones that failed.
Trying to load x amount of urls and then posting one time isn't ideal.
Trying to load x amount of urls and then posting one time isn't ideal.
Why you think so?
For whatever reason, SER does not repost super well to article sites. I don't know why. Doing what I said above(using global site list with all the urls and letting it try to repost a lot) helps improve the #'s. There might be a better way to do it, or something I'm missing, but that's what I do.
Real projects use the "Global list - verified" to post to.
So, live urls go from dummy project to verified list and real projects use them from verified list.
Works pretty good for me.
Submitted: 140
Verified: 730
That's not too bad for just the start of the first run. I'm happy with it anyway, see how it goes over the next day or so.
OR
do we have to import ALL / complete list even if I use only very precise types of engines only??
please clarify
ALL CPU and bandwidth is 100% used for SER submissions 24/7
mens import of un-used 80% of URLs absolutely impossible
the list ONLY of use if in SER format = engines separated in files
If you can offer in that format = I get a list ASAP = today
later = too late as my spamvilla canceled and expiring in about 10 days and my entire SER project successfully concluded in VERY near future
Just to confirm again, http://gsa.0accounts.com/domains.txt all from this list are unique domains?
you used the following settings to post on ?
- avoid posting on same site again (checked)
- maximum accounts per site : how much you set? 1?
- maximum posts per account : how much you set ? 1?
- verified links must have exact url
- no more filters like skip sites with x outbound links, skip sites with pr below x, etc.
- also no filters for domain name badwords, or where bad words appear on site
- as per email what email you used?
import NOT possible = NO deal
either in single files sorted by engine = original as in SER
or NOTHING at all
I didn't use any filters.
Anyway, it's up to you of course.
- avoid posting on same site again (checked) this is one of the most important option for all projects.
Bought the list anyway to update my databases, its a great addition to my lists, around 20% targets i didnt had. (speaking more about tier 1 links article-directory-forums-social book-social net)
Your scraping skills(footprints) are crazy.
Thanks for all your time testing&harvesting.
All_ad_split ? what are this files? what platforms are inside? are they verified links? identified?
I noticed there are no trackbacks blog comments guestbooks image comment in this file, what does it hold?
It's account data though.
thx