Getting nowhere with this?
It seems as soon as I resolve 1 issue with my campaign another one crops up!
Today here is what I have done:
1) I have paid for a ssd vps with solid SEO to run gsa ser (hoping this was the key to getting better results)
2) using a keyword list of 15,000 keywords, and a list of loads of footprints, I scraped 1 million URLs with Gscraper. I then removed dupe URLs ad domains and was left with just under 200k URLs.
3) imported these URLs as target URLs and set GSA to only use target URLs.
4) 30 semi- dedicated proxies.
5) 8 working pop3 enabled, spam filter disabled, hotmail email addresses
6) All platforms selected and all fields complete
7) Set to avoid creating no-follow.
8) 300 threads.
9) Using captcha breaker with almost 80% success
10) allowing to post to same website more than once
My threads are struggling to get anywhere near 100! I have 50/50 dofollow to no follow... and my vpm is just 3!!!
Can someone take a look and tell me what I am doing wrong?
I want to be creating 50-100 VPM consistently. In about a 10 minute period on my VPS I have 500 submitted links and 25 verified?
Please help!
Thanks
Today here is what I have done:
1) I have paid for a ssd vps with solid SEO to run gsa ser (hoping this was the key to getting better results)
2) using a keyword list of 15,000 keywords, and a list of loads of footprints, I scraped 1 million URLs with Gscraper. I then removed dupe URLs ad domains and was left with just under 200k URLs.
3) imported these URLs as target URLs and set GSA to only use target URLs.
4) 30 semi- dedicated proxies.
5) 8 working pop3 enabled, spam filter disabled, hotmail email addresses
6) All platforms selected and all fields complete
7) Set to avoid creating no-follow.
8) 300 threads.
9) Using captcha breaker with almost 80% success
10) allowing to post to same website more than once
My threads are struggling to get anywhere near 100! I have 50/50 dofollow to no follow... and my vpm is just 3!!!
Can someone take a look and tell me what I am doing wrong?
I want to be creating 50-100 VPM consistently. In about a 10 minute period on my VPS I have 500 submitted links and 25 verified?
Please help!
Thanks
Comments
Thanks
The people getting insane LpM are running many projects, not just one. I am running just one as well and my LpM is very low (but after a lot of research and conversations, it's normal for my particular setup). The more filters you have, the lower your LpM is going to be. Probably not your situation though.
From my initial reading of your post, I don't see any issue. It's working, you are getting submissions, your Captcha Breaker is working, your VPS isn't overloaded.
Verifications are another story. As I understand it, you might see them instantly, others it might take 5+ days. That isn't a straight answer. I see submissions for sites... then 3 days later I get 8 verified links! Or maybe none. Or maybe 3. Kinda random.
If you want more LpM and more verified links, you might add in other platforms to submit to and a few more email accounts. I am using 10 at the moment for each project and it's working pretty good.
I am submitting to all.
I now have 15 email acounts running.
Would it be worth creating projects which are exact duplicates of this project? and same target URLs? to get more lpm/ vpm? Or I guess then I could just end up with loads of duplicate links?
I just want to create as many links as possible for churn & burn!
This is my own opinion but 1000 links from Site A or 1 link from Site A... in the end, I'd think any SE is going to count it as one vote. Otherwise, some of the world's worst Tweets would rule the world with all the spam (I'm not much of a social signals believer, it's also easily gamed).
Anyway... to your POINT!
I wouldn't create a duplicate project using the same URLs.
It really sounds like you want the high LpM you see around here but using essentially 1 project posting to 1 site? Verified is so unique, I haven't seen a solid answer on that one. Even I get them but my filters are really tight.
If you want higher LpM, I think you need many many more projects, posting to different URLs and each project using 10 emails that are unique to that project.
If 1 project can post 2 links per minute.... now make that 100 projects and you have 200 LpM... you get my point. 1 project will never get 1000 LpM no matter how many places you want it to post.
I don't think you are doing anything wrong per se and I don't think your VPS is the issue. It's the understanding of LpM and # of projects.
You could try creating another project with the same values BUT use another page of the site you want to rank?
When you scrape and use footprints, that's just a list that whatever program "thinks" matches the platform. GSA then has to import and identify. Even if it's the right platform, it may not be able to post using the script if the site is custom.
No guarantees.
You can either:
1. Learn to scrape for specific platforms
2. Let GSA slowly scrape for you and identify and use those sites
People go back and forth on those 2 items but if you aren't in a hurry, #2 works pretty well over time depending on the filters, search engines and keywords you are using.
When you scrape, if you don't pick specific platforms or if you use as many footprints as possible with a list of keywords, that is about as easy as it gets and probably the least useful in the end.
I try and:
1. Pick a platform that matches something I am using in the project
2. Remove duplicate URLs
3. Import as Target URLs for the project (GSA will auto identify)
I assume from your post, you are trying to use GSA like a machine gun and not a sniper rifle.
I've seen lots of people showing off .. 650,000 identified sites! But keep in mind your frame of reference. If you find that many sites that GSA can identify, that doesn't mean it will post to them, it doesn't mean they aren't moderated or protected by spam software and it doesn't mean it will meet your filters if you have any. So the people who post such big numbers, it doesn't mean anything per se.
GSA is a learning curve for sure. You just have to tweak it.. and eventually decide if you are going to scrape for targets, buy a list, let GSA scrape or do all 3. Which can work! I had GSA looking for targets and in the meantime, I went looking and imported the target URLs. Works pretty well.
how long have you been doing SEO?
@monie This morning my verified links are quite high, about 75% of my submitted.
Update
So I left my campaign running since yesterday afternoon on my new SSD VPS. I woke up this morning to log-on and check and I was firstly a bit dissapointed because my projects are set to run at 300 threads (which they were yesterday) but now they are struggling to get to 50 again??!!
Anyway I looked along the stats at the bottom and saw this..
I was pretty pleased, after all this struggle to get a better vpm I saw a good verified to submitted ratio, and a MUCH better VPM.
I have actually created 96,000 verified links in the last 24 hours, (56 728 in the last 8 hours)
What I have done is this... Instead of just one project, I have created 4 duplicate projects.
Project 1- using my Gscraper list as target URL (all filters like low pr, obl etc disabled)
project 2, 3, 4- I uploaded the 15,000 keywords I used in Gscraper directly into gsa ser so it could look for it's own urls to post to. (again all filters disabled)
I then checked the chart to see the backlink created and was left dissapointed again!
wow 56000 links but only 323 unique domains??!!
From that many link I would like atleast 8000-10000 unique domains.
I have already unchecked "continue to post to same domain/ url even after failed" and I have unchecked "allow posting to same URL" So why do I only have a few unique domains?? In fact at one point there was about 10 of the same urls one after the other in my verified links feed?!
Thanks for your help everyone!
i would forget about churn & burn if i were you because i don't think you really know what you are doing.
just learn how SER works...., it can be dangerous in the wrong hands.
@deNiro72 thanks I will check this. Although I have definitely got those setting unchecked so I cannot understand why it would keep posting to the same url either!