GSA SER - project set up (CONCEPT)
Hello guys,
It’s some time ago I have used GSA SER.
That’s why I have rewatched all (at least 95%) of the video’s in this thread.
I did read the unofficial FAQ and many other threads as well.
So, as for now, before I will start using GSA SER again i’ll put a concept plan
on here.
It would be great if you could send me into the right direction/give me some
advise.
So first, before I am talking about building links those are the
services/programs I am able to (and I will) use:
EXTERNAL PROGRAMS/SERVICES
Additional programs:
- Captcha Breaker (1st)
- SEO Indexer
- Wicked Article Creator (for spin/spun)
Additional Services:
- 2captcha (2nd for web 2.0’s)
- SERengines for the Web 2.0’s
- Private Proxies
- Mail Accounts
+/- VPS (I have an PC here with 32GB RAM and more pretty stuff, so I’ll try GSA
SER without a VPS first).
Regarding the private proxies & the mail accounts:
- I’d saw Millionproxy having a good rate, has anyone experience with them?
- Regarding the mail accounts, I saw mail.ru is a good one, has this been
changed lately?
LINKBUILDING
I’ll try to keep this as easy and as short as possible.
The campaign will have four tiers. One tier will be done 100% manually.
The 3 other tiers will be done by GSA SER.
So:
MANUALLY: Tier 1. -> articles to my website, minimum 1000 words & unique.
--
GSA SER: Tier 1 -> Article, Social Networks, Web 2.0 and Wiki.
Link ratio with tier above: 3
GSA SER: Tier 2 -> Article, Social Networks, Web 2.0, Wiki
Link ratio with tier above: 3
GSA SER: Tier 3 -> Blog Comment, Guestbooks, Trackbacks, Social Networks
Link ratio with tier above: 6
SETTINGS
Private proxies:
Same private proxies for all tiers, is that working? (else i'll split it up)
Ratio: 1 thread per 1 private proxy
Mail accounts:
Around 15-20 mail accounts per tier.
I will add some more information regarding the settings, anchor txt ratio etc. later on.
It would be great to know if the fundation of the campaign is good / fine.
Comments
Also I would drop articles and wikis from your T3 they ain't going to index without support.
SEO Indexer just takes system resouces and does little else.
I would drop Social Bookmarks from T3 too. There's just not enough targets SER can post to.
You can test this by making a burner project for Social Bookmarks and then letting it run and check your verifiers.
do you maybe have an affiliate link to SEOSpartians?
Since you gave the tip to use it, i'd think it's fair to use it :-)
I have edit the post and removed what you said above.
You stated:
"SEO Indexer just takes system resouces and does little else."
Since I am only trying a campaign for a single website, i'd probabl have resources left.
Would you advise me? Or just skip it.
-> I am working on a second part of the concept plan, more regarding the general settings of GSA SER itself.
https://gyazo.com/c136583213544c31c2a2add213114113
I did not fill the keywords/anchor txt as for now, I will use keyword map pro to get a list of good keywords to scrape from.
Submit/verify
Tier 1: 25 URLS
Tier 2: 75 URLS
Tier 3: 450 URLS
per 1440 minutes / 24 hrs
Screenshot of the options tab:
part 1; https://gyazo.com/3173afc0dae2d1776a7ffd7864c3c744
I was thinking about MSN, Google and Yahoo.
part 2; https://gyazo.com/a4784301b476f0528ca4575d20f5dea2
I think that's all set, excluding the actual posts but I am sure I can cover that.
What do you mean with the anchor txt ratios dont add up to 100%?
Does it have to be a total of 100%? (atm. it's not)
Regarding the verified list; it's some time ago. so i'll just start from zero again. The first week, probably longer, i'll be building the verified list. Once that's done I'll move on to a test campaign.
I have unticked and ticked the fields you stated.
Though, for tier 1 and 2, should I untick everything stating "cotextual"?
(comment contextual, comment contextual/anchor txt, directory contextual etc etc).
I think SER might have something in it if anchor text isn't 100% to pop up or something can't check cos I'm on my phone.
Analyse competitor back links is basically link extraction if I remember right. Very resource heave but can grow a non contextual list exponentially if used right.
Regarding your T1 and T2, do you use web 2.0's?
I didn't see you stating you do in any of your replies lately.
If not, are the only platforms you use wiki and acrticles for T1 and T2?
"If you mean project options with all the
different anchor text variations, then sum that up and it has to be
below or exactly 100%. If there is a gab to 100% then the field on main
anchor text is used for the rest."
Said @Sven
Social networks too on T1/2. I use a few different methods, one uses nothing but links built in SER, one with SER and RX but RX could be swapped for SERE and one using a new concept I came up with that I'm keeping to myself.
So T1 and T2 look like wiki, articles and social networks for you.
No Web 2.0's.
I'll do a test run with a burning campaign later today to get a small link list
I do have scrapebox and platform identifier.
I saw a video tutorial regarding how to add GSA SER footprints to the scrapebox.
Would that be good? Or just.. let GSA SER run
(Should I make use of SERengines for a burning campaign to gain a linklist?)
Considering to buy 50 or 250 private proxies from proxymillion as well for scraping.
(or any number in between)
Then I'll use around 10-20 other private proxies for submission.
-- Sorry for the many questions.
I was not able to find these information in one of the video's.
I am considering using RankerX to build a small amount of high quality links for tier 1.
After that I'll use SER to build tier 1-2-3 structures with more links. I will go first with the;
@Shaun I'd see the difference between buying a list and scraping it.
Most prices of linklists are around 50-80$.
So, or I buy 1 - 2 linklists, or I buy 250 private proxies (79$) for a month.
I'd think I am able to scrape more potential URLs, while using keywords combined with the GSA footprints, with the amount of proxies instead of buying 2 lists.
Do you share this opinion?
(29$)
To be honest, I do not have any experience with Asian Virtual Solutions, though through private messages I didn't hear anything promising.
I am considering to buy 50 private proxies right now, and then combine footprints with keywords for scrapebox.
I'll order Ranker X as well, for the tier 1 (awesome it has a wizard for the PDF then :-)).)
As far as my experience go i used buyproxies.org ones.
I am considering to move to proxymillion for the scraping part (way cheaper)
Regarding footprints -> I'd bet that was a mis understanding.
I will combine a list of keywords with the footprints, so scrapebox will find 1 kind of websites (wiki/article etc).
Tomorrow my First Run Will start, including rankerX. Ill Keep you updated
Yes or No? I use platforms that can create both, anyone having inputs?
Though, the RankerX part will be done etc.
More in 24 hrs :-D
Big mistake: RankerX was adding Articles to profiles. I am looking into this how I can solve it.
(Though I added it as "article" i assumed RankerX would have known how to handle them).
Going to do a second run later on today.