@sweeppicker@gooner I always wanted to ask strategy about scraping with scrapers like Gscraper/Scrapebox. For example, I follow Ron's diagram for tiered link building. So would it be better to scrape 1-1 day each for contextual and junk platforms separately or import altogether (all footprints) and scrape altogether?
I currently started doing separately but my biggest problem is (I use gscraper) to avoid duplicate targets, I set to remove already used keywords in gscraper option. Now interchanging between the two, obviously I can't start with the remaining keywords as it'd lead to reset. So what's the best way to scrape for both platforms? Altogether or separately?
Cheers.
Can I jump in and ask how you search using multiple footprints? This is next on my list of things to do, but want to ask in advance. I'm going to test something with footprints and will report back here, and share, if it does work.
@sweeppicker: how do you create your own footprints? Would like to know because I saw that mentioned here quite a few times now but all I can use are the footprints that come out of the box with gsa.
Pratik I only target contextual and blogging platforms in Gscraper. U can also expand your KW list from within Gscraper and find even more targets. It's quite simple Hope that helps.
@sweeppicker Cool. I assume by expanding you meant inbuilt kw list? If so, I already know about it however don't use it as I do have other sources.
Oh and one more thing I wanted to ask, how you define contextual in SER, I mean what you guys usually check in "type of backlinks to create" for contextual links?
Oh and just for curiosity, if you scrape with Gscraper (using their proxies?) what's your average URL scraping speed?
Thanks.
I dont use the inbuilt KW list. If you go to the tab marked "expand" you can import your own Kws and expand the heck out of it. You need to use broadmatch Kws though. In GSA use the "article" and "social network" platforms.
@sweeppicker That expand option looks interesting but I'm certainly getting less hold of it. Could you mention in a bit detail as to what exactly is it intended to do? I apologize for asking questions here and there haha, pardon me if it is wasting your time.
I usually leave domain as anchor to around 60%, generic as 30% and 10% for all my main keywords that I want to rank for. But this also depends on the number of keywords you want to rank for. For example if I want to rank for 5 keywords, then the 10% would do 2% for each which is good.
Hi @Pratik, how many links would that be in total for each tier? What I'm getting at is, 2% KW density is much safer if there is only 50 links... versus 50,000 links. 2% KW density with 50k links means there are 1000 identical anchors. I don't know if that is cutting it too fine (my numbers are similar).
@spunko2010 Well, I think you would do okay but I do not work at Google haha so can't give the exact prescription. But I mean, really think it might not be a problem at all.
You know what's really cool?
If you have an EMD domain or main kw in the title of your page/homepage, then many CMS and many forums/sites converts plain links and fetches the title for that page. And Google NEEDS TO TAKE that into account so it would consider so many times before penalizing you - at least that's what my logic says. So also buying more branded domains in opposite to EMD makes sense. So in the coming months, that is what my stratergy has been and would be. For example instead of dogtraining.com, I'd go with dogtrainingguide.com and use "Dog Training Guide" as more of an anchor text so it also covers up "dog training".
Although I feel very idiot to teach a high earner like you haha - by no means I am a professional or something. Learning something new each day!
Question for anyone...? The issue that has been bouncing around in my head a bit is this. Say I am attempting to rank for 500 keywords, each tied to a URL on a large site. So I keep my Tier One percentages 60% generic, 30% domain, and 10% keyword. In my lower tiers, I may be building a link to a link such as "click here", "http://domain.com/path/path/dest.html", or "keyword". When I build that link, my options in GSA are to use the same anchor text, or all of the other usual options. To avoid leaving a massive footprint I am forced to never use the same anchor text option, other wise I will have links to links with the same weird generic anchor text (e.g. "click the up coming internet site" so I cannot easily build a theme from tier to tier using keywords. Am I missing something, or overthinking this?
What I personally will be doing and still doing is even if an EMD domain is available - I won't pick it. Because day by day, Google is cutting on EMDs and sooner or later seems to be hit in penalty - mild or strong. Instead I want to make a brand and go for branded domains which also contains my main keyword. PMD in simple words.
PMD means also Phrase Match Domain, in the past was a second and less effective option than EMD, but nowadays it's becoming more and more effective :-)
@Pratik - I normally use PMDs and they work well, but if I got the chance, I would snap up an EMD in a second. They work great, but you just have to be a bit careful how you use them.
@Pratik, as far as I can see, all that they are saying is that they are going to be looking at low quality EMDs, so if you build a decent site (which for the most part you need to anyway, in order for it to convert and to help it rank), and you don't over optimise it, then you shouldn't have anything to worry about.
What everyone, including the author of the article seems to miss, is that (provided you don't over optimise the site) you can build lots of keyword rich anchor text links to EMDs without ever risking tripping any filters (I'll let you figure out how :P )
That said, this is google that we're talking about, so who knows? lol
@donchino I agree but I did started seeing the effects. One of the niche sites that I am building currently that has no EMD or even PMD and the rankings are coming good (good content btw on site). However, some emds or pmds have tanked in last month or so.
I know these reports are not enough, but we can also not neglect that to rank, EMDs are not needed at all and you can still achieve those results building a brand.
@2take2 - You're right and it works very very well. In local business markets i always try and go for EMD's and i've never had anything but good results.
Does anyone have a specified link velocity for their kitchen sinks? Since these are the more spammy platforms, I'm just keeping any sort of velocity off and letting them hammer my T1, T2, and T3.
I start at 10X and then after about 1-2 months I raise it to 20X. I want to see terms in the top 50 before flipping it to a higher multiple as I want to take that momentum and up the stakes. Most of the time it works well.
Honestly the advice hasn't really changed. It's just a basis for setting up a solid tier structure. It separates contextuals from junk - which is maybe the most important point. I don't always do 3 tiers now, mainly to conserve resources on an already packed VPS.
Remember this isn't an SEO plan. This is just my view on how to tier. All the things that are SEO - many of which have nothing to do with all of this - well, that is a separate deal.
Is this still working in 2014? I read some changes in the PR filters during this thread. If I go with no PR filters on everything except Wiki's and Blog comments should I be fine than?
I use the Incredible Indexer, and a couple people on my team use the Instant Link Indexer. I think all the vendors that offer indexing services here are really good guys, and all have good services. I would take a look at all of them (there are at least three), and make whatever decision is best for you.
@sickaluph - It's a huge difference. Making links means nothing if they are not indexed - as they are invisible as far as ranking goes.
Not more than a few months ago you couldn't get links indexed without paying a lot of money for very few links getting indexed. Now the guys here on the forum offer really nice indexing services for cheap. And they can handle a lot of links.
So do your homework and get on board. It is very important to get your links indexed.
@justice - Sorry I missed your question somehow. It's not so much the amount of indexing but rather how many links you are making per day and how old the website is.
When I run projects, everything goes into indexing - I use several vendors. But everything goes in that day. I don't drip or anything like that.
I control how many links I make - that is the key. And I assume that most will be indexed. So I plan accordingly. Therefore its not about controlling the back-end (the indexing). I control the front-end and everything else takes care of itself.
@steelbone - I used Lindexed for years but dropped them when the boys fired up the indexing services here. After all, Lindexed is a crawling service. I personally would save the $25 per month on that.
He uses multiple services. You can stick in many different API's in the Main Options for multiple indexers. It is more out of choice than necessity. But it isn't a bad strategy - especially if you have more than one VPS or server going.
@justice - These days with G it is very hit or miss (and a lot more miss lately). I always believe in escalation and acceleration of links. You can't really do that if you start big. Plus I am a pretty patient person.
New domains require common sense. New domains don't get a zillion links per day out of the gate. Totally unnatural unless it goes viral. Which is essentially the underpinning of the churn and burn. So I would start at 5-10 per day in week 1, and gradually escalate from there. I honestly believe that is your best chance at making it stick.
I've been using this tiered strategy for awhile without any indexing and have actually had some pretty decent results... so if I start using an indexer my results should be even better?
After looking through some indexers I am not sure which one to go with... is GSA SEO Indexer good enough to get the job done?
@sickaluph - I started with that, but I always had Lindexed going in the background (both are closer to being crawling services in effectiveness to be honest). GSA Indexer is lightning fast, but one issue is that it will suck all of the oxygen out of your internet connection (if you are doing big volumes like me). So if you are using it on the same VPS or server as SER+CB, you are going to cripple the speed of SER.
Having said all that, if I was on a budget and just starting out, and basically had no money, I would get GSA Indexer in a New York second. No hesitation.
However, the internet has gotten a lot bigger, and it is getting very difficult to index backlinks. Much, much more difficult. And you need those backlinks indexed to rank. So I would honestly evaluate all of the index providers and settle in with somebody you are comfortable with.
I try not to recommend any one in particular as it is my position that competition and choices are very important to us as a community. Look at how many problems people have with VPS's and Dedi's. So my feeling is the more choices the better. Let the market sort out the good from the bad. All the indexing guys have happy customers.
That had to be one of the most interesting videos I ever watched in my entire life. I know there is a message in there for me, lol. Seriously, that was very deep. I may not be able to sleep now. I was captivated with every sentence. Wow. Thanks! (and bookmarked)
@ron, you're welcome It also has implications on your list sale - e.g. if you ever though of expanding the number of lists available at the same time (like adding a green list for example) OR if you ever thought of recommending a certain list to people to lower the choices they have to make and as result increase margin (e.g. making the red list more prominent than the blue list or having introduced a third list making one of the three a "most popular"). Ok, sorry, got sidetracked, I'll just stop here before things get out of hand, lol ...
How do you setup the side links in a project. I'm a newbie - I understand using tier 1 links in tier to and tier 2 in tier 3 but I don't understand the diagram with the targets off to the side. I know this is academic to you but can you help me out. How would I set this up?
the only difference between tier 2 links pointing to tier 1 and the sidelinks pointing to tier 1 is that tier 2 is contextual and the side liks of tier 1 are just additional links of other kinds.
@vlalston - The purpose of the diagram was to clearly show you a few things:
1) The contextual tiers are stacked underneath one another and intended to be kept separate from junk
2) People always make the mistake of jumbling (contextuals + junk) in the same tier, and then underneath that they then build junk to junk.
There is never a point in wasting precious resources to build junk to junk. Build contextual to contextual, build junk to contextual - but don't build junk to junk - waste of linkbuilding.
Comments
Lol, looks like you opened a can of worms here
They work great, but you just have to be a bit careful how you use them.
What everyone, including the author of the article seems to miss, is that (provided you don't over optimise the site) you can build lots of keyword rich anchor text links to EMDs without ever risking tripping any filters (I'll let you figure out how :P )
That said, this is google that we're talking about, so who knows? lol
Is this still working in 2014?
I read some changes in the PR filters during this thread. If I go with no PR filters on everything except Wiki's and Blog comments should I be fine than?
Thanks
@sickaluph - It's a huge difference. Making links means nothing if they are not indexed - as they are invisible as far as ranking goes.
Not more than a few months ago you couldn't get links indexed without paying a lot of money for very few links getting indexed. Now the guys here on the forum offer really nice indexing services for cheap. And they can handle a lot of links.
So do your homework and get on board. It is very important to get your links indexed.
@justice - Sorry I missed your question somehow. It's not so much the amount of indexing but rather how many links you are making per day and how old the website is.
When I run projects, everything goes into indexing - I use several vendors. But everything goes in that day. I don't drip or anything like that.
I control how many links I make - that is the key. And I assume that most will be indexed. So I plan accordingly. Therefore its not about controlling the back-end (the indexing). I control the front-end and everything else takes care of itself.
I use lindexed, the incredible indexer and express indexer.....
Help big g find your shit.....
After looking through some indexers I am not sure which one to go with... is GSA SEO Indexer good enough to get the job done?
btw, I'd throw this at you if you haven't seen it already: http://www.ted.com/talks/barry_schwartz_on_the_paradox_of_choice
@Ferryman, You are very welcome.
That had to be one of the most interesting videos I ever watched in my entire life. I know there is a message in there for me, lol. Seriously, that was very deep. I may not be able to sleep now. I was captivated with every sentence. Wow. Thanks! (and bookmarked)
@vlalston - The purpose of the diagram was to clearly show you a few things:
1) The contextual tiers are stacked underneath one another and intended to be kept separate from junk
2) People always make the mistake of jumbling (contextuals + junk) in the same tier, and then underneath that they then build junk to junk.
There is never a point in wasting precious resources to build junk to junk. Build contextual to contextual, build junk to contextual - but don't build junk to junk - waste of linkbuilding.