Hey guys, is there any good way how to clean your global sites list ? Or the only way is to go through all files and make one big file and after that import again ? Or any other good trick ?
1) The first thing is to remove duplicate URLs and duplicate domains - I do this weekly.
p.s. Don't use the cleanup function. That will eliminate a lot of the dead links, but it will also lose some good links. So...
2) A more advanced method is to do what I just said (remove all duplicates), but then move (cut and paste) what remains into a new empty folder. Therefore your verified file will now be *empty*.
Next assign an unused port in Main Options (like Submitted or Failed) to this new folder you created with your cleaned-up verified links. For the sake of argument let's say you decide to do this from the Failed folder. Do not check this port. Only verified should be checked. Remember: Checking a box means that SER will *write* to that folder. The point is you want to really have a clean verified folder. So you start it from scratch.
Next, you can either:
A) Change all projects to use the Failed folder. To clarify, when you check sitelists in project files, you are instructing SER which folder you want it to *read* as that folder has the targets.
or you can...
Directly import the failed folder into each (all of your projects). And of course, when it runs, it writes the good links to Verified.
Potential Issues: Nothing earth shattering with either method as they both work well, but your current projects have old emails which will cause SER to say (I already made a link for this URL with this email to this website). So you need new emails. And if the websites only allow one of your links per project URL, then you need an escape hatch, right? So I would create 5 new bullshit projects that are all clones, call them test projects, and make up a URL that doesn't exist, stick in new emails, new spins for articles, etc., and let those run. That way you catch everything.
Sounds complicated, but it is a very easy process. You should always have several bogus projects just to test things with new b.s. URLs and new emails anyway. These are throw away projects where, when you are done, you can change the URLs, delete account history, delete URL cache, delete submitted, delete verifieds, and start with all fresh Zeros in those columns.
In fact, you can run new lists through them (ahem, lol), and capture a ton of links really fast in your verified.
Will the "remove duplicated domain" process leave only one of them after dedupe ? It`s not clear to me but don`t want to risk my verify list (: cos I have a lot of subdomains on wordpress.com (:
Ron: do you have any plans to sell only contextual list ? ANyone here know its most important factor to rank, so all those trackbacks, comments etc. are not we really need (:
I will be more than happy to pay monthly subscription for good contextual dofollow list. I have about 2000 target contextual sites only, scraping them by gscraper, hrefer but processing those scraped lists is so time consuming ): succes rate (verified) is about 2-3% at max so most of the time my GSA SER submitting to sites where are no chances to get verified link. I think it could be a good idea to sell only contextual list as a separate one to those 2 you already selling. Can you offer service like this in the future or now ?
I know you have on those 2 list many contextual targets but I meant a lot bigger one only focused on contextual targets, is it possible you will offer something like this ?
I think you will have here a lot of people interested in pay you for this (:
@thomas73 - We discussed that some time back, but when you get in the business of scraping and processing, you soon realize that contextuals are the most difficult to hunt down. Since people need full lists anyway to run SER, it didn't make sense economically or investment wise to do that. So it will always be a full list.
Ron: I can imagine that of course (:., but just wondering all the people around scraping them by themselves on their own usage only (((: or they are depending in 1st tier on tools like rankqwyz or FCSnetworker and then only boost them by all those listsc consited from comments, bookmarks etc only (:
I read here a lot of posts about how important is to build quality 1st tier based in 100% on high quality contextual sites only, and I agree in 100% with that (:, but not one here selling only contextual lists. So therefore I asked about. If you have dificult keyword 100 rankqwyz target sites could be to low, after creating lets say 1-2k backlinks from 100 target sites the footprints will kill any project probably, so I am wondering how to diversify 1st tier target sites more and it seems there is no easy solution for top competition projects , maybe except buying high PR backlinks from quality sites (:
Ok ,Ron I full yunderstand your business point of view, it`s ok, it was just a question. (:
I think there is a time to rent additional dedi for scraping and processing contextual targets for my own use only (((:, there is probably something true in this not many wants to share their contextual lists cos they are most valuable part of whole link building process, so I can imagine iw someone sell something like that the value of those target sites will dramatically get down after some period of time so its not in interest of people who doing business on rank their clients websites for big money to share/sell those lists (:
I am a huge fan of web2.0's on the first tier. And I am a fan of rankwyz as well. But you don't need those services to help yourself.
Leaving aside the business side of it, I like contextual links because, in my opinion, they provide better linkjuice for ranking. I use SER to drip that first tier anyway. If you are making a lot of contextuals on T1, you are going to get burned. So if you are following that plan (going easy on the T1), what you are really talking about is having a large amount of contextuals for the T2.
I have found that I can still rank with the number of contextuals that are on the list. I'm getting all different ones for the T1, and then I post again and again on the T2 with the same contextuals - and I have not had any problems. Plus I'm hammering the T1 contextuals with junk as well. So in my honest opinion, I think you have it stuck in your head that you need a massive amount of contextuals to win - and I am not seeing that to be the case in real life.
You probably right but I have several projects with really hard phrases (global) and it seems with SER will be hard to rank them (: but I am trying but it goes so slow ...
For less competetive phrases hammering T1 with T2 contextuals works very well for me, I ranked within 2 weeks over 15 keyphrases but I hammered T1 backlinks as a hell (: of course all goes to indexer service, and the positions still increasing despite this I stop build new backlinks some time ago, it was a test project and I thought google will penalize me fast but it didn`t happened yet (:
How you do it in your 1TIER ?
Lets say you have 3 keywords (the client pay for TOP10 ranks) but of course you have to build a lot of additional anchors to provide diversify, do you create a separate project for those 3 main keyphrases (money ones) and then another separate project (T1) for those LSI/URL/Brand etc ? Or you mixing them up all together in one T1 project and then hammer them all with T2 contextuals ?
The question is about
1. Rankqwyz - I put there only money keywords (but I observed after couple of posts many of them getting suspended , so I am wondering it maybe could be a better idea to post only once and forget about that blogs ,then just hammer them with T2 only)
2. GSA SER - here I split into two projects one for money keywords where I create only PR1+ contextual, and second T1 project for LSI/Generic/Brand/URL etc also 100% contextual
Rankqwyz an FCS are good tools but I am trying to figure out how to get less suspensions of my blogs cos it is very annoying if after couple of posts some of them getting deleted (like for example very often happening with wordpress.com) so the time for building T2 for them was wasted. Do you have any special strategy for those 2 tools to build backlinks on them ?
@Thomas73: If it were that easy to find contextuals, people wouldn't need to buy lists. We spend a large percentage of our time scraping for contextuals. It ain't that easy. They are hard to come by, and getting harder every day.
The SERLists team all use exactly the same lists we are selling. I would love nothing better than to drop 50,000 contextuals into a project. And then i woke up.
@thomas73, I find that these things tend to help with not getting your web2s deleted;
-Don't name them so that they sound too spammy if you can help it (so something like "www.CasinoViagraPorn.wordpress.com" is probably out of the question haha)
-Make the first post not have any links
-Use good content if you can, or if it makes sense to do so.
-Make the second and possibly third post either not have any links, or just link out to non competing authority sites.
-Don't post too often, 2 or 3 times a week seems good.
-Don't hammer them with tons of backlinks too fast.
The above seems to work for me, but I still lose a fair few anyway. I guess it's just the nature of the beast, and possibly the fact that auto posting will tools can trip their spam filters no matter how careful you are.
Satans_Apprentice: I know already something about cos doing that with very low success rate ): but I think it is worth of it, without contextuals we get penalized fast so there are no choice we must buy or scrape them every single day. How many contextuals the list you are selling cosists of ?
2Take2 Thanks for advices, of course I am doing 1st tier that way, using SeoContentMachine to provide good quality tier1 and as I see in most cases it produce readable content so it sticks, but every day I see some loses as probably everyone here experiencing , so I just wondering it has any sense to invest time into Rankqwyz to build strong niche blogs if lets say after some period of time they can get deleted with all the articles/backlinks. The risk as I see is high despite I am doing clean articles without links from time to time, they are readable not post too often etc. so the question is it could be better use SEREngines and GSA lists of contextual to drop 1-2 articles at max and forget switch to another account, create new ones. If we create manyu articles on the blog there are bigger chances someone will read it and find out they was created for one reason (:. Of course there are still a way do write everything manually to avoid or lower that risk but if you have 50-100 clients and hundreds of keywords to rank its impossible to do with reasonable fees they are able to pay for my manual or outsourced work (writing articles)
I see what you're saying mate, and yes I guess there's always a risk that they'll get deleted and all of your lower tiers will be in vain.
You could always get around this (or at least lower the risk/loss) by linking to multiple web2s from each of the lower tier properties that you build with GSA SER, or whatever ;-)
I also like to leave the web2s to 'mature' a bit before I start tiering them up, as once they get over say 10 -15 posts they're also far less likely to get deleted, at least in my experience, anyway.
That said, with rankwyz or FCS it's a numbers game, if 10 get deleted just build 100 more, at least that's the way I look at it.
Also, like you say, manually doing them and then using tools to post to them
is probably a waste of time, although I always like to keep some totally
manual web2s as they never seem to get deleted and IMO are well worth
having.
On another note, if you've got 50-100 clients you need to rank, then maybe you should look into building your own PBN to use in conjunction with the web2s, if you aren't already?
Comments
subdomain1.wordpress.com
subdomain2.wordpress.com
Will the "remove duplicated domain" process leave only one of them after dedupe ?
It`s not clear to me but don`t want to risk my verify list (: cos I have a lot of subdomains on wordpress.com (:
ANyone here know its most important factor to rank, so all those trackbacks, comments etc. are not we really need (:
I will be more than happy to pay monthly subscription for good contextual dofollow list. I have about 2000 target contextual sites only, scraping them by gscraper, hrefer but processing those scraped lists is so time consuming ): succes rate (verified) is about 2-3% at max so most of the time my GSA SER submitting to sites where are no chances to get verified link. I think it could be a good idea to sell only contextual list as a separate one to those 2 you already selling. Can you offer service like this in the future or now ?
I know you have on those 2 list many contextual targets but I meant a lot bigger one only focused on contextual targets, is it possible you will offer something like this ?
I think you will have here a lot of people interested in pay you for this (:
I read here a lot of posts about how important is to build quality 1st tier based in 100% on high quality contextual sites only, and I agree in 100% with that (:, but not one here selling only contextual lists. So therefore I asked about. If you have dificult keyword 100 rankqwyz target sites could be to low, after creating lets say 1-2k backlinks from 100 target sites the footprints will kill any project probably, so I am wondering how to diversify 1st tier target sites more and it seems there is no easy solution for top competition projects , maybe except buying high PR backlinks from quality sites (:
Ok ,Ron I full yunderstand your business point of view, it`s ok, it was just a question. (:
For less competetive phrases hammering T1 with T2 contextuals works very well for me, I ranked within 2 weeks over 15 keyphrases but I hammered T1 backlinks as a hell (: of course all goes to indexer service, and the positions still increasing despite this I stop build new backlinks some time ago, it was a test project and I thought google will penalize me fast but it didn`t happened yet (:
How you do it in your 1TIER ?
Lets say you have 3 keywords (the client pay for TOP10 ranks) but of course you have to build a lot of additional anchors to provide diversify, do you create a separate project for those 3 main keyphrases (money ones) and then another separate project (T1) for those LSI/URL/Brand etc ? Or you mixing them up all together in one T1 project and then hammer them all with T2 contextuals ?
The question is about
1. Rankqwyz - I put there only money keywords (but I observed after couple of posts many of them getting suspended , so I am wondering it maybe could be a better idea to post only once and forget about that blogs ,then just hammer them with T2 only)
2. GSA SER - here I split into two projects one for money keywords where I create only PR1+ contextual, and second T1 project for LSI/Generic/Brand/URL etc also 100% contextual
Rankqwyz an FCS are good tools but I am trying to figure out how to get less suspensions of my blogs cos it is very annoying if after couple of posts some of them getting deleted (like for example very often happening with wordpress.com) so the time for building T2 for them was wasted. Do you have any special strategy for those 2 tools to build backlinks on them ?
-Don't name them so that they sound too spammy if you can help it (so something like "www.CasinoViagraPorn.wordpress.com" is probably out of the question haha)
-Make the first post not have any links
-Use good content if you can, or if it makes sense to do so.
-Make the second and possibly third post either not have any links, or just link out to non competing authority sites.
-Don't post too often, 2 or 3 times a week seems good.
-Don't hammer them with tons of backlinks too fast.
The above seems to work for me, but I still lose a fair few anyway. I guess it's just the nature of the beast, and possibly the fact that auto posting will tools can trip their spam filters no matter how careful you are.
2Take2 Thanks for advices, of course I am doing 1st tier that way, using SeoContentMachine to provide good quality tier1 and as I see in most cases it produce readable content so it sticks, but every day I see some loses as probably everyone here experiencing , so I just wondering it has any sense to invest time into Rankqwyz to build strong niche blogs if lets say after some period of time they can get deleted with all the articles/backlinks. The risk as I see is high despite I am doing clean articles without links from time to time, they are readable not post too often etc. so the question is it could be better use SEREngines and GSA lists of contextual to drop 1-2 articles at max and forget switch to another account, create new ones. If we create manyu articles on the blog there are bigger chances someone will read it and find out they was created for one reason (:. Of course there are still a way do write everything manually to avoid or lower that risk but if you have 50-100 clients and hundreds of keywords to rank its impossible to do with reasonable fees they are able to pay for my manual or outsourced work (writing articles)
I see what you're saying mate, and yes I guess there's always a risk that they'll get deleted and all of your lower tiers will be in vain.
You could always get around this (or at least lower the risk/loss) by linking to multiple web2s from each of the lower tier properties that you build with GSA SER, or whatever ;-)
I also like to leave the web2s to 'mature' a bit before I start tiering them up, as once they get over say 10 -15 posts they're also far less likely to get deleted, at least in my experience, anyway.
That said, with rankwyz or FCS it's a numbers game, if 10 get deleted just build 100 more, at least that's the way I look at it.
Also, like you say, manually doing them and then using tools to post to them is probably a waste of time, although I always like to keep some totally manual web2s as they never seem to get deleted and IMO are well worth having.
On another note, if you've got 50-100 clients you need to rank, then maybe you should look into building your own PBN to use in conjunction with the web2s, if you aren't already?