@Hunar indeed, gone are the days when you could rank in a short period of time and actually make it stick, those sites hit SEO back in its infancy in the year 2000, some of the big brands were using recognised and I believe it is how they have authority to this day.
SEO should be a slow paced race, people should be prepared to build their own authority instead of piggy backing, paying for link networks.
Seems like I'm a little bit late to the party, and the thread's moved on a bit.
@gooner, glad to hear that you're getting on well with the scraping, just don't forget to hit the 'stop' button every once in a while to check out all the other cool things that SB can do!
jamesmurren - you were having some problems with Scrapebox? What you posted earlier should work fine, so must be something else, probably with your settings. If you're still having problems then post some screen shots and I'll take a look.
With regards to the amount of keywords to use, it really depends on how powerful your setup is and how you like to scrape - Just add as many as you can without it crashing (be prepared to have to wait a few minutes for it to merge them though).
As @gooner says, you can use less keywords if you like and it will work fine, but generally I like to just leave the program running for as long as I can, then just extract the lists of 1million urls from the 'harvester sessions' folder at my leisure.
Maybe have a play around with it, and see what works for you?
LOL @gooner, honestly, I've got no idea - sometimes SB just does what, well, SB does :-/
Changing the subject slightly....
With regards to the debate about sharing ideas on an open forum, in principle I'm all for it, but I think that there ARE certain things that perhaps shouldn't be shared in public.
I think it's safe to say that google know about all the ways to game their algo, it is after all, their algo, and it wouldn't be any stretch of the imagination to think that they've got an army of poachers turned game keepers who's job it is to spend all day long looking for loopholes.
I can't see a problem with giving people tips on how to improve their strategies, or sharing ideas on general things that are working or might work, but I do agree that giving away a strategy that's making you millions probably wouldn't be the best idea for obvious reasons.
I mean, you could even post an exact blueprint of how to rank a site on here and it probably wouldn't be a problem as 90% of people would either be too lazy to carry it out, or disregard it altogether, and another 5% would probably fuck it up. Google would probably turn a blind eye to the remaining 5% who did use it, as it wouldn't be worth their time or the resources to do anything about it.
IMO the problem comes when people try to monitise these methods, and by bringing them to the masses essentially force google's hand. Take SAPE for example, it was never a problem until people started selling it as a service on popular forums like BHW
Which brings me on neatly to my next point...
@Tim89 I'm not trying to be funny, but aren't you (and others) doing exactly the same thing that you are complaining about in this thread by selling a method of indexing to the general public that certain "in the know" SEOs have been using for a while now to get an edge over their competition?
I use a similar service to yours myself, but now that this method is being used to index millions upon millions of URLs a day to essentially game google's algo, do you think that it will be long before they have no choice but to think of a way to put a stop to it?
@2take2 I honestly don't think other services are doing what I do.
To be totally honest, what can google do to find out the backend of my indexing service?
The method won't be patched because the method is not being revealed, so no, I don't think anything will be changed.
I don't think other indexers are using my method as for example incredible indexer have needed to obtain two servers or shared servers for what they do!
Where as with my service, I'm managing around 400 active subscribers constantly making submissions through the API without any hiccups... What so ever, furthermore I believe express indexer has more dashboard features too which also take up resources.
I've had no problem with my server what so ever, which could mean 3 things, either they have much much more custom or there backend is flawed or they simply use a different method.
Lol thanks @2Take2 for finally putting eloquently what id been holding my tongue on for a while regarding the sudden rash of indexing "services". i really find it cheeky to be honest that they would want to capitalise on such an easy technique and also dumb cos they are drawing attention to a good trick (which they could have just used quietly indefinitely) just to make some quick cash and probably trash it for everyone else in the process.
I don't understand why so many people think all of our indexing services use the same method!
If this was the case, please explain to me why my service actually beat others in index rates? Surely if we all used the same methods then our indexing rates would be the same?
If you did have a method to make millions the best bet would be to milk it for all it's worth and then when it doesn't work any more sell it as a WSO to scrape the last few pennies out of it... Another trade secret revealed! lol
Not that I'm paranoid or anything, but it did occur to me that the best way for Google to discredit bulk link building would be to set up an 'instant' link indexing service and submit them all to their disavow database
Perhaps that's whats in the secret sauce?
(* I still use indexing services, just being facetious)
I was thinking a good general rule of thumb regarding the sharing of information is that sharing settings for gsa is harmless like how many threads, how to get good lpm etc. I dont see any harm in spreading that around. and maybe the implementing specific seo tactics (after all ron is always saying THIS IS NOT AN SEO FORUM :P) should be kept more under your hat maybe to pms if you wanted to share them. obv ppl can do what they want but I was just thinking in a 'best practices' kind of deal.
Comments
SEO should be a slow paced race, people should be prepared to build their own authority instead of piggy backing, paying for link networks.
@gooner, glad to hear that you're getting on well with the scraping, just don't forget to hit the 'stop' button every once in a while to check out all the other cool things that SB can do!
jamesmurren - you were having some problems with Scrapebox? What you posted earlier should work fine, so must be something else, probably with your settings. If you're still having problems then post some screen shots and I'll take a look.
With regards to the amount of keywords to use, it really depends on how powerful your setup is and how you like to scrape - Just add as many as you can without it crashing (be prepared to have to wait a few minutes for it to merge them though).
As @gooner says, you can use less keywords if you like and it will work fine, but generally I like to just leave the program running for as long as I can, then just extract the lists of 1million urls from the 'harvester sessions' folder at my leisure.
Maybe have a play around with it, and see what works for you?
Any idea why that may be?
If you feel scared to say in case the big G finds out... it's ok i understand!!!
Changing the subject slightly....
With regards to the debate about sharing ideas on an open forum, in principle I'm all for it, but I think that there ARE certain things that perhaps shouldn't be shared in public.
I think it's safe to say that google know about all the ways to game their algo, it is after all, their algo, and it wouldn't be any stretch of the imagination to think that they've got an army of poachers turned game keepers who's job it is to spend all day long looking for loopholes.
I can't see a problem with giving people tips on how to improve their strategies, or sharing ideas on general things that are working or might work, but I do agree that giving away a strategy that's making you millions probably wouldn't be the best idea for obvious reasons.
I mean, you could even post an exact blueprint of how to rank a site on here and it probably wouldn't be a problem as 90% of people would either be too lazy to carry it out, or disregard it altogether, and another 5% would probably fuck it up. Google would probably turn a blind eye to the remaining 5% who did use it, as it wouldn't be worth their time or the resources to do anything about it.
IMO the problem comes when people try to monitise these methods, and by bringing them to the masses essentially force google's hand. Take SAPE for example, it was never a problem until people started selling it as a service on popular forums like BHW
Which brings me on neatly to my next point...
@Tim89 I'm not trying to be funny, but aren't you (and others) doing exactly the same thing that you are complaining about in this thread by selling a method of indexing to the general public that certain "in the know" SEOs have been using for a while now to get an edge over their competition?
I use a similar service to yours myself, but now that this method is being used to index millions upon millions of URLs a day to essentially game google's algo, do you think that it will be long before they have no choice but to think of a way to put a stop to it?
To be totally honest, what can google do to find out the backend of my indexing service?
The method won't be patched because the method is not being revealed, so no, I don't think anything will be changed.
I don't think other indexers are using my method as for example incredible indexer have needed to obtain two servers or shared servers for what they do!
Where as with my service, I'm managing around 400 active subscribers constantly making submissions through the API without any hiccups... What so ever, furthermore I believe express indexer has more dashboard features too which also take up resources.
I've had no problem with my server what so ever, which could mean 3 things, either they have much much more custom or there backend is flawed or they simply use a different method.
If this was the case, please explain to me why my service actually beat others in index rates? Surely if we all used the same methods then our indexing rates would be the same?