I've been wondering about the same question as @Pratik. People use the term project very loosely. They'll refer to a site as a project, the whole affiliate marketing or adsense sites as project, one keyword as project, one tier as project, one SER task as project, etc etc. Very confusing.
Sometimes we have answers, but we perceived an answer differently from what they meant.
I mean, if T1, T2, T3, T1A, T2A and T3A are counted as 6 projects. Then 100 projects are not that much, only 16 sites or so. Or if it's an authority site, 16 keywords. Unless, of course, you dump multiple money site urls or keywords into one SER task.
@Pratik, I am referring to 120 projects, not groups.
@AlexR, I'm also experimenting on a few sites with no articles and social networks as well. Just try a few different things on new sites - don't do everything the same.
@ron Cool, but again just to confirm as @audioguy pointed out what type of confusions are caused in these things. You meant 120 projects by combining just all tiers right? I mean 120 tiers consisting of T123 and T1A2A3A. lol, sorry but was just clearing up.
Don't forget that SER also uses a number of different footprints to find targets, and those are coded in the engine file.
And when it does use the keyword, I use the spinfolder token as I have keywords bundled into different files."
As sven mentioned, sometimes keywords really dont make sense (inurl:register/bla "keyword") because its not very likely that the KW is in the URL.
When ticking "always use keywords", does SER really always use one, even if it doesnt make sense? That way there are a lot of nonsense search queries..
Actually im looking for an even better way to get the MAX targets out of SER, and this includes editing each footprint in this way:
- having a look if it could make sense to use a keyword and then adding it to the engine file (e.g. Searchterm="powered by wordpress" %spinfolder-c:/keywords%), that way it can be made sure that SER really only uses a keyword when it makes sense + it uses a KW each time.
Im still not sure if i got everything right, would that make sense or is it better to just tick the "always use" option and forget about it?
@startrip - this really needs to be addresses at some point. I'm trying to come up with a solution. I was thinking that maybe a 2 part one might work. So what it first does it use a footprint from the engine that is more friendly with a keyword attached. e.g. "Powered by XYZ" + KW. So each platform has a set of footprints that work with a keyword and a set that doesn't work. And if you enable always use KW's it uses the set that has KW's. With all the SER users there does need to be a way for everyone to get targets based on their keywords, or all the standard targets will lose their value and SER will lose it's value in about a year.
Adding keywords doesn't make sense for footprints who target register page, because register page aren't keywords rich and almost contains the same content.
For exemple for ArticleScript :
inurl:"login.php" "Login to access your author control panel"
basically thats right what sawa73 and rodol said but i give you something to think about
you can expand your range of results you get for "laser targeted" footprints whenever you add "two letter" keywords for example like "aa-zz". furthermore you are able to add wildcards (*) for google SEs so you can use "a*a-z*z". combine this with nubmers 0-9 and spintax combinations like {*|{a|b|z}|{*aa|*ab|*zz}} and you will get a good variety of different results for google SE (as * is an operator which won't work for every SE).
as a side effect you have multilanguage keywords for searching as all "short character" combinations are possible in each language and can regulate how often you like to search with no keywords at all ("{*| }") or like to use some keywords. in my example every type of keyword was used with {33%|33%|33%}.
@ozz i did that with Scrapebox a while ago to test it.
I mixed a footprint-set (i think drupal) with numbers 00-99, a-z, Monday-Sunday, Different names (marc, michael...), months, countrynames, site:.de, site:.com, site:.net etc.
I ended up with a giant list, but 81% (!!!) were duplicates. I thought i found sth really great, but it obviously doesnt improve things.
We must remember there is a limited number of targets out there. The internet is big, but we are also dealing with big numbers. In 2012 there were 650 million webpages, lets say its 1 billion now. When you let SB run on 100 proxies for 1 week, you can easily fetch 50 million- Thats 5% of the internet so to say. And with our footprints we are reducing the number of scrapable webpages drastically.
Long story short - there is no way to scrape perfectly, but since the number of targets is also extremely limited , we dont need that and should focus on other things (making money for example)
Making Money is the MOTIVE that make US wonder, What are the best Footprints to Scrape the Best URLS, to build The Best LInks, To Rank our Money Site and MAKE MONEY!
Yes but i think there is more to it than messing around too much with scraping. I mean once you have 100k+ verifieds in your global list, u can use it again to rank almost any page... sometimes size doesnt matter
you have to use %random-a-z%|%random-0-9%. but apart from that it should work and you can combine it the way you like it and how often the macros have to be used.
@sawa73 - that's exactly what I was meaning but didn't explain it nicely. So one set of footprints to find decent places and another to find the register page or something.
@ozz - you just place that in the engine footprint between the | | right? How do you get around the amount of duplicates you find using that?
i just try to get the widest diversity of results and fragment the results with
keywords -> different combinations of 2 letter, syllables, most used words, wildcards plus a combination of all of them when it makes sense (= expecting resultst)
SE mods -> different Google operators for languages, countries and so on
@Ozz - wouldn't you get the same results from just using a larger keyword list and always enable "use keywords" - then you wouldn't need to update footprints all the time on every update? It sounds like it would have the same effect as adding all these extra combinations to the footprints?
well, "always use keywords" needs to be activated obviously and i never said you need to change the footprints in the engine files for this. large keyword lists have the disadvantage that they are working mostly for one language only (like english), but when you are using SEs of some other languages as well than those "character combinations" working obviously better as they occur in every language which using latin characters.
furthermore you are able to use your keyword lists as well and combine both together {50% keyword list|50% keyword combinations}. what you are doing is just limited by your imagination. just test it by hand in your browser and do whatever feels best for you.
Always a tradeoff - between ease & number of results. I might try to use different keywords rather than do those type edits in the engine files and have to update them on the SER update...
This thread was truly inspiring, but I just want to clarify some details. @insane, @ron - when you guys buy domains for spamming/testing purposes, do you use a) private registration, b) fake names, c) use your own name.
Private registration is not for free, but nobody wants to be associated with spam projects. If those are intended just for few month - one year, it could be reasonable to use fake user data when buying the domain.
Comments
@Pratik, I am referring to 120 projects, not groups.
@AlexR, I'm also experimenting on a few sites with no articles and social networks as well. Just try a few different things on new sites - don't do everything the same.
For exemple for ArticleScript :
inurl:"login.php" "Login to access your author control panel"
this footprint target register pages, who are all similar (ex: http://www.indiangoarticles.com/login.php), adding keywords won't help in this case.
But if you use a footprint who target article pages :
"About the Author" "Submit Articles" "Most Viewed - All Categories"
You can append keyword to find more target.
{#random[a-z,0-9]#random[a-z,0-9]*|#random[a-z,0-9]#random[a-z,0-9]|#random[a-z,0-9]*#random[a-z,0-9]}?
Private registration is not for free, but nobody wants to be associated with spam projects. If those are intended just for few month - one year, it could be reasonable to use fake user data when buying the domain.