@sonic81 I compared most of them and only saw modifications to the search terms, my reply was something in between asking for a confirmation and asking for opinions about this, I don't know if Zeusy didn't think about optimizing this or there are reasons to don't do it (maybe not worth the time).
"I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches" "
I know this is important and I'm glad you mentioned it.
To confirm I just updated the footprints, keywords and to add keyword to the search. Yesterday, I have another look at the likes of Article Beach engine and to be honest the page must have = Needs also updated on many engines. Using google and a new footprint I found over 1million pages, but when I told the engine the search term and tested it didnt identify many of the article sites, which confirmed they need updating.
lrichard2112 you can use both, be you may get a lot of already parsed, I would go with one or the other.
@eLeSlash - What @Zeusy did specifically helps users of SER. Every person on this forum should get Gscraper, obtain top notch footprints, and stick those in the engine footprints of SER.
This is one example where people should not rely on one another. The people who do this will very much benefit in getting more links and higher LPM.
Comments
@kaene, I find your comment very interesting:
"I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches" "
I know this is important and I'm glad you mentioned it.
Is it ok to check all double engine.ini files ? (SER default and Zeusy`s?)
kaene
To confirm I just updated the footprints, keywords and to add keyword to the search. Yesterday, I have another look at the likes of Article Beach engine and to be honest the page must have =
Needs also updated on many engines. Using google and a new footprint I found over 1million pages, but when I told the engine the search term and tested it didnt identify many of the article sites, which confirmed they need updating.
lrichard2112 you can use both, be you may get a lot of already parsed, I would go with one or the other.
We have in Search section Syntax options for :
Site:
Inurl:
link:
Intitle:
I would recommend another intext:
In the engines files would be nice to have separate "search term=" sections
Example:
search term site:=
search term inurl:=
This would allow for focusing footprintsin the SE's. Without wasting resources.
Example, if google is my search engine, it can handle the inurl:
While other search engines cant. Which is waste.
i use only your edited engine.ini and i cant get any verified from 24 hour run.
http://prntscr.com/yt433
http://prntscr.com/yu0dz
Dunno why this is happening
I know @Ozz would love for me to set him up here (and he even has a special banner for this that he hand made), but:
PUBLIC PROXIES SUCK
Translation: Don't use them.
@eLeSlash - What @Zeusy did specifically helps users of SER. Every person on this forum should get Gscraper, obtain top notch footprints, and stick those in the engine footprints of SER.
This is one example where people should not rely on one another. The people who do this will very much benefit in getting more links and higher LPM.
I've made an idea about what to do with @Zeusy files... but I don't get the point of using GScraper here.
I'm a missing something?
The bottom line is if you have better footprints, you will:
Not for finding footprints, for testing footprints.