@sonic81 I compared most of them and only saw modifications to the search terms, my reply was something in between asking for a confirmation and asking for opinions about this, I don't know if Zeusy didn't think about optimizing this or there are reasons to don't do it (maybe not worth the time).
"I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches" "
I know this is important and I'm glad you mentioned it.
lrichard2112
Noob question :
Is it ok to check all double engine.ini files ? (SER default and Zeusy`s?)
To confirm I just updated the footprints, keywords and to add keyword to the search. Yesterday, I have another look at the likes of Article Beach engine and to be honest the page must have = Needs also updated on many engines. Using google and a new footprint I found over 1million pages, but when I told the engine the search term and tested it didnt identify many of the article sites, which confirmed they need updating.
lrichard2112 you can use both, be you may get a lot of already parsed, I would go with one or the other.
eLeSlash
edited April 2013
Isnt the option "Put keyword in quotes when used in search queries" should add + "keyword" at the footprint?
Zeusy
edited April 2013
Im thinking it might be greatly benifical if he egines have more than just 1 search string section. We have in Search section Syntax options for :
Site: Inurl: link: Intitle:
I would recommend another intext:
In the engines files would be nice to have separate "search term=" sections Example:
search term site:= search term inurl:=
This would allow for focusing footprintsin the SE's. Without wasting resources.
Example, if google is my search engine, it can handle the inurl: While other search engines cant. Which is waste.
eLeSlash
Is there any way that you can add the mini image of platforms on the left of our backlinks (from Last verified url field)?
I know @Ozz would love for me to set him up here (and he even has a special banner for this that he hand made), but:
PUBLIC PROXIES SUCK
Translation: Don't use them.
ron SERLists.com
@svobada - The answer is no. SB cannot help for this type of thing.
eLeSlash
@Ron so what zeusy did with the engines, its only working useing with gscraper? Or it can be solely use with the internal gsa scrapeing aswell?
ron SERLists.com
@eLeSlash - What @Zeusy did specifically helps users of SER. Every person on this forum should get Gscraper, obtain top notch footprints, and stick those in the engine footprints of SER.
This is one example where people should not rely on one another. The people who do this will very much benefit in getting more links and higher LPM.
unikbit
@zeus, did you got also their proxy service or just use semi/private proxies?
Musk
I feel like I shouldn't be going for MORE results as much as going for footprints that yield a high V:S ratio.
eLeSlash
@Ron i know that, but i still dont know the answears for my questions
Zeusy
unikbitI use private proxyies only for everything.
KayKay
@Zeusy engines working great got alot of submissions with your fixed engines thanks
kinglouie
Again... For what you guys do you use GScraper?
I've made an idea about what to do with @Zeusy files... but I don't get the point of using GScraper here.
I'm a missing something?
ron SERLists.com
edited April 2013
The bottom line is if you have better footprints, you will:
Find more targets
Submit more links
Get more verified links because you submitted more links
kinglouie
So GScraper is used for finding more footprints?
ron SERLists.com
Not for finding footprints, for testing footprints.
kinglouie
You lost me here... LMAO
Hunar
Which part is confusing you. The footprints in general or testing the footprints?
kinglouie
I know what footprints are, but I don't know how to test them? Or how to find more of them!?
Comments
@kaene, I find your comment very interesting:
"I found that there is a lot of room for improvement in the "page must have" and "url must have" options, and GSA SER is finding good urls because of the search terms, but then not finding the right engine because of wrong "page must have" and "url must have" options. For instance, I found some Pligg sites where I could post with other tools but SER was giving me a "no engine matches" "
I know this is important and I'm glad you mentioned it.
Is it ok to check all double engine.ini files ? (SER default and Zeusy`s?)
kaene
To confirm I just updated the footprints, keywords and to add keyword to the search. Yesterday, I have another look at the likes of Article Beach engine and to be honest the page must have =
Needs also updated on many engines. Using google and a new footprint I found over 1million pages, but when I told the engine the search term and tested it didnt identify many of the article sites, which confirmed they need updating.
lrichard2112 you can use both, be you may get a lot of already parsed, I would go with one or the other.
We have in Search section Syntax options for :
Site:
Inurl:
link:
Intitle:
I would recommend another intext:
In the engines files would be nice to have separate "search term=" sections
Example:
search term site:=
search term inurl:=
This would allow for focusing footprintsin the SE's. Without wasting resources.
Example, if google is my search engine, it can handle the inurl:
While other search engines cant. Which is waste.
i use only your edited engine.ini and i cant get any verified from 24 hour run.
http://prntscr.com/yt433
http://prntscr.com/yu0dz
Dunno why this is happening
I know @Ozz would love for me to set him up here (and he even has a special banner for this that he hand made), but:
PUBLIC PROXIES SUCK
Translation: Don't use them.
@eLeSlash - What @Zeusy did specifically helps users of SER. Every person on this forum should get Gscraper, obtain top notch footprints, and stick those in the engine footprints of SER.
This is one example where people should not rely on one another. The people who do this will very much benefit in getting more links and higher LPM.
I've made an idea about what to do with @Zeusy files... but I don't get the point of using GScraper here.
I'm a missing something?
The bottom line is if you have better footprints, you will:
Not for finding footprints, for testing footprints.