That's awesome. But then how do you know which footprint to put in each .ini file? That must take a lot of time. Theres hundreds of files in the engines folder but when you export them with santos' tool they get lumped into one big file
A few days spent doing the boring stuff and months sitting back wondering what I can tweak next
There are no quick ways to getting the results I do on a daily basis. A few have been added recently like being able to copy engine selections from one project to the next, but you need to spend time with ser to make it run sweet and fast
Dude, that's crazy. All I can say. No way I'm going in there and editing all of those files. I just scrolled down... & there's a lot of damn .ini files.
But @king818, that's the kind of stuff you do if you want to make a ton of submissions. All you do is stick a few footprints in GScraper with the same keywords, and see what yields the most results.
Then you look at the results and change that engine. Then keep that engine named and in a separate folder which you can use to copy over the original.
And if you keep using 5.22, you make the change once and you're done, as long as you don't update. Or you just copy over after an update.
This is the kind of busy work that pays dividends every day going forward.
free does it. get 200 free proxies from BHW, test them in the proxy test of SER, then load them in gscraper free. export footprints from SER or use som found ones of BHW........ let it run for a day. done
Blog comments sure do. I love them on lower tiers. I seem to be able to get a ton more blog comments than other kinds.
Paid (Pro). $38. One time fee. You get the free use of their proxies and 7 day trial when you pay. So if you are in the mood and have the spare time, you could take advantage of that offer and really help yourself out.
I would just make sure you get organized ahead of time, grab the footprints, know which engines you want to target, yadayada, so when you pull the trigger you come out swinging.
If I just go in Google and type "Powered by 4images" "Agregar comentario" or whatever the footprint is, and check how many results it has is this doing the same thing Gscraper does albeit slower?
@king818 - I would familiarize yourself with the Search box in the upper right hand corner Most of your questions have been asked and answered numerous times.
I have yet to find a thread on this forum which elaborates upon editing the .ini files. That's pretty advanced SER stuff. Additionally searches for ini and footprint yield no applicable results .
I have the footprints I like and don't like. I just don't know how to edit these files appropriately
Thanks guys, I think I'm good. This stuff may be too complicated for me anyhow that's why I'm trippin. I think I'll stick with my 80% SER optimization You guys are experts on another level
@king818 - Just to give you an example. Open up the blogtronix.ini file in Notepad++. About 10 or 12 lines down you see this (notice it is in a modified spin syntax using "|"):
search term="Sign up with your email address. There are already * registered members."|"Powered by Blogtronix"|"Attached Image: " "Powered by Blogtronix"|"Powered by Sharetronix"|"External Profiles" "Last online" "About Me"|"users can communicate using quick status updates of 160 characters or less." "This free flowing dialogue lets you send messages, pictures and video to anyone"|"It`s also easy to find and connect with other people for private threads and to keep track of their updates."
Each one of those phrases between the pipes is being used to find targets. Now say you run some scrapes in gscraper or SB.
You setup each scrape with only one of those phrases. And get all of them running.
Now let's say you find that the 3rd phrase gets you 90% of the total results. Now don't you think it would be a tremendous improvement in efficiency to remove all the other phrases except the good one?
It's that easy. Time consuming? Yes. Worthwhile? A big Yes.
Now let's say that those footprint lists I mentioned earlier had 10 other footprints for this engine that are completely different and unique.
Test those as well. It's just a scrape to find out the answer. Which ones are worthwhile, and which ones are crap.
Please don't turn it into rocket science. You are short selling yourself. These are the kind of things you do if you want to win at this game.
It is mindless crap that will get you more links and make you more money.
@Ron & @LeeG - right on! It looks like we have come full circle again to my earliest concern on SER, the huge overlap in SE results and the continual parsing of them again and again and again with a slightly modified keyword.
Do you guys know of a tool that does the following:
1) Takes first keyword as master keyword/baseline.
2) Takes a bunch of keywords and checks in an SE what percentage of unique results compared to master they generate.
3) Has option to delete anything below X %?
I have been looking for something like this for 6 months!
Ozz, dont give them any more methods. They will be desperately confused and trying to over complicate things by adding g force calculus and pi x infinity type calculations to work out how many times a sparrow flaps its wings and then adding that to the final method of getting the best footprints to use
Not saying ron does that, but I might be implying it
A little trick with gscraper free to get your 100k results maxed is to remove duplicate domains at scraping
Im testing the free version at the moment to see how good it is
Its more basic than scrapebox. Less confusing for us old wise ones
Im also running a proxy scraper 24/7 now to feed it fresh proxies on each run.
Comments
So everyone take note.
On the pro version, there is a tool that shows the results for footprints
You can see from that, some terms will throw up a lot of results
That's why I make ser dance the way I do.
I spend the time doing it right.
Edit each ini file, one at a time
A few days spent doing the boring stuff and months sitting back wondering what I can tweak next
There are no quick ways to getting the results I do on a daily basis. A few have been added recently like being able to copy engine selections from one project to the next, but you need to spend time with ser to make it run sweet and fast
But @king818, that's the kind of stuff you do if you want to make a ton of submissions. All you do is stick a few footprints in GScraper with the same keywords, and see what yields the most results.
Then you look at the results and change that engine. Then keep that engine named and in a separate folder which you can use to copy over the original.
And if you keep using 5.22, you make the change once and you're done, as long as you don't update. Or you just copy over after an update.
This is the kind of busy work that pays dividends every day going forward.
The free version does not have the footprint checker on it
You can import footprints, but it does not have the bit in the screenshot above
That's the joys of the paid version, you can see how many results are possible with the paid version
Blog comments sure do. I love them on lower tiers. I seem to be able to get a ton more blog comments than other kinds.
Paid (Pro). $38. One time fee. You get the free use of their proxies and 7 day trial when you pay. So if you are in the mood and have the spare time, you could take advantage of that offer and really help yourself out.
I would just make sure you get organized ahead of time, grab the footprints, know which engines you want to target, yadayada, so when you pull the trigger you come out swinging.
The private internet marketers forum Im a moderator on, the guys have called me the Dr Frankenstein of ser with the tricks I pull out the bag
Each time Im getting closer to the half million submissions in a day, without list feeding
Some tricks can be easy, others just take time to implement and test
This ones easy with the right tool for the job, just time consuming
@king818 - Just to give you an example. Open up the blogtronix.ini file in Notepad++. About 10 or 12 lines down you see this (notice it is in a modified spin syntax using "|"):
search term="Sign up with your email address. There are already * registered members."|"Powered by Blogtronix"|"Attached Image: " "Powered by Blogtronix"|"Powered by Sharetronix"|"External Profiles" "Last online" "About Me"|"users can communicate using quick status updates of 160 characters or less." "This free flowing dialogue lets you send messages, pictures and video to anyone"|"It`s also easy to find and connect with other people for private threads and to keep track of their updates."
Each one of those phrases between the pipes is being used to find targets. Now say you run some scrapes in gscraper or SB.
You setup each scrape with only one of those phrases. And get all of them running.
Now let's say you find that the 3rd phrase gets you 90% of the total results. Now don't you think it would be a tremendous improvement in efficiency to remove all the other phrases except the good one?
It's that easy. Time consuming? Yes. Worthwhile? A big Yes.
Now let's say that those footprint lists I mentioned earlier had 10 other footprints for this engine that are completely different and unique.
Test those as well. It's just a scrape to find out the answer. Which ones are worthwhile, and which ones are crap.
Please don't turn it into rocket science. You are short selling yourself. These are the kind of things you do if you want to win at this game.
It is mindless crap that will get you more links and make you more money.
Tommy
Again your missing the point thisisalex
Why run a secondary link scraper and use more resources?
The whole of idea of ser is to be able to scrape results and post links with no need for extra methods of adding link targets
A simple set and forget method
Setting up scrapebox or gscraper to scrape links is missing the point of what ser does best
All the time your using tools like that to import lists, your not spreading the links very widely
The more keywords you use, the more results you pull
The better the footprints, the more places you pull to place links
You might be lacking the will to do things properly and gain good results, but don't think everyone else has your same lazy attitude
https://forum.gsa-online.de/discussion/2325/can-i-speed-up-the-importindentify-function/p1
"
Ozz, dont give them any more methods. They will be desperately
confused and trying to over complicate things by adding g force calculus
and pi x infinity type calculations to work out how many times a
sparrow flaps its wings and then adding that to the final method of
getting the best footprints to use
Not saying ron does that, but I might be implying it
A little trick with gscraper free to get your 100k results maxed is to remove duplicate domains at scraping
Im testing the free version at the moment to see how good it is
Its more basic than scrapebox. Less confusing for us old wise ones
Im also running a proxy scraper 24/7 now to feed it fresh proxies on each run.
If any asks, Proxy Multiply. 7 day free trial
""
LeeG
February 21
Flag
How do some of you lot posses the ability to breath at times
Come on, scrape a list with gscraper free version, use the free tool by Santos to extract the footprints
Leave that running over night with its 100,000 url limit and bang, following day add them to a t4
Easily sorted into the two main site list areas.
At worst you have a load of junk links on a lower tier
No doubt given time you will find a method to complicate a simple task by adding your quantum physics girly tool methods
"and still. I am helpful, and have positive attitude. you are insulting again.