1. What important factor you look for? 2. What metric you consider for t1 link from scrapped list. 3. Do you use same list for t2 and t3? 4. How to get new footprints to scrape fresh domains?
1. I do not understand this question 2. I do not consider any metrics. They are absolutely useless. It means nothing. 3. Yes 4. A footprint is a word or phrase that you can find commonly across all or most websites using a particular CMS. So you have to look at all the websites that have been identified as that particular cms and find the footprints. You can use footprint studio in SER. Or just use your own eyeballs and brain manually. I prefer manual work for this.
Bro if you want to get the most out of this tool - buy lists! GSA ser is really amazing tool,but without targets/sites to post to is useless.Scraping is pain in the ass without good proxies,bandwidth etc. So from my experience it really needs good targets to see results and i bought ser list and im pretty happy )
Bro if you want to get the most out of this tool - buy lists! GSA ser is really amazing tool,but without targets/sites to post to is useless.Scraping is pain in the ass without good proxies,bandwidth etc. So from my experience it really needs good targets to see results and i bought ser list and im pretty happy )
If you're running SER on a home computer, I agree with this.
Comments
2. I do not consider any metrics. They are absolutely useless. It means nothing.
3. Yes
4. A footprint is a word or phrase that you can find commonly across all or most websites using a particular CMS. So you have to look at all the websites that have been identified as that particular cms and find the footprints. You can use footprint studio in SER. Or just use your own eyeballs and brain manually. I prefer manual work for this.
If you're running SER on a home computer, I agree with this.