one other related question, when you are scraping for new urls, if the url/domain comes under more than one platform, i.e. indexer,blog comment or trackback, will it register under all 3, or just the easiest or first one ?
rss & web 2.0 are both fixed domains in the left hand panel, so that means no other sites can be added to these platforms ? & turn searching off for these ?
1. Not necessarily. Indexer urls should be placed to places that are already indexed so google can visit those places and pick up the url you posted.
2. If I recall it correctly then there is an option where you can choose between them in GSA Platform identifier 3. You have to manually add sites there.
"1. Not necessarily. Indexer urls should be placed to places that are
already indexed so google can visit those places and pick up the url you
posted."
you mean the urls you are linking to from the indexer sites should be already indexed ?
(please clarify)
"2. If I recall it correctly then there is an option where you can choose between them in GSA Platform identifier"
i havent got GSA Platform identifier,but i noticed that if you have indexer selected then it finds loads of indexer, with not a lot else, so i was curious.
the original question was based on what i thought was conflicting information, i dont know if i missunderstood what was said in the posts where i found it ?
how do you manually add rss & web 2.0 sites ? You pay for SEREngines web 2.0 addition to add more web2.0 sites. (We are waiting for v3.0 so be patient) You can script your own engines: https://docu.gsa-online.de/search_engine_ranker/script_manual
Otherwise the original web2.0 engines in SER are dead I think. (Haven't tried them for a long time)
how do you know which sites to add ? For example you have a recipe site that you want to promote on other sites where recipe submission is allowed. You go to other sites and script an engine for SER for those particular sites. You use ser to post to those sites. In other words you find the sites you need and use ser as a tool to post there.
ok, thank you, i think you explained it very well, i thought you could just import them from the sites you have scraped & gsa ser would recognize them & list them as web 2.0 sites.
"In other words you find the sites you need and use ser as a tool to post there"
does it work if you post to these sites after adding them to gsa ser by adding the code for them ?
Yeah, the naming convention is a bit confusing especially because there isn't clear definitions of what actually web2.0 sites are in general. But web2.0 sites are websites that solely exist for the users to create pages on them like wordpress.com or tumblr etc... You can make a web2.0 site anytime by creating a wordpress, joomla or other CMS site and allow users to register and create content on your site. In this case users could post to that site with ser as SER already has an engine script for posting to wordpress and Joomla based sites.
When we scrape for targets we scrape for platforms that ser already knows and so we can post there easily. These sites aren't solely exist to allow users to create whatever content they like there. These sites are generally niche specific sites that allow user generated content too. Think of a recipe site where you can also send recipes.
Comments
3. You have to manually add sites there.
You pay for SEREngines web 2.0 addition to add more web2.0 sites. (We are waiting for v3.0 so be patient) You can script your own engines: https://docu.gsa-online.de/search_engine_ranker/script_manual
For example you have a recipe site that you want to promote on other sites where recipe submission is allowed. You go to other sites and script an engine for SER for those particular sites. You use ser to post to those sites. In other words you find the sites you need and use ser as a tool to post there.
Useful read:
https://forum.gsa-online.de/discussion/219/tutorial-web-2-0-how-to-code-a-web-2-0-engine-noob-friendly/p1
You can make a web2.0 site anytime by creating a wordpress, joomla or other CMS site and allow users to register and create content on your site. In this case users could post to that site with ser as SER already has an engine script for posting to wordpress and Joomla based sites.
When we scrape for targets we scrape for platforms that ser already knows and so we can post there easily. These sites aren't solely exist to allow users to create whatever content they like there. These sites are generally niche specific sites that allow user generated content too. Think of a recipe site where you can also send recipes.