0.31 - new: ability to specify different keywords for content checking than for scraping - new: basic support for Italian language - new: export template "Wordpress XML"
well see it as an addition...titles would be taken from other sources. Also in some cases the tool tries to take catchy sentences as titles where the keyword appears or takes short questions with keyword in.
It will scrape data, and/or use previously scraped data (depending on settings), and then determinate how many articles that data would generate if used all....Thats the MAX number here.
0.35 - new: SpinDB Editor via Tools button (not flly working though) - new: improved read flow when mixing sentences - new: ability to use local files/folders as sources 0.34 - new: added Swedish as supported langauge (no spin sytax...feel free to provide it) - fix: some problems with Italian content scraping fixed (unicode mismatch) - new: improved some scrapers - fix: authority link scraping with %domain% was broken - fix: number of articles was not changable for some algorithms 0.33 - new: the function to "Create Multiple Projects from Keyword" will accept Backlink-URLs as input in the format "http://www.xyz.com#kw1,kw2,..."
@fng edit project->sources->custom sources->click ADD though I have to change the header from "URL" to "Location" as it can hold a folder/file as well now.
Guys: Do you get enough titles when you use MAX in the "Number of titles" box? I only get ~4 different for very broad kw's. You have any settings that works well for this? (For RX purpose)
I know you can upload several articles to RX, and that should be a workaround solution as for now.
Comments
- new: basic support for Italian language
- new: export template "Wordpress XML"
- new: improved API spinner usage (follow rules of waiting time between queries)
For example when I set the output number to MAX will it use the article source in multiple articles?
0.35 - new: SpinDB Editor via Tools button (not flly working though)
- new: improved read flow when mixing sentences
- new: ability to use local files/folders as sources
0.34 - new: added Swedish as supported langauge (no spin sytax...feel free to provide it)
- fix: some problems with Italian content scraping fixed (unicode mismatch)
- new: improved some scrapers
- fix: authority link scraping with %domain% was broken
- fix: number of articles was not changable for some algorithms
0.33 - new: the function to "Create Multiple Projects from Keyword" will accept Backlink-URLs
as input in the format "http://www.xyz.com#kw1,kw2,..."
Thought it would be in sources->custom
Where is it located?
though I have to change the header from "URL" to "Location" as it can hold a folder/file as well now.
After selecting "content from file" nothing happens.
Content from folder does work
Used previously collected data = checked
custom->add->content from folder
selected save and scrape (save alone does not do anything)
get above error
Folder has one txt file that has 4300+ sentences
Am I doing something wrong?
I only get ~4 different for very broad kw's.
You have any settings that works well for this? (For RX purpose)
I know you can upload several articles to RX, and that should be a workaround solution as for now.