@fng please redownload...the demo expired thing should be fine now till 15th or march at least.
fng
was able to download now
Sven www.GSA-Online.de
0.31 - new: ability to specify different keywords for content checking than for scraping - new: basic support for Italian language - new: export template "Wordpress XML"
Sven www.GSA-Online.de
0.32 - new: internal spinner for Italian language - new: improved API spinner usage (follow rules of waiting time between queries)
fng
can't change # of articles for "Mix paragraphs (from all articles)"
Sven www.GSA-Online.de
fixed in next update.
echanney usa
how bout an auto posting module in a future update, where content can be posted to a site on a schedule
Sven www.GSA-Online.de
it's on my list
Kaine thebestindexer.com
edited March 2017
Import urls list to scrape.
Maybe + condition (for locate article position).
Sven www.GSA-Online.de
you can do that in project options->sources->custom
fng
How about adding an option sources->my own text file and my own text file folder and then use previously collected data
Sven www.GSA-Online.de
you mean it should extract directly from a folder or given text file? Well sounds ok...will add that.
fng
@sven - maybe not a good idea after all to extract data from text files because how would you create titles?
Sven www.GSA-Online.de
well see it as an addition...titles would be taken from other sources. Also in some cases the tool tries to take catchy sentences as titles where the keyword appears or takes short questions with keyword in.
710fla ★ #1 GSA SER VERIFIED LIST serpgrow.com
Does it use each article source only once when using the option to mix paragraphs?
For example when I set the output number to MAX will it use the article source in multiple articles?
Sven www.GSA-Online.de
It will scrape data, and/or use previously scraped data (depending on settings), and then determinate how many articles that data would generate if used all....Thats the MAX number here.
Sven www.GSA-Online.de
0.35 - new: SpinDB Editor via Tools button (not flly working though) - new: improved read flow when mixing sentences - new: ability to use local files/folders as sources 0.34 - new: added Swedish as supported langauge (no spin sytax...feel free to provide it) - fix: some problems with Italian content scraping fixed (unicode mismatch) - new: improved some scrapers - fix: authority link scraping with %domain% was broken - fix: number of articles was not changable for some algorithms 0.33 - new: the function to "Create Multiple Projects from Keyword" will accept Backlink-URLs as input in the format "http://www.xyz.com#kw1,kw2,..."
fng
Can't find new feature in 0.35 -> Use local files/folders as sources.
Thought it would be in sources->custom
Where is it located?
Sven www.GSA-Online.de
edited March 2017
@fng edit project->sources->custom sources->click ADD though I have to change the header from "URL" to "Location" as it can hold a folder/file as well now.
fng
custom->add->content from file does not work
After selecting "content from file" nothing happens.
Content from folder does work
Sven www.GSA-Online.de
sorry, will be fixed...for now just add a folder and later edit it to a file.
fng
Get error "sorry, not enough content"
Used previously collected data = checked custom->add->content from folder selected save and scrape (save alone does not do anything) get above error
Folder has one txt file that has 4300+ sentences
Am I doing something wrong?
Sven www.GSA-Online.de
can you get a bit more details to this? Full log or screenshot of the log.
fng
sent email - subj=not enough content error
Sven www.GSA-Online.de
thanks, was able to fix it in next version
antonearn Earth
edited March 2017
Guys: Do you get enough titles when you use MAX in the "Number of titles" box? I only get ~4 different for very broad kw's. You have any settings that works well for this? (For RX purpose)
I know you can upload several articles to RX, and that should be a workaround solution as for now.
Sven www.GSA-Online.de
@antonearn how many articles are generated? Only 4 maybe? Becuase thats the number it will add here.
antonearn Earth
Uploading several articles works fine aswell. Question closed.
Comments
- new: basic support for Italian language
- new: export template "Wordpress XML"
- new: improved API spinner usage (follow rules of waiting time between queries)
For example when I set the output number to MAX will it use the article source in multiple articles?
0.35 - new: SpinDB Editor via Tools button (not flly working though)
- new: improved read flow when mixing sentences
- new: ability to use local files/folders as sources
0.34 - new: added Swedish as supported langauge (no spin sytax...feel free to provide it)
- fix: some problems with Italian content scraping fixed (unicode mismatch)
- new: improved some scrapers
- fix: authority link scraping with %domain% was broken
- fix: number of articles was not changable for some algorithms
0.33 - new: the function to "Create Multiple Projects from Keyword" will accept Backlink-URLs
as input in the format "http://www.xyz.com#kw1,kw2,..."
Thought it would be in sources->custom
Where is it located?
though I have to change the header from "URL" to "Location" as it can hold a folder/file as well now.
After selecting "content from file" nothing happens.
Content from folder does work
Used previously collected data = checked
custom->add->content from folder
selected save and scrape (save alone does not do anything)
get above error
Folder has one txt file that has 4300+ sentences
Am I doing something wrong?
I only get ~4 different for very broad kw's.
You have any settings that works well for this? (For RX purpose)
I know you can upload several articles to RX, and that should be a workaround solution as for now.