My Review : First i would like to appreciate ROYALMICE for his patience, he answered all my questions patiently. Very helpful guy.
About the data pack it saved me hell lot of time. I am new to GSA and i don't have time to read how it works and how to configure it. By just spending few bucks i got everything ready within a day of order and it took exactly 2 mins to set it up and run it. As i am not familiar with GSA i asked my friend to look into my VPS and he was very much impressed about the service. Royalmice did a very nice job and he used all the features required for a successful campaign. I rate his service 10/10.
If you are new to GSA and haven't got enough time to read and practice. Then this is the service for you.
I received my two 7-tier data packs. Everything looks good although I made some small tweaks for my capcha service and added wait 15mins for email + use proxies. As with previous data packs I have ordered everything seems to be setup correctly. I left the url limits in place and let them run over night. So far so good - thanks royalmice
I installed and started my first Tier 7 project this afternoon and everything looks. The question I have is if uncheck the "pause the project after 50/100 submissions in a day" for each tier and let Indexification drip feed links to indexer in a 7 day period, will that have any bad effect?
I also looked at the Tier 1 verified link URL'S list and many of them have a "?" next to the url. Aren't those supposed to be high PR value instead of "?" or am I missing something?
SeenuSamx Thank you very much for taking the time to leave feedback on your experience with the 7 Tier data packs. I am glad your experience was positive and that it managed to save you time.
Seenu thanks for the 10/10 vote of confidence i appreciate.
Glennf - Download failed normally means either a connection or the destination don't exist anymore or Time ut - But i cant say for sure what your specific problem is but suggest you start by checking your proxies and time out settings. Please do a search in the forum if there is a thread already with the same errors
I will be updating the database templates every Sunday to allow for changes in platforms like new platforms added or removed, that will ensure that the data packs always has the most up to date engines.
I have many newbies ordering data packs that come back to ask how to set some basic GSA Ser settings like proxies, time outs and other advance settings under options. Some people also don't know how to set the most basic GSA Capcha breaker and GSA SEO indexer settings These settings cannot be set by the data packs and the individual user have to set these by them self on their sites. To help you i will make a short video to show you what my settings are so you at least have an idea what proxies, capcha breaker, indexer, and time out settings should look like. A link to the video will be posted with each data pack order.
I hope to do this over the weekend or sooner if time allow.
I've ordered several times before, amazing turnaround times, thanks
Just a quick question, now the GSA has scheduled article posting will you be able to setup projects to create web2 blogs which will first make a post with no link then make a 2nd post xx days later?
Thank you very much for the feedback, glad you are happy with the turn around times.
To be honest I have been so busy setting up the new 7 tier data pack option and thereafter fulfilling orders that i did not even know about the new schedule option that was introduced. I just took a look at it and it seems like a great idea and something which i will definitely add to the data packs, after i have studied what the best default settings would be. If there is a thread here in the forum where this are being discussed please drop me the link please.
I normally update all my data pack templates on Sundays to include new engines that was launched, so this scheduling could be added in this Sunday's updates if i have the info on what the best options are.
Glenf, no it wont make sense at all because for the top tier which point to the money site we are looking for quality links not high volume low quality links.
But if that is what you want just change it. Like i said before you are free to change any settings to what u want.
Also, you have "Skip also unknown PR". I looked at my list on Tier 1 URLs and many of them are maked "?" which is I think is unknown PR. Do you think those ones marked "?" are higher PR?
I have not skipped anything. All settings are there because it is what i feel is best. If you want to change any of the settings you can do so without asking about it. The data packs is a sold base from where you can further customize it to your liking.
I ordered one GSA SER data pack with 7 tiers and I am very pleased with the service because it was fast delivered and the work looks proffessional to me. Clearly I will order more.
zee007 -- That is not a problem to do at all but it would be so expensive that no one would want to buy it.
To produce a data pack i need 90 paragraphs of content, and then that many titles as well. IN my Tiered data packs each Tier has 20 different article variations, each of the articles are in syntax format, so if there are 7 tiers, you are looking at 7 x 20 = 140 articles to be written
So yes i agree it would be awesome to have all in original content, but so would the cost be.
I am looking at offering a add on where i use original UAW style articles for the top tiers of my data packs, but i am still testing that option. To get an idea of what kind of articles i am talking about, have a look at the premium article write and spin service on my site : http://asiavirtualsolutions.com/product/article-write-and-spin/
brainiac Thank you very much for sharing your GSA Ser stats and your experience with the tiered data packs, i am glad it is working out positively for you, keep up the good work. You should really try out the 7 tier data packs, as those now include delayed article posting - i will make a post about it later to explain the delayed posting -- Once again thanks for the support and for taking the time to leave your feedback
Glennf No i do not have a specific date but i can assure you it wont be this week or next, however i will make an announcement when it is ready. I am not going to start a back and forth speculation about what Penguin is and what it is not, but I do have 3 words for you to go and research to enlighten you a bit, those 3 words are: Partial Match Keywords
passmaker - Thank you very much for the endorsement and the nice words about the GSA Ser data packs. Really glad you are happy with the service and the product.
tsaimllc if you read the thread from the beginning you will find several comments on ranking increases. Myself i focus on ranking for longtail keywords and doing deep linking of my sites -- this is definitely working for me
I am running gsa ser 24/7 for my own projects, mostly travel niches, when ever i run a gsa campaign for a domain of mine, i see increase in bookings on the site i am promoting within hours of starting gsa ser.
You should keep adjusting your GSA campaigns and addapt to use the latest features in gsa
Many people has been asking for it so i am pleased to announce ---> The Option to use highly spun original articles for the top tiers of the GSA ser Data packswill be available from next weekend the 10th of November 13 onwards. To get an idea of the highly spun articles i will be using you can see it on my site here : http://asiavirtualsolutions.com/product/article-write-and-spin/
This is EXCELLENT royalmice. I've been watching this thread for this. May I ask how much extra the highly spun top tier content Tier 7 data packs will be?
Do you have a sample spun article variation that was generated by the pack that we can read to get us further excited of what we will see on Nov. 10th?
Also, how long can a Tier 7 data pack run once started (week, month, 2 months) ?
Glennf - I have not set the price yet but if you follow the link i gave to the premuim article write and spin then you can get an idea of what the added charge will be.
The length of how long a GSA project can run is not really determined by the content, instead it is determined by the targets it has or can find to post to, so even if you have a 10GB spun content if there are no targets to post to then the content can not run.
So your next question would be how to make sure you don't run out of targets to post to: You start by expanding your Global site list, import target links and get it sorted into platforms, you can either scrape random sources to import using the build in scraper or external scrapers like Gscraper or scrapebox -- if you are too lazy to scrape or find a list then i do have a 2.6 million link list which you can import into your GSA Glovbal site, list - details on the list i have is available here : http://asiavirtualsolutions.com/product/backlink-lists/ (if you have questions about the list, then contact me via my site ) To import into the Global site list go here: OPTIONS \ ADVANCE \ TOOLS\ IMPORT URLS (second one from top)
Even better than the global site list is to scrape targets specific to your projects and then import these directly into your project. These are Niche specific Target urls and you will use specific footprints with keywords related to you project. -- To import these right click on any of your projects \ select "IMPORT TARGET urls
Thanks royalmice, That explains a lot. Another question is, when I buy 1 of your Tier 7 packs with unique content in upper tiers can I use it for building links to 2 different URLs but in the same EXACT niche?
Awesome service. Michael really understands your niche to create a great GSA project. I got my project running in 5 minutes! You cannot go wrong — highly recommended!
I've been using these data packs, and can say they have taught me a lot about how the projects should be set up. However, as much as I like to learn, I prefer letting the professionals help me out every now and then. I have two more data packs waiting to be completed and will continue to use the service.
PS: This whole forum is awesome for learning stuff about GSA.
Comments
I also looked at the Tier 1 verified link URL'S list and many of them have a "?" next to the url. Aren't those supposed to be high PR value instead of "?" or am I missing something?
Seenu thanks for the 10/10 vote of confidence i appreciate.
Glennf - Download failed normally means either a connection or the destination don't exist anymore or Time ut - But i cant say for sure what your specific problem is but suggest you start by checking your proxies and time out settings. Please do a search in the forum if there is a thread already with the same errors
I will be updating the database templates every Sunday to allow for changes in platforms like new platforms added or removed, that will ensure that the data packs always has the most up to date engines.
If you have any questions with regards to the data packs, please feel free to use the live help on my site : http://asiavirtualsolutions.com/product/gsa-search-engine-ranker-data-pack/
Thanks and have a great day
These settings cannot be set by the data packs and the individual user have to set these by them self on their sites.
To help you i will make a short video to show you what my settings are so you at least have an idea what proxies, capcha breaker, indexer, and time out settings should look like. A link to the video will be posted with each data pack order.
I hope to do this over the weekend or sooner if time allow.
Thank you very much for the feedback, glad you are happy with the turn around times.
To be honest I have been so busy setting up the new 7 tier data pack option and thereafter fulfilling orders that i did not even know about the new schedule option that was introduced. I just took a look at it and it seems like a great idea and something which i will definitely add to the data packs, after i have studied what the best default settings would be. If there is a thread here in the forum where this are being discussed please drop me the link please.
I normally update all my data pack templates on Sundays to include new engines that was launched, so this scheduling could be added in this Sunday's updates if i have the info on what the best options are.
deancow Thanks for letting me know :
But if that is what you want just change it. Like i said before you are free to change any settings to what u want.
The data packs is a sold base from where you can further customize it to your liking.
Look forward to be of service again.
Have a great day
To produce a data pack i need 90 paragraphs of content, and then that many titles as well. IN my Tiered data packs each Tier has 20 different article variations, each of the articles are in syntax format, so if there are 7 tiers, you are looking at 7 x 20 = 140 articles to be written
So yes i agree it would be awesome to have all in original content, but so would the cost be.
I am looking at offering a add on where i use original UAW style articles for the top tiers of my data packs, but i am still testing that option. To get an idea of what kind of articles i am talking about, have a look at the premium article write and spin service on my site : http://asiavirtualsolutions.com/product/article-write-and-spin/
Thanks and have a great day
Have a great week ahead
That would really be great. I too am looking for that since the last Penguin update demanding content be unique.
Glennf No i do not have a specific date but i can assure you it wont be this week or next, however i will make an announcement when it is ready. I am not going to start a back and forth speculation about what Penguin is and what it is not, but I do have 3 words for you to go and research to enlighten you a bit, those 3 words are: Partial Match Keywords
Really glad you are happy with the service and the product.
Have a great weekend
I'm curious I havent heard anyone's actual RESULTS from using these yet. How are people's RANKINGS doing after using these?
I am running gsa ser 24/7 for my own projects, mostly travel niches, when ever i run a gsa campaign for a domain of mine, i see increase in bookings on the site i am promoting within hours of starting gsa ser.
You should keep adjusting your GSA campaigns and addapt to use the latest features in gsa
To get an idea of the highly spun articles i will be using you can see it on my site here : http://asiavirtualsolutions.com/product/article-write-and-spin/
I will also be embedding co-citation linking directly into the data packs after next week. You might be asking what the heck is co citation LOL ===> there is a good post over here about it: http://www.searchenginejournal.com/co-citation-and-co-occurrence-the-next-big-thing-in-seo/
Many new features coming to GSA SER Data Packs in NOVEMBER so keep your eyes on the thread or make sure to visit my website Asia Virtual Solutions at http://asiavirtualsolutions.com/product/gsa-search-engine-ranker-data-pack/
Do you have a sample spun article variation that was generated by the pack that we can read to get us further excited of what we will see on Nov. 10th?
Also, how long can a Tier 7 data pack run once started (week, month, 2 months) ?
The length of how long a GSA project can run is not really determined by the content, instead it is determined by the targets it has or can find to post to, so even if you have a 10GB spun content if there are no targets to post to then the content can not run.
So your next question would be how to make sure you don't run out of targets to post to:
You start by expanding your Global site list, import target links and get it sorted into platforms, you can either scrape random sources to import using the build in scraper or external scrapers like Gscraper or scrapebox -- if you are too lazy to scrape or find a list then i do have a 2.6 million link list which you can import into your GSA Glovbal site, list - details on the list i have is available here : http://asiavirtualsolutions.com/product/backlink-lists/
(if you have questions about the list, then contact me via my site )
To import into the Global site list go here: OPTIONS \ ADVANCE \ TOOLS\ IMPORT URLS (second one from top)
Even better than the global site list is to scrape targets specific to your projects and then import these directly into your project. These are Niche specific Target urls and you will use specific footprints with keywords related to you project. -- To import these right click on any of your projects \ select "IMPORT TARGET urls
If you want to save money and scrape yourself -- then here is a extensive list of footprints you can use: https://www.dropbox.com/s/arwabbo375tmdws/Footprints - all combined.txt
So to answer your question the more targets you have (global sitelist + target urls ) the longer you can run the project.
PS: This whole forum is awesome for learning stuff about GSA.