A few questions
1st is there a global blacklist available of big websites like facebook, instagram, youtube, etc. that many people potentially use as "website" on their social profiles but have no contact form on them?
2nd when importing urls is there a feature to supply additional information to the text message via variables?
Lots of external scrapers are able to grab additional info along with the url.
For example I mean it like this.
The importing file is a csv in this case with the additional fields "first_name, sales and address".
The additional fields should be user definable in the csv and not hardcoded.
url(mandatory),first_name,sales_amount,address
http://website.com,John Doe,200,Fakestreet 23 Burmingham
http://website2.com,Andrew Homes,432,Fakestreet 32 London
Then it should be possible to write the text message like
Hello %first_name%, it came to my attention that your branch in %address% had %sales_amount% sales this week. Let me show you how to improve this number tenfold...
Those messages would be far less spammy than generalized ones.
3rd when I plan to import 10 million websites, does that work well with the program or should I split it up into several smaller projects?
2nd when importing urls is there a feature to supply additional information to the text message via variables?
Lots of external scrapers are able to grab additional info along with the url.
For example I mean it like this.
The importing file is a csv in this case with the additional fields "first_name, sales and address".
The additional fields should be user definable in the csv and not hardcoded.
url(mandatory),first_name,sales_amount,address
http://website.com,John Doe,200,Fakestreet 23 Burmingham
http://website2.com,Andrew Homes,432,Fakestreet 32 London
Then it should be possible to write the text message like
Hello %first_name%, it came to my attention that your branch in %address% had %sales_amount% sales this week. Let me show you how to improve this number tenfold...
Those messages would be far less spammy than generalized ones.
3rd when I plan to import 10 million websites, does that work well with the program or should I split it up into several smaller projects?
Comments
2. right now there is no such thing but I see a usage on this so I will add it
3. personally i would split it up, though 10mio should work...just try the demo version
Simply import a csv file and use the header value as macro. Sample:
csv content:
message:
in "submission content-->add-->import" and select "all files" for file type and upload/import the .csv that way?
"Lots of external scrapers are able to grab additional info along with the url."
Mind sharing what are some of the better external scrapers that do this?
Thanks.