1st is there a global blacklist available of big websites like facebook, instagram, youtube, etc. that many people potentially use as "website" on their social profiles but have no contact form on them?
2nd when importing urls is there a feature to supply additional information to the text message via variables?
Lots of external scrapers are able to grab additional info along with the url.
For example I mean it like this.
The importing file is a csv in this case with the additional fields "first_name, sales and address".
The additional fields should be user definable in the csv and not hardcoded.
Doe,200,Fakestreet 23 Burminghamhttp://website2.com,Andrew
Homes,432,Fakestreet 32 London
Then it should be possible to write the text message like
Hello %first_name%, it came to my attention that your branch in %address% had %sales_amount% sales this week. Let me show you how to improve this number tenfold...
Those messages would be far less spammy than generalized ones.
3rd when I plan to import 10 million websites, does that work well with the program or should I split it up into several smaller projects?