To guard against link-rot of news websites when they're used as Reference for statements in Wikidata, is there any bot that is systematically going around and collecting new URLs that are added with the Reference URL property (P854), adding them to Internet Archive, copying the result, and then adding that as Archive URL property (P1065) + Archive Date (P2960) back into the original statement?
If not, would that be feasible, and a good idea? -Liam
_______________________________________________ Wikidata mailing list Wikidata@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata