Re: [Sks-devel] "SKS is effectively running as end-of-life software at this point"?

2019-02-07 Thread robots.txt fan
On Thursday, February 7, 2019 12:37 AM, Andrew Gallagher wrote: > Because you can reject a key, but then what happens is it just keeps trying > to come back. Pretty soon there are so many rejected keys floating around > that the network stops reconciling. Also, what happens if I reject certain

Re: [Sks-devel] "SKS is effectively running as end-of-life software at this point"?

2019-02-06 Thread robots.txt fan
On Wednesday, February 6, 2019 8:28 PM, Robert J. Hansen wrote: > It's a lack of community consensus on what a redesign should look like. That can be changed. I do not know anything about the source code of this project, so forgive my naivety. Is it possible to develop a keyserver thar uses th

Re: [Sks-devel] Implications of GDPR

2018-04-29 Thread robots.txt fan
Moritz Wirth wrote: > Given the fact that it is not possible to delete data from a keyserver Of course this is possible. You can delete key by using the "sks drop " command. Now, if I understand it correctly the key will immediately be re-added because of gossiping keyservers. However, it would

Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-25 Thread robots.txt fan
It is not about spam, but about being found. Thank you very much for adding the file! Only 5 or 6 of the servers I found are left. >> Whilst I don"t believe it will make any difference whatsoever to your >> spam levels, it may reduce some load on my keyservers from genuine >> indexing so I"ve add

Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-23 Thread robots.txt fan
But now it is working, thank you very much! > Thank you for heads up, given that robots.txt wasn"t previously tracked > but created directly on server there ended up a conflict on update for > the file...___ Sks-devel mailing list Sks-devel@nongnu.org ht

Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-23 Thread robots.txt fan
metalgamer: Thank you very much! ToBeFree: It would sure serve the absurdity indeed. Please don't do it. Kristian: Thank you very much for adding the file to the repository! Like I explained, the concern are not bad actors here, but instead actors that do respect the standard (e.g. Google). May

Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-22 Thread robots.txt fan
Hello again, m, thank you very much for installing the file to your server! Paul, thank you for your kind words. Robert, this is not a lost cause, but instead a fixable problem. Condolences are not required, but a solution is. This solution can only come from admins like m. I have now come up

Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-20 Thread robots.txt fan
admin to "block" crawlers from these pages, because this fails as long as at least one admin doesn't. Have a nice day anyway. On Tue, Jun 20, 2017, 10:36 robots.txt fan wrote: Dear Sirs and Madams, I would like to thank all of you for doing this. You are a necessary pillar to PGP and it

[Sks-devel] Request: Install an efficient robots.txt file

2017-06-20 Thread robots.txt fan
Dear Sirs and Madams, I would like to thank all of you for doing this. You are a necessary pillar to PGP and it is awesome that you are there to provide the infrastructure to host everyone's key. Without attempting to diminish the previous sentence, I have a request to make to some of you. Mo