Re: [Sks-devel] "SKS is effectively running as end-of-life software at this point"?

2019-02-07 Thread robots.txt fan
On Thursday, February 7, 2019 12:37 AM, Andrew Gallagher  
wrote:
> Because you can reject a key, but then what happens is it just keeps trying 
> to come back. Pretty soon there are so many rejected keys floating around 
> that the network stops reconciling. Also, what happens if I reject certain 
> keys and you don’t, but your only connection to the rest of the network is 
> through me? Once nodes start implementing different policies you can go 
> split-brain surprisingly easily.

I shouldn't have written "reject". If you already have this key in your 
blacklist, just tell the other keyserver that you already have it, but do not 
store it. Store only the hash.

Of course it might still be possible to code information into the hashes like 
Tobias wrote, but at least generating exactly the right hash is extremely 
expensive (if not impossible) from the attacker's perspective so I do not think 
it is feasible for them at all. Storing hashes of kryptonite should be okay.

> It’s not a simple matter of just coding it up.

Of course not, and I wouldn't dare claiming that. I agree with Martin in that I 
also am glad to see that there is a will to invest time in developing a new 
server. The Synchronising Key Servers should not vanish from earth.

___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] "SKS is effectively running as end-of-life software at this point"?

2019-02-06 Thread robots.txt fan
On Wednesday, February 6, 2019 8:28 PM, Robert J. Hansen  
wrote:
> It's a lack of community consensus on what a redesign should look like.

That can be changed. I do not know anything about the source code of this 
project, so forgive my naivety.

Is it possible to develop a keyserver thar uses the same interface as the 
current one? Meaning that GnuPG-Clients don't need to change and current 
keyservers can recon with the new keyservers (since they are not all upgraded 
simultaniously)?

Of course, that while also being able to not accept large keys. A first step 
might be limiting UIDs to a certain size, but then one could just generate lots 
of UIDs, or if you also limit the number of UIDs per key, they could generate 
lots of keys.

Furthermore, what would be nice if content could be deleted. I'm thinking about 
GDPR requests from people who do not want their data online anymore, or illegal 
content coded into UIDs or User Attribute Packets. Perhaps this can be 
implemented through a blacklist of fingerprints that synchronises.

Are there any more problems that need to be fixed? Like seriously, everyone 
please write the problems they have with SKS.

To answer my first question, I guess that it is possible to implement a 
keyserver with the same interface for GPG users that can still recon with older 
servers. The older servers might try to send them keys that are on the 
blacklist or are large, but the new server can reject those keys of course.

___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Implications of GDPR

2018-04-29 Thread robots.txt fan
Moritz Wirth wrote:
> Given the fact that it is not possible to delete data from a keyserver

Of course this is possible. You can delete key by using the "sks drop " 
command. Now, if I understand it correctly the key will immediately be re-added 
because of gossiping keyservers. However, it would not be impossible to extend 
SKS to have a keyserver reject keys from a blacklist that each server admin 
would maintain, or possibly gossip. (If this does not exist already.)

I imagine this would be a useful instrument for more use-cases than this one. I 
imagine a server admin based in Germany would get in trouble if someone 
submitted a key with the user-id "The owner of this server denies the 
Holocaust", an action that is illegal in Germany.  The server admin could get 
out of the trouble by adding the hash of that key to the blacklist.

I know I am suggesting censorship but it's not like SKS was ever meant to be a 
secure or reliable channel.

___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-25 Thread robots.txt fan
It is not about spam, but about being found. Thank you very much for adding the 
file!
Only 5 or 6 of the servers I found are left.

>> Whilst I don"t believe it will make any difference whatsoever to your
>> spam levels, it may reduce some load on my keyservers from genuine
>> indexing so I"ve added a robots.txt file at the root (covering both port
>> 11371 and 80).___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-23 Thread robots.txt fan
But now it is working, thank you very much!

> Thank you for heads up, given that robots.txt wasn"t previously tracked
> but created directly on server there ended up a conflict on update for
> the file...___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-23 Thread robots.txt fan
metalgamer: Thank you very much!

ToBeFree: It would sure serve the absurdity indeed. Please don't do it.

Kristian: Thank you very much for adding the file to the repository! Like I 
explained, the concern are not bad actors here, but instead actors that do 
respect the standard (e.g. Google). May I ask how the git repository and the 
live site are related? While I see the robots.txt file in the git repository, 
it is not displayed on https://sks-keyservers.net/robots.txt.

Best regards
RTF___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-22 Thread robots.txt fan
Hello again,

m, thank you very much for installing the file to your server!

Paul, thank you for your kind words.

Robert, this is not a lost cause, but instead a fixable problem. Condolences 
are not required, but a solution is. This solution can only come from admins 
like m.

I have now come up with a larger list of servers that do not yet have an 
efficient robots.txt file or none at all. Kristian, you have responded to this 
thread, I believe you manage the first one on the list. Is there a reason why 
only /status is blocked and not /pks?

https://sks-keyservers.net (blocks /status, but not /pks)
https://keyserver.mattrude.com (blocks /pks, but not /search)
http://pgp.net.nz (works fine on port 11371, but not on port 80)
http://keyserver.nausch.org:11371 (completely missing)
http://pgp.circl.lu (completely missing)
http://keyserver.cns.vt.edu (completely missing)
https://gpg.mozilla.org (completely missing)
https://keyserver.metalgamer.eu (completely missing)
https://keys.fedoraproject.org (completely missing)
http://pgpkeys.eu:11371 (completely missing)
http://keyserver.rayservers.com:11371 (seems to be down now, was up a few days 
ago)

Best regards
RTF___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


Re: [Sks-devel] Request: Install an efficient robots.txt file

2017-06-20 Thread robots.txt fan
Hi,

how can you assume that it was me who uploaded a key with my name on it?

Please, I try to be optimistic here. This is a problem where I rely on the 
server admins, yes. Unfixable for me, easily fixable for her or his server by 
the respective admin. Is it unreasonable to assume that the admins are 
benevolent? I do not think so. Otherwise, we may be talking about PBP, not PGP.

RTF

Hi,

If you don't want your name to appear on Google, don't upload it to a service 
that permanently spreads it to hundreds of public websites. Especially don't 
rely on every server admin to "block" crawlers from these pages, because this 
fails as long as at least one admin doesn't.

Have a nice day anyway.

On Tue, Jun 20, 2017, 10:36 robots.txt fan  wrote:
Dear Sirs and Madams,

I would like to thank all of you for doing this. You are a necessary pillar to 
PGP and it is awesome that you are there to provide the infrastructure to host 
everyone's key.

Without attempting to diminish the previous sentence, I have a request to make 
to some of you.

Most of the SKS serve an efficient robots.txt that prevents everyone's 
un-deletable name and email showing up on search engines. However, there are 
some exceptions. I like to keep a low profile, but when searching for my name, 
for example on Google, a significant amount of results are from SKS pages, or 
to be more specific, these:

keyserver.nausch.org
pgp.net.nz
pgp.circl.lu
keyserver.rayservers.com
sks-keyservers.net
keyserver.mattrude.com (special case: blocks /pks, but not /search, a 
non-standard (?) directory)

I would like to ask the owners of these pages to take the time to install an 
efficient robots.txt file, for example something like this:

User-agent: *
Disallow: /pks/

To all others, I would like to ask you to take the time to check if your server 
serves an efficient robots.txt file, and if it does not, to please install one.

If there is any doubt that a robots.txt file is a good idea, I can elaborate on 
that.

Thank you for your time.

RTF
___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel


[Sks-devel] Request: Install an efficient robots.txt file

2017-06-20 Thread robots.txt fan
Dear Sirs and Madams,

I would like to thank all of you for doing this. You are a necessary pillar to 
PGP and it is awesome that you are there to provide the infrastructure to host 
everyone's key.

Without attempting to diminish the previous sentence, I have a request to make 
to some of you.

Most of the SKS serve an efficient robots.txt that prevents everyone's 
un-deletable name and email showing up on search engines. However, there are 
some exceptions. I like to keep a low profile, but when searching for my name, 
for example on Google, a significant amount of results are from SKS pages, or 
to be more specific, these:

keyserver.nausch.org
pgp.net.nz
pgp.circl.lu
keyserver.rayservers.com
sks-keyservers.net
keyserver.mattrude.com (special case: blocks /pks, but not /search, a 
non-standard (?) directory)

I would like to ask the owners of these pages to take the time to install an 
efficient robots.txt file, for example something like this:

User-agent: *
Disallow: /pks/

To all others, I would like to ask you to take the time to check if your server 
serves an efficient robots.txt file, and if it does not, to please install one.

If there is any doubt that a robots.txt file is a good idea, I can elaborate on 
that.

Thank you for your time.

RTF___
Sks-devel mailing list
Sks-devel@nongnu.org
https://lists.nongnu.org/mailman/listinfo/sks-devel