GUI for GNOME?

2007-08-08 Thread HF

hey list,

is there any GUI for Tor (and privoxy?) available für GNOME?
something like Vidalia under OSX?

Hannes

Re: GUI for GNOME?

2007-08-08 Thread Ringo Kamens
I think you can just compile torK and run it in gnome even though it's
based on KDE. Also, blossom provides a nice web interface.
Comrade Ringo Kamens

On 8/8/07, HF [EMAIL PROTECTED] wrote:
 hey list,

 is there any GUI for Tor (and privoxy?) available für GNOME?
 something like Vidalia under OSX?

 Hannes


Re: GUI for GNOME?

2007-08-08 Thread Ater Atrocitas
-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160

Ringo Kamens schreef:
 I think you can just compile torK and run it in gnome even though it's
 based on KDE. Also, blossom provides a nice web interface.
 Comrade Ringo Kamens
 

Then again, you can also just get Vidalia installed under Gnome. In a
few of the larger distro's it should just be in portage, or the
repository, whatever your distribution has called it's package database.

If it's not present there, go here: http://vidalia-project.net/download.php

- --
Ater Atrocitas
[EMAIL PROTECTED]
http://blasfemie.com

GnuPG key: http://blasfemie.com/ateratrocitas(0x720F6C40).asc
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.5 (GNU/Linux)

iD8DBQFGugHGVhIlSnIPbEARA4r4AKCelr4xNqRemq4zC0eZ52Nr0HO+gQCfUvlr
cgUZufVr7ZjO3bbU3TWSvUE=
=aLjP
-END PGP SIGNATURE-


Re: GUI for GNOME?

2007-08-08 Thread Ater Atrocitas
-BEGIN PGP SIGNED MESSAGE-
Hash: RIPEMD160

Ringo Kamens schreef:
 Last time I checked, it wasn't in the ubuntu repositories and it's a
 bitch to compile and run because of all of its dependencies. Also, I
 think it uses a deprecated version of the qt toolkit but perhaps
 things have changed since I last tried to use it.
 Comrade Ringo Kamens
 

Even though it isn't the most recent posting, it does not seem all to
hard in this howto:
http://ubuntuforums.org/showthread.php?t=287349
(Ubuntu Edgy Eft, QT, Tor, Vidalia)
Installing on previous Ubuntu version 6.06 LTS does not seem that
simple: http://trac.vidalia-project.net/ticket/198

Don't know whether the whole Tor 0.1.2.16 release and it's control port
management has changed the installation on Ubuntu though, here on Gentoo
I'll have to first fix getting a new Vidalia ebuild as the previous
Vidalia does not work with the newest Tor.

- --
Ater Atrocitas
[EMAIL PROTECTED]
http://blasfemie.com

GnuPG key: http://blasfemie.com/ateratrocitas(0x720F6C40).asc
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.5 (GNU/Linux)

iD8DBQFGugbwVhIlSnIPbEARAxuUAKCWXPDP2FlCRzrPuLsa79jnixevJgCgimzK
2D4anrwPYo9dlfLuYWRWWqg=
=eR8A
-END PGP SIGNATURE-


Re: Proposal of a new hidden wiki

2007-08-08 Thread Eduardo Costa Lisboa
I think that a redundant system would be the best thing. The main
hidden wiki could contain script that backs the site up every X hours,
and anyone could download it and host anywhere.


Or, maybe, some more softsticated high availability system could run
on the background to host the same hostname from different sources.
Thus, the master onion servers would point to the last one which
request the .onion hostname. For this, there should be a need to
share, among the hidden wiki mirrors, the same private key, which
implicates in full trust between the mirrorers.


-- 
Eduardo Costa Lisboa


Re: GUI for GNOME?

2007-08-08 Thread Matt Edman
On Wed, Aug 08, 2007 at 01:53:30PM -0400, Ringo Kamens wrote:
 Last time I checked, it wasn't in the ubuntu repositories and it's a
 bitch to compile and run because of all of its dependencies. Also, I
 think it uses a deprecated version of the qt toolkit but perhaps
 things have changed since I last tried to use it.
 Comrade Ringo Kamens

You're right, Vidalia is not in the Ubuntu repositories. Our ebuild for Gentoo
and RPMs for Red Hat have been contributed by some nice people who wanted to
help out. It seems nobody has been motivated enough to make a package for
Ubuntu yet, though.

Instructions for compiling Vidalia from source on Ubuntu can be found on our
wiki (there is even a section especially for Ubuntu):

http://trac.vidalia-project.net/wiki/InstallSource#LinuxBSDUnix

As for using a deprecated version of Qt, that's news to me. Vidalia has always
required at least Qt 4.1, but also works with Qt 4.2 and 4.3. If you've found
that Vidalia doesn't build for you with one of those versions of Qt, I'd be
happy to know which one and fix it.

--Matt


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
I like the distributed private key idea. Each wiki copy would have a
separate email address so that if one server got compromised, the
operators could be informed and change the private key. My question
is: what would determine which server got chosen?
Comrade Ringo Kamens

On 8/8/07, Eduardo Costa Lisboa [EMAIL PROTECTED] wrote:
 I think that a redundant system would be the best thing. The main
 hidden wiki could contain script that backs the site up every X hours,
 and anyone could download it and host anywhere.


 Or, maybe, some more softsticated high availability system could run
 on the background to host the same hostname from different sources.
 Thus, the master onion servers would point to the last one which
 request the .onion hostname. For this, there should be a need to
 share, among the hidden wiki mirrors, the same private key, which
 implicates in full trust between the mirrorers.


 --
 Eduardo Costa Lisboa



Re: GUI for GNOME?

2007-08-08 Thread Ringo Kamens
Has anybody tried using the RPMs on ubuntu using Alien?
Comrade Ringo Kamens

On 8/8/07, Matt Edman [EMAIL PROTECTED] wrote:
 On Wed, Aug 08, 2007 at 01:53:30PM -0400, Ringo Kamens wrote:
  Last time I checked, it wasn't in the ubuntu repositories and it's a
  bitch to compile and run because of all of its dependencies. Also, I
  think it uses a deprecated version of the qt toolkit but perhaps
  things have changed since I last tried to use it.
  Comrade Ringo Kamens

 You're right, Vidalia is not in the Ubuntu repositories. Our ebuild for Gentoo
 and RPMs for Red Hat have been contributed by some nice people who wanted to
 help out. It seems nobody has been motivated enough to make a package for
 Ubuntu yet, though.

 Instructions for compiling Vidalia from source on Ubuntu can be found on our
 wiki (there is even a section especially for Ubuntu):

 http://trac.vidalia-project.net/wiki/InstallSource#LinuxBSDUnix

 As for using a deprecated version of Qt, that's news to me. Vidalia has always
 required at least Qt 4.1, but also works with Qt 4.2 and 4.3. If you've found
 that Vidalia doesn't build for you with one of those versions of Qt, I'd be
 happy to know which one and fix it.

 --Matt



Re: Proposal of a new hidden wiki

2007-08-08 Thread Eduardo Costa Lisboa
On 8/8/07, Ringo Kamens [EMAIL PROTECTED] wrote:
 I like the distributed private key idea. Each wiki copy would have a
 separate email address so that if one server got compromised, the
 operators could be informed and change the private key. My question
 is: what would determine which server got chosen?
 Comrade Ringo Kamens


I think that if two or more hidden services used the same private key,
thus the same .onion hostname, the master servers would always point
to the latest updated. It's like a dynamic IP hostname service, like
no-ip.org. Maybe one of us could deploy a test like this:

- person A hosts a html site saying: this is host A
- person B hosts a html site saying: this is host B

but persons A  B use the same private key. Then, they could, just to
equal host usage, schedule tor restarts each 2 hours. So, in even
hours host A would respond, and in pair hours host B would respond.
And this, automatically.

If it works, then both hosts should syncronize their content, and tho
this I would suggest some scripttable rsync or unison approach.

Was I clear? Sorry for my poor english.


-- 
Eduardo Costa Lisboa


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
I'm interested in testing this out with somebody. Until then, can any
devs/tor hackers enlighten us as to what would determine which host
gets picked? Would it be whoever is the fewest hops away? If so, one
host would get the most traffic if it was consistently closest to fast
servers.
Comrade Ringo Kamens


On 8/8/07, Eduardo Costa Lisboa [EMAIL PROTECTED] wrote:
 On 8/8/07, Ringo Kamens [EMAIL PROTECTED] wrote:
  I like the distributed private key idea. Each wiki copy would have a
  separate email address so that if one server got compromised, the
  operators could be informed and change the private key. My question
  is: what would determine which server got chosen?
  Comrade Ringo Kamens


 I think that if two or more hidden services used the same private key,
 thus the same .onion hostname, the master servers would always point
 to the latest updated. It's like a dynamic IP hostname service, like
 no-ip.org. Maybe one of us could deploy a test like this:

 - person A hosts a html site saying: this is host A
 - person B hosts a html site saying: this is host B

 but persons A  B use the same private key. Then, they could, just to
 equal host usage, schedule tor restarts each 2 hours. So, in even
 hours host A would respond, and in pair hours host B would respond.
 And this, automatically.

 If it works, then both hosts should syncronize their content, and tho
 this I would suggest some scripttable rsync or unison approach.

 Was I clear? Sorry for my poor english.


 --
 Eduardo Costa Lisboa



Re: orconfig.h for windows

2007-08-08 Thread Ed Jensen
Are there any instructions on the complete build process (from source) for 
windows?

Ed


 Thanks Nick,

But eventdns.c doesn't have a #ifdef HAVE_UNISTD_H in it.
Ed


 On Tue, Aug 07, 2007 at 06:38:31PM -0700, Ed Jensen wrote:
 Thanks Roger, I get it now.
 
 One more question if you don't mind - where do you get your unistd.h
 from? (included from eventdns.c) My windows box appears not to have
 it.

unistd.h is only included when HAVE_UNISTD_H is set; if your platform
doesn't have it, the configure script should not define HAVE_UNISTD_H.








   
-
Moody friends. Drama queens. Your life? Nope! - their life, your story.
 Play Sims Stories at Yahoo! Games. 
   
-
Shape Yahoo! in your own image.  Join our Network Research Panel today!

Re: Proposal of a new hidden wiki

2007-08-08 Thread Karsten Loesing
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Hi,

 I like the distributed private key idea.

Yes, that's really a nice idea. And it might even work.

 My question
 is: what would determine which server got chosen?
 
 I think that if two or more hidden services used the same private key,
 thus the same .onion hostname, the master servers would always point
 to the latest updated.

Correct. A hidden service uploads a current descriptor (containing
contact information) if a) there is some significant change in contact
information or b) an hour passes.

 Then, they could, just to
 equal host usage, schedule tor restarts each 2 hours. So, in even
 hours host A would respond, and in pair hours host B would respond.
 And this, automatically.

That's a bad idea, because it does not really improve availability if a
hidden service is restarted every two hours.

The two services should rather be run in parallel all the time. Then,
after some maths, one would (probably -- am no mathematician) find that
both services have their own descriptors published half the time, and
thus receive half of the client accesses. (Note that the one-hour
intervals break as soon as the list of introduction points changes --
that means that starting the nodes with a certain timing does not
significantly improve this solution.)

However, I am quite sure that the developers did not have this variant
of content replication in mind when they designed the hidden services.
That means that it might break. But why not try it? :)

- --Karsten
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFGuhxn0M+WPffBEmURAubmAJ9Or3XmcxgmnGxXJgDHGSXHPvaK5gCbB90/
qeNETEE1FYc9bNxUeJi8niU=
=8nZG
-END PGP SIGNATURE-


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
Perhaps instead of just making it redundant, they should shut off at
random times for random lengths (like 10 hours or less). From the way
I understand the attacks remaining against tor, this would make is
much more complicated to do a timing or deductive attack against the
hidden service even for a global adversary. This would actually act to
the disadvantage to a one-machine hidden service, but it would be a
really good idea for a redundant one. Another question is how do we
keep the wikis in sync without connecting to the other ones every time
their is a change (which would make timing attacks much easier).
Perhaps a linux machine that has a network-raid volume with the other
servers (over tor) acting as sections of the RAID volume?
Comrade Ringo Kamens

On 8/8/07, Karsten Loesing [EMAIL PROTECTED] wrote:
 -BEGIN PGP SIGNED MESSAGE-
 Hash: SHA1

 Hi,

  I like the distributed private key idea.

 Yes, that's really a nice idea. And it might even work.

  My question
  is: what would determine which server got chosen?
 
  I think that if two or more hidden services used the same private key,
  thus the same .onion hostname, the master servers would always point
  to the latest updated.

 Correct. A hidden service uploads a current descriptor (containing
 contact information) if a) there is some significant change in contact
 information or b) an hour passes.

  Then, they could, just to
  equal host usage, schedule tor restarts each 2 hours. So, in even
  hours host A would respond, and in pair hours host B would respond.
  And this, automatically.

 That's a bad idea, because it does not really improve availability if a
 hidden service is restarted every two hours.

 The two services should rather be run in parallel all the time. Then,
 after some maths, one would (probably -- am no mathematician) find that
 both services have their own descriptors published half the time, and
 thus receive half of the client accesses. (Note that the one-hour
 intervals break as soon as the list of introduction points changes --
 that means that starting the nodes with a certain timing does not
 significantly improve this solution.)

 However, I am quite sure that the developers did not have this variant
 of content replication in mind when they designed the hidden services.
 That means that it might break. But why not try it? :)

 - --Karsten
 -BEGIN PGP SIGNATURE-
 Version: GnuPG v1.4.6 (GNU/Linux)
 Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

 iD8DBQFGuhxn0M+WPffBEmURAubmAJ9Or3XmcxgmnGxXJgDHGSXHPvaK5gCbB90/
 qeNETEE1FYc9bNxUeJi8niU=
 =8nZG
 -END PGP SIGNATURE-



Re: Proposal of a new hidden wiki

2007-08-08 Thread Robert Hogan
On Wednesday 08 August 2007 19:32:39 Ringo Kamens wrote:
 I'm interested in testing this out with somebody. Until then, can any
 devs/tor hackers enlighten us as to what would determine which host
 gets picked? Would it be whoever is the fewest hops away? If so, one
 host would get the most traffic if it was consistently closest to fast
 servers.
 Comrade Ringo Kamens


The spec says:

  Upon receiving a descriptor, the directory server checks the signature,
   and discards the descriptor if the signature does not match the enclosed
   public key.  Next, the directory server checks the timestamp.  If the
   timestamp is more than 24 hours in the past or more than 1 hour in the
   future, or the directory server already has a newer descriptor with the
   same public key, the server discards the descriptor.  Otherwise, the
   server discards any older descriptors with the same public key and
   version format, and associates the new descriptor with the public key.
   The directory server remembers this descriptor for at least 24 hours
   after its timestamp.  At least every 18 hours, Bob's OP uploads a
   fresh descriptor.

So if a number of servers shared the same hidden-service key they would just 
overwrite each other's descriptor with each upload. They would never 
co-exist, instead the most recent poster would get the traffic. 

It seems like it should work as long as the servers agreed to update at 
different times. Not sure how secure such a service would be though.


-- 

Browse Anonymously Anywhere - http://anonymityanywhere.com
TorK- KDE Anonymity Manager - http://tork.sf.net
KlamAV  - KDE Anti-Virus- http://www.klamav.net



signature.asc
Description: This is a digitally signed message part.


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
Well I think that just through system-backups, maintenance, restarting
etc. that the descriptor upload times would be fairly random anyways,
especially if a random-turn-off function was implemented.
Comrade Ringo Kamens

On 8/8/07, Robert Hogan [EMAIL PROTECTED] wrote:
 On Wednesday 08 August 2007 19:32:39 Ringo Kamens wrote:
  I'm interested in testing this out with somebody. Until then, can any
  devs/tor hackers enlighten us as to what would determine which host
  gets picked? Would it be whoever is the fewest hops away? If so, one
  host would get the most traffic if it was consistently closest to fast
  servers.
  Comrade Ringo Kamens
 

 The spec says:

   Upon receiving a descriptor, the directory server checks the signature,
and discards the descriptor if the signature does not match the enclosed
public key.  Next, the directory server checks the timestamp.  If the
timestamp is more than 24 hours in the past or more than 1 hour in the
future, or the directory server already has a newer descriptor with the
same public key, the server discards the descriptor.  Otherwise, the
server discards any older descriptors with the same public key and
version format, and associates the new descriptor with the public key.
The directory server remembers this descriptor for at least 24 hours
after its timestamp.  At least every 18 hours, Bob's OP uploads a
fresh descriptor.

 So if a number of servers shared the same hidden-service key they would just
 overwrite each other's descriptor with each upload. They would never
 co-exist, instead the most recent poster would get the traffic.

 It seems like it should work as long as the servers agreed to update at
 different times. Not sure how secure such a service would be though.


 --

 Browse Anonymously Anywhere - http://anonymityanywhere.com
 TorK- KDE Anonymity Manager - http://tork.sf.net
 KlamAV  - KDE Anti-Virus- http://www.klamav.net





Re: Proposal of a new hidden wiki

2007-08-08 Thread [EMAIL PROTECTED]
Hello Karsten, Ringo and Eduardo,

Feel free to experiment, it's fun... But:
In my opinion it's MUCH easier to:
-have one well known hidden wiki
-have one or more well known backups of the hidden wiki, with the edit
function disabled

If the primary server is down, people can just go to a backup.
If the primary server goes permanently down, a backup can become the new
primary server.

If you want to create unbreakable hidden websites, tor isn't today the
right network, but might become in the future. It might be better to use
a network with a distributed cache system like freenet or such. Or to
help develop Tor by creating a distributed cache system for Tor. But
experimenting is fun, so I don't want you not to try. You might discover
a bug or security vulnerability by doing something that isn't supposed
to be done.

Every network has it's differences. Freenet is ultra slow, is known for
hosting child porn, doesn't allow exits to the internet, and isn't
actively developed. Tor has the world's most hated and feared army as a
sponsor/initiator of the project, but on the other hand it's fast and
does it job. Freenet has a slow but working system for creating almost
unbreakable web sites, Tor has a simpler, faster but more vulnerable
system to hide websites.

With Tor it's very easy to detect if a tor server and a tor service go
down at the same time. Showing on what server a hidden service runs.


And by the way, why is there a need to have a hidden wiki, when it's
going to get detected soon enough who runs it? Why not a public wiki
with .onion links? Why not add .onion links to the public wiki that
already exists?
Blessed are thee who stay hidden inside Tor, for the hiddenness from the
evil internet bestoeth them! ;-)


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
I appreciate the concern, but I think that while freenet is a viable
option and certainly there should be a backup on it, tor users need a
central link cache (so they can use the tor hidden network). I think
that tor is the right network for unbreakable hidden website,
especially if we use redundant services (through RAID-over-network?).
The reason we can do this on the real internet is that it would get
censored. Really quickly. Many countries have laws banning such
activities or linking to certain sites, like cryptography sites, which
is why tor links must be linked to from a hidden wiki.
Comrade Ringo Kamens

On 8/8/07, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:
 Hello Karsten, Ringo and Eduardo,

 Feel free to experiment, it's fun... But:
 In my opinion it's MUCH easier to:
 -have one well known hidden wiki
 -have one or more well known backups of the hidden wiki, with the edit
 function disabled

 If the primary server is down, people can just go to a backup.
 If the primary server goes permanently down, a backup can become the new
 primary server.

 If you want to create unbreakable hidden websites, tor isn't today the
 right network, but might become in the future. It might be better to use
 a network with a distributed cache system like freenet or such. Or to
 help develop Tor by creating a distributed cache system for Tor. But
 experimenting is fun, so I don't want you not to try. You might discover
 a bug or security vulnerability by doing something that isn't supposed
 to be done.

 Every network has it's differences. Freenet is ultra slow, is known for
 hosting child porn, doesn't allow exits to the internet, and isn't
 actively developed. Tor has the world's most hated and feared army as a
 sponsor/initiator of the project, but on the other hand it's fast and
 does it job. Freenet has a slow but working system for creating almost
 unbreakable web sites, Tor has a simpler, faster but more vulnerable
 system to hide websites.

 With Tor it's very easy to detect if a tor server and a tor service go
 down at the same time. Showing on what server a hidden service runs.


 And by the way, why is there a need to have a hidden wiki, when it's
 going to get detected soon enough who runs it? Why not a public wiki
 with .onion links? Why not add .onion links to the public wiki that
 already exists?
 Blessed are thee who stay hidden inside Tor, for the hiddenness from the
 evil internet bestoeth them! ;-)



Re: orconfig.h for windows

2007-08-08 Thread Nick Mathewson
On Wed, Aug 08, 2007 at 12:35:45PM -0700, Ed Jensen wrote:
 Are there any instructions on the complete build process (from source) for 
 windows?
 
 Ed

Check out Roger's earlier message on this thread.  The build process
we use is documented at:

   https://tor.eff.org/svn/trunk/doc/tor-win32-mingw-creation.txt


yrs,
-- 
Nick Mathewson


pgpYenESf5Vfa.pgp
Description: PGP signature


Re: Proposal of a new hidden wiki

2007-08-08 Thread Ringo Kamens
It's not the issue of a great wall attack where a person can't
access a public wiki with onion links, it's an issue of whether that
wiki could even exist. You'd have to crazy to host that on a public
machine.
Comrade Ringo Kamens

On 8/8/07, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:

  If you use Tor you can access a web site whether or not it's a hidden
 service. If you can access Tor, censorship is already defeated.

  But, please experiment, and as I said in an earlier post: I'm willing to
 run a backup of a hidden wiki, would there be a need. See my other post.

  /Viking

  Ringo Kamens skrev:
  I appreciate the concern, but I think that while freenet is a viable
 option and certainly there should be a backup on it, tor users need a
 central link cache (so they can use the tor hidden network). I think
 that tor is the right network for unbreakable hidden website,
 especially if we use redundant services (through RAID-over-network?).
 The reason we can do this on the real internet is that it would get
 censored. Really quickly. Many countries have laws banning such
 activities or linking to certain sites, like cryptography sites, which
 is why tor links must be linked to from a hidden wiki.
 Comrade Ringo Kamens

 On 8/8/07, [EMAIL PROTECTED] [EMAIL PROTECTED] wrote:


  Hello Karsten, Ringo and Eduardo,

 Feel free to experiment, it's fun... But:
 In my opinion it's MUCH easier to:
 -have one well known hidden wiki
 -have one or more well known backups of the hidden wiki, with the edit
 function disabled

 If the primary server is down, people can just go to a backup.
 If the primary server goes permanently down, a backup can become the new
 primary server.

 If you want to create unbreakable hidden websites, tor isn't today the
 right network, but might become in the future. It might be better to use
 a network with a distributed cache system like freenet or such. Or to
 help develop Tor by creating a distributed cache system for Tor. But
 experimenting is fun, so I don't want you not to try. You might discover
 a bug or security vulnerability by doing something that isn't supposed
 to be done.

 Every network has it's differences. Freenet is ultra slow, is known for
 hosting child porn, doesn't allow exits to the internet, and isn't
 actively developed. Tor has the world's most hated and feared army as a
 sponsor/initiator of the project, but on the other hand it's fast and
 does it job. Freenet has a slow but working system for creating almost
 unbreakable web sites, Tor has a simpler, faster but more vulnerable
 system to hide websites.

 With Tor it's very easy to detect if a tor server and a tor service go
 down at the same time. Showing on what server a hidden service runs.


 And by the way, why is there a need to have a hidden wiki, when it's
 going to get detected soon enough who runs it? Why not a public wiki
 with .onion links? Why not add .onion links to the public wiki that
 already exists?
 Blessed are thee who stay hidden inside Tor, for the hiddenness from the
 evil internet bestoeth them! ;-)