Hi Ronny,

I'm currently writing a very large program in Python that uses scp. The
program is threaded and I'm finding that I have to limit the number of
active threads to about 40 to get best performance. Although the program
is threaded every instance of scp/ssh is a separate process. The program
could easily fork 400 instances of scp concurrently if I let it.

The up shot of this is that you have to keep a close watch on the system
resources that your script uses. Different OS's switch contexts
differently, so it could work on one OS and not another. I need to have
mine run on DEC, SUN, AIX, and Linux, so I must program for the least
common denominator.

As far as your pty problem, your system probably doesn't have enough pty
devices defined, but, I think, this isn't your real problem anyway. If
you are forking the ssh connections concurrently then 400 is a bit much
anyway. If they are done serially then you are not closing used ptys
correctly.

Carl

On 13-Apr-00 Ronny Zellhann wrote:
> Having a problem with ssh
> 
> Making several ssh connections to several machines (about 400) in a
> script 
> I get a lot of hanging connection in TIME_WAIT state. After a while I
> get 
> an error because I don't have enough pty:s.
> Is there any solution for that ?   I don't want to put my script i
> sleep 
> mode to wait for the connections to close.
> 
> Thanks in advance
> Ronny

------------------------------------------------------------------------
E-Mail: Carl J. Nobile <[EMAIL PROTECTED]>
Date: 13-Apr-00                             Phone: 315-453-2912 Ex. 5336
Time: 13:13:45                                Fax: 315-453-3052

Software Engineering Group -- AppliedTheory Corp.
224 Harrison Street, 6th Floor, Syracuse, NY  13202
------------------------------------------------------------------------

Reply via email to