If I had to write the program all over again I would not use asyncore and
instead use threading, but the only thing worse than using asyncore is
writing this program all over again.  on the bright side, i managed to
remove all the blocking stuff.  example
self.settimeout(0) used to be 1.
asyncore.loop(.0001, use_poll=True)    .0001 is somehow more optimal than 0
<- wizard magic
if you ever use asyncore make sure anything referring to time is 0.  if you
want your sockets to time out in less than the default 30 seconds you
should write your own function to kill it.

considering there is going to be a hard limit of ~64000 unprivileged ports,
i think my main concern will be cpu and memory consumption.  I think it's
possible i will hit those limits before I hit any limit imposed by python.
 nontheless, i got my request rate to ~250 /sec avg and i can scan a /16 in
about 4 minutes.  my limit now is a 4000 sockets hard limit and the fact
that my box crashes if i actually use all of them.

This is all part of some research that me and a friend are putting
together.  we submitted a cfp for bsides boston and will release the code
when i have finished abusing it to the greatest extent possible.  in the
meantime if you see a http/get request to your IPs and the packet is full
of hilarious yo mama jokes it's me.  i am pretty excited about the stuff im
finding.

-a


On Thu, Jan 10, 2013 at 12:45 PM, Karl Schuttler
<[email protected]>wrote:

> On Thu, Jan 10, 2013 at 10:57 AM, allison nixon <[email protected]> wrote:
> > Yeah that's why I said I accidentally wrote it in python.  I just want to
> > know if I can salvage this since I already wrote it.
>
> I think you're misinterpreting David's comment - he is saying that
> threading is what you want. By launching each request in a new thread,
> you can spawn a whole bunch of requests off and not have to spend
> execution time waiting for the response to come back ("babysitting").
>
> I wrote an HTTPS scanner to send HTTP POST to a list of IPs and try
> default usernames and passwords, using threads. Using threads I was
> able to do >500 requests in parallel, doing 11k hosts in about 5
> minutes.
>
> Certainly try to find what is consuming your resources.
>
> Karl
>
>
>
>
>
> >
> > I  also just realized something that I might have been doing wrong that
> is
> > hogging most of the resources... if I can get 200 requests per second
> with
> > python i'll let you guys know...
> >
> >
> >
> > On Thu, Jan 10, 2013 at 10:37 AM, Martín <[email protected]> wrote:
> >>
> >> Hi!
> >>
> >> be also advised that because of the GlL [1] of the cpython interpreter,
> >> you may need to switch to an alternative interpreter implementation when
> >> performance of high concurrency applications is a must.
> >>
> >> In other words, Python may not be the best choice when trying to go to
> >> such performance limits.
> >>
> >> [1] http://en.wikipedia.org/wiki/Global_Interpreter_Lock
> >>
> >>
> >> On Thu, Jan 10, 2013 at 4:22 PM, Scott Kragen <[email protected]>
> wrote:
> >>>
> >>> Allison,
> >>>
> >>> Have you looked into threading for python?  I have used this library in
> >>> several of projects because writing a thread pool from scratch started
> to
> >>> give me a headache.
> >>>
> >>>
> >>>
> http://code.activestate.com/recipes/577105-synchronization-decorator-for-class-methods/
> >>>
> >>> The advantage of a thread pool is it can que based on the max amount of
> >>> threads you define.
> >>>
> >>> Scott
> >>>
> >>> On Thu, Jan 10, 2013 at 10:06 AM, allison nixon <[email protected]>
> >>> wrote:
> >>>>
> >>>> Say I'm writing a broadscanner for a pet project and I accidentally
> >>>> wrote the entire thing in Python.
> >>>>
> >>>> Using asyncore and sockets, the best I can get is 3 http requests per
> >>>> second, maybe 10 per second at the very max.  For a scanner this is of
> >>>> course, very lame.
> >>>>
> >>>> My goal is to of course make as many http request as a desktop
> computer
> >>>> will handle, so perhaps 200 per second with whatever number of sockets
> >>>> waiting in the background for a response(which I check periodically
> for a
> >>>> response and parse when I get it)
> >>>>
> >>>> Is there any way I can acheive 200 requests per second without
> learning
> >>>> another programming language?
> >>>>
> >>>> _______________________________________________
> >>>> Pauldotcom mailing list
> >>>> [email protected]
> >>>> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> >>>> Main Web Site: http://pauldotcom.com
> >>>
> >>>
> >>>
> >>>
> >>> --
> >>>
> >>>
> ----------------------------------------------------------------------------------------------------------------------
> >>> There is only one metric in security that can be truly measured and
> that
> >>> is failure!
> >>>
> >>> --- Jack Daniels
> >>>
> >>>
> ----------------------------------------------------------------------------------------------------------------------
> >>> _______________________________________________
> >>> Pauldotcom mailing list
> >>> [email protected]
> >>> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> >>> Main Web Site: http://pauldotcom.com
> >>
> >>
> >>
> >> _______________________________________________
> >> Pauldotcom mailing list
> >> [email protected]
> >> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> >> Main Web Site: http://pauldotcom.com
> >
> >
> >
> >
> > --
> > _________________________________
> > Note to self: Pillage BEFORE burning.
> >
> > _______________________________________________
> > Pauldotcom mailing list
> > [email protected]
> > http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> > Main Web Site: http://pauldotcom.com
> _______________________________________________
> Pauldotcom mailing list
> [email protected]
> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> Main Web Site: http://pauldotcom.com
>



-- 
_________________________________
Note to self: Pillage BEFORE burning.
_______________________________________________
Pauldotcom mailing list
[email protected]
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com

Reply via email to