Yes, 300 users should not be a problem as a single JMeter client
should be able to handle that.

On 06/06/2008, Michael McDonnell <[EMAIL PROTECTED]> wrote:
> That makes sense. I'll give it a go. (We're pretty sure there's no bottle
>  neck passing things, they do it every 100 samples, and this is over a 100
>  MB/s net. I'm only trying to run 300 users, so they should be able to
>  perform well over a 10 MB/s
>
>
>  On Fri, Jun 6, 2008 at 9:27 AM, sebb <[EMAIL PROTECTED]> wrote:
>
>  > In client-server mode, only the test plan is sent from the client to
>  > the server(s).
>  >
>  > Any additional files - e.g. CSV input files - need to be present on
>  > the server host in the location specified by  the test plan.
>  >
>  > Sample data is returned to the client, and processed/stored by the client.
>  > This can become a bottleneck at the client - both for JMeter itself,
>  > and for the network connection - under high loads.
>  >
>  > Data files are best randomised before use.
>  > Likewise, if you want to run with different data on different hosts,
>  > then create different data files for each host (but you can use the
>  > same name).
>  >
>  > On 06/06/2008, Michael McDonnell <[EMAIL PROTECTED]> wrote:
>  > > How did you randomize the data from the CSVs? (if I may ask)
>  > >
>  > >  Also, I'm dealing with a lot of optimistic locking issues which would
>  > only
>  > >  occur if each csv is doing the EXACT same thing at the exact same time
>  > >  (which is completely likely)
>  > >
>  > >
>  > >  On Thu, Jun 5, 2008 at 9:54 PM, Ryan Dooley <[EMAIL PROTECTED]>
>  > wrote:
>  > >
>  > >  > I had a similar experience the first time.  Turns out that the data I
>  > >  > wanted
>  > >  > to test with (HTTP POSTs) has to be put on each remote.  I also had a
>  > >  > process to randomize the data when transferred to the remotes.  I
>  > finally
>  > >  > got the load up high enough across 10 machines like yours.
>  > >  >
>  > >  > The test harness I had was pretty simple:  post these things to this
>  > url.
>  > >  >
>  > >  > On Thu, Jun 5, 2008 at 5:19 PM, Michael McDonnell <
>  > [EMAIL PROTECTED]>
>  > >  > wrote:
>  > >  >
>  > >  > > We're running a distributed test (roughly 7 remote workstations) on
>  > a
>  > >  > > pretty
>  > >  > > hefty box (8 cores, 32 gigs ram.... etc...)
>  > >  > >
>  > >  > > However, something seems to be going wrong... perhaps its because
>  > I'm
>  > >  > > crossing linux and windows platforms to try to do the testing?
>  > >  > >
>  > >  > > We're load testing a web application, so primarily, the only work
>  > we're
>  > >  > > doing is http requests (there are a few "java requests" that
>  > actually is
>  > >  > an
>  > >  > > app I created to make webservice calls, but we'll get to that later)
>  > >  > >
>  > >  > > However, when we view the transactions in the database, they are
>  > >  > extremely
>  > >  > > low. (frighteningly low).
>  > >  > >
>  > >  > > Then we run the test from a single user work station (same test, 300
>  > >  > users
>  > >  > > doing work) and our results come back fantastically!
>  > >  > >
>  > >  > > Now granted: I guess the big deal is this: when the app uses a csv
>  > in
>  > >  > > distributed mode, does each slave utilize the the same csv in the
>  > same
>  > >  > > order
>  > >  > > ? or is there a sort of "break up" so that no two slaves are using
>  > the
>  > >  > same
>  > >  > > line in the csv?
>  > >  > >
>  > >  > > I'm sorry for what may be dumb questions... but we're coming down to
>  > a
>  > >  > > tight
>  > >  > > deadline, and the distributed testing is not giving us good results
>  > where
>  > >  > > as
>  > >  > > the local testing is.
>  > >  > >
>  > >  > > Thanks for all your help in advance.
>  > >  > >
>  > >  > > Michael
>  > >  > >
>  > >  >
>  > >
>  >
>
> > ---------------------------------------------------------------------
>  > To unsubscribe, e-mail: [EMAIL PROTECTED]
>  > For additional commands, e-mail: [EMAIL PROTECTED]
>  >
>  >
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to