In my example, I would grab 'whatever' records were hashed in the to 'group' -- while it's not perfect since there are 'overflow' - was just trying to think of a way to break a file into pieces that would otherwise process much like a BASIC select - just grab the 'group' and go.... I can see it's probably not possible, but the topic got me thinking about 'what if'... (And we're UniData - so I have to apply that filter to most everything I read on the list anyway <G>)
-----Original Message----- From: u2-users-boun...@listserver.u2ug.org [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of David Taylor Sent: Monday, October 01, 2012 6:10 PM To: U2 Users List Subject: Re: [U2] [u2] Parallel processing in Universe Or, let's suppose you wanted to process repetitive segments of one very large record using the same logic in a separate phantom process for each segment, how large a record can be read and processed in Universe? Dave > So how would a user 'chop up' a file for parallel processing? > Ideally, if here was a Mod 10001 file (or whatever) it would seem like > it would be 'ideal' to assign 2000 groups to 5 phantoms -- but I don't > know how 'start a BASIC select at Group 2001 or 4001' ... > > -----Original Message----- > From: u2-users-boun...@listserver.u2ug.org > [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George > Gallen > Sent: Monday, October 01, 2012 3:29 PM > To: U2 Users List > Subject: Re: [U2] [u2] Parallel processing in Universe > > 0001: OPENSEQ "/tmp/pipetest" TO F.PIPE ELSE STOP "NO PIPE" > 0002: LOOP > 0003: READSEQ LINE FROM F.PIPE ELSE CONTINUE > 0004: PRINT LINE > 0005: REPEAT > 0006: STOP > 0007: END > > Although, not sure if you might need to sleep a litte between the > READSEQ's ELSE and CONTINUE > Might suck up cpu time when nothing is writing to the file. > > Then you could setup a printer in UV that did a "cat - >> /tmp/pipetest" > > Now your phantom just needs to print to that printer. > > George > > -----Original Message----- > From: u2-users-boun...@listserver.u2ug.org > [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of George > Gallen > Sent: Monday, October 01, 2012 4:16 PM > To: U2 Users List > Subject: Re: [U2] [u2] Parallel processing in Universe > > The only thing about a pipe is that once it's closed, I believe it has > to be re-opened by both Ends again. So if point a opens one end, and > point b opens the other end, once either end closes, It closes for > both sides, and both sides would have to reopen again to use. > > To eliminate this, you could have one end open a file, and have the > other sides do a ">>" append To that file - just make sure you include > some kind of dataheader so the reading side knows which Process just wrote the data. > > -----Original Message----- > From: u2-users-boun...@listserver.u2ug.org > [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of u2ug > Sent: Monday, October 01, 2012 4:11 PM > To: U2 Users List > Subject: Re: [U2] [u2] Parallel processing in Universe > > pipes > > > -----Original Message----- > From: u2-users-boun...@listserver.u2ug.org > [mailto:u2-users-boun...@listserver.u2ug.org] On Behalf Of Wjhonson > Sent: Monday, October 01, 2012 4:05 PM > To: u2-users@listserver.u2ug.org > Subject: [U2] [u2] Parallel processing in Universe > > > What's the largest dataset in the Universe user world? > In terms of number of records. > > I'm wondering if we have any potential for utilities that map-reduce. > I suppose you would spawn phantoms but how do they communicate back to > the master node? > _______________________________________________ > U2-Users mailing list > U2-Users@listserver.u2ug.org > http://listserver.u2ug.org/mailman/listinfo/u2-users > > > _______________________________________________ > U2-Users mailing list > U2-Users@listserver.u2ug.org > http://listserver.u2ug.org/mailman/listinfo/u2-users > _______________________________________________ > U2-Users mailing list > U2-Users@listserver.u2ug.org > http://listserver.u2ug.org/mailman/listinfo/u2-users > _______________________________________________ > U2-Users mailing list > U2-Users@listserver.u2ug.org > http://listserver.u2ug.org/mailman/listinfo/u2-users > > _______________________________________________ > U2-Users mailing list > U2-Users@listserver.u2ug.org > http://listserver.u2ug.org/mailman/listinfo/u2-users > _______________________________________________ U2-Users mailing list U2-Users@listserver.u2ug.org http://listserver.u2ug.org/mailman/listinfo/u2-users _______________________________________________ U2-Users mailing list U2-Users@listserver.u2ug.org http://listserver.u2ug.org/mailman/listinfo/u2-users