On 6 Mar 2007, at 11:29, Richard Miller wrote:

Here is from our experience. I can't explain why this is happening, but repeated tests show that it does.

1) Using the blocking form (put url xx into url yy), we experienced that a foreground file transfer that might take 30 seconds would be effected by a pendingmessage file transfer scheduled to occur sometime during the time frame of that foreground transfer


I see a potential problem here, but it depends what you're doing in the file transfer pendingMessage handler. When using blocking calls (get url, put ... into url) only one transfer can be processed at a time by libUrl. If you attempt to start another transfer before the previous one completes, you will get an error in the result. Depending on how your file transfer handler handles errors (Can it dstinguish between an error in a currently uploading transfer and one that has been called before the previous one completes?) it could get messy.

I don't think using the pendingMessages queue to schedule blocking transfers is the best approach. Typically, you would wait until one transfer completes, and then start the next one.

If you know the files to be transferred when you start the procedure, I would do something like this (error checking points only indicated):

on fileTransfer pFileList

 repeat for each line tFile in pFileList
    put <whatever> into tUrl ## your url setting routine here
    put url ("binfile:" & pFile) into tData
    ## check the result here in case of local file error
    repeat 5 times ## or however many times you want to try
       put false into tFailed
       put tData into url tUrl
       if the result is empty then
         exit repeat
       else
         put true into tFailed
       end if
     end repeat
     if tFailed then
        ## perhaps record the file name
        ## and show a list of failed uploads later
     end if
 end repeat

end fileTransfer

If you don't know the files in advance, then I'd add them to a queue (perhaps a custom property or global or script local variable. Then have the transfer routine check the queue when it finshes a transfer.

local sFileQ

on startTransferRoutine ## called only once, perhas at startup
     send tranferFile to me in 0 milliseconds
end startTransferRoutine

on qFile pFile ## adds file to queue
    if pFile <> empty then
        put line 1 of pFile & cr after sFileQ
    end if
end qFile

on transferFile
  if the number of lines of sFileQ > 0 then
    repeat until tFile <> empty
      put line 1 of sFileQ into tFile
      delete line 1 of tFile
    end repeat
    put <whatever> into tUrl ## your url setting routine here
    put url ("binfile:" & pFile) into tData
    ## check the result here in case of local file error
    repeat 5 times ## or however many times you want to try
       put false into tFailed
       put tData into url tUrl
       if the result is empty then
         exit repeat
       else
         put true into tFailed
       end if
     end repeat
     if tFailed then
        ## perhaps record the file name
        ## and show a list of failed uploads later
     end if
  end if
  send transferFile to me in 1000 milliseconds
end transferFile


The server (under its default settings) rejects subsequent attempts unless we unload the url,

For non-blocking calls, you must unload the url, even after an error.


send a QUIT command, and wait some period of time before starting the next attempt.

That doesn't sound too good.

Cheers
Dave
_______________________________________________
use-revolution mailing list
use-revolution@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-revolution

Reply via email to