On Wed, Apr 14, 2010 at 5:40 AM, Chris Coggins <cacogg...@cox.net> wrote:
> Data supplied randomly throughout the day from one company division is
> processed only on one server in the company. One of the files the processing
> script generates is an xml file that is needed to supply current data to
> separate group of users in another part of the company via a flash
> application in the user's browser.  However, the end user never accesses
> this server. They only have permission to access another server elsewhere in
> the network. AND, the latest versions of both IE and Flash have made it very
> difficult for cross-server access to data. Therefore I need to get this xml
> file onto the server that they access so that their flash application can
> use it, and I'm not able to just write the xml file to that server using the
> script.

Have you tried using something like ssh+rsync or scp (ssh copy) from
the source server (i.e., the process that generates the file could
"export" it)? If you set up public key authentication then the script
wouldn't need to worry about authentication (though you would need to
make sure that the private key was properly secured). Does the
destination server have an SSH daemon running?

> Personalized in the sense that the xml file contains the absolute latest
> up-to-the minute data, which  is necessary for the users to finish their
> tasks.

How often is it updated? Every X seconds, minutes, ... hours? How
often will users hit the application? It would seem wasteful to fetch
it from the remote server every time if it isn't changing (I'm
unfamiliar with LWP::Simple, but it doesn't seem to mention caching in
the perldoc), especially if many users are hitting it often. At the
same time, if it's changing too often, you might end up having issues
trying to push it from the source server (among other things,
synchronization issues). It might be over my head, but I'm just trying
to push some ideas. :)

> I've managed to do what I needed to do by forcing the end user to launch a
> local script that copies the files across the network everytime they need to
> access the page that uses the data. The script goes like this:
>
> #!/usr/bin/perl
> use CGI;
> use LWP::Simple qw(getstore);
>
>  my$url = "http://domain.com/dir1/xmlfile.xml";;
>  my$url1 = "http://domain.com/dir1/jsfile.js";;
>  my$file = "../filename.xml";
>  my$file1 = "../filename.js";
>
>  getstore($url, $file);
>  getstore($url1, $file1);
>
> then prints the actual location page to the browser.

Do you mean they go to a copy page and then go to the processing page
(i.e., Flash application) or do you mean that the coping page becomes
the Flash application page? Assuming you can't push the file from the
source server, could you use an asynchronous request from Flash to the
copy script on the destination server instead of having the users jump
around? When the request returns successfully, you can then request
the XML data?

I'm just trying to get an idea of the whole picture. :)

-- 
Brandon McCaig <bamcc...@gmail.com>
V zrna gur orfg jvgu jung V fnl. Vg qbrfa'g nyjnlf fbhaq gung jnl.
Castopulence Software <http://www.castopulence.org/> <bamcc...@castopulence.org>

--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to