Thomas Bätzler wrote:
Chris Coggins <cacogg...@cox.net> asked:
I need to copy a small config file from one server to another. I'm
generating the file on one server with a pretty complex script. I need
to put that file on another server for immediate user access via html.
I've tried just writing the file straight to the new server over the
network using absolute paths but it doesn't work. I'm thinking that I
can accomplish this instead through a perl-generated html page, to have
the script copy the file from its source server and put it locally for
use within the web page.

What would the code look like to do this?

Let's see if I got this right: You have one server where some file is 
generated, and another one where said file is to be downloaded from.

Data supplied randomly throughout the day from one company division is processed only on one server in the company. One of the files the processing script generates is an xml file that is needed to supply current data to separate group of users in another part of the company via a flash application in the user's browser. However, the end user never accesses this server. They only have permission to access another server elsewhere in the network. AND, the latest versions of both IE and Flash have made it very difficult for cross-server access to data. Therefore I need to get this xml file onto the server that they access so that their flash application can use it, and I'm not able to just write the xml file to that server using the script.
Is this file personalized for each user, i.e. must arguments be passed from 
front- to backend?

Personalized in the sense that the xml file contains the absolute latest up-to-the minute data, which is necessary for the users to finish their tasks.


If the generation of the file is "complex", can it be produced on demand or do 
we have to implement some kind of waiting mechanism that keeps the user's browser 
connected to the frontend while the file is generated?

I've managed to do what I needed to do by forcing the end user to launch a local script that copies the files across the network everytime they need to access the page that uses the data. The script goes like this:

#!/usr/bin/perl
use CGI;
use LWP::Simple qw(getstore);

  my$url = "http://domain.com/dir1/xmlfile.xml";;
  my$url1 = "http://domain.com/dir1/jsfile.js";;
  my$file = "../filename.xml";
  my$file1 = "../filename.js";

  getstore($url, $file);
  getstore($url1, $file1);

then prints the actual location page to the browser.
Cheers,
Thomas


--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to