Why dont use something like VMWare converter and make a full clone of
the Server then run all testing within a test lab that does not have
access to the outside world.

But has noted above, depending what the end-user needs the test
carried out for you may still have to run the tests on the live site.

????????????????????
????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
???????????????????????????????????????????????????????????????????????????????????

BaconZombie

LOAD "*",8,1

On 31 October 2012 16:41, Barry Von Ahsen <[email protected]> wrote:
> wget has a convert option:
>
> -k
>        --convert-links
>            After the download is complete, convert the links in the document 
> to make them suitable for local
>            viewing.
>
>
> it mentions absolute links, but not fully-qualified links, so YMMV
>
>
> -barry
>
>
>
> On Oct 31, 2012, at 11:28 AM, Arch Angel wrote:
>
>> Have you considered grepping through the files searching for the hard coded
>> portion of the code which is the same across the board and change that to
>> the location on the dev box?  Might take a nix box an hour or so but should
>> get nearly all of them.  Only true way to be sure is search all the code
>> line by line.  If it is that important that you don't hit the prod box I
>> would spend the time and effort to do so.
>>
>> Also a year or so back Paul and Larry talked about a way to clone a site
>> and pull it down to a local machine and run all the pages locally, but I
>> can't recall exactly what they did.  Sorry, hopefully Bugbear, Paul, or
>> Larry can recall that one to help you.  My old age and all, I don't recall
>> as well as I used too.  What were we discussing anyway :-)
>>
>> - Robert
>> (arch3angel)
>>
>> On Wed, Oct 31, 2012 at 12:15 PM, Patrick Laverty <[email protected]
>>> wrote:
>>
>>> Ok, newbie here...
>>>
>>> I was asked to scan a web site that we were told is vulnerable. So I'm
>>> copying the site over to my Dev server and each time I manually click
>>> on links, I see it sends my request to production. I went through the
>>> .htaccess file and changed everything to point to my Dev server. It
>>> still goes to prod. I dig in a little further and sure enough, most of
>>> the links in the hundreds of pages are hardcoded to the prod site.
>>>
>>> What's the safest way to get around this? Set the /etc/hosts file on
>>> my scanning machine to point to my Dev server? I want to make 100%
>>> sure that my scan never hits the production server.
>>>
>>> Suggestions?
>>>
>>> Thank you.
>>> _______________________________________________
>>> Pauldotcom mailing list
>>> [email protected]
>>> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
>>> Main Web Site: http://pauldotcom.com
>>>
>> _______________________________________________
>> Pauldotcom mailing list
>> [email protected]
>> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
>> Main Web Site: http://pauldotcom.com
>
> _______________________________________________
> Pauldotcom mailing list
> [email protected]
> http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
> Main Web Site: http://pauldotcom.com



-- 
????????????????????
????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????
???????????????????????????????????????????????????????????????????????????????????

BaconZombie

LOAD "*",8,1
_______________________________________________
Pauldotcom mailing list
[email protected]
http://mail.pauldotcom.com/cgi-bin/mailman/listinfo/pauldotcom
Main Web Site: http://pauldotcom.com

Reply via email to