Re: Finding porn links in hacked web pages

2008-01-28 Thread Omer Zak
1. When your Web site contains a blog (which may not be a problem for
Geoff's friend's Web site), the local copy upload method is not
feasible, unless designed to skip the blog part.

2. The local copy upload method does not alert you when vandalism has
actually occurred.

On Mon, 2008-01-28 at 09:48 +0200, Shahar Dag wrote:
 Hi
 
 I would prefer to maintain a local copy of the web + once a day (using cron) 
 to upload it to the web server
 (or even better, maintain a SVN server that hold the local copy of the web)
 
 Shahar
 - Original Message - 
 From: Omer Zak [EMAIL PROTECTED]
 To: linux-il linux-il@cs.huji.ac.il
 Sent: Monday, January 28, 2008 9:15 AM
 Subject: Re: Finding porn links in hacked web pages
 
 
  The method which I use is to:
  1. Perform periodic backup of the entire Web site, including SQL dumps
  of any databases driving it.
  2. Download the backup files to PC.
  3. Open them (into a subdirectory and import into a new DB instance,
  respectively).
  4. Run 'diff' between the opened files and the previous backup.
 
  For regular files, use 'diff'.  For DB comparison of two MySQL DBs, I
  use a Python script, which I wrote.
--- Omer
 
  On Mon, 2008-01-28 at 09:03 +0200, Geoffrey S. Mendelson wrote:
  Yesterday my wife went to a perfectly normal web page and after
  a few seconds a porn page replaced it.
 
  I looked at the HTML page source and found that at the bottom of the
  page were hundreds of links, which did not belong there. I called
  the publisher of the page, and he determined that his server had been
  hacked and the links added.
 
  He is not technicaly inclined at all, and does not have the ability
  to check his pages without going to each one in a browser and looking
  at the page source. He has thousands of pages and runs the site as
  a Jewish news site, with no income.
 
  I was thinking that I could write a program that scans each of his
  web pages using wget or lynx to download them, but don't want to
  start writing code if it has been already done.
 
  Any suggestions?
-- 
MS-Windows is the Pal-Kal of the PC world.
My own blog is at http://www.zak.co.il/tddpirate/

My opinions, as expressed in this E-mail message, are mine alone.
They do not represent the official policy of any organization with which
I may be affiliated in any way.
WARNING TO SPAMMERS:  at http://www.zak.co.il/spamwarning.html


=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Re: Finding porn links in hacked web pages

2008-01-28 Thread Jonathan Ben Avraham

Hi Geoffrey,
I think the problem here is a business and ethical issue and not a 
technical issue. The technical reality is as Omer states. That is, it 
takes time and technical ability (in other words, money) to keep web sites 
safe. Your friend needs to understand this and either find the money 
required to maintain his site properly or to shut it down. He might 
also consider merging his service into a site that has the resources to 
look out after itself and its users.

Regards,

 - yba


On Mon, 28 Jan 2008, Omer Zak wrote:


Date: Mon, 28 Jan 2008 09:15:57 +0200
From: Omer Zak [EMAIL PROTECTED]
To: linux-il linux-il@cs.huji.ac.il
Subject: Re: Finding porn links in hacked web pages

The method which I use is to:
1. Perform periodic backup of the entire Web site, including SQL dumps
of any databases driving it.
2. Download the backup files to PC.
3. Open them (into a subdirectory and import into a new DB instance,
respectively).
4. Run 'diff' between the opened files and the previous backup.

For regular files, use 'diff'.  For DB comparison of two MySQL DBs, I
use a Python script, which I wrote.
  --- Omer

On Mon, 2008-01-28 at 09:03 +0200, Geoffrey S. Mendelson wrote:

Yesterday my wife went to a perfectly normal web page and after
a few seconds a porn page replaced it.

I looked at the HTML page source and found that at the bottom of the
page were hundreds of links, which did not belong there. I called
the publisher of the page, and he determined that his server had been
hacked and the links added.

He is not technicaly inclined at all, and does not have the ability
to check his pages without going to each one in a browser and looking
at the page source. He has thousands of pages and runs the site as
a Jewish news site, with no income.

I was thinking that I could write a program that scans each of his
web pages using wget or lynx to download them, but don't want to
start writing code if it has been already done.

Any suggestions?





--
 EE 77 7F 30 4A 64 2E C5  83 5F E7 49 A6 82 29 BA~. .~   Tk Open Systems
=}ooO--U--Ooo{=
 - [EMAIL PROTECTED] - tel: +972.2.679.5364, http://www.tkos.co.il -

=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Re: Finding porn links in hacked web pages

2008-01-28 Thread Nadav Har'El
On Mon, Jan 28, 2008, Geoffrey S. Mendelson wrote about Finding porn links in 
hacked web pages:
 He is not technicaly inclined at all, and does not have the ability
 to check his pages without going to each one in a browser and looking
 at the page source. He has thousands of pages and runs the site as
 a Jewish news site, with no income.
 
 I was thinking that I could write a program that scans each of his
 web pages using wget or lynx to download them, but don't want to 
 start writing code if it has been already done. 

If this guy is the only one changing his content, what I would do is run
a trivial script on a remote machine: every day (or whatever) fetch the
entire content of the site (with wget) compare (with cmp) the new content
to the previous content, and finally email or SMS this guy the number of
modified files. If he knows that he modified one page, and got a mail saying
one page changed, he's safe. If he changed nothing and got a message that
100 pages changed, he knows he has a big problem.
I don't think that scanning for porn links will work; How will you know
that these are porn links? And what will happen the next time his site is
cracked, and the cracker won't add porn links, but do something else?

During the doc.com boom, I remember an Israeli startup whose business was
exactly this - noticing that a site has been defaced using remote servers
which constantly try to download pages from the site and notice if something
has changed. Unfortunately, I can't recall now the company's name.


-- 
Nadav Har'El|  Monday, Jan 28 2008, 21 Shevat 5768
[EMAIL PROTECTED] |-
Phone +972-523-790466, ICQ 13349191 |A messy desk is a sign of a messy mind.
http://nadav.harel.org.il   |An empty desk is a sign of an empty mind.

=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Re: Finding porn links in hacked web pages

2008-01-28 Thread Tom Rosenfeld
Hi Geoff,
Any of these comparison suggestions are fine, but they miss the point. If
the site is hacked, the hacker can come back every day, or hour and
reinstall his links. You can be sure he already has an automated process.

You need to find the source of the break in and then plug it. After that a
comparison script will be useful to alert you to new problems.

-tom


On Jan 28, 2008 10:32 AM, Nadav Har'El [EMAIL PROTECTED] wrote:

 On Mon, Jan 28, 2008, Geoffrey S. Mendelson wrote about Finding porn
 links in hacked web pages:
  He is not technicaly inclined at all, and does not have the ability
  to check his pages without going to each one in a browser and looking
  at the page source. He has thousands of pages and runs the site as
  a Jewish news site, with no income.
 
  I was thinking that I could write a program that scans each of his
  web pages using wget or lynx to download them, but don't want to
  start writing code if it has been already done.

 If this guy is the only one changing his content, what I would do is run
 a trivial script on a remote machine: every day (or whatever) fetch the
 entire content of the site (with wget) compare (with cmp) the new content
 to the previous content, and finally email or SMS this guy the number of
 modified files. If he knows that he modified one page, and got a mail
 saying
 one page changed, he's safe. If he changed nothing and got a message that
 100 pages changed, he knows he has a big problem.
 I don't think that scanning for porn links will work; How will you know
 that these are porn links? And what will happen the next time his site is
 cracked, and the cracker won't add porn links, but do something else?

 During the doc.com boom, I remember an Israeli startup whose business was
 exactly this - noticing that a site has been defaced using remote servers
 which constantly try to download pages from the site and notice if
 something
 has changed. Unfortunately, I can't recall now the company's name.


 --
 Nadav Har'El|  Monday, Jan 28 2008, 21 Shevat
 5768
 [EMAIL PROTECTED]
 |-
 Phone +972-523-790466, ICQ 13349191 |A messy desk is a sign of a messy
 mind.
 http://nadav.harel.org.il   |An empty desk is a sign of an empty
 mind.

 =
 To unsubscribe, send mail to [EMAIL PROTECTED] with
 the word unsubscribe in the message body, e.g., run the command
 echo unsubscribe | mail [EMAIL PROTECTED]




-- 
-tom
054-244-8025


Re: Finding porn links in hacked web pages

2008-01-28 Thread Nadav Har'El
On Mon, Jan 28, 2008, Tom Rosenfeld wrote about Re: Finding porn links in 
hacked web pages:
 Hi Geoff,
 Any of these comparison suggestions are fine, but they miss the point. If
 the site is hacked, the hacker can come back every day, or hour and
 reinstall his links. You can be sure he already has an automated process.
 
 You need to find the source of the break in and then plug it. After that a
 comparison script will be useful to alert you to new problems.

You're right that Know that your site has been defaced is not a complete
defense: It doesn't prevent your site from getting cracked in the first
place, it doesn't prevent stealing your secret data. It also doesn't prevent
the cracker from cracking your site again after you've (thought that you)
fixed it.

But what it does is give you some level of protection against fadichot
(the English word embarrassments isn't strong enough for that :-)) - it
protects your site from sending to thousands of its users embarrassing texts
like porn links or statement like THIS SITE HAS BEEN HACKED, or worse -
giving people who download software from you, trojaned software. It gives
you the opportunity to recognize this situation as soon as possible, and
at least yank off the site to prevent further embarrassments.

Of course, the trivial technique I suggested will only work for rarely
edited static sites. In sites which are supposed to be heavily edited by many
people, and dynamic sites, it is much harder for any automatic software to
figure out which changes were legitimate and which were done by crackers.


-- 
Nadav Har'El|  Monday, Jan 28 2008, 21 Shevat 5768
[EMAIL PROTECTED] |-
Phone +972-523-790466, ICQ 13349191 |How long a minute depends on what side of
http://nadav.harel.org.il   |the bathroom door you're on.

=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Re: Finding porn links in hacked web pages

2008-01-28 Thread Shahar Dag

Hi

I would prefer to maintain a local copy of the web + once a day (using cron) 
to upload it to the web server

(or even better, maintain a SVN server that hold the local copy of the web)

Shahar
- Original Message - 
From: Omer Zak [EMAIL PROTECTED]

To: linux-il linux-il@cs.huji.ac.il
Sent: Monday, January 28, 2008 9:15 AM
Subject: Re: Finding porn links in hacked web pages



The method which I use is to:
1. Perform periodic backup of the entire Web site, including SQL dumps
of any databases driving it.
2. Download the backup files to PC.
3. Open them (into a subdirectory and import into a new DB instance,
respectively).
4. Run 'diff' between the opened files and the previous backup.

For regular files, use 'diff'.  For DB comparison of two MySQL DBs, I
use a Python script, which I wrote.
  --- Omer

On Mon, 2008-01-28 at 09:03 +0200, Geoffrey S. Mendelson wrote:

Yesterday my wife went to a perfectly normal web page and after
a few seconds a porn page replaced it.

I looked at the HTML page source and found that at the bottom of the
page were hundreds of links, which did not belong there. I called
the publisher of the page, and he determined that his server had been
hacked and the links added.

He is not technicaly inclined at all, and does not have the ability
to check his pages without going to each one in a browser and looking
at the page source. He has thousands of pages and runs the site as
a Jewish news site, with no income.

I was thinking that I could write a program that scans each of his
web pages using wget or lynx to download them, but don't want to
start writing code if it has been already done.

Any suggestions?


--
MS-Windows is the Pal-Kal of the PC world.
My own blog is at http://www.zak.co.il/tddpirate/

My opinions, as expressed in this E-mail message, are mine alone.
They do not represent the official policy of any organization with which
I may be affiliated in any way.
WARNING TO SPAMMERS:  at http://www.zak.co.il/spamwarning.html


=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]




=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Finding porn links in hacked web pages

2008-01-27 Thread Geoffrey S. Mendelson
Yesterday my wife went to a perfectly normal web page and after
a few seconds a porn page replaced it. 

I looked at the HTML page source and found that at the bottom of the
page were hundreds of links, which did not belong there. I called
the publisher of the page, and he determined that his server had been
hacked and the links added. 

He is not technicaly inclined at all, and does not have the ability
to check his pages without going to each one in a browser and looking
at the page source. He has thousands of pages and runs the site as
a Jewish news site, with no income.

I was thinking that I could write a program that scans each of his
web pages using wget or lynx to download them, but don't want to 
start writing code if it has been already done. 

Any suggestions? 

Thanks in advance, Geoff.

-- 
Geoffrey S. Mendelson, Jerusalem, Israel [EMAIL PROTECTED]  N3OWJ/4X1GM
IL Voice: (07)-7424-1667 U.S. Voice: 1-215-821-1838 
Visit my 'blog at http://geoffstechno.livejournal.com/

=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]



Re: Finding porn links in hacked web pages

2008-01-27 Thread Omer Zak
The method which I use is to:
1. Perform periodic backup of the entire Web site, including SQL dumps
of any databases driving it.
2. Download the backup files to PC.
3. Open them (into a subdirectory and import into a new DB instance,
respectively).
4. Run 'diff' between the opened files and the previous backup.

For regular files, use 'diff'.  For DB comparison of two MySQL DBs, I
use a Python script, which I wrote.
   --- Omer

On Mon, 2008-01-28 at 09:03 +0200, Geoffrey S. Mendelson wrote:
 Yesterday my wife went to a perfectly normal web page and after
 a few seconds a porn page replaced it. 
 
 I looked at the HTML page source and found that at the bottom of the
 page were hundreds of links, which did not belong there. I called
 the publisher of the page, and he determined that his server had been
 hacked and the links added. 
 
 He is not technicaly inclined at all, and does not have the ability
 to check his pages without going to each one in a browser and looking
 at the page source. He has thousands of pages and runs the site as
 a Jewish news site, with no income.
 
 I was thinking that I could write a program that scans each of his
 web pages using wget or lynx to download them, but don't want to 
 start writing code if it has been already done. 
 
 Any suggestions? 

-- 
MS-Windows is the Pal-Kal of the PC world.
My own blog is at http://www.zak.co.il/tddpirate/

My opinions, as expressed in this E-mail message, are mine alone.
They do not represent the official policy of any organization with which
I may be affiliated in any way.
WARNING TO SPAMMERS:  at http://www.zak.co.il/spamwarning.html


=
To unsubscribe, send mail to [EMAIL PROTECTED] with
the word unsubscribe in the message body, e.g., run the command
echo unsubscribe | mail [EMAIL PROTECTED]