Re: OT: Website protection

2009-07-13 Thread schmero...@gmail.com

Thanks for the advise.

Rick Macdougall wrote:

Mikael Bak wrote:

schmero...@gmail.com wrote:

One of our client's websites gets hacked frequently - 1x per month -
usually with some kind of phishing scam.



We've also had some problems lately. After deep investigations we saw
that in 100% of the cases there were no break-ins at all. Not in the old
fashioned manner anyway. The ftp usernames and passwords were stolen
from the client's PC with keylogger or spyware. The hacker could then
log in to the ftp account and make changes to the website.



I've seen this myself on three different client machines (each hosting 
multiple sites). I have yet to discover what spyware was responsible as 
the owners of the different sites contacted the users in question 
themselves.


Regards,

Rick



Re: OT: Website protection

2009-07-12 Thread Rick Macdougall

Mikael Bak wrote:

schmero...@gmail.com wrote:

One of our client's websites gets hacked frequently - 1x per month -
usually with some kind of phishing scam.



We've also had some problems lately. After deep investigations we saw
that in 100% of the cases there were no break-ins at all. Not in the old
fashioned manner anyway. The ftp usernames and passwords were stolen
from the client's PC with keylogger or spyware. The hacker could then
log in to the ftp account and make changes to the website.



I've seen this myself on three different client machines (each hosting 
multiple sites). I have yet to discover what spyware was responsible as 
the owners of the different sites contacted the users in question 
themselves.


Regards,

Rick



Re: OT: Website protection

2009-07-12 Thread Mikael Bak
schmero...@gmail.com wrote:
> One of our client's websites gets hacked frequently - 1x per month -
> usually with some kind of phishing scam.
> 

We've also had some problems lately. After deep investigations we saw
that in 100% of the cases there were no break-ins at all. Not in the old
fashioned manner anyway. The ftp usernames and passwords were stolen
from the client's PC with keylogger or spyware. The hacker could then
log in to the ftp account and make changes to the website.

To prevent this: Change ftp passwords often and check client PC machines
for viruses. Security aware companies will after an incident like this
be aware of the risks to use MS Windows to upload their website content.
If they can't live with that risk, then they have an option to switch -
perhaps only the machines used for ftp transactions.


Mikael


Re: OT: Website protection

2009-07-11 Thread Benny Pedersen

On Sat, July 11, 2009 14:06, schmero...@gmail.com wrote:

> Any ideas where to look for such a beast &/or a mailing list that deals
> with this type of issue?

pages and url that have webserver writeble dirs is always a risk, remove this 
possible to do this solves the problem

else make use of kernel based fuse fs where you can denied write of clamav 
scanned files if its known virus

some webapp have support for clamav btw, good example is mediawiki

my clamav use google clamav signatures around 1.9 Million sigs here currently

-- 
xpoint



Re: OT: Website protection

2009-07-11 Thread SM

At 05:06 11-07-2009, schmero...@gmail.com wrote:
One of our client's websites gets hacked frequently - 1x per month - 
usually with some kind of phishing scam.


I understand their first line of defense is to make sure security is 
tight and systems are up to date, however, it seems to me that there 
must be some scanning utility that would check their site for 
unauthorized pages via a search for domain names.


If they are compromised regularly, they should go to the source of 
the problem and fix it.  You could scan the file system to look for 
unauthorized files.  You cannot do that for webpages.  As the system 
is compromised, you cannot rely on the scan.


Any ideas where to look for such a beast &/or a mailing list that 
deals with this type of issue?


Search for tripwire.

Regards,
-sm 



Re: OT: Website protection

2009-07-11 Thread Cedric Knight
schmero...@gmail.com wrote:
>>> So, if our client was google, the utility would search all files on the
>>> site looking for domains. If it found microsoft.com within one of the
>>> pages and email would be sent to the administrator who could delete the
>>> page and look for other evidence of being hacked or add microsoft.com to
>>> the whitelist.
>>>
>>> Any ideas where to look for such a beast &/or a mailing list that deals
>>> with this type of issue?

Forgot to mention http://www.unmaskparasites.com/

CK


Re: OT: Website protection

2009-07-11 Thread Cedric Knight
schmero...@gmail.com wrote:
>> One of our client's websites gets hacked frequently - 1x per month -
>> usually with some kind of phishing scam.
>>
>> I understand their first line of defense is to make sure security is
>> tight and systems are up to date, however, it seems to me that there
>> must be some scanning utility that would check their site for
>> unauthorized pages via a search for domain names.
>>
>> So, if our client was google, the utility would search all files on the
>> site looking for domains. If it found microsoft.com within one of the
>> pages and email would be sent to the administrator who could delete the
>> page and look for other evidence of being hacked or add microsoft.com to
>> the whitelist.
>>
>> Any ideas where to look for such a beast &/or a mailing list that deals
>> with this type of issue?

Indeed Google "safe browsing" scans pages it indexes looking for IFRAME
exploits, Gumblar etc.

Phishing pages are harder to recognise than links to malware, meaning
Google has to largely rely on us reporting 'web forgeries'.  From the
evidence that Google doesn't automatically list suspicious pages I
assume that no such utility yet exists.

(Also note that Gumblar and other malware use pretty tedious JavaScript
obfuscation techniques.  So you might want to wget the site or access it
through a browser, rather than just grep through it for suspicious strings.)

I don't know who is working on phishing detection tools: maybe contact
APWG (antiphishing.org) or your local OWASP chapter.

Phishing scams are in my experience often uploaded through insecure
CMSes such as Joomla modules (you can see this when the uri contains
things like 'mambots/content' listed in the rules below).

Despite that, have you done the obvious and checked FTP logs?  In many
cases the website owner or designer may have a keylogger or agent
stealing FTP credentials which are then circulated to a botnet to deface
pages:  http://news.zdnet.com/2100-9595_22-306268.html  I've had to ask
people to run at least two up-to-date spyware scans on the Windows PC
they upload content from before the culprit is found.

Also make sure your correct abuse address is listed at abuse.net (and on
WHOIS if appropriate), so e.g. SpamCop reports about spamvertised sites
come to you without delay.

Terry Carmen wrote:
> If you're getting hacked once a month, I suspect the server contains a
> well-known vulnerability that needs to be located and repaired.
> 
> I'd recommend making all content changes on a *really* secure server, then
> replicating the entire web-root to the public web server with rsync, with the
> --delete option enabled.
> 
> Rsync will overwrite any of the "damaged" content with a fresh copy from the
> secure server and remove any "extras", making any unauthorized content changes
> vanish.

I like that suggestion - provided you're not expecting general visitors
to contribute content, you could rsync every 20 mins or so and by the
time the uri is spammed out the malicious content is gone.  The back-end
would be on a firewalled server that is not public-facing.  However, it
doesn't necessarily help if the FTP/SSH/CMS password is weak or
(particularly) has been compromised by malware on a desktop.

These strings in URIs/filenames have seemed to me to be associated with
phishing:

uri PHISH_CGI
/(\/cgi(?!\.ebay\.)|Login(?:Member)?\.do|mambo\/+components|mambots\/content\/|\/smilies|\/uploads|\/\?siteid=|\/aspnet_client|\/(?:includes|_mem_bin|components|classes)\/)/
describe PHISH_CGI  Common phishing destination
score PHISH_CGI 0.05

uri PHISH_CGI2
/\/(?:uploads|files|includes|components|js|mambots|smilies|images)\/.*(?:\.co\.uk|\.com\b|Log[a-z\.0-9-]+\.(?:php|htm))/i
describe PHISH_CGI2 Looks like exploit with "Logon" file
score PHISH_CGI20.2

I hope some of this helps.

CK



Re: OT: Website protection

2009-07-11 Thread Terry Carmen

> One of our client's websites gets hacked frequently - 1x per month -
> usually with some kind of phishing scam.
>
> I understand their first line of defense is to make sure security is
> tight and systems are up to date, however, it seems to me that there
> must be some scanning utility that would check their site for
> unauthorized pages via a search for domain names.
>
> So, if our client was google, the utility would search all files on the
> site looking for domains. If it found microsoft.com within one of the
> pages and email would be sent to the administrator who could delete the
> page and look for other evidence of being hacked or add microsoft.com to
> the whitelist.
>
> Any ideas where to look for such a beast &/or a mailing list that deals
> with this type of issue?

If you're getting hacked once a month, I suspect the server contains a
well-known vulnerability that needs to be located and repaired.

I'd recommend making all content changes on a *really* secure server, then
replicating the entire web-root to the public web server with rsync, with the
--delete option enabled.

Rsync will overwrite any of the "damaged" content with a fresh copy from the
secure server and remove any "extras", making any unauthorized content changes
vanish.

Terry








OT: Website protection

2009-07-11 Thread schmero...@gmail.com
One of our client's websites gets hacked frequently - 1x per month - 
usually with some kind of phishing scam.


I understand their first line of defense is to make sure security is 
tight and systems are up to date, however, it seems to me that there 
must be some scanning utility that would check their site for 
unauthorized pages via a search for domain names.


So, if our client was google, the utility would search all files on the 
site looking for domains. If it found microsoft.com within one of the 
pages and email would be sent to the administrator who could delete the 
page and look for other evidence of being hacked or add microsoft.com to 
the whitelist.


Any ideas where to look for such a beast &/or a mailing list that deals 
with this type of issue?