Re: [Full-disclosure] Google's robots.txt handling

2012-12-13 Thread Mario Vilas
That paragraph says pretty much the exact opposite of what you understood.

Also, could we please stop refuting points nobody even made in the first
place? OP never claimed this to be a vulnerability, nor ever said
robots.txt is a proper security mechanism to hide files in public web
directories.

All OP said was the way robots.txt is indexed allows for some Google dorks
to be made, and it may be a good idea to avoid that. Clearly it's not the
discovery of the century, but it seems fairly reasonable to me... I don't
get what all this fuzz is about.

On Wed, Dec 12, 2012 at 12:18 PM, Christoph Gruber l...@guru.at wrote:

 On 12.12.2012 at 00:23 Lehman, Jim jim.leh...@interactivedata.com
 wrote:

  It is possible to use white listing for robots.txt. Allow what you want
 google to index and deny everything else. That way google doesn't make you
 a goole dork target and someone browsing to your robots.txt file doesn't
 glean any sensitive files or folders. But this will not stop directory
 bruting to discover your publicly exposed sensitive data, that probably
 should not be exposed to the web in the first place.

 Maybe I misunderstood something, but do you really think that sensitive
 can be hidden in secret directories on publicly reachable web servers?
 --
 Christoph Gruber
 By not reading this email you don't agree you're not in any way affiliated
 with any government, police, ANTI- Piracy Group, RIAA, MPAA, or any other
 related group, and that means that you CANNOT read this email.
 By reading you are not agreeing to these terms and you are violating code
 431.322.12 of the Internet Privacy Act signed by Bill Clinton in 1995.
 (which doesn't exist)

 ___
 Full-Disclosure - We believe in it.
 Charter: http://lists.grok.org.uk/full-disclosure-charter.html
 Hosted and sponsored by Secunia - http://secunia.com/




-- 
“There's a reason we separate military and the police: one fights the enemy
of the state, the other serves and protects the people. When the military
becomes both, then the enemies of the state tend to become the people.”
___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Re: [Full-disclosure] Google's robots.txt handling

2012-12-13 Thread Lehman, Jim
Yes I think you misunderstood or more likely I poorly worded the post. White 
listing is better than black listing. Black listing something you don't want 
googlebot to index just makes it easier for someone to find something you don't 
want indexed. If that content is sensitive, it probably should not be publicly 
accessible in the first place. But people never put sensitive content on web 
server (weak attempt at humor, my apologies).  I am beating the dead horse 
here, but robots.txt is not a security control. 
Most of the time robots.txt is great for recon sense and not Amy measure of 
defense. 

White listing just helps in not exposing too much information, a speed  bump if 
anything security related. I think this falls under the 'defense in depth' 
heading.   

-Original Message-
From: full-disclosure-boun...@lists.grok.org.uk 
[mailto:full-disclosure-boun...@lists.grok.org.uk] On Behalf Of Christoph Gruber
Sent: Wednesday, December 12, 2012 3:19 AM
To: full-disclosure@lists.grok.org.uk
Subject: Re: [Full-disclosure] Google's robots.txt handling

On 12.12.2012 at 00:23 Lehman, Jim jim.leh...@interactivedata.com wrote:

 It is possible to use white listing for robots.txt. Allow what you want 
 google to index and deny everything else. That way google doesn't make you a 
 goole dork target and someone browsing to your robots.txt file doesn't glean 
 any sensitive files or folders. But this will not stop directory bruting to 
 discover your publicly exposed sensitive data, that probably should not be 
 exposed to the web in the first place. 

Maybe I misunderstood something, but do you really think that sensitive can 
be hidden in secret directories on publicly reachable web servers?
-- 
Christoph Gruber
By not reading this email you don't agree you're not in any way affiliated with 
any government, police, ANTI- Piracy Group, RIAA, MPAA, or any other related 
group, and that means that you CANNOT read this email.
By reading you are not agreeing to these terms and you are violating code 
431.322.12 of the Internet Privacy Act signed by Bill Clinton in 1995.
(which doesn't exist)

___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/


***
This message (including any files transmitted with it) may contain confidential 
and/or proprietary information, is the property of Interactive Data Corporation 
and/or its subsidiaries, and is directed only to the addressee(s). If you are 
not the designated recipient or have reason to believe you received this 
message in error, please delete this message from your system and notify the 
sender immediately. An unintended recipient's disclosure, copying, 
distribution, or use of this message or any attachments is prohibited and may 
be unlawful. 
***

___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/


Re: [Full-disclosure] Nokia phone forcing traffic through proxy

2012-12-13 Thread Kim Henriksen
Opera has done this for quite some time now. They translate and compresses
the website into their own language called OBML:
http://dev.opera.com/articles/view/opera-binary-markup-language/


On Fri, Dec 7, 2012 at 9:01 PM, Philip Whitehouse phi...@whiuk.com wrote:

 On 7 Dec 2012, at 19:03, Jeffrey Walton noloa...@gmail.com wrote:

  On Fri, Dec 7, 2012 at 11:55 AM, Gaurang Pandya gaub...@yahoo.com
 wrote:
  It has been noticed that internet browsing traffic, instead of directly
  hitting requested server, is being redirected to proxy servers. They get
  redirected to Nokia/Ovi proxy servers if Nokia browser is used, and to
 Opera
  proxy servers if Opera Mini browser is used.
 
  More detailed info at :
  http://gaurangkp.wordpress.com/2012/12/05/nokia-proxy/
  It sounds a lot like http://click-fraud-fun.blogspot.com/.
 
  We know proxies can cause a lot of trouble in practice. For example,
 
 http://blog.cryptographyengineering.com/2012/03/how-do-interception-proxies-fail.html
 .
 
  Proxies and data snatching are the reason to pin certificates when
  using VPN and SSL/TLS if a pre-existing relationship exists (for
  example, you know the host and its public key). Are you talking to an
  Nokia/Ovi proxy, an Interception proxy (perhaps enabled by Trustwave),
  or the host expected during a SSL/TLS negotiation?
 
  We now have a much better body of knowledge. Its too bad most browser
  don't offer the features for those who are security conscious. On
  Android, Google went so far as to offer pinning as opt-in for sites:
 
 http://groups.google.com/group/android-security-discuss/browse_thread/thread/f5898be7ee9abc48
 .
 
  Jeff

 BlackBerry does this, Amazon Kindle Fire almost certainly does it, for
 caching purposes. I'm not sure whether that's why the Nokia phone is doing
 it though - you need a good infrastructure to support it.

 Regards,

 Philip Whitehouse
 ___
 Full-Disclosure - We believe in it.
 Charter: http://lists.grok.org.uk/full-disclosure-charter.html
 Hosted and sponsored by Secunia - http://secunia.com/




-- 
Mvh.
Kim Henriksen
___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

[Full-disclosure] 'portable-phpMyAdmin (WordPress Plugin)' Authentication Bypass (CVE-2012-5469)

2012-12-13 Thread Mark Stanislav
I. DESCRIPTION
---
portable-phpMyAdmin doesn't verify an existing WordPress session
(privileged or not) when accessing the plugin file path directly. Because
of how this plugin works, a default installation will provide a full
phpMyAdmin console with the privilege level of the MySQL configuration of
WordPress.


II. TESTED VERSION
---
1.3.0


III. PoC EXPLOIT
---
Navigate to http://host/wp-content/plugins/portable-phpmyadmin/wp-pma-modand
you will be presented with the full portable-phpMyAdmin web interface
without the requirement of a session or any credential.


IV. SOLUTION
---
Upgrade to version 1.3.1


V. REFERENCES
---
http://wordpress.org/extend/plugins/portable-phpmyadmin/
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-5469


VI. TIMELINE
---
10/13/2012 - Initial developer disclosure
10/14/2012 - Response from developer with commitment to fix the
vulnerability
10/31/2012 - Follow-up with developer after no communication or patched
release
11/16/2012 - Second attempt to follow-up with developer regarding
progress/timetable
11/26/2012 - Contacted WordPress 'plugins team' about lack of progress on
patched release
11/27/2012 - WordPress 'plugins team' patches software and releases version
1.3.1
12/12/2012 - Public disclosure
___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

[Full-disclosure] Hacking Competition PHDAYS CTF Quals 2012 Starts On December 15

2012-12-13 Thread PHD
Let us remind you that PHDays CTF Quals starts on the 15th of December and will 
last for three days. 300 teamshttp://quals.phdays.com/teams/leaders from more 
than 30 different countries of the world have already registered. You still can 
join!

The teams will try their hands at security assessment, vulnerabilities 
detection and exploitation as well as fulfilling reverse engineering tasks. The 
conditions of PHDays CTF Quals, as opposed to many other competitions of the 
kind, are brought as close to real life as possible: all the vulnerabilities 
are not fictional, but indeed occur on present-day information systems.

Registration for Quals: till 17th of December, 2012.
Time when Quals will be held: From 10 a.m. of the 15th of December till 10 a.m. 
of the 17th of December, 2012 (Moscow time).

The winners of the contest will be those who gain the highest score earlier 
than others. On the basis of the PHDays CTF Quals results, the strongest teams 
will be invited to participate in PHDays III CTF.

The main contest will take place on the 22nd and 23rd of May, 2013 in Moscow 
during the third international information security forum Positive Hack 
Dayshttp://phdays.com/. Big money prize is waiting for the winners!

Details
You can learn more about PHDays CTF Quals and register by following the link 
http://quals.phdays.comhttp://quals.phdays.com/.

___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

[Full-disclosure] nullsec-net-crypter.pdf

2012-12-13 Thread Levent Kayan
Hello,

we just released a new paper, which discusses ideas of advanced runtime
encryption of .NET executables.

You can find the paper here: http://www.nullsecurity.net/papers.html

Enjoy reading it.


Cheers,
noptrix
--
Name: Levon 'noptrix' Kayan
E-Mail: nopt...@nullsecurity.net
GPG key: 0xDCA45D42
Key fingerprint: 250A 573C CA93 01B3 7A34  7860 4D48 E33A DCA4 5D42
Homepage: http://www.nullsecurity.net/

___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/


Re: [Full-disclosure] Google's robots.txt handling

2012-12-13 Thread Philip Whitehouse
I restate my email's second point.

Google is indexing robots.txt because (from all the examples I can see) 
robots.txt doesn't contain a line to disallow indexing of robots.txt

It is possible that some web sites provide actual content in a file that 
happens to be called robots.txt (e.g a website concerned with AI development).

Could Google do better by removing the file? Sure. But as webmasters haven't 
told them not to, even though they have provided other files not to index, 
Google is doing exactly what they were asked.

Maybe the R.E.S. should state that a valid robots.txt should not be indexed.

Incidentally Bing shows the same behaviour - in fact the Google file is the 4th 
hit even without any of the file type classifiers.

Philip Whitehouse

On 13 Dec 2012, at 11:40, Mario Vilas mvi...@gmail.com wrote:

 That paragraph says pretty much the exact opposite of what you understood.
 
 Also, could we please stop refuting points nobody even made in the first 
 place? OP never claimed this to be a vulnerability, nor ever said robots.txt 
 is a proper security mechanism to hide files in public web directories.
 
 All OP said was the way robots.txt is indexed allows for some Google dorks to 
 be made, and it may be a good idea to avoid that. Clearly it's not the 
 discovery of the century, but it seems fairly reasonable to me... I don't get 
 what all this fuzz is about.
 
 On Wed, Dec 12, 2012 at 12:18 PM, Christoph Gruber l...@guru.at wrote:
 On 12.12.2012 at 00:23 Lehman, Jim jim.leh...@interactivedata.com wrote:
 
  It is possible to use white listing for robots.txt. Allow what you want 
  google to index and deny everything else. That way google doesn't make you 
  a goole dork target and someone browsing to your robots.txt file doesn't 
  glean any sensitive files or folders. But this will not stop directory 
  bruting to discover your publicly exposed sensitive data, that probably 
  should not be exposed to the web in the first place.
 
 Maybe I misunderstood something, but do you really think that sensitive 
 can be hidden in secret directories on publicly reachable web servers?
 --
 Christoph Gruber
 By not reading this email you don't agree you're not in any way affiliated 
 with any government, police, ANTI- Piracy Group, RIAA, MPAA, or any other 
 related group, and that means that you CANNOT read this email.
 By reading you are not agreeing to these terms and you are violating code 
 431.322.12 of the Internet Privacy Act signed by Bill Clinton in 1995.
 (which doesn't exist)
 
 ___
 Full-Disclosure - We believe in it.
 Charter: http://lists.grok.org.uk/full-disclosure-charter.html
 Hosted and sponsored by Secunia - http://secunia.com/
 
 
 
 -- 
 “There's a reason we separate military and the police: one fights the enemy 
 of the state, the other serves and protects the people. When the military 
 becomes both, then the enemies of the state tend to become the people.”
 
 ___
 Full-Disclosure - We believe in it.
 Charter: http://lists.grok.org.uk/full-disclosure-charter.html
 Hosted and sponsored by Secunia - http://secunia.com/
___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/

Re: [Full-disclosure] Google's robots.txt handling

2012-12-13 Thread Jeffrey Walton
On Thu, Dec 13, 2012 at 7:52 AM, Philip Whitehouse phi...@whiuk.com wrote:
 I restate my email's second point.

 Google is indexing robots.txt because (from all the examples I can see)
 robots.txt doesn't contain a line to disallow indexing of robots.txt

 It is possible that some web sites provide actual content in a file that
 happens to be called robots.txt (e.g a website concerned with AI
 development).

 Could Google do better by removing the file? Sure. But as webmasters haven't
 told them not to, even though they have provided other files not to index,
 Google is doing exactly what they were asked.

Webmasters don't have to in the US - the Computer Fraud and Abuse Act
(CFAA) means Google (et al) must operate within the authority granted
by the webmasters. If that means the webmasters decide they don't want
their site crawled, then Google (et al) has exceeded its authority and
broken US Federal law. Just ask Weev.

This system needs a submission based whitelist.

Jeff

___
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/