Re: Standing Up Against German Laws - Project HayNeedle
However some of these issues can be mitigated without too much trouble. For example, one could have a dynamically growing dictionary of words to search for based on random words in random results pages that it grabs. At the very least, this would kill any attempts to filter it out of the data mining system. If the point of the system is primarily to create plausible deniability for the end-user, that is, to allow them to say "hayneedle hit the site, not me, so I am innocent", then I'd say it could be effective in that regard barring some proviso in the law that allow them to persecute someone who did not actually even visit a site of their own volition. Beyond that, it's also effective in terms of turning up the noise to signal ratio and making this law that much less effective, while placing a greater burden of ISPs who are then more likely to lobby against it ever more vigorously all while remaining entirely 'white area' in terms of functionality. I understand your post, but I don't think Mr. Ziegler was over-selling his product's effectiveness beyond what it is really capable of. Take care, Matt johan beisser wrote: On Nov 10, 2007, at 9:28 AM, Paul Sebastian Ziegler wrote: The mechanism is quite easy: It searches Google for random words and picks random pages among the results, then spiders from there (well it is spidering except that it only follows one URL at a time within a session thus simulating a user). There's a few things wrong with this approach. Most of them were outlined by Bruce Schneier when he reviewed "TrackMeNot"[1] last year. The same issues with TrackMeNot apply to Hayneedle, including potential false positives, and list of word combinations that can be filtered out easily, and well, the list goes on. [1] http://www.schneier.com/blog/archives/2006/08/trackmenot_1.html -- /* * mdh - Solitox Networks (Lead Project Engineer) * Facts often matter little, in the face of fervently held perceptions */
Re: SMF .htaccess bypass
So what you're saying is that .htaccess is working as expected. What does this have to do with SMForum? Using .htaccess to protect the admin section is not at all standard in SMForum, so I'm not really sure how or why this is relevant. Furthermore, SMForum still has its own authentication mechanisms. This doesn't seem like a bug/issue at all, just software working as intended, even if someone chooses to use it in an unusual and insecure manner. - mdh [EMAIL PROTECTED] wrote: # ./start # # Discovered by Seph1roth on June 2007 (was priv8) # # Vulnerable: Simple Machine Forum [ALL Versions] # # Visit: http://www.blackroots.it - Best hacking site. # # Description: If smf has index.php?action=admin in .htaccess ,i can bypass that by typing in the url some variable of administration panel : example: index.php?action=admin (.htaccess,then access denied) index.php?action=membergroups (accessible) index.php?action=news (accessible) index.php?action=featuresettings (accessible) ...and others... i can bypass and enter the administration by typing the accessible variables in the url... # Greets to all BlackRoots Users # # Shoutz to all kiddies # # ./end -- /* * mdh - Solitox Networks (Lead Project Engineer) * Facts often matter little, in the face of fervently held perceptions */
Re: Wiki Remote Authentication Bypass Vulnerability
This is the designed behavior of the application, not an "exploit" as you claim. In addition to that, the syntax used in your example URL's is specific to MediaWiki, and not common amongst all wiki apps, as you claim. Furthermore, this "exploit" does not work "100% of the time" - even if this could be called an exploit, which it clearly is not, protecting an entry and configuring levels of access are relatively common and simple tasks for Wiki administrators. Your claim that this is an access validation error is simply wrong, and is akin to saying that being able to write to a file which a user intentionally sets to mode 0777 is an error. - Matt [EMAIL PROTECTED] wrote: Wiki Remote Authentication Bypass Vulnerability The Exploit Works 100 % of the time. It really is up to the admin to add security like locking a page to prevent editing. There are Two ways of having this Exploit work. One is simply add the code (example 1) after the Page you wanna test or if that dosent work, add Code (example 2) and Exploit code after the new pages Name! Anyone using any type of Wiki project is vulnerable. Successfully exploiting this issue allows remote attackers to gain remote administrative access to the vulnerable sites Pages. Attackers can use a browser to exploit this issue. Hackers Center Security Group (http://www.hackerscenter.com) Credit: Doz Class: Access Validation Error Remote: Yes Vendor: http://www.wiki.org/ Version: N/A Exploit: ?action=edit Example 1: http://www.Site.com/wiki/Main_Page?action=edit Example 2: http://www.Site.com/wiki/Hacked?action=edit Proff of Concept: (Concealed) Security researcher? Join us: mail Zinho at zinho at hackerscenter.com -- /* * Matt D. Harris <[EMAIL PROTECTED]> * Solitox Networks - Lead Project Engineer */