RE: [SC-L] How do we improve s/w developer awareness?

2004-12-03 Thread owner-sc-l
[EMAIL PROTECTED]
From: Peter Amey [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sender: [EMAIL PROTECTED]
Precedence: bulk
Mailing-List: contact [EMAIL PROTECTED] ; run by MajorDomo
List-Id: Secure Coding Mailing List sc-l.securecoding.org
List-Post: mailto:[EMAIL PROTECTED]
List-Subscribe: http://www.securecoding.org/list/
List-Unsubscribe: http://www.securecoding.org/list/
List-Help: http://www.securecoding.org/list/charter.php
List-Archive: http://lists.virus.org
Delivered-To: mailing list [EMAIL PROTECTED]
Delivered-To: moderator for [EMAIL PROTECTED]

[snip]
 
 Remember that little incident in 2000 when the London 
 millennium bridge was
 closed immediately after opening due to excessive wobbling when people
 walked across it? I can't guarantee that my recollection is 
 accurate, but
 I'm sure they were trying to put this down to that software classic, a
 'Design feature'.

The Millenium Bridge wobble is indeed instructive.  Engineering is usually 
a profession that is conservative and places great emphasis on codifying 
and learning from past mistakes.  Much bridge design work uses 
well-established, trustworthy principles.  The Millenium Bridge designers 
deliberately pushed the boundaries to produce something novel and exciting. 
 Never before had a suspension bridge had the suspension and decking in the 
same plane (i.e. the deck doesn't hang from the suspension, its balanced 
on/between the suspension).  The result was strong enough but had 
unexpected dynamics i.e. it wobbled!
I am confident that this experience is already in the text books, standard 
data tables and CAD tools.  The engineering body of knowledge had been 
added to and the problem should not recur.

This is where the software community can learn:

1.  We are appalling at learning from previous mistakes (other than in 
perfecting our ability to repeat them!)
2.  We routinely push the boundaries of what we try and achieve by leaping 
instead of stepping.
3.  We routinely adopt novel and untried technology in preference to proven 
and trustworthy alternatives.  Indeed, mature technology often seems to be 
rejected precisely because it is not new, novel or exciting enough.

The Millenium Bridge made news precisely because such engineering faiures 
are rare; software engineering failures make the news because they are so 
common the papers would be empty if they weren't reported! 

[snip]

Peter


**

This email is confidential and intended solely for the use of the 
individual to whom it is addressed.  If you are not the intended recipient, 
be advised that you have received this email in error and that any use, 
disclosure, copying or distribution or any action taken or omitted to be 
taken in reliance on it is strictly prohibited.  If you have received this 
email in error please contact the sender.  Any views or opinions presented 
in this email are solely those of the author and do not necessarily 
represent those of Praxis High Integrity Systems Ltd (Praxis HIS). 

 Although this email and any attachments are believed to be free of any 
virus or other defect, no responsibility is accepted by Praxis HIS or any 
of its associated companies for any loss or damage arising in any way from 
the receipt or use thereof.  The IT Department at Praxis HIS can be 
contacted at [EMAIL PROTECTED]

**




Re: [SC-L] How do we improve s/w developer awareness?

2004-12-02 Thread Brian Utterback
George Capehart wrote:
Yes, assuming management cares . . . and that's *my* broken record . . .
:)
If the tone of my comments was a bit harsh, it is most emphatically not
intended to be directed at your thoughts.  It is only because of my
intense frustration with the situation.  When Management wants
software systems to be secure, they will be.  Not perfectly so, but
within published limits.  Management will see to it that the
appropriate policies and processes are in place to assure it.
Management will see to it that delivering a product that passes the
certification process is more important than delivering a product by a
certain date.  Management will require that a security architecture
be in place before the design process starts, etc., etc., etc.  The
board will hold Management accountable for the risk that the use of
the system entails.  When that happens, Management will come to
realize that security is *their* problem, *not* InfoSec's problem.
/Until/ that happens, changing frameworks, development tools,
methodologies or whatever will not solve the problem.  The Problem
just isn't in IT.
As Dennis Miller says:  But that's just my opinion.  I could be wrong.

Which brings up the next broken record. Management will not care until
it affects the bottom line not to care. As long as it costs money to
care (missed deadlines, more tools and training, etc.) and it doesn't
cost anything not to care, then they *shouldn't* care.  Either the
consumers have to care so that security problems cost sales, or
the liability laws need to change so that security problems result in
financial penalties. Consumers in  general are too diverse a group to
really change what they want. It would be very difficult to educate the
entire consumer base to the collateral costs of poor security in
products and to set valid expectations about what is and is not
possible. The all software has bugs mantra is now very ingrained.
Changing liability laws on the other hand is a simple solution. This
will force managers to do the proper due diligence just to CYA.
Sure, there will be increased litigation costs, but a couple high
profile cases and the whole process of development will change.
And I don't buy the programming is too hard, there will always be bugs
argument. Maybe there will always be bugs, but I don't think we have
reached the point where we can really make a call about how hard it
really is. We call programmers engineers, but very, very few software
engineers deserve the title. Would you accept it was too hard to
do a stress analysis from the engineer designing a bridge?
--
blu
It is bafflingly paradoxical that the United States is by far the
world's leading scientific nation while simultaneously housing the most
scientifically illiterate populace outside the Third World.
- Richard Dawkins

Brian Utterback - OP/N1 Revenue Product Engineering, Sun Microsystems
Ph/VM: 877-259-7345, Em:brian.utterback-at-ess-you-enn-dot-kom


RE: [SC-L] How do we improve s/w developer awareness?

2004-12-02 Thread Michael S Hines
I've been trying to get IT Auditors and the Audit community in general to apply 
the same
due dilligence to operating systems (infrastructure or general controls) that 
they apply
to applications systems testing.

I'm not aware of anyone in the IT Audit community doing OS audits - to verify 
that the
systems work as advertised and do not fail where they should not.   I become 
quite aware
of this a few years ago when I was in a group doing Penetraiton Testing of an 
OS and
discovered many flaws.

Why don't auditors audit the OS?  I, frankly, don't know. 

But Auditors do have the ear of upper management and they could be the ones 
indicating the
weaknessed in the infrastructure that puts the organization at risk. 

We wouldn't put in a new payroll system without verifying that it works 
properly.  Yet
we're more than willing to unpackage and plug in a desktop computer without the 
same due
dilligence.  Why?It's beyond me.  

Perhaps if more people were asking the right questions to the right people ...  
?  

Why we've come to accept the CTL_ALT_DEL 'three finger salute' as SOP is beyond 
me.  

Of course the issues above aren't limited to one particular OS.  There are 
plenty of
problems to go around.
(see the work done at Univ of Wisconsin - the Fuzz Testing project 
http://www.cs.wisc.edu/~bart/fuzz/fuzz.html )

Mike Hines
---
Michael S Hines
[EMAIL PROTECTED] 




RE: [SC-L] How do we improve s/w developer awareness?

2004-12-02 Thread Shea, Brian A
FYI this is part of a notice that went out to financial institutions
recently. 

Complete Financial Institution Letter:
http://www.fdic.gov/news/news/financial/2004/fil12104.html 
 
Highlights: 

Management is responsible for ensuring that commercial off-the-shelf
(COTS) software packages and vendor-supplied in-house computer system
solutions comply with all applicable laws and regulations.
 
The guidance contained in this financial institution letter will assist
management in developing an effective computer software evaluation
program to accomplish this objective. 

An effective computer software evaluation program will mitigate many of
the risks - including failure to be regulatory compliant - that are
associated with software products throughout their life cycle. 

Management should use due diligence in assessing the quality and
functionality of COTS software packages and vendor-supplied in-house
computer system solutions.

Distribution:
FDIC-Supervised Banks (Commercial and Savings) 


-Original Message-
From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]
On Behalf Of Greenarrow 1
Sent: Monday, November 29, 2004 6:08 PM
To: George Capehart
Subject: Re: [SC-L] How do we improve s/w developer awareness?

Words could not be spoken better.  This is my argument from the get go.
I,
to, am tired of seeing everyone blame it on the Dev department when the
orders from above are I want this now and fast.  Maybe, we can focus and
convince upper level management that security is as important as the
greed,
money, bells, whistles.  But while I support Dev I still do not
understand
how some companies development departments can include tight secured
coding
in a short time frame and others seem to provide excuses or just do not
care.

Customers are now looking at the security of programs.  Slowly,
consumers
are finally looking at security flaws mainly because of the media
coverage
that computer softwares are creating.  While it may only be top
contenders
in the software field customers are now questioning other programs they
have
which I support fully.  I can see this in the rise of subscribers to
certain
Security Flaw Alerts which has risen over 71% within the last 3 months.

Just a word of warning as consumers become more aware of security in the
softwares they purchase companies that do not secure will start showing
a
downslide in purchases.  It is happening to one major company as we
email
each other on issues.

Regards,
George
Greenarrow1
InNetInvestigations-Forensics


- Original Message -
From: George Capehart [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, November 28, 2004 5:18 PM
Subject: Re: [SC-L] How do we improve s/w developer awareness?


 On Thursday 11 November 2004 10:26, Kenneth R. van Wyk allegedly
wrote:
  Greetings,
 
  In my business travels, I spend quite a bit of time talking with
  Software Developers as well as IT Security folks.  One significant
  different that I've found is that the IT Security folks, by and
  large, tend to pay a lot of attention to software vulnerability and
  attack information while most of the Dev folks that I talk to are
  blissfully unaware of the likes of Full-Disclosure, Bugtraq, PHRACK,
  etc.  I haven't collected any real stats, but it seems to me to be
at
  least a 90/10% and 10/90% difference.  (Yes, I know that this is a
  gross generalization and there are no doubt significant exceptions,
  but...)
 
  I believe that this presents a significant hurdle to getting Dev
  folks to care about Software Security issues.  Books like Gary
  McGraw's Exploiting Software do a great job at explaining how
  software can be broken, which is a great first step, but it's only a
  first step.

 Apologies for the two-week latency in this reply.  I don't have as
much
 time for the lists as I used to.

 I have read the rest of this thread, and I didn't see any comments
that
 address a dimension that is, for me, the most salient.  I feel like a
 broken record because this topic crops up on one security-related list
 or another at least once a quarter and I end up saying the same thing
 every time.  I'm going to say it again, though, because I really
 believe that it is important . . . Dev folks will care about security
 when their managers care about security.  If time-to-market and bells
 and whistles are more important to management than security is,
 that's where dev folks will spend their time.  It is their job to do
 what their managers tell them to do.  When management decides that
it
 is more important to deliver a product that is based on a robust
 security architecture and which is built and tested with security in
 mind, it will be.  Until then, it won't.  At one time or another in my
 career, I have held just about every position in the software
 development food chain.  I have had the president of the company tell
 me:  I don't care what it takes, you /*will*/ have this project done
 and delivered in four months!  Well, we delivered a
 less-than

Re: [SC-L] How do we improve s/w developer awareness?

2004-12-02 Thread der Mouse
 Changing liability laws on the other hand is a simple solution.

But at what price?  It would kill off open source completely, as far as
I can see, in the jurisdiction(s) in question.  (How many open source
projects could afford to defend a liability suit even if they (a)
wanted to and (b) had a won case?)

Of course, if you don't mind that, go right ahead.  You don't say where
you are, but looking over your message I see reason to thin it's the
USA, and I long ago wrote off the USA as a place to write code.  I
think it could be a very good thing for the USA to try such laws; it
would give us hard data about what their effect is, rather than the
speculation (however well-informed) that's all we have to go on now -
and it quite likely would have the pleasant side effect of pushing most
open source projects out into the free (or at least freer) world.

/~\ The ASCIIder Mouse
\ / Ribbon Campaign
 X  Against HTML[EMAIL PROTECTED]
/ \ Email!  7D C8 61 52 5D E7 2D 39  4E F1 31 3E E8 B3 27 4B




Re: [SC-L] How do we improve s/w developer awareness? [Virus Checked]

2004-12-02 Thread graham . coles
I have to say I find your comparison between bridge engineers and software
engineers rather troubling.

In response to your question:

  'Would you accept it was too hard to do a stress analysis from the
engineer designing a bridge?'

I think, regrettably, we probably would do these days.

Remember that little incident in 2000 when the London millennium bridge was
closed immediately after opening due to excessive wobbling when people
walked across it? I can't guarantee that my recollection is accurate, but
I'm sure they were trying to put this down to that software classic, a
'Design feature'.

Seems that far from Software Engineers taking the bridge engineers
approach, we may be seeing the exact reverse happening. :-)

--
Graham Coles.






   
  Brian Utterback   
   
  Brian.Utterback@To:   George Capehart 
[EMAIL PROTECTED] 
  Sun.COM cc:   [EMAIL PROTECTED]  
   
  Sent by: Subject:  Re: [SC-L] How do we 
improve s/w developer awareness? [Virus Checked] 
  [EMAIL PROTECTED] 
   
  coding.org
   

   

   
  02/12/2004 13:25  
   

   




George Capehart wrote:

 Yes, assuming management cares . . . and that's *my* broken record . . .
 :)

 If the tone of my comments was a bit harsh, it is most emphatically not
 intended to be directed at your thoughts.  It is only because of my
 intense frustration with the situation.  When Management wants
 software systems to be secure, they will be.  Not perfectly so, but
 within published limits.  Management will see to it that the
 appropriate policies and processes are in place to assure it.
 Management will see to it that delivering a product that passes the
 certification process is more important than delivering a product by a
 certain date.  Management will require that a security architecture
 be in place before the design process starts, etc., etc., etc.  The
 board will hold Management accountable for the risk that the use of
 the system entails.  When that happens, Management will come to
 realize that security is *their* problem, *not* InfoSec's problem.
 /Until/ that happens, changing frameworks, development tools,
 methodologies or whatever will not solve the problem.  The Problem
 just isn't in IT.

 As Dennis Miller says:  But that's just my opinion.  I could be wrong.


Which brings up the next broken record. Management will not care until
it affects the bottom line not to care. As long as it costs money to
care (missed deadlines, more tools and training, etc.) and it doesn't
cost anything not to care, then they *shouldn't* care.  Either the
consumers have to care so that security problems cost sales, or
the liability laws need to change so that security problems result in
financial penalties. Consumers in  general are too diverse a group to
really change what they want. It would be very difficult to educate the
entire consumer base to the collateral costs of poor security in
products and to set valid expectations about what is and is not
possible. The all software has bugs mantra is now very ingrained.
Changing liability laws on the other hand is a simple solution. This
will force managers to do the proper due diligence just to CYA.
Sure, there will be increased litigation costs, but a couple high
profile cases and the whole process of development will change.

And I don't buy the programming is too hard, there will always be bugs
argument. Maybe there will always be bugs, but I don't think we have
reached the point where we can really make a call about how hard it
really is. We call programmers engineers, but very, very few software
engineers deserve the title. Would you accept it was too hard to
do a stress analysis from the engineer designing a bridge?
--
blu

It is bafflingly paradoxical that the United States is by far the
world's leading scientific nation while

Re: [SC-L] How do we improve s/w developer awareness?

2004-11-29 Thread Greenarrow 1
Words could not be spoken better.  This is my argument from the get go.  I,
to, am tired of seeing everyone blame it on the Dev department when the
orders from above are I want this now and fast.  Maybe, we can focus and
convince upper level management that security is as important as the greed,
money, bells, whistles.  But while I support Dev I still do not understand
how some companies development departments can include tight secured coding
in a short time frame and others seem to provide excuses or just do not
care.

Customers are now looking at the security of programs.  Slowly, consumers
are finally looking at security flaws mainly because of the media coverage
that computer softwares are creating.  While it may only be top contenders
in the software field customers are now questioning other programs they have
which I support fully.  I can see this in the rise of subscribers to certain
Security Flaw Alerts which has risen over 71% within the last 3 months.

Just a word of warning as consumers become more aware of security in the
softwares they purchase companies that do not secure will start showing a
downslide in purchases.  It is happening to one major company as we email
each other on issues.

Regards,
George
Greenarrow1
InNetInvestigations-Forensics


- Original Message -
From: George Capehart [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Sunday, November 28, 2004 5:18 PM
Subject: Re: [SC-L] How do we improve s/w developer awareness?


 On Thursday 11 November 2004 10:26, Kenneth R. van Wyk allegedly wrote:
  Greetings,
 
  In my business travels, I spend quite a bit of time talking with
  Software Developers as well as IT Security folks.  One significant
  different that I've found is that the IT Security folks, by and
  large, tend to pay a lot of attention to software vulnerability and
  attack information while most of the Dev folks that I talk to are
  blissfully unaware of the likes of Full-Disclosure, Bugtraq, PHRACK,
  etc.  I haven't collected any real stats, but it seems to me to be at
  least a 90/10% and 10/90% difference.  (Yes, I know that this is a
  gross generalization and there are no doubt significant exceptions,
  but...)
 
  I believe that this presents a significant hurdle to getting Dev
  folks to care about Software Security issues.  Books like Gary
  McGraw's Exploiting Software do a great job at explaining how
  software can be broken, which is a great first step, but it's only a
  first step.

 Apologies for the two-week latency in this reply.  I don't have as much
 time for the lists as I used to.

 I have read the rest of this thread, and I didn't see any comments that
 address a dimension that is, for me, the most salient.  I feel like a
 broken record because this topic crops up on one security-related list
 or another at least once a quarter and I end up saying the same thing
 every time.  I'm going to say it again, though, because I really
 believe that it is important . . . Dev folks will care about security
 when their managers care about security.  If time-to-market and bells
 and whistles are more important to management than security is,
 that's where dev folks will spend their time.  It is their job to do
 what their managers tell them to do.  When management decides that it
 is more important to deliver a product that is based on a robust
 security architecture and which is built and tested with security in
 mind, it will be.  Until then, it won't.  At one time or another in my
 career, I have held just about every position in the software
 development food chain.  I have had the president of the company tell
 me:  I don't care what it takes, you /*will*/ have this project done
 and delivered in four months!  Well, we delivered a
 less-than-half-assed piece of software, but you can be sure that it was
 designed at the keyboard with absolutely *no* thought for security.
 That guy didn't know security from Adam's house cat and cared less.  It
 was not my job to deliver *secure* software.  It was my job to deliver
 /*what we'd promised the customer*/ in four months.  Security wasn't in
 the spec, so security wasn't in the product.

 It is not fair to beat up on the developers . . . or even the project
 managers.  This is a governance/risk management problem.  This is a
 C-/board-level problem.  It's not going to be solved until the people
 giving the orders give orders to do it right.  I know many developers
 and project managers who have a clue, but it doesn't matter if they are
 not allowed to exercise it.

 My 0.02$CURRENCY.

 Cheers,

 George Capehart
 --
 George W. Capehart

 Key fingerprint:  3145 104D 9579 26DA DBC7  CDD0 9AE1 8C9C DD70 34EA

 With sufficient thrust, pigs fly just fine.  -- RFC 1925



Re: [SC-L] How do we improve s/w developer awareness?

2004-11-28 Thread George Capehart
On Thursday 11 November 2004 10:26, Kenneth R. van Wyk allegedly wrote:
 Greetings,

 In my business travels, I spend quite a bit of time talking with
 Software Developers as well as IT Security folks.  One significant
 different that I've found is that the IT Security folks, by and
 large, tend to pay a lot of attention to software vulnerability and
 attack information while most of the Dev folks that I talk to are
 blissfully unaware of the likes of Full-Disclosure, Bugtraq, PHRACK,
 etc.  I haven't collected any real stats, but it seems to me to be at
 least a 90/10% and 10/90% difference.  (Yes, I know that this is a
 gross generalization and there are no doubt significant exceptions,
 but...)

 I believe that this presents a significant hurdle to getting Dev
 folks to care about Software Security issues.  Books like Gary
 McGraw's Exploiting Software do a great job at explaining how
 software can be broken, which is a great first step, but it's only a
 first step.

Apologies for the two-week latency in this reply.  I don't have as much 
time for the lists as I used to.

I have read the rest of this thread, and I didn't see any comments that 
address a dimension that is, for me, the most salient.  I feel like a 
broken record because this topic crops up on one security-related list 
or another at least once a quarter and I end up saying the same thing 
every time.  I'm going to say it again, though, because I really 
believe that it is important . . . Dev folks will care about security 
when their managers care about security.  If time-to-market and bells 
and whistles are more important to management than security is, 
that's where dev folks will spend their time.  It is their job to do 
what their managers tell them to do.  When management decides that it 
is more important to deliver a product that is based on a robust 
security architecture and which is built and tested with security in 
mind, it will be.  Until then, it won't.  At one time or another in my 
career, I have held just about every position in the software 
development food chain.  I have had the president of the company tell 
me:  I don't care what it takes, you /*will*/ have this project done 
and delivered in four months!  Well, we delivered a 
less-than-half-assed piece of software, but you can be sure that it was 
designed at the keyboard with absolutely *no* thought for security.  
That guy didn't know security from Adam's house cat and cared less.  It 
was not my job to deliver *secure* software.  It was my job to deliver 
/*what we'd promised the customer*/ in four months.  Security wasn't in 
the spec, so security wasn't in the product.

It is not fair to beat up on the developers . . . or even the project 
managers.  This is a governance/risk management problem.  This is a 
C-/board-level problem.  It's not going to be solved until the people 
giving the orders give orders to do it right.  I know many developers 
and project managers who have a clue, but it doesn't matter if they are 
not allowed to exercise it.

My 0.02$CURRENCY.

Cheers,

George Capehart
-- 
George W. Capehart

Key fingerprint:  3145 104D 9579 26DA DBC7  CDD0 9AE1 8C9C DD70 34EA

With sufficient thrust, pigs fly just fine.  -- RFC 1925





Re: [SC-L] How do we improve s/w developer awareness?

2004-11-12 Thread M Taylor
On Thu, Nov 11, 2004 at 04:56:20PM -0500, ljknews wrote:
 At 2:48 PM -0500 11/11/04, Paco Hope wrote:
 
 On 11/11/04 11:46 AM, ljknews [EMAIL PROTECTED] wrote:
  As a software developer, I care about such issues, but the compiliations
  you list are largely not applicable to the operating system and programming
  languages with which I work.
 
 
 I am still looking for a forum that omits those problems due to choice
 of C and related programming languages that use null terminated string.
 I know that is a bad idea, and I don't do it.
 
 I am still looking for a forum that omits problems propagated over IP
 and related protocols.  I don't do that either.
 
 I have yet to see a standard tool (as distinguished from social
 engineering technique) from elsewhere that fits VMS.
 

RISK Digest http://www.risk.org/ (comp.risks) is about the closest,
although not security focused it does discuss system failures beyond 
buffer overflows and TCP/IP protocol suite. It does not exclude familiar
risks (and documented failures) of buffer overflows, but extends into
numerous design related failures which can have security implications
which transcend any given platoform or language.

Of course VMS is not immune to security risks. I know, I created more
than one insecure piece of software for VMS (in-house stuff that is 
now retired).




Re: [SC-L] How do we improve s/w developer awareness?

2004-11-12 Thread Gunnar Peterson
Concur that security is more colorless than most of the other ilities. My point
is that the other domains which serve up the non-functional requirements are
colorless to some degree as well. So in terms of how the other ility domains
approach the quantification and elaboration of the goals that emerge from their
domains and getting them in the hands of architects and developers, there may
be some activities and artifacts in there that we can learn from.

-gp

Quoting Jeff Williams [EMAIL PROTECTED]:

 We certainly have a lot to learn from the other communities, but security is
 worse than the other *-ilities, because it is more difficult to see.
 Consumers can tell which operating system is easier to use, and which one is
 faster, but there is no way to know which is more secure today.

 Until consumers can tell the difference between a security program and one
 that is not, they will not pay more for the secure one.  Which means that it
 is not going to make many managers' radar screen, and therefore developer
 awareness will never happen on a broad scale.

 In my opinion, the way out of this trap is to get more information to
 consumers about the security in software.  Information like how many lines
 of code, what languages, what libraries, process used, security testing
 done, mechanisms included, and other information can and should be
 disclosed.

 --Jeff

 - Original Message -
 From: Gunnar Peterson [EMAIL PROTECTED]
 To: Yousef Syed [EMAIL PROTECTED]
 Cc: Secure Coding Mailing List [EMAIL PROTECTED]
 Sent: Friday, November 12, 2004 6:58 AM
 Subject: Re: [SC-L] How do we improve s/w developer awareness?


   Making software secure should be a requirement of the development
   process. I've had the priviledge to have worked on some very good
   projects where the managers emphasised security in the beginning of
   the projects life cycle since it was a requirement of the client.
 
  Making software secure absolutely should be part of the development
  lifecycle, and as early as possible, too. My overall point was that if
  you talk to the people who really care about usability (as
  distinguished from just features) you will hear very similar
  frustrations about their ability to get what they consider true
  usability requirements into the end product. So in terms of learning
  from other communities I think as opposed to beating our heads against
  the same wall it can be helpful to learn from another *-ility community
  to see what ways they have tried successfully/unsuccessfully to
  increase the quality in software from their viewpoint. My suggestion is
  that the problem is not just software security but run a little deeper
  to the main problem of software quality of which security is one of the
  factors (albeit an important one).
 
  So what are the common threads amongst usability and security? For
  examples it is interesting to note that both communities seem to value
  early involvement in the development lifecycle and striving for
  simplicity in design. Software security does not need more barriers,
  but to the extent that we can find allies with similar goals and issues
  from other communities (could be *-ilitity, business, compliance, legal
  btw) and collaborate with them to communicate the value of quality,
  then our chances for shipping better software are increased.
 
  -gp
 
  Societies have invested more than a trillion dollars in software and
  have grotesquely enriched minimally competent software producers whose
  marketing skills far exceed their programming skills. Despite this
  enormous long-run investment in software, economists were unable to
  detect overall gains in economic productivity from information
  technology until perhaps the mid-1990s or later; the economist Robert
  Solow once remarked that computers showed up everywhere except in
  productivity statistics.
 
  Quality may sometimes be the happy by-product of competition. The lack
  of competition for the PC operating system and key applications has
  reduced the quality and the possibilities for the user interface. There
  is no need on our interface for a visible OS, visible applications, or
  for turning the OS and browsers and e-mail programs into marketing
  experiences. None of this stuff appeared on the original graphical user
  interface designed by Xerox PARC. That interface consisted almost
  entirely of documents--which are, after all, what users care about.
  Vigorous competition might well have led to distinctly better PC
  interfaces--without computer administrative debris, without operating
  system imperialism, without unwanted marketing experiences--compared to
  what we have now on Windows and Mac.
 
Today nearly all PC software competition is merely between the old
  release and the new release of the same damn product. It is hard to
  imagine a more perversely sluggish incentive system for quality.
  Indeed, under such a system, the optimal economic strategy for market

Re: [SC-L] How do we improve s/w developer awareness?

2004-11-12 Thread Dana Epp
I think we have to go one step further.
Its nice to know what the attack patterns are. A better thing to do is to know how to identify them 
during threat modeling, and then apply safeguards to mitigate the risk. ie: We need a merge of 
thoughts from Exploiting Software and Building Secure Software into a 
single source... where attack and defense can be spoken about together.
We all like to spout out that until you know the threats to which you are 
susceptible to, you cannot build secure systems. The reality is, unless you 
know how to MITIGATE the threats... simply knowing they exist doesn't do much 
to protect the customer.
Gary McGraw wrote:
One of the reasons that Greg Hoglund and I wrote Exploiting Software was
to gain a basic underdstanding of what we call attack patterns.  The
idea is to abstract away from platform and language considerations (at
least some), and thus elevate the level of attack discussion.
We identify and discuss 48 attack patterns in Exploiting Software.  Each
of them has a handful of associated examples from real exploits.  I will
paste in the complete list below.  As you will see, we provided a start,
but there is plenty of work here remaining to be done.
Perhaps by talking about patterns of attack we can improve the signal to
noise ratio in the exploit discussion department.
gem
Gary McGraw, Ph.D.
CTO, Cigital
http://www.cigital.com
WE NEED PEOPLE!
Make the Client Invisible
Target Programs That Write to Privileged OS Resources 
Use a User-Supplied Configuration File to Run Commands That Elevate
Privilege 
Make Use of Configuration File Search Paths 
Direct Access to Executable Files 
Embedding Scripts within Scripts 
Leverage Executable Code in Nonexecutable Files 
Argument Injection 
Command Delimiters 
Multiple Parsers and Double Escapes 
User-Supplied Variable Passed to File System Calls 
Postfix NULL Terminator 
Postfix, Null Terminate, and Backslash 
Relative Path Traversal 
Client-Controlled Environment Variables 
User-Supplied Global Variables (DEBUG=1, PHP Globals, and So Forth) 
Session ID, Resource ID, and Blind Trust
Analog In-Band Switching Signals (aka Blue Boxing) 
Attack Pattern Fragment: Manipulating Terminal Devices 
Simple Script Injection 
Embedding Script in Nonscript Elements 
XSS in HTTP Headers 
HTTP Query Strings 
User-Controlled Filename 
Passing Local Filenames to Functions That Expect a URL 
Meta-characters in E-mail Header
File System Function Injection, Content Based
Client-side Injection, Buffer Overflow
Cause Web Server Misclassification
Alternate Encoding the Leading Ghost Characters
Using Slashes in Alternate Encoding
Using Escaped Slashes in Alternate Encoding 
Unicode Encoding 
UTF-8 Encoding 
URL Encoding 
Alternative IP Addresses 
Slashes and URL Encoding Combined 
Web Logs 
Overflow Binary Resource File 
Overflow Variables and Tags 
Overflow Symbolic Links 
MIME Conversion 
HTTP Cookies 
Filter Failure through Buffer Overflow 
Buffer Overflow with Environment Variables 
Buffer Overflow in an API Call 
Buffer Overflow in Local Command-Line Utilities 
Parameter Expansion 
String Format Overflow in syslog() 



This electronic message transmission contains information that may be
confidential or privileged.  The information contained herein is intended
solely for the recipient and use by any other party is not authorized.  If
you are not the intended recipient (or otherwise authorized to receive this
message by the intended recipient), any disclosure, copying, distribution or
use of the contents of the information is prohibited.  If you have received
this electronic message transmission in error, please contact the sender by
reply email and delete all copies of this message.  Cigital, Inc. accepts no
responsibility for any loss or damage resulting directly or indirectly from
the use of this email or its contents.
Thank You.


--
Regards,
Dana Epp
[Blog: http://silverstr.ufies.org/blog/]