metrics for security

  • Thread starter Jeffrey R. Jones [MSFT]
  • Start date
J

Jeffrey R. Jones [MSFT]

I would to start a discussion to take the security progress metrics beyond
theoretical and into the practical, realistic and usable to measure progress
and relative security. I've had a couple of brainstorm sessions and given
it some thought, but for the criteria to become useful, I think there should
be an active community discussion that asks hard questions and refines the
criteria to an accepted set. If we achieve that, we may even enable some
good independant apples-to-apples analysis.

I've given some thoughts to the criteria and I think:
.. The criteria should be measurable and distinguishable
.. The methodology should be repeatable by objective 3rd-parties
.. The criteria should support real world scenarios
.. The criteria should be useful for business decisions
.. The criteria should support measuring version over version improvements
.. The criteria should support competitive comparisons

My strawman proposal for security criteria has the following high level
areas: assurance, security quality, protective capabilities, manageability &
automation, security update tools and security update policies. I have also
added one other category based upon previous feedback on the strawman. My
examples below in each category are not meant to be comprehensive, but just
a good starting point for the discussions.

Assurance is an area that experienced security professionals will recognize.
In the days of the NSA's Trusted Computer Systems Evaluation Criteria,
different assurance levels were tied to different features sets. The Common
Criteria, in use today introduced a more flexible model that has separated
assurance levels from the feature set - EAL4, for example stands for
evaluation assurance level 4. So, in the assurance area, I can think of
several measurable sub-criteria: 3rd-party certifications, source integrity
protections, established change control processes, quality assurance
processes, and penetration testing.
Security Quality is the term I have begun using for things like
vulnerabilities that are found in code. In this category, I can think of
several sub-criteria: number of vulnerabilities fixed, number of public
vulnerabilities unfixed, average time to fix a publicly-reported
vulnerability, number of open listening ports, and number of listening
services or daemons.
Protective Capabilities are the capabilities that might protect the software
from attack even in the presence of a vulnerability. Sub-criteria include:
port firewall capabilities, antivirus/worm capabilities, strong
authentication, behavior monitoring, backup, and security policy enforcement
features.
Manageability & Automation is a category to capture software technologies
designed to make mistakes less likely to decrease security and the less
expertise is required to deploy and manage a product with low security risk.
Sub-criteria would include: security configuration wizards/tools, lockdown
tools, vulnerability assessment capabilities, security reporting
capabilities, and central security management capabilities.
Security Update Packages & Tools are a complementary part of measuring
security until either security quality or protective capabilities reach 100%
effectiveness. Sub-criteria should include: automatic update capabilities,
patch approval capability for enterprise updates, patch deployment tool,
uninstall capabilities, re-compile requirements, reboot requirements, size
of packages, and works with 3rd-party deployment tools.
Security Update Policies is an interesting one that I think there should be
a lot of discussion about. Sub-criteria for this are would include:
security lifecycle time period of products, patches all publicly disclosed
issues, issues security advisories, follows responsible disclosure, offers
mitigation guidance, patch test quality criteria, and patches all
versions/languages at the same time. If I am the responsible administrator
deploying and managing a dynamic web farm, for example, these criteria might
be strong considerations when selecting my software components.
Workload Required Security Features is one I have added based upon feedback
I received when I published an earlier strawman in the Microsoft Security
Newsletter. The work to flush this out still needs to be done, but the idea
is as follows. What security features are necessary to lock down the
security for _any_ OS deployment? What security features would be necessary
for a secure deployment of a dynamic web server in a particular business
scenario? A database server? For a given scenario, we would work to
identify features like identification & authentication, # of biometrics
supported, remote encrypted connectivity, comprehensive security auditing,
etc, that would be useful decision making critiria.

If security metrics is an area you care about, then comment on these high
level categories. Are they too narrow, too broad? If/once we get some
refinement on the high level criteria, we can split off some individual
threads for each category and try to flush out a list of criteria.

Jeff
 
C

Carey Frisch [MVP]

Windows XP Security Checklist
http://labmice.techtarget.com/articles/winxpsecuritychecklist.htm

How can I harden my computer or server to secure it from hackers?
http://securityadmin.info/faq4.asp#harden

--
Carey Frisch
Microsoft MVP
Windows XP - Shell/User

Be Smart! Protect your PC!
http://www.microsoft.com/security/protect/

----------------------------------------------------------------------------------------------

"Jeffrey R. Jones [MSFT]" (e-mail address removed) wrote in message:

|I would to start a discussion to take the security progress metrics beyond
| theoretical and into the practical, realistic and usable to measure progress
| and relative security. I've had a couple of brainstorm sessions and given
| it some thought, but for the criteria to become useful, I think there should
| be an active community discussion that asks hard questions and refines the
| criteria to an accepted set. If we achieve that, we may even enable some
| good independant apples-to-apples analysis.
|
| I've given some thoughts to the criteria and I think:
| . The criteria should be measurable and distinguishable
| . The methodology should be repeatable by objective 3rd-parties
| . The criteria should support real world scenarios
| . The criteria should be useful for business decisions
| . The criteria should support measuring version over version improvements
| . The criteria should support competitive comparisons
|
| My strawman proposal for security criteria has the following high level
| areas: assurance, security quality, protective capabilities, manageability &
| automation, security update tools and security update policies. I have also
| added one other category based upon previous feedback on the strawman. My
| examples below in each category are not meant to be comprehensive, but just
| a good starting point for the discussions.
|
| Assurance is an area that experienced security professionals will recognize.
| In the days of the NSA's Trusted Computer Systems Evaluation Criteria,
| different assurance levels were tied to different features sets. The Common
| Criteria, in use today introduced a more flexible model that has separated
| assurance levels from the feature set - EAL4, for example stands for
| evaluation assurance level 4. So, in the assurance area, I can think of
| several measurable sub-criteria: 3rd-party certifications, source integrity
| protections, established change control processes, quality assurance
| processes, and penetration testing.
| Security Quality is the term I have begun using for things like
| vulnerabilities that are found in code. In this category, I can think of
| several sub-criteria: number of vulnerabilities fixed, number of public
| vulnerabilities unfixed, average time to fix a publicly-reported
| vulnerability, number of open listening ports, and number of listening
| services or daemons.
| Protective Capabilities are the capabilities that might protect the software
| from attack even in the presence of a vulnerability. Sub-criteria include:
| port firewall capabilities, antivirus/worm capabilities, strong
| authentication, behavior monitoring, backup, and security policy enforcement
| features.
| Manageability & Automation is a category to capture software technologies
| designed to make mistakes less likely to decrease security and the less
| expertise is required to deploy and manage a product with low security risk.
| Sub-criteria would include: security configuration wizards/tools, lockdown
| tools, vulnerability assessment capabilities, security reporting
| capabilities, and central security management capabilities.
| Security Update Packages & Tools are a complementary part of measuring
| security until either security quality or protective capabilities reach 100%
| effectiveness. Sub-criteria should include: automatic update capabilities,
| patch approval capability for enterprise updates, patch deployment tool,
| uninstall capabilities, re-compile requirements, reboot requirements, size
| of packages, and works with 3rd-party deployment tools.
| Security Update Policies is an interesting one that I think there should be
| a lot of discussion about. Sub-criteria for this are would include:
| security lifecycle time period of products, patches all publicly disclosed
| issues, issues security advisories, follows responsible disclosure, offers
| mitigation guidance, patch test quality criteria, and patches all
| versions/languages at the same time. If I am the responsible administrator
| deploying and managing a dynamic web farm, for example, these criteria might
| be strong considerations when selecting my software components.
| Workload Required Security Features is one I have added based upon feedback
| I received when I published an earlier strawman in the Microsoft Security
| Newsletter. The work to flush this out still needs to be done, but the idea
| is as follows. What security features are necessary to lock down the
| security for _any_ OS deployment? What security features would be necessary
| for a secure deployment of a dynamic web server in a particular business
| scenario? A database server? For a given scenario, we would work to
| identify features like identification & authentication, # of biometrics
| supported, remote encrypted connectivity, comprehensive security auditing,
| etc, that would be useful decision making critiria.
|
| If security metrics is an area you care about, then comment on these high
| level categories. Are they too narrow, too broad? If/once we get some
| refinement on the high level criteria, we can split off some individual
| threads for each category and try to flush out a list of criteria.
|
| Jeff
|
| --
| This posting is provided "AS IS" with no warranties, and confers no rights.
 
D

Danny Sanders

http://isecom.securenetltd.com/osstmm.en.2.1.pdf



hth
DDS W 2k MVP MCSE

Carey Frisch said:
Windows XP Security Checklist
http://labmice.techtarget.com/articles/winxpsecuritychecklist.htm

How can I harden my computer or server to secure it from hackers?
http://securityadmin.info/faq4.asp#harden

--
Carey Frisch
Microsoft MVP
Windows XP - Shell/User

Be Smart! Protect your PC!
http://www.microsoft.com/security/protect/

-------------------------------------------------------------------------- --------------------

"Jeffrey R. Jones [MSFT]" (e-mail address removed) wrote in message:

|I would to start a discussion to take the security progress metrics beyond
| theoretical and into the practical, realistic and usable to measure progress
| and relative security. I've had a couple of brainstorm sessions and given
| it some thought, but for the criteria to become useful, I think there should
| be an active community discussion that asks hard questions and refines the
| criteria to an accepted set. If we achieve that, we may even enable some
| good independant apples-to-apples analysis.
|
| I've given some thoughts to the criteria and I think:
| . The criteria should be measurable and distinguishable
| . The methodology should be repeatable by objective 3rd-parties
| . The criteria should support real world scenarios
| . The criteria should be useful for business decisions
| . The criteria should support measuring version over version improvements
| . The criteria should support competitive comparisons
|
| My strawman proposal for security criteria has the following high level
| areas: assurance, security quality, protective capabilities, manageability &
| automation, security update tools and security update policies. I have also
| added one other category based upon previous feedback on the strawman. My
| examples below in each category are not meant to be comprehensive, but just
| a good starting point for the discussions.
|
| Assurance is an area that experienced security professionals will recognize.
| In the days of the NSA's Trusted Computer Systems Evaluation Criteria,
| different assurance levels were tied to different features sets. The Common
| Criteria, in use today introduced a more flexible model that has separated
| assurance levels from the feature set - EAL4, for example stands for
| evaluation assurance level 4. So, in the assurance area, I can think of
| several measurable sub-criteria: 3rd-party certifications, source integrity
| protections, established change control processes, quality assurance
| processes, and penetration testing.
| Security Quality is the term I have begun using for things like
| vulnerabilities that are found in code. In this category, I can think of
| several sub-criteria: number of vulnerabilities fixed, number of public
| vulnerabilities unfixed, average time to fix a publicly-reported
| vulnerability, number of open listening ports, and number of listening
| services or daemons.
| Protective Capabilities are the capabilities that might protect the software
| from attack even in the presence of a vulnerability. Sub-criteria include:
| port firewall capabilities, antivirus/worm capabilities, strong
| authentication, behavior monitoring, backup, and security policy enforcement
| features.
| Manageability & Automation is a category to capture software technologies
| designed to make mistakes less likely to decrease security and the less
| expertise is required to deploy and manage a product with low security risk.
| Sub-criteria would include: security configuration wizards/tools, lockdown
| tools, vulnerability assessment capabilities, security reporting
| capabilities, and central security management capabilities.
| Security Update Packages & Tools are a complementary part of measuring
| security until either security quality or protective capabilities reach 100%
| effectiveness. Sub-criteria should include: automatic update capabilities,
| patch approval capability for enterprise updates, patch deployment tool,
| uninstall capabilities, re-compile requirements, reboot requirements, size
| of packages, and works with 3rd-party deployment tools.
| Security Update Policies is an interesting one that I think there should be
| a lot of discussion about. Sub-criteria for this are would include:
| security lifecycle time period of products, patches all publicly disclosed
| issues, issues security advisories, follows responsible disclosure, offers
| mitigation guidance, patch test quality criteria, and patches all
| versions/languages at the same time. If I am the responsible administrator
| deploying and managing a dynamic web farm, for example, these criteria might
| be strong considerations when selecting my software components.
| Workload Required Security Features is one I have added based upon feedback
| I received when I published an earlier strawman in the Microsoft Security
| Newsletter. The work to flush this out still needs to be done, but the idea
| is as follows. What security features are necessary to lock down the
| security for _any_ OS deployment? What security features would be necessary
| for a secure deployment of a dynamic web server in a particular business
| scenario? A database server? For a given scenario, we would work to
| identify features like identification & authentication, # of biometrics
| supported, remote encrypted connectivity, comprehensive security auditing,
| etc, that would be useful decision making critiria.
|
| If security metrics is an area you care about, then comment on these high
| level categories. Are they too narrow, too broad? If/once we get some
| refinement on the high level criteria, we can split off some individual
| threads for each category and try to flush out a list of criteria.
|
| Jeff
|
| --
| This posting is provided "AS IS" with no warranties, and confers no rights.
 
J

Jeffrey R. Jones [MSFT]

Danny & Carey,

These are some excellent documents, thanks for bringing them to my
attention.

However, these are pretty operationally focused - something I would use
after I selected a my software.

Have you given any thought to security metrics for how you would select
which platform or products you would base an initial decision on?

Jeff


Danny Sanders said:
http://isecom.securenetltd.com/osstmm.en.2.1.pdf



hth
DDS W 2k MVP MCSE

Carey Frisch said:
Windows XP Security Checklist
http://labmice.techtarget.com/articles/winxpsecuritychecklist.htm

How can I harden my computer or server to secure it from hackers?
http://securityadmin.info/faq4.asp#harden

--
Carey Frisch
Microsoft MVP
Windows XP - Shell/User

Be Smart! Protect your PC!
http://www.microsoft.com/security/protect/

-------------------------------------------------------------------------- --------------------

"Jeffrey R. Jones [MSFT]" (e-mail address removed) wrote in message:

|I would to start a discussion to take the security progress metrics beyond
| theoretical and into the practical, realistic and usable to measure progress
| and relative security. I've had a couple of brainstorm sessions and given
| it some thought, but for the criteria to become useful, I think there should
| be an active community discussion that asks hard questions and refines the
| criteria to an accepted set. If we achieve that, we may even enable some
| good independant apples-to-apples analysis.
|
| I've given some thoughts to the criteria and I think:
| . The criteria should be measurable and distinguishable
| . The methodology should be repeatable by objective 3rd-parties
| . The criteria should support real world scenarios
| . The criteria should be useful for business decisions
| . The criteria should support measuring version over version improvements
| . The criteria should support competitive comparisons
|
| My strawman proposal for security criteria has the following high level
| areas: assurance, security quality, protective capabilities, manageability &
| automation, security update tools and security update policies. I have also
| added one other category based upon previous feedback on the strawman. My
| examples below in each category are not meant to be comprehensive, but just
| a good starting point for the discussions.
|
| Assurance is an area that experienced security professionals will recognize.
| In the days of the NSA's Trusted Computer Systems Evaluation Criteria,
| different assurance levels were tied to different features sets. The Common
| Criteria, in use today introduced a more flexible model that has separated
| assurance levels from the feature set - EAL4, for example stands for
| evaluation assurance level 4. So, in the assurance area, I can think
of
| several measurable sub-criteria: 3rd-party certifications, source integrity
| protections, established change control processes, quality assurance
| processes, and penetration testing.
| Security Quality is the term I have begun using for things like
| vulnerabilities that are found in code. In this category, I can think of
| several sub-criteria: number of vulnerabilities fixed, number of public
| vulnerabilities unfixed, average time to fix a publicly-reported
| vulnerability, number of open listening ports, and number of listening
| services or daemons.
| Protective Capabilities are the capabilities that might protect the software
| from attack even in the presence of a vulnerability. Sub-criteria include:
| port firewall capabilities, antivirus/worm capabilities, strong
| authentication, behavior monitoring, backup, and security policy enforcement
| features.
| Manageability & Automation is a category to capture software technologies
| designed to make mistakes less likely to decrease security and the less
| expertise is required to deploy and manage a product with low security risk.
| Sub-criteria would include: security configuration wizards/tools, lockdown
| tools, vulnerability assessment capabilities, security reporting
| capabilities, and central security management capabilities.
| Security Update Packages & Tools are a complementary part of measuring
| security until either security quality or protective capabilities reach 100%
| effectiveness. Sub-criteria should include: automatic update capabilities,
| patch approval capability for enterprise updates, patch deployment
tool,
| uninstall capabilities, re-compile requirements, reboot requirements, size
| of packages, and works with 3rd-party deployment tools.
| Security Update Policies is an interesting one that I think there
should be
| a lot of discussion about. Sub-criteria for this are would include:
| security lifecycle time period of products, patches all publicly disclosed
| issues, issues security advisories, follows responsible disclosure, offers
| mitigation guidance, patch test quality criteria, and patches all
| versions/languages at the same time. If I am the responsible administrator
| deploying and managing a dynamic web farm, for example, these criteria might
| be strong considerations when selecting my software components.
| Workload Required Security Features is one I have added based upon feedback
| I received when I published an earlier strawman in the Microsoft Security
| Newsletter. The work to flush this out still needs to be done, but the idea
| is as follows. What security features are necessary to lock down the
| security for _any_ OS deployment? What security features would be necessary
| for a secure deployment of a dynamic web server in a particular
business
| scenario? A database server? For a given scenario, we would work to
| identify features like identification & authentication, # of biometrics
| supported, remote encrypted connectivity, comprehensive security auditing,
| etc, that would be useful decision making critiria.
|
| If security metrics is an area you care about, then comment on these high
| level categories. Are they too narrow, too broad? If/once we get some
| refinement on the high level criteria, we can split off some individual
| threads for each category and try to flush out a list of criteria.
|
| Jeff
|
| --
| This posting is provided "AS IS" with no warranties, and confers no rights.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top