Date   

Re: [cncf-sig-security] [cncf-toc] Vulnerability scanning for CNCF projects

Emily Fox
 

Same!   We would love a presentation!  Shubhra please add itself to the agenda for an upcoming meeting in December.

The Security SIG group meets every Wednesday at 10:00am PT (USA Pacific)

Meeting minutes and agenda:https://docs.google.com/document/d/170y5biX9k95hYRwprITprG6Mc9xD5glVn-4mB2Jmi2g/

- Emily Fox

@TheMoxieFox (personal handle)


On Thu, 19 Nov 2020, 12:26 Justin Cormack via lists.cncf.io, <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: Vulnerability scanning for CNCF projects

Luke A Hinds
 

Add me as well.

I am one of the maintainers on bandit (python ast based security linter) which hits around 25k downloads a day, so I have a fair amount of experience in what works / does not work well with security linters. As others have mentioned, false positives always happen so you need a developer UX that does not make the linter into something that gets yelled at all the time.

On Thu, Nov 19, 2020 at 5:49 PM Dave Zolotusky via lists.cncf.io <dzolo=spotify.com@...> wrote:
Same, I'd be interested.

~Dave

On Thu, Nov 19, 2020 at 6:26 PM Justin Cormack via lists.cncf.io <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
~Dave


Re: Vulnerability scanning for CNCF projects

Dave Zolotusky
 

Same, I'd be interested.

~Dave

On Thu, Nov 19, 2020 at 6:26 PM Justin Cormack via lists.cncf.io <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
~Dave


Re: Vulnerability scanning for CNCF projects

Justin Cormack
 

I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: Vulnerability scanning for CNCF projects

Shubhra Kar
 

If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

alexis richardson
 

+1


On Thu, Nov 19, 2020 at 10:22 AM Gadi Naor via lists.cncf.io <gadi=alcide.io@...> wrote:
This is a great initiative that also sends a message that security is part of the core functionality. 

Few suggestions:
  1. If we can ensure CNCF projects follow Container Image authoring best practices, such as building Images from scratch or distroless images - it will eliminate a lot of the noise static scanners generate. 
  2. For projects are designed to run on k8s - scanning the deployment assets for security best practices of the manifests (k8s manifests, helm charts, kustomized resources) with tools such as conftest, kyverno, commercial or a combination should be used to verify components do not run as privileged, do not run on host namespaces, have network policies, .etc etc.
  3. In cases where exceptions must be made there should be clear process, and audited policy/config for that - e.g. CVEs that can not get fixed, components that need certain escalated privileges to function etc.
Gadi

On Thu, Nov 19, 2020 at 11:59 AM Gareth Rushgrove <gareth@...> wrote:
On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@...> wrote:
>
> Liz,
>   Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
>   I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
> * Listing of vulns in deliverable artifacts
> * Listing licensing dependencies
> * SBOM
> * vulnerability threshold and prioritizing resolution in prior to artifact delivery
> * vulnerability threshold and prioritizing resolution post artifact delivery
>
> Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
>

I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

> The current numbers for a lot of our projects look really quite bad

This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

>  reviewing such a massive amount of data for project owners might take way too much time

The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

> false positives from things like dependencies only used in test

I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


> ~Emily Fox
> @TheMoxieFox
>
>
> On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
>>
>> Hi TOC and SIG Security folks
>>
>> On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.
>>
>> We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?
>>
>> This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.
>>
>> As well as vulnerability scanning this is showing license dependencies, which could be very useful.
>>
>> For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).
>>
>> Copying Shubra in case he would like to comment further. .
>>
>> Enjoy KubeCon!
>> Liz
>
>



--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com







--
Gadi NaorCTO & Security Plumber
 

US.   2443 Fillmore St, San Francisco, CA, 94115
IL.    5 Miconis St, Tel Aviv, 6777214   
M. +972-52-6618811
Web.      www.alcide.io
GitHub. github.com/alcideio

Follow us on LinkedInFollow us on Twitter 

Complete Kubernetes & Service Mesh Security. 
Bridging Security & DevOps.



Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Gadi Naor
 

This is a great initiative that also sends a message that security is part of the core functionality. 

Few suggestions:
  1. If we can ensure CNCF projects follow Container Image authoring best practices, such as building Images from scratch or distroless images - it will eliminate a lot of the noise static scanners generate. 
  2. For projects are designed to run on k8s - scanning the deployment assets for security best practices of the manifests (k8s manifests, helm charts, kustomized resources) with tools such as conftest, kyverno, commercial or a combination should be used to verify components do not run as privileged, do not run on host namespaces, have network policies, .etc etc.
  3. In cases where exceptions must be made there should be clear process, and audited policy/config for that - e.g. CVEs that can not get fixed, components that need certain escalated privileges to function etc.
Gadi


On Thu, Nov 19, 2020 at 11:59 AM Gareth Rushgrove <gareth@...> wrote:
On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@...> wrote:
>
> Liz,
>   Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
>   I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
> * Listing of vulns in deliverable artifacts
> * Listing licensing dependencies
> * SBOM
> * vulnerability threshold and prioritizing resolution in prior to artifact delivery
> * vulnerability threshold and prioritizing resolution post artifact delivery
>
> Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
>

I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

> The current numbers for a lot of our projects look really quite bad

This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

>  reviewing such a massive amount of data for project owners might take way too much time

The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

> false positives from things like dependencies only used in test

I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


> ~Emily Fox
> @TheMoxieFox
>
>
> On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
>>
>> Hi TOC and SIG Security folks
>>
>> On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.
>>
>> We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?
>>
>> This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.
>>
>> As well as vulnerability scanning this is showing license dependencies, which could be very useful.
>>
>> For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).
>>
>> Copying Shubra in case he would like to comment further. .
>>
>> Enjoy KubeCon!
>> Liz
>
>



--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com







--
Gadi NaorCTO & Security Plumber
 

US.   2443 Fillmore St, San Francisco, CA, 94115
IL.    5 Miconis St, Tel Aviv, 6777214   
M. +972-52-6618811
Web.      www.alcide.io
GitHub. github.com/alcideio

Follow us on LinkedInFollow us on Twitter 

Complete Kubernetes & Service Mesh Security. 
Bridging Security & DevOps.



Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Gareth Rushgrove
 

On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@...> wrote:

Liz,
Love this. As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices. The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment. This fits in alignment with those efforts. We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this. I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
* Listing of vulns in deliverable artifacts
* Listing licensing dependencies
* SBOM
* vulnerability threshold and prioritizing resolution in prior to artifact delivery
* vulnerability threshold and prioritizing resolution post artifact delivery

Definitely worth a conversation and follow-ups. Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

The current numbers for a lot of our projects look really quite bad
This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

reviewing such a massive amount of data for project owners might take way too much time
The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

false positives from things like dependencies only used in test
I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


~Emily Fox
@TheMoxieFox


On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:

Hi TOC and SIG Security folks

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com


Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Eli Nesterov <eli.nesterov@...>
 

Liz, this is great! Having vulnerability scanning is a good thing, but looking into the results might be too many false positives (as you pointed out) and noise. In my experience, reviewing such a massive amount of data for project owners might take way too much time.
I actually like the idea of the security scorecard https://github.com/ossf/scorecard which covers lots of security best practices and provides lots of actionable feedback along with advice on how to improve using different tools. 

--eli

On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: FYI: Cloud Native Security Whitepaper 2020

Matt Jarvis
 

This is awesome ! Well done folks ...


On Wed, 18 Nov 2020 at 17:41, Justin Cormack via lists.cncf.io <justin.cormack=docker.com@...> wrote:
Thanks to everyone who worked so hard on this. Congratulations on shipping it, it will be very
helpful. 

Justin


On Wed, Nov 18, 2020 at 5:38 PM Chris Aniszczyk <caniszczyk@...> wrote:
The CNCF Security SIG did an excellent job putting together a white paper around cloud native security: https://github.com/cncf/sig-security/blob/master/security-whitepaper/cloud-native-security-whitepaper.md

It's great! Please check it out and feel free to provide their community feedback on it!

--
Chris Aniszczyk (@cra)


[RESULT] Buildpacks moves to incubation

Amye Scavarda Perrin
 


Re: FYI: Cloud Native Security Whitepaper 2020

Justin Cormack
 

Thanks to everyone who worked so hard on this. Congratulations on shipping it, it will be very
helpful. 

Justin


On Wed, Nov 18, 2020 at 5:38 PM Chris Aniszczyk <caniszczyk@...> wrote:
The CNCF Security SIG did an excellent job putting together a white paper around cloud native security: https://github.com/cncf/sig-security/blob/master/security-whitepaper/cloud-native-security-whitepaper.md

It's great! Please check it out and feel free to provide their community feedback on it!

--
Chris Aniszczyk (@cra)


FYI: Cloud Native Security Whitepaper 2020

Chris Aniszczyk
 

The CNCF Security SIG did an excellent job putting together a white paper around cloud native security: https://github.com/cncf/sig-security/blob/master/security-whitepaper/cloud-native-security-whitepaper.md

It's great! Please check it out and feel free to provide their community feedback on it!

--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Chris Aniszczyk
 

Let's put it as a discussion item for the next meeting and consider rolling it out in 2021


On Wed, Nov 18, 2020 at 9:47 AM Liz Rice <liz@...> wrote:
Thanks Chris. 

We could also require it for TOC members & SIG chairs too 


On Wed, Nov 18, 2020 at 2:40 PM Chris Aniszczyk <caniszczyk@...> wrote:
Thanks!

Liz I have added this as a requirement per the project proposal process: https://github.com/cncf/toc/pull/570

We can discuss at the next TOC meeting to vote/finalize the changes, but I think the best place to put the requirement is at the project proposal phase where we can easily check against the initial list of maintainers. In the future, we can try to do something fancy like an automated audit report based on what's in maintainers.cncf.io and if they have taken the course.

On Wed, Nov 18, 2020 at 8:30 AM Bartłomiej Płotka <bwplotka@...> wrote:
Hi,

Just completed it (takes ~20 min) and definitely can recommend it to all who maintain projects on open source! 🤗 

It's actionable and insightful, +1 to make it mandatory.

BTW, direct training link: https://training.linuxfoundation.org/training/inclusive-open-source-community-orientation-lfc102/ (it's free)

Kind Regards,
Bartek Płotka (@bwplotka)

On Wed, 18 Nov 2020 at 10:07, Liz Rice <liz@...> wrote:
I’d like to see all project maintainers taking this at all maturity levels

Probably getting carried away here, but it would be nice if we could automate this, a bit like CLA bots: automatically flagging up anyone who’s listed in a Maintainers file if they haven’t taken the course


On Fri, 13 Nov 2020 at 15:31, Chris Aniszczyk <caniszczyk@...> wrote:
As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!


--
Chris Aniszczyk (@cra)



--
Chris Aniszczyk (@cra)


--
Chris Aniszczyk (@cra)


Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Chris Aniszczyk
 

" Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?"

We have a graduation requirement around CII badging which requires a security disclosure process so it's there but not codified formally, we could do that, I think the important thing is that projects also publish advisories in a standard way (like via the github security API)

We should treat the LF tool suite as another option for projects to take advantage of, already many projects are using Snyk, FOSSA, Whitesource etc that is listed here: https://github.com/cncf/servicedesk#tools

You can kind of get an SBOM (depending you define sbom ;p) for some of our projects already: https://app.fossa.com/attribution/c189c5b9-fe2c-45f2-ba40-c34c36bab868

I think offering projects more choice is always better as the landscape changes often in tooling.

On Wed, Nov 18, 2020 at 10:54 AM Emily Fox <themoxiefoxatwork@...> wrote:
Liz,
  Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
  I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
* Listing of vulns in deliverable artifacts
* Listing licensing dependencies
* SBOM
* vulnerability threshold and prioritizing resolution in prior to artifact delivery
* vulnerability threshold and prioritizing resolution post artifact delivery

Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?

~Emily Fox


On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
Chris Aniszczyk (@cra)


Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Emily Fox
 

Liz,
  Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
  I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
* Listing of vulns in deliverable artifacts
* Listing licensing dependencies
* SBOM
* vulnerability threshold and prioritizing resolution in prior to artifact delivery
* vulnerability threshold and prioritizing resolution post artifact delivery

Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?

~Emily Fox


On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Vulnerability scanning for CNCF projects

Liz Rice
 

Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: FYI: New Training Course on Diversity in Open Source

Liz Rice
 

Thanks Chris. 

We could also require it for TOC members & SIG chairs too 


On Wed, Nov 18, 2020 at 2:40 PM Chris Aniszczyk <caniszczyk@...> wrote:
Thanks!

Liz I have added this as a requirement per the project proposal process: https://github.com/cncf/toc/pull/570

We can discuss at the next TOC meeting to vote/finalize the changes, but I think the best place to put the requirement is at the project proposal phase where we can easily check against the initial list of maintainers. In the future, we can try to do something fancy like an automated audit report based on what's in maintainers.cncf.io and if they have taken the course.

On Wed, Nov 18, 2020 at 8:30 AM Bartłomiej Płotka <bwplotka@...> wrote:
Hi,

Just completed it (takes ~20 min) and definitely can recommend it to all who maintain projects on open source! 🤗 

It's actionable and insightful, +1 to make it mandatory.

BTW, direct training link: https://training.linuxfoundation.org/training/inclusive-open-source-community-orientation-lfc102/ (it's free)

Kind Regards,
Bartek Płotka (@bwplotka)

On Wed, 18 Nov 2020 at 10:07, Liz Rice <liz@...> wrote:
I’d like to see all project maintainers taking this at all maturity levels

Probably getting carried away here, but it would be nice if we could automate this, a bit like CLA bots: automatically flagging up anyone who’s listed in a Maintainers file if they haven’t taken the course


On Fri, 13 Nov 2020 at 15:31, Chris Aniszczyk <caniszczyk@...> wrote:
As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!


--
Chris Aniszczyk (@cra)



--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Chris Aniszczyk
 

Thanks!

Liz I have added this as a requirement per the project proposal process: https://github.com/cncf/toc/pull/570

We can discuss at the next TOC meeting to vote/finalize the changes, but I think the best place to put the requirement is at the project proposal phase where we can easily check against the initial list of maintainers. In the future, we can try to do something fancy like an automated audit report based on what's in maintainers.cncf.io and if they have taken the course.

On Wed, Nov 18, 2020 at 8:30 AM Bartłomiej Płotka <bwplotka@...> wrote:
Hi,

Just completed it (takes ~20 min) and definitely can recommend it to all who maintain projects on open source! 🤗 

It's actionable and insightful, +1 to make it mandatory.

BTW, direct training link: https://training.linuxfoundation.org/training/inclusive-open-source-community-orientation-lfc102/ (it's free)

Kind Regards,
Bartek Płotka (@bwplotka)

On Wed, 18 Nov 2020 at 10:07, Liz Rice <liz@...> wrote:
I’d like to see all project maintainers taking this at all maturity levels

Probably getting carried away here, but it would be nice if we could automate this, a bit like CLA bots: automatically flagging up anyone who’s listed in a Maintainers file if they haven’t taken the course


On Fri, 13 Nov 2020 at 15:31, Chris Aniszczyk <caniszczyk@...> wrote:
As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!


--
Chris Aniszczyk (@cra)



--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Bartłomiej Płotka
 

Hi,

Just completed it (takes ~20 min) and definitely can recommend it to all who maintain projects on open source! 🤗 

It's actionable and insightful, +1 to make it mandatory.

BTW, direct training link: https://training.linuxfoundation.org/training/inclusive-open-source-community-orientation-lfc102/ (it's free)

Kind Regards,
Bartek Płotka (@bwplotka)

On Wed, 18 Nov 2020 at 10:07, Liz Rice <liz@...> wrote:
I’d like to see all project maintainers taking this at all maturity levels

Probably getting carried away here, but it would be nice if we could automate this, a bit like CLA bots: automatically flagging up anyone who’s listed in a Maintainers file if they haven’t taken the course


On Fri, 13 Nov 2020 at 15:31, Chris Aniszczyk <caniszczyk@...> wrote:
As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!


--
Chris Aniszczyk (@cra)

1681 - 1700 of 7197