Date   

Re: [EXTERNAL] [cncf-toc] [VOTE] Open Policy Agent from incubating to graduated

Davanum Srinivas
 

+1 Non-binding

On Tue, Dec 8, 2020 at 11:45 AM Brendan Burns via lists.cncf.io <bburns=microsoft.com@...> wrote:
+1, Binding



From: cncf-toc@... <cncf-toc@...> on behalf of Amye Scavarda Perrin via lists.cncf.io <ascavarda=linuxfoundation.org@...>
Sent: Wednesday, September 30, 2020 9:00 AM
To: CNCF TOC <cncf-toc@...>
Subject: [EXTERNAL] [cncf-toc] [VOTE] Open Policy Agent from incubating to graduated
 
The Open Policy Agent project has applied for graduation from incubation to graduated. (https://github.com/cncf/toc/pull/520)

The due diligence document can be found here: https://docs.google.com/document/d/19M5fTpe57rQIMNxawRl5wSWvJUapuzY-CkV4O5pvieU/edit
 
Brendan Burns has called for public comment: https://lists.cncf.io/g/cncf-toc/message/5281

Please vote (+1/0/-1) by replying to this thread.

Remember that the TOC has binding votes only, but we do appreciate non-binding votes from the community as a sign of support!

--
Amye Scavarda Perrin | Program Manager | amye@...



--
Davanum Srinivas :: https://twitter.com/dims


Re: [EXTERNAL] [cncf-toc] [VOTE] Open Policy Agent from incubating to graduated

Brendan Burns
 

+1, Binding



From: cncf-toc@... <cncf-toc@...> on behalf of Amye Scavarda Perrin via lists.cncf.io <ascavarda=linuxfoundation.org@...>
Sent: Wednesday, September 30, 2020 9:00 AM
To: CNCF TOC <cncf-toc@...>
Subject: [EXTERNAL] [cncf-toc] [VOTE] Open Policy Agent from incubating to graduated
 
The Open Policy Agent project has applied for graduation from incubation to graduated. (https://github.com/cncf/toc/pull/520)

The due diligence document can be found here: https://docs.google.com/document/d/19M5fTpe57rQIMNxawRl5wSWvJUapuzY-CkV4O5pvieU/edit
 
Brendan Burns has called for public comment: https://lists.cncf.io/g/cncf-toc/message/5281

Please vote (+1/0/-1) by replying to this thread.

Remember that the TOC has binding votes only, but we do appreciate non-binding votes from the community as a sign of support!

--
Amye Scavarda Perrin | Program Manager | amye@...


[RFC] Refining the way we communicate deprecations/wide-reaching changes to the project

Stephen Augustus
 

Forwarding here as well, if anyone is interested in leaving feedback.

-- Stephen

---------- Forwarded message ---------
From: Stephen Augustus <stephen.k8s@...>
Date: Wed, Dec 2, 2020, 22:56
Subject: [k8s-steering] [RFC] Refining the way we communicate deprecations/wide-reaching changes to the project
To: Kubernetes developer/contributor discussion <kubernetes-dev@...>
Cc: steering <steering@...>


Hey Kubernetes Community,

tl;dr -- words are hard sometimes and we should take some time and care to assess the way we wield them.

---

As we go through deprecations and infrastructure changes in the project, it might be a worthwhile exercise to assess and refine the way we communicate them.

I can think of a few recent examples that caused some panic and required additional lift from contributors to reframe or contort/extend support to accommodate:
We should consider what it means to turn down a service, piece of functionality, or kubernetes/kubernetes-adjacent system and type of impact it may have for consumers.

Without policing contributors, as maintainers of the project, we also have a responsibility to users to be careful and deliberate with our communications outside of the project, whether it be Twitter, Hacker News, etc., etc.

So how can we improve?

I think depending on the scope of a change, the following SIGs should be involved in crafting comms:
  • SIG Architecture
  • SIG Release
  • SIG Docs
With SIG ContribEx to assist with consistent delivery across our properties.

I'm curious to hear everyone's thoughts here.

-- Stephen

--
You received this message because you are subscribed to the Google Groups "steering" group.
To unsubscribe from this group and stop receiving emails from it, send an email to steering+unsubscribe@....
To view this discussion on the web visit https://groups.google.com/a/kubernetes.io/d/msgid/steering/CAOqU-DRtVQRC79v1xM5zVpQ11hWoyqdhgrhOamkVQ3%2B5kJw44A%40mail.gmail.com.


Agenda for 12/1

Amye Scavarda Perrin
 

Hi all, 
We'll be meeting tomorrow: 
Optional SIG Updates
New Training Course on Diversity in Open Source
Moved: Security Scanning for projects to December 15th when Liz is able to make the meeting 


Thanks! 

--
Amye Scavarda Perrin | Program Manager | amye@...


Re: [VOTE] Buildpacks to move to incubation

Isaac Mosquera
 

+1 NB



On Wed, Nov 18, 2020 12:49 PM, Archy k ayrat.khayretdinov@... wrote:
+1 NB

On Wed, Oct 7, 2020 at 5:21 PM Amye Scavarda Perrin <ascavarda@...> wrote:
Cloud Native Buildpacks has applied to move from sandbox to incubation. (https://github.com/cncf/toc/pull/338)

Justin Cormack is the TOC sponsor for this project, he has performed Due Diligence (https://docs.google.com/document/d/1tb3mK5cJmaQLO8xR__9NaH2GMrdn3WPjAZFBJYsXrxY/edit) and called for public comment. (https://lists.cncf.io/g/cncf-toc/message/5317)

Please vote (+1/0/-1) by replying to this thread.

Remember that the TOC has binding votes only, but we do appreciate non-binding votes from the community as a sign of support!

--
Amye Scavarda Perrin | Program Manager | amye@...



#velocity,

I S A A C  M O S Q U E R A
Chief Technology Officer
p: 703.795.5322


Apologies

Liz Rice
 

I'm sorry folks, I have a conflict coming up on Tuesday 1st, and I am going to have to skip the TOC meeting

Liz


[RESULT] etcd for graduation

Amye Scavarda Perrin
 

The etcd project has been approved for graduation. (https://lists.cncf.io/g/cncf-toc/message/5452)
+1 Binding 
9/10
Matt Klein: https://lists.cncf.io/g/cncf-toc/message/5453
Brendan Burns: https://lists.cncf.io/g/cncf-toc/message/5461
Saad Ali: https://lists.cncf.io/g/cncf-toc/message/5473
Sheng Liang: https://lists.cncf.io/g/cncf-toc/message/5474
Xiang Li: https://lists.cncf.io/g/cncf-toc/message/5477
Alena Prokharchyk: https://lists.cncf.io/g/cncf-toc/message/5478
Dave Zolotusky: https://lists.cncf.io/g/cncf-toc/message/5482  
Justin Cormack: https://lists.cncf.io/g/cncf-toc/message/5495
Liz Rice: https://lists.cncf.io/g/cncf-toc/message/5496    

+1 NB
Lee Calcote: https://lists.cncf.io/g/cncf-toc/message/5454
Bartłomiej Płotka: https://lists.cncf.io/g/cncf-toc/message/5455
John Hillegass: https://lists.cncf.io/g/cncf-toc/message/5456
Tim St. Clair: https://lists.cncf.io/g/cncf-toc/message/5457
Barak Stout: https://lists.cncf.io/g/cncf-toc/message/5458
Kevin Ryan: https://lists.cncf.io/g/cncf-toc/message/5459
Bhaarat Sharma: https://lists.cncf.io/g/cncf-toc/message/5460
Yin Ding: https://lists.cncf.io/g/cncf-toc/message/5462
Ken Owens: https://lists.cncf.io/g/cncf-toc/message/5463
Archy K: https://lists.cncf.io/g/cncf-toc/message/5464
Ken Sipe: https://lists.cncf.io/g/cncf-toc/message/5465
Ricardo Aravena: https://lists.cncf.io/g/cncf-toc/message/5466
Andrew Aitken: https://lists.cncf.io/g/cncf-toc/message/5467
Katie Gamanji: https://lists.cncf.io/g/cncf-toc/message/5468
Ido Samuelson: https://lists.cncf.io/g/cncf-toc/message/5469
Frederick Kautz: https://lists.cncf.io/g/cncf-toc/message/5470
Oleg Chornyi: https://lists.cncf.io/g/cncf-toc/message/5471
Keith Burdis: https://lists.cncf.io/g/cncf-toc/message/5472
alexis richardson: https://lists.cncf.io/g/cncf-toc/message/5475
Suresh Krishnan: https://lists.cncf.io/g/cncf-toc/message/5476
Kiran Mova: https://lists.cncf.io/g/cncf-toc/message/5479
Alois Reitbauer: https://lists.cncf.io/g/cncf-toc/message/5480
Romaric Philogène: https://lists.cncf.io/g/cncf-toc/message/5481
Bob Wise: https://lists.cncf.io/g/cncf-toc/message/5483
Stephen Augustus: https://lists.cncf.io/g/cncf-toc/message/5484
Golfen Guo: https://lists.cncf.io/g/cncf-toc/message/5485
Benjamin Texier: https://lists.cncf.io/g/cncf-toc/message/5486
Richard Hartmann: https://lists.cncf.io/g/cncf-toc/message/5487
Philippe Robin: https://lists.cncf.io/g/cncf-toc/message/5488
Xu Wang: https://lists.cncf.io/g/cncf-toc/message/5489
Alex Chircop: https://lists.cncf.io/g/cncf-toc/message/5490
Thomas Schuetz: https://lists.cncf.io/g/cncf-toc/message/5491
Isaac Mosquera: https://lists.cncf.io/g/cncf-toc/message/5492
Sunny Raskar: https://lists.cncf.io/g/cncf-toc/message/5493
Tzury Bar Yochay: https://lists.cncf.io/g/cncf-toc/message/5494
Robert Wilkins III: https://lists.cncf.io/g/cncf-toc/message/5497

--
Amye Scavarda Perrin | Program Manager | amye@...


Re: [cncf-sig-security] [cncf-toc] Vulnerability scanning for CNCF projects

Vinay Venkataraghavan <vvenkatara@...>
 

Hello everyone, 
Just catching up on the thread and a little late to the discussion. I'm in total agreement with other points already brought up that we should:
  • Have some policies and guidance around visibility into the vulnerabilities in container images. 
  • Broad guard rails which call out certain gates that have to pass (within reason). 
  • Sticking to process of visibility and continuous improvement to ensure that over time the incubating and graduated projects are improving their security posture. 
I love the dashboard of the LFX security tool / project. Would be great to see how we could either incorporate or expand that for other security usecases for projects going through graduation. 
Thanks,
- Vinay 




On Thu, Nov 19, 2020 at 10:01 AM Emily Fox <themoxiefoxatwork@...> wrote:
Same!   We would love a presentation!  Shubhra please add itself to the agenda for an upcoming meeting in December.

The Security SIG group meets every Wednesday at 10:00am PT (USA Pacific)

Meeting minutes and agenda:https://docs.google.com/document/d/170y5biX9k95hYRwprITprG6Mc9xD5glVn-4mB2Jmi2g/

- Emily Fox

@TheMoxieFox (personal handle)

On Thu, 19 Nov 2020, 12:26 Justin Cormack via lists.cncf.io, <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
Technical Director, Office of the CTO
Prisma Cloud | Palo Alto Networks


FYI: CNCF TOC Panel at KubeCon NA 2020

Chris Aniszczyk
 

Hey all, we are doing our traditional TOC panel at KubeCon today:

Feel free to join us at the tail end of kubecon and if you have any questions, please let us know here on the list or on the #toc channel in Slack: https://docs.google.com/presentation/d/1Rsj1rLwTPDcltGEQLHTDDefeSyn6O9ZB4htruSZFHr8/edit?usp=sharing

The big thing to bring up on my end is that we will be having TOC elections in January 2021 for about a handful of seats.

--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Chris Aniszczyk
 

Thanks, this is a discussion point for the TOC but I think the reality will be a roll out in 2021 at some level.

Also thank you Arun for pushing me to get this done in time for kubecon :)

Please continue taking the course and offer feedback, we plan on reserving budget every 1-2 years to do a refresh.

On Fri, Nov 20, 2020 at 1:45 AM Luke Hinds <lhinds@...> wrote:
Should we make it mandatory (which I think is a good idea), it would be useful to cross reference kubernetes who took the same approach for all leaders (I did the course as part of product security). That way we won't be nagging people to do training they have already completed.

On Fri, Nov 20, 2020 at 1:55 AM Arun Gupta <arun.gupta@...> wrote:
Chris,

I just completed the course and it's extremely valuable. As already mentioned and noted for next TOC agenda, this should be a must for all leadership positions in CNCF.

It took me > 1 hour to listen all audio transcripts and taking the quiz, but a great use of time. Here is the direct link: https://trainingportal.linuxfoundation.org/learn/course/inclusive-open-source-community-orientation-lfc102/

Thanks,
Arun

On Nov 13, 2020, at 7:30 AM, Chris Aniszczyk <caniszczyk@...> wrote:

As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!

--
Chris Aniszczyk (@cra)



--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Luke A Hinds
 

Should we make it mandatory (which I think is a good idea), it would be useful to cross reference kubernetes who took the same approach for all leaders (I did the course as part of product security). That way we won't be nagging people to do training they have already completed.

On Fri, Nov 20, 2020 at 1:55 AM Arun Gupta <arun.gupta@...> wrote:
Chris,

I just completed the course and it's extremely valuable. As already mentioned and noted for next TOC agenda, this should be a must for all leadership positions in CNCF.

It took me > 1 hour to listen all audio transcripts and taking the quiz, but a great use of time. Here is the direct link: https://trainingportal.linuxfoundation.org/learn/course/inclusive-open-source-community-orientation-lfc102/

Thanks,
Arun

On Nov 13, 2020, at 7:30 AM, Chris Aniszczyk <caniszczyk@...> wrote:

As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!

--
Chris Aniszczyk (@cra)


Re: FYI: New Training Course on Diversity in Open Source

Arun Gupta
 

Chris,

I just completed the course and it's extremely valuable. As already mentioned and noted for next TOC agenda, this should be a must for all leadership positions in CNCF.

It took me > 1 hour to listen all audio transcripts and taking the quiz, but a great use of time. Here is the direct link: https://trainingportal.linuxfoundation.org/learn/course/inclusive-open-source-community-orientation-lfc102/

Thanks,
Arun

On Nov 13, 2020, at 7:30 AM, Chris Aniszczyk <caniszczyk@...> wrote:

As a follow up from previous discussions on D&I training, we at The Linux Foundation in partnership with NCWIT are launching a new course on building inclusive open source communities that CNCF helped fund: https://training.linuxfoundation.org/announcements/linux-foundation-and-ncwit-release-free-training-course-on-diversity-in-open-source/

We should consider making this as a graduation requirement or even as part of project acceptance, food for thought as we ramp up for kubecon + cloudnativecon next week!

--
Chris Aniszczyk (@cra)


Re: [cncf-sig-security] [cncf-toc] Vulnerability scanning for CNCF projects

Emily Fox
 

Same!   We would love a presentation!  Shubhra please add itself to the agenda for an upcoming meeting in December.

The Security SIG group meets every Wednesday at 10:00am PT (USA Pacific)

Meeting minutes and agenda:https://docs.google.com/document/d/170y5biX9k95hYRwprITprG6Mc9xD5glVn-4mB2Jmi2g/

- Emily Fox

@TheMoxieFox (personal handle)


On Thu, 19 Nov 2020, 12:26 Justin Cormack via lists.cncf.io, <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: Vulnerability scanning for CNCF projects

Luke A Hinds
 

Add me as well.

I am one of the maintainers on bandit (python ast based security linter) which hits around 25k downloads a day, so I have a fair amount of experience in what works / does not work well with security linters. As others have mentioned, false positives always happen so you need a developer UX that does not make the linter into something that gets yelled at all the time.

On Thu, Nov 19, 2020 at 5:49 PM Dave Zolotusky via lists.cncf.io <dzolo=spotify.com@...> wrote:
Same, I'd be interested.

~Dave

On Thu, Nov 19, 2020 at 6:26 PM Justin Cormack via lists.cncf.io <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
~Dave


Re: Vulnerability scanning for CNCF projects

Dave Zolotusky
 

Same, I'd be interested.

~Dave

On Thu, Nov 19, 2020 at 6:26 PM Justin Cormack via lists.cncf.io <justin.cormack=docker.com@...> wrote:
I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz



--
~Dave


Re: Vulnerability scanning for CNCF projects

Justin Cormack
 

I would be interested in that. 

Justin


On Thu, 19 Nov 2020 at 17:23, Shubhra Kar <skar@...> wrote:
If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: Vulnerability scanning for CNCF projects

Shubhra Kar
 

If this group is interested, my team would love to present the capabilities and limitations alike of the LFX security tool project. We are working on items like creating a SBOM policy management, adding support for scanning build systems and container images next. Secrets management and static analysis are longer term roadmap items. 

Top challenges we need to solve collectively relatively quickly:

1. The tool provides capability to turn on/off dev dependencies, need the group to identify if we need to do that and which dev dependencies in particular. Project maintainers are probably the best equipped to determine this list.
2. A project is usually spread over multiple orgs and repo combinations. Some repos don't have a manifest file, which LFX needs in order to scan. A best practice would be to ensure there is consistent manifest creation.


Kind Regards,

Shubhra Kar
CTO and GM of Products and IT
tweet: @shubhrakar



On Wed, Nov 18, 2020 at 8:41 AM Liz Rice <liz@...> wrote:
Hi TOC and SIG Security folks 

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)? 

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts. 

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).  

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

alexis richardson
 

+1


On Thu, Nov 19, 2020 at 10:22 AM Gadi Naor via lists.cncf.io <gadi=alcide.io@...> wrote:
This is a great initiative that also sends a message that security is part of the core functionality. 

Few suggestions:
  1. If we can ensure CNCF projects follow Container Image authoring best practices, such as building Images from scratch or distroless images - it will eliminate a lot of the noise static scanners generate. 
  2. For projects are designed to run on k8s - scanning the deployment assets for security best practices of the manifests (k8s manifests, helm charts, kustomized resources) with tools such as conftest, kyverno, commercial or a combination should be used to verify components do not run as privileged, do not run on host namespaces, have network policies, .etc etc.
  3. In cases where exceptions must be made there should be clear process, and audited policy/config for that - e.g. CVEs that can not get fixed, components that need certain escalated privileges to function etc.
Gadi

On Thu, Nov 19, 2020 at 11:59 AM Gareth Rushgrove <gareth@...> wrote:
On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@...> wrote:
>
> Liz,
>   Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
>   I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
> * Listing of vulns in deliverable artifacts
> * Listing licensing dependencies
> * SBOM
> * vulnerability threshold and prioritizing resolution in prior to artifact delivery
> * vulnerability threshold and prioritizing resolution post artifact delivery
>
> Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
>

I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

> The current numbers for a lot of our projects look really quite bad

This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

>  reviewing such a massive amount of data for project owners might take way too much time

The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

> false positives from things like dependencies only used in test

I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


> ~Emily Fox
> @TheMoxieFox
>
>
> On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
>>
>> Hi TOC and SIG Security folks
>>
>> On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.
>>
>> We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?
>>
>> This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.
>>
>> As well as vulnerability scanning this is showing license dependencies, which could be very useful.
>>
>> For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).
>>
>> Copying Shubra in case he would like to comment further. .
>>
>> Enjoy KubeCon!
>> Liz
>
>



--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com







--
Gadi NaorCTO & Security Plumber
 

US.   2443 Fillmore St, San Francisco, CA, 94115
IL.    5 Miconis St, Tel Aviv, 6777214   
M. +972-52-6618811
Web.      www.alcide.io
GitHub. github.com/alcideio

Follow us on LinkedInFollow us on Twitter 

Complete Kubernetes & Service Mesh Security. 
Bridging Security & DevOps.



Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Gadi Naor
 

This is a great initiative that also sends a message that security is part of the core functionality. 

Few suggestions:
  1. If we can ensure CNCF projects follow Container Image authoring best practices, such as building Images from scratch or distroless images - it will eliminate a lot of the noise static scanners generate. 
  2. For projects are designed to run on k8s - scanning the deployment assets for security best practices of the manifests (k8s manifests, helm charts, kustomized resources) with tools such as conftest, kyverno, commercial or a combination should be used to verify components do not run as privileged, do not run on host namespaces, have network policies, .etc etc.
  3. In cases where exceptions must be made there should be clear process, and audited policy/config for that - e.g. CVEs that can not get fixed, components that need certain escalated privileges to function etc.
Gadi


On Thu, Nov 19, 2020 at 11:59 AM Gareth Rushgrove <gareth@...> wrote:
On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@...> wrote:
>
> Liz,
>   Love this.  As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices.  The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment.   This fits in alignment with those efforts.  We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this.  I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
>   I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
> * Listing of vulns in deliverable artifacts
> * Listing licensing dependencies
> * SBOM
> * vulnerability threshold and prioritizing resolution in prior to artifact delivery
> * vulnerability threshold and prioritizing resolution post artifact delivery
>
> Definitely worth a conversation and follow-ups.  Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
>

I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

> The current numbers for a lot of our projects look really quite bad

This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

>  reviewing such a massive amount of data for project owners might take way too much time

The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

> false positives from things like dependencies only used in test

I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


> ~Emily Fox
> @TheMoxieFox
>
>
> On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@...> wrote:
>>
>> Hi TOC and SIG Security folks
>>
>> On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.
>>
>> We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?
>>
>> This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.
>>
>> As well as vulnerability scanning this is showing license dependencies, which could be very useful.
>>
>> For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).
>>
>> Copying Shubra in case he would like to comment further. .
>>
>> Enjoy KubeCon!
>> Liz
>
>



--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com







--
Gadi NaorCTO & Security Plumber
 

US.   2443 Fillmore St, San Francisco, CA, 94115
IL.    5 Miconis St, Tel Aviv, 6777214   
M. +972-52-6618811
Web.      www.alcide.io
GitHub. github.com/alcideio

Follow us on LinkedInFollow us on Twitter 

Complete Kubernetes & Service Mesh Security. 
Bridging Security & DevOps.



Re: [cncf-sig-security] Vulnerability scanning for CNCF projects

Gareth Rushgrove
 

On Wed, 18 Nov 2020 at 16:54, Emily Fox <themoxiefoxatwork@gmail.com> wrote:

Liz,
Love this. As part of the assessments SIG-Security performs, we've begun highlighting the importance of secure development practices. The last few assessments we've begun pushing more for this, as well as responsible disclosure instructions and general security mindedness for project sustainment. This fits in alignment with those efforts. We currently have the assessment process undergoing some updates (held currently for kubecon) and this make it a great time to potentially include this. I personally would like to see license dependencies and dependency trees to help push forward in the area of SBOM.
I think we should be clear however in what our thresholds and terms are in this area, offhand i can think of the following potentials:
* Listing of vulns in deliverable artifacts
* Listing licensing dependencies
* SBOM
* vulnerability threshold and prioritizing resolution in prior to artifact delivery
* vulnerability threshold and prioritizing resolution post artifact delivery

Definitely worth a conversation and follow-ups. Do you have anything in mind that are must haves off the above or anything I missed or misunderstood?
I'd be happy to join and help here.

HUGE DISCLAIMER. I work at Snyk, which is the service powering the
scans. I'm also a maintainer of Conftest as part of the Open Policy
Agent project and know a bunch of folks on here. I'm not trying to
sell you anything, other nice vendors exist, etc. I just happen to
have opinions and experience here.

The current numbers for a lot of our projects look really quite bad
This is nearly always the case when projects or company first look at
vulnerabilities. It's indicative of the problem domain more so than
projects doing the wrong thing. Fixing starts with visibility.

reviewing such a massive amount of data for project owners might take way too much time
The main thing to do is break the problem down. Luckily there are a
few things you can do here.

* As you note, starting with non-test dependencies is a good idea
* Then start with the most severe and those which can be fixed, and
repeat. Standards like CVSS exist, as well as more involved
vendor-specific mechanisms. CVSS is mainly simple to read on the
surface (Low 0.1 - 3.9, Medium 4.0 - 6.9, High 7.0 - 8.9, Critical 9.0
- 10.0)
* Each time you clear a new threshold, put in checks in CI to help
enforce things in the future

For instance:

* Start with Critical (CVSS 9+), non-test issues that have a fix available
* Add a CI check to break the build for CVSS 9+, non-test, fixable issues
* Do the same for 8+ non-test
* Do the same for 9+ test
...

etc.

In this way what seems an impossibly large bit of work gets broken
down and you get value quickly. You can absolutely do this at your own
pace. I wouldn't advocate for CNCF to set deadlines, though guidelines
and reporting for graduated projects might be useful.

Separately, you likely want to have some level of triage for
vulnerabilities that don't have fixes available yet. The above
approach is somewhat mechanical, triage needs more context and
security experience. I'd at least recommend having maintainers triage
Critical severity issues in dependencies. Assuming that's rare, you
can extend this as far as you like and have time to do (to High, or
Medium, or a specific CVSS threshold).

false positives from things like dependencies only used in test
I wouldn't think of test vulnerabilities as false positives, just
potential a different type of vulnerability. As one example,
compromised test vulnerabilities have the potential to steal build
credentials and suddenly someone is shipping a compromised version of
software to end users using your release toolchain.


I'm sure the above is obvious to some, but I thought it was worth
laying out. It should also be pretty tool agnostic.
As mentioned, happy to join conversations if folks are discussing.

Gareth


~Emily Fox
@TheMoxieFox


On Wed, Nov 18, 2020 at 11:41 AM Liz Rice <liz@lizrice.com> wrote:

Hi TOC and SIG Security folks

On Friday I got a nice preview from Shubhra Kar and his team at the LF about some tools they are building to provide insights and stats for LF (and therefore CNCF) projects. One that's of particular interest is an integration of scanning security issues.

We require graduated projects to have security reviews, and SIG Security are offering additional assessments, but we don't really have any standards around whether project artifacts shipping with vulnerabilities. Should we have something in place for requiring projects to have a process to fix vulnerability issues (at least the serious ones)?

This tooling is off to a great start. The current numbers for a lot of our projects look really quite bad, but this may be to do with scanning all the repos related to a project's org. I'd imagine there are also some false positives from things like dependencies only used in test that don't affect the security of the executables that end users run - we may want to look at just reporting vulnerabilities from a project's deployable artifacts.

As well as vulnerability scanning this is showing license dependencies, which could be very useful.

For discussion, how we want to use this kind of info, and whether we want to formalize requirements on projects (e.g. at graduation or incubation levels).

Copying Shubra in case he would like to comment further. .

Enjoy KubeCon!
Liz


--
Gareth Rushgrove
@garethr

garethr.dev
devopsweekly.com

1461 - 1480 of 6989