Date   

Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Sarah Conway <sconway@...>
 

Note, Chris A. also shared that there is going to be a post conference transparency report from CNCF that compiles some data we are able to collect and track through our own systems, observations, reporting. 



On Tue, Dec 18, 2018 at 10:50 AM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Sarah Conway
Vice President of Communications 
The Linux Foundation
(978) 578-5300  Cell
Skype:  sarah.k.conway


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Matt Farina
 

Lets give attendees (primarily end users) what they’re looking for (in several dimensions).  To do that, lets understand better what they want, by gathering some hard data.  Then explicitly feed that back into the design, talk selection etc.  This in contrast to designing it around what sponsors want, what chairs want, what the talk selectors want, etc, in the absence of the above data.

Q

I really like Quintons idea. It forces us to pay attention to and try to meet the needs of end users. If we don't do that something else could come along and steal our thunder.


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Arun Gupta
 

+1 for track chairs

I've been part of several program committees over last several years. And this one factor really helps build a great agenda. The chairs still oversee the agenda but they deal with it like a microservices-based application as opposed to one big monolith.

Arun

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com






Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Dee Kumar <dkumar@...>
 

Nanci Lancaster (https://www.cncf.io/people/staff/) is nearly a full-time staff member supporting the co-chairs and program committee in their work. She is also backed up by Jillian on the LF content team. I am also happy to volunteer my time to support the co-chairs. 

On Tue, Dec 18, 2018 at 3:59 PM Alexis Richardson <alexis@...> wrote:
As the chairs are volunteering in spare time, I suggest that more support is necessary.



On Tue, 18 Dec 2018, 23:47 Dee Kumar, <dkumar@...> wrote:
Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535



--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Quinton Hoole
 

Thanks Matt

Nice write-up.  For me, the main high-level take-away from the session was this:

Lets give attendees (primarily end users) what they’re looking for (in several dimensions).  To do that, lets understand better what they want, by gathering some hard data.  Then explicitly feed that back into the design, talk selection etc.  This in contrast to designing it around what sponsors want, what chairs want, what the talk selectors want, etc, in the absence of the above data.

Q


From: <cncf-toc@...> on behalf of Matt Farina <matt@...>
Date: Tuesday, December 18, 2018 at 07:50
To: CNCF TOC <cncf-toc@...>, Janet Kuo <chiachenk@...>, "bryanliles@..." <bryanliles@...>, Liz Rice <liz@...>, Dan Kohn <dan@...>, Chris Aniszczyk <caniszczyk@...>
Subject: [cncf-toc] Improvement Feedback on KubeCon/CloudNativeCon NA 2018

At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com




Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Sonya Koptyev <sonya@...>
 

+1 on track chairs, especially aligned to the new category structures would work really well. 

Thanks,
Sonya

Sonya Koptyev | Director of Evangelism | m: +1 425 505 0100



On Dec 18, 2018, 3:59 PM -0800, alexis richardson <alexis@...>, wrote:
As the chairs are volunteering in spare time, I suggest that more support is necessary.



On Tue, 18 Dec 2018, 23:47 Dee Kumar, <dkumar@...> wrote:
Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

alexis richardson
 

As the chairs are volunteering in spare time, I suggest that more support is necessary.



On Tue, 18 Dec 2018, 23:47 Dee Kumar, <dkumar@...> wrote:
Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Ruben Orduz <ruben@...>
 

I'm sorry I could not stay in Seattle for this meeting. Were alternative review approaches discussed? single-blind/double-blind? Changes in the proposal format? Length of fields, etc?


On Tue, Dec 18, 2018 at 6:47 PM Dee Kumar <dkumar@...> wrote:
Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Dee Kumar <dkumar@...>
 

Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee


On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com





--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535


Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018

alexis richardson
 

+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com




Improvement Feedback on KubeCon/CloudNativeCon NA 2018

Matt Farina
 

At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
  • Keynotes with a rounded room
  • Daycare for kids
  • Many women giving keynotes
Some problems that could use more solution suggestions:
  • Room changes and sched updating after sessions had begun. Some speakers were late because of this
  • Uniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layout
  • Finding room locations in sched can be difficult
  • Good talks were from end users. How can we get more of these?
  • Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)
  • The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
  • Announce sponsored keynotes as sponsored
  • Speaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)
  • Use feedback from previous conferences to inform session selection at future conferences
  • Collect videos of previous speaking when submitting selections
  • Keep things on the same topic in the same room (tracks have a room)
  • Have a dry run of the talks before the day of
  • Match seasoned speakers with newer speakers to help work on sessions
  • Track chairs - this came up several times by different people
  • Capture data on whey sessions were popular (is it speakers, topic, something else?)
  • Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiences
  • Example from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the fact
  • Example from academic medical conferences: you have to turn in feedback to continue accreditation
  • Local conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the works
  • Posted ingredients lists for the meals for people with special diets
  • Having room on badges for things like GitHub handles because that's how many of us know each other
  • Live captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina
mattfarina.com




[RESULT] etcd project proposal (incbuation)

Chris Aniszczyk
 

Richard Hartmann: https://lists.cncf.io/g/cncf-toc/message/2279
Nils De Moor: https://lists.cncf.io/g/cncf-toc/message/2280
Nick Chase: https://lists.cncf.io/g/cncf-toc/message/2281
Srinivasan Jagannadhan: https://lists.cncf.io/g/cncf-toc/message/2282
Iftach Schonbaum: https://lists.cncf.io/g/cncf-toc/message/2283
Manik Taneja: https://lists.cncf.io/g/cncf-toc/message/2291
Haifeng Liu: https://lists.cncf.io/g/cncf-toc/message/2292
徐翔轩:  https://lists.cncf.io/g/cncf-toc/message/2293

Thanks all for voting, we look forward to cultivating the etcd community!

--
Chris Aniszczyk (@cra) | +1-512-961-6719


Re: RFC: Project Health Dashboard

Chris Aniszczyk
 

* Are you tracking components/individual repos as well?

yes, each project has its own site, all repositories are tracked and grouped into repository groups. For "All CNCF" project each repository groups is a single project.

* Are you tracking people (most commits, etc) as well?

yes, there are specific dashboards for this (see per project dashboards)

* Will you expand the UI filters, ordering, etc?

on this particular dashboard, this is hardest to do. It is not a Grafana dashboard at all, HTML is programmatically generated. Grafana only has a very basic tabular dashboard.

* There's too much information in default state. How about hiding the 3 month, 3 month, trend, 6 month lines on first view?

we can hide/show anything. We'll add some UI element showing/hiding rows when clicked, but it would have to be implemented as embedded javascript code.

* Is the underlying data persisted somewhere?

yes, it is, like for all other dashboards. It is stored in a Postgres database, i.e., 
* If yes, is that accessible?

Only for a small team of people who have access to an SSH bastion host, but anyone can run their own version of devstats:

nobody ever asked for this :) It was stored in the InfluxDB once (a long time ago), but we replaced it with Postgres.

* Time to last commit it relative, time to last release is absolute,

this is confusing - there is both time of last commit and days since the last commit, there is only last release date (no days since the last release), we're still working on highlighting the right metrics for project health

* Are you tracking sandbox projects as well?

yes, header row contains all CNCF projects grouped by Graduated, Incubating and Sandbox.

On Tue, Dec 4, 2018 at 11:55 AM Richard Hartmann <richih@...> wrote:
Looks like a good collection and basis for more analysis; questions in
no particular order:

* Are you tracking components/individual repos as well?
* Are you tracking people (most commits, etc) as well?
* Will you expand the UI filters, ordering, etc?
  * There's too much information in default state. How about hiding
the 3 month, 3 month, trend, 6 month lines on first view?
* Is the underlying data persisted somewhere?
  * If yes, is that accessible?
  * Is it stored in Prometheus? :)
* Time to last commit it relative, time to last release is absolute,
this is confusing
* Are you tracking sandbox projects as well?


--
Chris Aniszczyk (@cra) | +1-512-961-6719


Re: RFC: Project Health Dashboard

Richard Hartmann
 

Looks like a good collection and basis for more analysis; questions in
no particular order:

* Are you tracking components/individual repos as well?
* Are you tracking people (most commits, etc) as well?
* Will you expand the UI filters, ordering, etc?
* There's too much information in default state. How about hiding
the 3 month, 3 month, trend, 6 month lines on first view?
* Is the underlying data persisted somewhere?
* If yes, is that accessible?
* Is it stored in Prometheus? :)
* Time to last commit it relative, time to last release is absolute,
this is confusing
* Are you tracking sandbox projects as well?


FYI: Cancelling the 12/18/2018 TOC meeting

Chris Aniszczyk
 

The TOC has decided to cancel the 12/18 meeting due to the December holidays. 

We hope everyone has a great holiday and new years!

--
Chris Aniszczyk (@cra) | +1-512-961-6719


RFC: Project Health Dashboard

Chris Aniszczyk
 

Hey all, we have been experimenting with surfacing project health metrics via DevStats for CNCF projects: https://all.devstats.cncf.io/d/53/projects-health?orgId=1

If there are metrics that are missing that you would like added, let us know.

NOTE: this is still a work in progress!

--
Chris Aniszczyk (@cra) | +1-512-961-6719


Re: ​Kubernetes' first major security hole discovered | ZDNet

Chris Aniszczyk
 

The Kubernetes Steering Committee delegated it to a committee to select a vendor:

On Wed, Dec 5, 2018 at 12:34 AM Quinton Hoole <quinton@...> wrote:
Another reminder how important this stuff is...

https://www.zdnet.com/article/kubernetes-first-major-security-hole-discovered/

What is the status of out independent security audit of Kubernetes?

Do we have any idea whether that might have caught this ahead of time?

Q



--
Chris Aniszczyk (@cra) | +1-512-961-6719


​Kubernetes' first major security hole discovered | ZDNet

Quinton Hoole <quinton@...>
 

Another reminder how important this stuff is...

https://www.zdnet.com/article/kubernetes-first-major-security-hole-discovered/

What is the status of out independent security audit of Kubernetes?

Do we have any idea whether that might have caught this ahead of time?

Q


TOC Agenda 12/4/2018

Chris Aniszczyk
 

Hey all, the agenda for the TOC meeting is below:


We will focus on the categories proposal (https://docs.google.com/document/d/1mt1LH1QJgwA91A6x-DEdjg4ZOXrOnxxc1d2xhq4Hq3I/edit#) and hear from the CoreDNS and Fluentd projects regarding graduation.

Thanks!

--
Chris Aniszczyk (@cra) | +1-512-961-6719


Re: Important - next TOC call agenda and Categories / SIGs

alexis richardson
 

Addendum:

Thanks to Justin Cormack for adding some narrative around an example
SIG for Security. Please see the doc & comment!

On Thu, Nov 29, 2018 at 12:05 PM Alexis Richardson <alexis@...> wrote:

Hi all

REMINDER to please look at and comment on this doc on SIGs

https://docs.google.com/document/d/1mt1LH1QJgwA91A6x-DEdjg4ZOXrOnxxc1d2xhq4Hq3I/edit#

I know we are all ridic busy with Kubecon looming and more. But, if
we can all get this doc into a near final state this year then we can
kick off the SIGs, start to scale, and hopefully unblock stuff. Your
help is invaluable.

If you have any other agenda items for next week, please let us know
and have them added to the living agenda & minutes doc.

a

4821 - 4840 of 7556