Re: Improvement Feedback on KubeCon/CloudNativeCon NA 2018


Yuan Chen <yuan.chen@...>
 

I have similar questions about the review process as Ruben. Are we planning to make any changes? 

The review and selection process is such a key factor for the success of the conference. We had a long discussion online a while ago. A lot of good comments and suggestions. I think someone collected the feedbacks somewhere. The co-chairs and cncf staff should probably review them and make a proposal for changes. Hope we could improve the review and selection next year.

Thanks,

Yuan Chen
Principal Architect, Infrastructure
JD Silicon Valley R&D Center



From: Ruben Orduz
Sent: Tuesday, December 18, 3:57 PM
Subject: Re: [cncf-toc] Improvement Feedback on KubeCon/CloudNativeCon NA 2018
To: Dee Kumar
Cc: alexis richardson, Matt Farina, CNCF TOC, Janet Kuo, bryanliles@..., Liz Rice, Dan Kohn, Chris Aniszczyk


I'm sorry I could not stay in Seattle for this meeting. Were alternative review approaches discussed? single-blind/double-blind? Changes in the proposal format? Length of fields, etc?

On Tue, Dec 18, 2018 at 6:47 PM Dee Kumar <dkumar@...> wrote:
Hi Alexis, 

Please note that the KubeCon + CloudNativeCon co-chairs have staggered one-year appointments to improve continuity. As described in https://www.cncf.io/blog/2018/11/16/kubecon-barcelona-2019-call-for-proposals-cfp-is-open/ Bryan Liles is taking over for Liz RIce as co-chair and Janet Kuo continues. On the track chair idea (+1), we are discussing it and will provide some updates early in the new year. 

Regards,
Dee

On Tue, Dec 18, 2018 at 3:23 PM alexis richardson <alexis@...> wrote:
+1 for track chairs!  

I'd like to see CNCF appoint a permanent Kubecon liaison who can help the conference chairs achieve continuity.  

WDY(A)T?



On Tue, Dec 18, 2018 at 3:50 PM Matt Farina <matt@...> wrote:
At KubeCon/CloudNativeCon there was a session on improving the conference for the future. This session was born out of conversations on this list so I wanted to circle back with some of the material from that session to further the conversation and see about getting some of them implemented.

Before I share improvements people suggested, I wanted to touch on a possible technical problem. Both Paris Pittman and I found examples of sessions that people proposed but that appeared to disappear. For example, in Paris case it was sessions she proposed that were neither accepted or rejected along with a session that was double accepted.

Dan and Chris, can someone look into the technical process and make sure there isn't a place where some sessions could be inadvertently dropped or otherwise messed up?

I also want to thank the track chairs. It is an often thankless job. In the session there were call outs to things people liked and I, for one, appreciated hearing those. Nothing I intend to write is meant to be a criticism. Rather, it's to share suggestions many people had looking to continiously improve a changing conference.

Some things people liked:
Keynotes with a rounded roomDaycare for kidsMany women giving keynotes
Some problems that could use more solution suggestions:
Room changes and sched updating after sessions had begun. Some speakers were late because of thisUniformly, sessions at the TCC were under-attended. If we use the venue again we should re-think layoutFinding room locations in sched can be difficultGood talks were from end users. How can we get more of these?Some reviewers were a bit overwhelmed (e.g., someone reviewed ~120 submissions)The SurveyMonkey review application isn't great and reviewers would like something better
Here are some of the suggestions from the session:
Announce sponsored keynotes as sponsoredSpeaker training to help speakers improve their sessions (especially maintainers who get intros/deep dives)Use feedback from previous conferences to inform session selection at future conferencesCollect videos of previous speaking when submitting selectionsKeep things on the same topic in the same room (tracks have a room)Have a dry run of the talks before the day ofMatch seasoned speakers with newer speakers to help work on sessionsTrack chairs - this came up several times by different peopleCapture data on whey sessions were popular (is it speakers, topic, something else?)Tracks with diversity (e.g., one of the tracks had 4 of 6 talks on the same project). Some conferences limit the number of project talks in a single track. Couple this with shared room/day and YouTube playlist experiencesExample from an education conference: a 90 minute talk is 80 minuites. The final 10 minutes was people doing reviews. This captures feedback in the moment from many rather than negative feedback after the factExample from academic medical conferences: you have to turn in feedback to continue accreditationLocal conferences (like DevOps Days, WordCamp, DrupalCamp, etc) which Chris Aniszczyk said was in the worksPosted ingredients lists for the meals for people with special dietsHaving room on badges for things like GitHub handles because that's how many of us know each otherLive captioning, at least for the keynotes
Note, I likely missed some. If I did and you remember something please share to fill in the gaps.

There was no way for me to capture the entirety of the session in an email this short. If you're interested in more detail please watch the video.

How can we move forward on some of these suggestions?

Also, please feel free to forward this on or loop others in as needed.

-- 
Matt Farina



--
Dee Kumar
Vice President, Marketing
Cloud Native Computing Foundation
@deesprinter
408 242 3535



Join cncf-toc@lists.cncf.io to automatically receive all group messages.