We like to be as Agile as possible, with what some might perceive as an extreme emphasis on the Agile value of “Customer collaboration over contract negotiation”. Our customers and partners asked me to write this post in order to share how we express this value in our development process. I was under the impression that what we do is pretty standard, but I've learned that many of our customers find it novel and suggested I share our approach more broadly. It is timely, because if you read it you’ll find out how you can get a sneak preview of all of the “cool new stuff” under development. This changes over time - when I originally wrote this post we were focused on major enhancements to the Weave platform. In this update of early 2019 we're focused on improving the Idea Engine. And over time, our roadmap and focus will continue to evolve to meet customer needs. But the core collaboration processes described herein will remain stable.
I’ll start with co-development and then review our dev/preview/production operations platforms. I'll finish with the impact it has on us and our customers; among other things, this process is way more fun than you might realize, like running downhill. My hope is that you’ll find ideas that will embolden you to collaborate more closely with your customers in the development of your own products and services.
Co-DevelopmentWe use the term “co-development” to capture our process of working with customers to develop our platforms. On the qualitative side, the term encompasses design jams, collaborative roadmapping and prioritization, Empathy Interviews and shared Sprint reviews. On the quantitative side, it includes product use data and yes, at times, surveys. Design jams are so special that they deserve their own post, so let’s review the other items.
Simply put, we produce lots of Prune the Product Tree and Buy a Feature forums with our customers. Conteneo makes the bulk of our revenue from our enterprise customers and our partners, so we do weight the feedback from enterprise customers and our partners the highest. We also pay very special attention to feedback from our Certified Collaboration Instructors because these people are often pushing the platform forward in novel ways.
Other segments that have a strong influence on our development roadmap include cities, schools and concerned citizens who are working with the Every Voice Engaged Foundation to implement Participatory Budgeting programs and other forms of civic engagement.
The benefits of this are incalculable: We stay aligned to customer needs, we identify new and often novel uses and features and we prune (remove) ideas and features that just don't make sense. Our customers enjoy this process too, as we’ve watched friendships develop from the shared goal of pursuing new ways of working.
Empathy InterviewsWe use Empathy Interviews to thoroughly explore ideas or requirements with customers. For example, we recently had a request from a customer to add both dot voting and liking to our visual collaboration platform. We wanted to make sure we understood their needs, so we held an Empathy Interview. These typically last an hour, and in that time we obtained the clarity we needed to confidently add this item to the backlog.
Shared Sprint Reviews
We tend to develop by focusing on one of our platforms for a “chunk” of work – typically an epic. The development of Weave, is a perfect case study: We started with our first generation platform (the Conteneo Collaboration Cloud) and then created a vision for its successor - Weave. Starting from scratch with entirely new User Story Maps and a commitment to upgrade to the latest in single-page responsive design awesomeness, we ended up with a release plan (a "PI Increment in SAFe terms) spanning more than 8 2-week Sprints.
That’s a long time in a world dominated by companies who brag about releasing tiny enhancements multiple times a day, but it made sense for us: Weave represented significant improvements in the presentation and management of data, the flow of work and the underlying data model. Accordingly, we’d actually make it worse for our customers if we forced them to operate with two different mental models at the same time or forced them to live through multiple data migrations and enhancements.
But of course, waiting until Weave is finished is not a recipe for customer collaboration: Customers and partners need to see the system as it is being developed so that they can provide critically needed feedback that helps keep our team on track.
We solve this problem by inviting our customers to our Sprint reviews. We keep these Sprint reviews light, with an open invitation: Customers can “dip in the stream” of development when it fits their schedule. We’ve found the following agenda for Sprint reviews with customers works well:
Overview: We start with an overview of the current epic or major development theme, including any adjustments to the plan from the last Sprint. For example, in the development of Weave we had to adjust part of our release plan when a key developer became quite ill.
Review Goals: We then review the goals of the development from the last Sprint – “Here is what we were planning to develop – here is how far we got – here is what we’re going to show”. This allows us to convey key insights from the Sprint.
Demo: We then give the demo. We find it helpful to gently refer to the “job” that this feature or capability is doing for our customer or the new superpower it grants our customers. Because we’re making some pretty cool changes to our platform we find that customers have loads of questions, mostly of the form “OK… this looks cool. Now, to confirm, this feature used to work this way in the old platform, but it will work this way in the new platform?” or “This is really powerful - Is this a new capability that didn’t exist in the old platform?” Take the time to capture and collect these questions and your answers as these help improve your marketing and sales messages.
Discuss Next Sprint: We conclude by making sure customers know how to access these new features if the want to try them (see next section) and our plans for the next Sprint. We naturally use this portion of the Sprint review to explore in greater detail any key questions our team might have on the Product Backlog Items planned for the Sprint. And we can use this to adjust priorities and re-rank items.
Customers often ask us if we record our Sprint reviews so that they can watch them. We've found that this is not really effective or compelling - a lot of overhead for little return. It is better to just attend the next Sprint review if you miss one.
Product Use Data
SaaS companies have a huge advantage over traditional on-premise software: We can easily instrument our solutions to capture a wide variety of usage metrics and use these data to improve our offerings. I'll be the first to admit that in the early development of our offerings we didn't leverage this enough. More recently, though, we've started to leverage this more and our solutions are better off for it. These data are often included in our Sprint plans and product reviews.
I've gone on record - multiple times - that Surveys Suck. And most of the time, they do, because people create really terrible surveys and/or leverage them really badly. That said, sometimes individual opinions on whether or not you prefer broccoli or your personal preferences for an item really do matter. In this circumstances, surveys (as an instrument) don't suck, and we use them too.
I’d like to now turn to the mechanics of how we implement the qualitative side to this work.
Dev - Preview - ProductionOur customers can't make every Sprint meeting. And sometimes they just want to try things out on their own and explore what's coming more thoroughly than a demo might allow. Accordingly, we provide them with early access to the bits. Our approach manages early access to code through two key systems with a balance on providing access to major new functionality while ensuring this functionality is providing in a safe and familiar context. Let's explore how we do this in greater detail.
StructureWe run three platforms on the internet. And because security has to be an ongoing and relentless commitment of any vendor selling Enterprise software, all are secured. The platforms are:
http://weave-preview.conteneo.co ("preview" - aka, "staging")
http://weave.contenep.co ("production" or "prod")
The purpose, data and structure of each platform is as follows:
|dev||Earliest access to new functionality||dev is loaded with sample / test data. Brave explorers can try out this platform, but they are forewarned that not everything may work quite as they expect. Data on dev is not backed up and is periodically erased.|
|preview||Access to new functionality in the context of familiar data. Not designed to be scalable.||preview receives a periodic copy of the production database, typically when we move substantial functionality from dev to preview. This provides explorers with access to new capabilities in a comfortable context. Data on preview is not backed up and is periodically erased. Developers can also use preview to test migration scripts and the business team can use it to prepare documentation and marketing materials.|
|prod||Full production system designed and built on a highly scalable grid.||Prod receives a periodic update of code from preview. Sometimes this is extremely frequent, as when we’re iterating on features or systems. Sometimes this is less frequent, as when we’re making substantial changes. Data on prod is always backed up.|
In plain English, we use dev to test out code and preview to give you early access to code with your data. And because preview has customer data, we make sure it is covered by ALL of the same security protocols and protections as production.
Our customers know that this means:
To use dev you have to create a new account on dev.
To use preview use your normal production account.
DO NOT EVER USE dev or preview for "real" work. Really.
Dealing with the Old in the Context of the New
When you're making substantial changes to the structure of an existing system you have to determine how to manage upgrades, data migrations, access to old functionality and the explicit removal of the no-longer needed or the obsolete (the pruning of your tree, to use one of my favorite metaphors ;-).
Our approach is to always make certain that "old application code" is always available until the new code is capable of fully subsuming it. This allows us to develop and continually test things like data and schema migrations and APIs, virtually eliminating any problems companies experience in deploying production systems.
This is a very cool and very powerful strategy created by the dev team - we go back and forth between the new platform under development and the old platform to help ensure we're improving every part of the new functionality. And it helps us identify the most important areas to highlight when communicating functionality with customers.
Is The Glass House Fun?
We love developing in a glass house. Our customers can see what we’re working on and we get the benefit of frequent collaboration. Our dev team gets the benefit of safe development and the focus on getting code into internet-facing production systems helps shake out boatloads of problems that can arise when teams defer such things as data migrations, deployment processes and security testing.
I love that we have a means of working on new user story maps, new ways to capture and express functionality, and opportunities to remove old code in a way that let's us see how it affects old systems.
What Do Customers Think?
Our customers and partners love the glass house just as much, and perhaps even more. We've heard that few enterprise software vendors have this degree of commitment to collaboration. Indeed, some customers have asked me about the dangers of letting competitors see what's happening in our development. That might be a worry for some companies, but we don't worry about competitors: The need for improved collaboration is so strong that we welcome additional companies. Alas, developing enterprise software is not easy, and we've watched many so-called competitors fail.
Here are what a few customers and partners think about our approach.
I attended one of the sprint reviews. Here's what I liked. Customers from different companies attended so I was able to see what other people were interested in. Your team showed working software so I could see the software in action. You encouraged feedback and it was easy to give feedback about the product.
-Steve Neiderhauser:, IBM Watson Health-
I like that you’re, to use a ’90’s expression, eating your own dog food by seeking input directly from customers, partners and other collaborative ne'er-do-wells. The design jams have been positive in the sense that it’s open inquiry with a view to seeking understanding around requirements objectively. Then creating the stories for a series of sprints that reflect real user needs.
After we did the design jam last year, there was a great deal of progress in a very short space of time. As a contributor I felt engaged in the meeting, and that my contributions were followed up on through the release cycle and communication of progress.
-Mike Northcott: Spearfish Innovation-
I have found the process to be beneficial to all because having open discussions and demos allows prompt feedback from all of us and, hopefully, saves work to developers. I have also noticed that response time is in accordance of the importance of the request instead of FIFO and that's great. For example, 36 hr. request-to-production time is awesome (that's faster than the amazing 3-day sinkhole repair in Fukuoka last week). I think it works well because of the emphasis on value delivery and immediate prioritization.
-Masa Maeda, Valuinnova-
So, there you have it: how we develop in the Glass House. What does your house look like?