Our knowledge is sourced from a community of experts around the world. For example, all of our scripts for F5 devices were written by F5 experts. These experts have been using F5 devices for years, they are certified on the devices they write scripts for and some of them are even F5 DevCentral MVPs.
When code is contributed by engineers from around the world, it is vital to ensure the code is of high quality. It is equally important to ensure the code is developed according to global best practices and reflects the results you’d expect from a superb team. Just like the Linux kernel open source project, which all businesses depend on today, has a strict development process and quality controls, so does Indeni’s knowledge repository.
The overarching Indeni Knowledge creation process follows these steps:
- Script idea or request (“collaborate”) – a community member suggests a script to add to the repository, or a current user of Indeni asks for one. This script idea or request is logged as a ticket (what we call “IKP Ticket”).
- Ticket assigned – one of the pre-vetted contributors (see below “Contributor validation”) is assigned the ticket.
- Script developed and tested (the “code”, “build” and “test” phases) – the contributor writes the script and follows the guidelines set for code review, testing and code contribution set by Indeni. Please see “quality controls” below.
- Script release (“release” and “deploy”) – the script is included in the upcoming Knowledge update package and is released to customers around the world. It is deployed with a click of a button as part of the package.
- On-going script utilization analysis and feedback collection (“automate” and “analyze”) – the script’s behavior is continuously watched through Indeni Insight. We identify how many issues the script is helping identify, how well is it performing and what types of devices it is being used with. We also solicit user feedback to ensure the script is doing what a user would expect it to.
- Feedback recorded and actioned (“collaborate”) – the feedback (both machine and human) is recorded as new IKP tickets, which are then assigned to community contributors and the process starts anew.
New content is typically released every two weeks, and are working on getting the release cycles shorter and shorter. In some cases, a script can be generated and delivered to a customer within four days from request.
Our pre-distribution quality controls include:
- Contributor validation – each contributor is personally validated. We inspect their resume, talk to them and ensure they are truly experts in their field. We also verify their legitimacy to weed out bad actors. During the validation process, we provide them with challenges to ensure they can handle the work.
- Curation – we specifically select which scripts make it into the repository and which don’t. There is not a single script that gets included without sign-off by an Indeni employee.
- Double human verification – each script goes through a code review process by a designated community member. Those who are allowed to conduct the code review are very experienced, and are generally pedant, catching small errors.
- Manual testing in a lab – every script is manually tested against one or more devices which it supports, to ensure it works properly and does not impact the device negatively. There are two labs these are tested in, one in Tel Aviv and one in San Francisco, as well as mini-labs maintained by certain community members.
- Automated verification – an automated system, based on the Jenkins Continuous Integration / Continuous Delivery tool, ensures that each script has automatic tests and conforms with our standards.
- Beta phase – certain scripts, or groups of scripts, will go through a beta phase. In this phase, customers and users may volunteer to test the scripts with their devices before they become production scripts.
A script must go through all of the quality controls in order to be released to users.