The implications for security tools selection
When we released Chalk™, someone asked on Twitter why we chose to use Nim as the programming language. The simple answer was because it was the best tool for the job. We didn’t choose an industry standard language like C, Rust or Go and we have never heard anyone describing Nim as best-of-breed.
What I have observed is that when people choose tools, they typically choose either so-called best-of-breed tools or get the tool as part of a suite, something I refer to as ‘a tools meal deal’. People tend to be risk averse and also choose tools that are perceived as industry standard.
I think the term best-of-breed is misleading, and instead people should always choose the best tool for the job.
The nuance of so-called best-of-breed tools today, lies in the commonly accepted definition, the tool that is perceived or marketed as having the highest efficacy in detecting or preventing security issues, but for a company, the best tool for the job may actually be the tool that is easiest to deploy, will result in the largest developer adoption, have the best success rate in getting tasks done, or any number of other criteria, but not necessarily the best at finding issues.
All of these things are measurable through controlled tests, including user studies so it is not subjective. You put tools in front of users and then ask them to complete specific tasks and measure the results. Find and fix issue A in a test suite, measure the results. By measuring task success rates, and time to completion, combined with a large enough sample size for instance, you can tell which tool has a higher rate of efficacy for the thing you care about. That may be accuracy, but nine times out of ten in my opinion isn’t when you actually think about it. Adoption may be key because when the tide rises across an organization, you will end up with a better overall security result. 1% adoption times 100% findings will always be less than 50% adoption and 50% findings, and you miss 100% of the shots on goal you don’t take.
I also think the term industry standard is only applicable if all tech environments, company structures, budgets and cultures were identical clones. They are not, so it's bullshit.
Industry standard is a term that is mired in murky marketing and folklore. Sure 9 out of 10 fortune 100 companies may be using a tool, but big companies use one of everything so that’s lying with statistics. Sure your peers may be using it, but it's a huge industry and likely your peers are your peers for a reason and your peers are not the populous of the industry. Maybe peer standard, but industry standard is not true. Industry standard is a marketing phrase, repeated enough times that people believe it. If it was true there would be one dominant company in each market segment and a bunch of contenders. In appsec there are a number of big players with significant market share so by definition there is no industry standard.
I think we need to treat best-of-breed for what it is, and we need to shift our mental model to always selecting the best tool for the job. I also think we need to stop paying any credence to the phrase ‘industry standard’.