In this world, there is no such thing as perfect security.
Every app or service you use — even the websites you visit — have security bugs. Companies go through repeated rounds of testing, code reviews and audits — sometimes even bringing in third-parties. Bugs get missed — that’s life, and it happens — but when they are uncovered, companies can get hacked.
That’s where a bug bounty comes into play. A bug bounty is an open-door policy to anyone who finds a bug or a security flaw; they are critical for channeling those vulnerabilities back to your development team so they can be fixed before bad actors can exploit them.
Bug bounties are an extension of your internal testing process and incentivize hackers to report bugs and issues and get paid for their work rather than dropping details of a vulnerability out of the blue (aka a “zero-day”) for anyone else to take advantage of.
Bug bounties are a win-win, but paying hackers for bugs is only one part of the process. As is usually the case where security meets startup culture, getting the right system in place early is best.
Why you need a vulnerability disclosure program
A bug bounty is just a small part of the overall bug-hunting and remediating process.
You need a vulnerability disclosure program (VDP) that sets the rules of engagement for anyone who submits bugs in exchange for a payout. A VDP defines the scope of what can and can’t be tested; triages the incoming flow of reports to deal with critical bugs first; and manages bug bounty payouts. Crucially, these rules allow companies to exclude certain areas of their operations — such as domains or systems — while ensuring that security researchers are treated fairly and appropriately compensated.
Vulnerability disclosure programs can be a lot to manage in-house, which is why many outsource their programs to a VDP platform like HackerOne, Bugcrowd and HackenProof, to name a few. These third-parties also handle the bug bounty payouts by working with the companies to determine the severity of a bug and pay out according to that scale.
Know the good and the bad
Over the years there have been plenty of horror stories from the security community. Hackers have been threatened with legal action, gagged from disclosing findings and, many feel, cheated out of their bounties for finding bugs that have allegedly already been found.
No system is perfect. But when vulnerability disclosure programs combined with bug bounties work, they work well.
These are some of the dos and don’ts that startups need to know when planning their vulnerability disclosure programs:
- Recognize your hackers. Hackers and researchers work hard to find bugs, often altruistically. More often than not, bug finders are happy just to see the issue fixed. But recognition is important. Whether that’s a thank-you, sending them company swag like a T-shirt or other merchandise or paying them through a bug bounty, it’s important to recognize researchers for their work. Startup budgets are already stretched thin, and paying out for bugs can quickly get costly, but companies should — if they can — at least allocate a set amount of budget for payouts on the higher-severity bugs.
- Safe harbor is important but not a panacea. Hackers face legal threats all the time. Some companies, like password manager maker Keeper and drone maker DJI, historically took an aggressive approach over good-faith security research. On the flip side, many companies are increasingly embracing good-faith research and putting “safe harbor” clauses into their vulnerability disclosure programs to affirm their commitments to shielding researchers from legal action. Safe harbor provisions are promises that companies will not file criminal or civil charges under U.S. hacking laws, like the Computer Fraud and Abuse Act, so long as the hacker or researcher sticks to the rules of the program. It’s not a panacea — the government can still bring changes — but it’s a start.
- Don’t threaten security researchers. It should go without saying that actively threatening security researchers with legal action — whether warranted or otherwise — sets an incredibly bad example. It shows heavy-handedness and weakens the program for other would-be bug finders. If a hacker thinks they will be sued, there’s less chance they will want to report a bug, which puts your vulnerability disclosure program at risk. As a knock-on effect, your company’s overall security will suffer.
- Avoid non-disclosure agreements. And finally, vulnerability disclosure programs are not just about disclosing a bug to the company. It’s about coordinated disclosure to the public. That’s in part what these bug bounty platforms help with. Once a bug is fixed, it’s published, so that not only can the hacker or researcher take credit for their work but their findings can also to help others learn, understand and find other, similar bugs going forward. Security is not proprietary; it’s an ongoing collective learning effort. Non-disclosure agreements prevent such disclosures and harm security research.
Comment