Should Startups Write Tests?

Should startups write tests? Coming from Testery, a company created by testers for testers, our answers might surprise you.

Startups face unique challenges that more established organizations don't worry as much about. In a startup, the goal is often to move as quickly as possible to find something that people will pay for. You may ship a set of features, promote it for a few months, rip it all out, ship another set of features, replace the entire architecture, etc.

Because of all this change, many are tempted to forgo automated testing altogether. "We're changing too fast, it'd be a waste of time." This line of thinking misses an incredibly important point. A well-executed approach to automated testing will help you go faster.

Modern applications typically have a solid API layer that abstracts away the database and business logic from the UI layer. This has made it so that UI tests – while still important – are less important than they were before. Conversely it has mean that getting in place solid API tests is probably more important than before.

It's also important to point out that startups come in many different flavors. Are you bootstrapped? VC backed? Pre-revenue? Post-revenue? Working in a highly regulated environment? How big is the team? Answers to each of these questions can influence what the right approach is for you. Reach out to Testery if you'd like some more personalize recommendations for your situation, but in general we recommend the following.

Automated Testing When You're Pre-revenue / Funding

If you are pre-revenue / pre-funding, the focus is simply on getting something out there as quickly as possible to prove you have enough traction to warrant further investment. At this point, things like chasing down code coverage and putting in place E2E tests are probably a distraction. That said, there are some tests you should definitely write at this stage.

Write simple integration tests for every API endpoint. These tests should simply focus on whether or not you get the right return code from the API and that the API returns the right fields in the right format. These tests should run every time code is pushed to CI/CD. And yes, CI/CD is worth setting up from the very beginning. The very first commit for Testery was an API integration test for API authentication.

Write unit tests for any classes that include complex / core business logic. Unit tests are inexpensive to write and inexpensive to run. If there's a section of your code base that is the "brain" of the project, wrapping that login in unit tests will help you remember what you did when you come back to look at it again in a few months. It'll also serve as documentation to the future developers you hope to have on the team.

Leverage AI assistants to write tests more quickly. While AI still has a long way to go before it can be fully relied upon for all of our testing needs, getting in place unit tests often requires writing boiler plate code. AI Assistants like GitHub Copilot can generate this boiler plate code more quickly so you can focus simply on the business logic that needs to be tested.

Automated Testing When You're Post-Revenue / Funding

Once you become post-seed / Series A / post-revenue, the emphasis shifts from simply getting something out there to getting in place teams, processes, and tools that you can scale and that can grow with you. You may have customers now and customers have expectations that things they want to do will simply work.

Expand your API integration tests. You already have integration tests for all your API endpoints, but they might not be asserting that the data is correct, checking that things were written to the database, verifying different scenarios, making all the proper assertions, etc. Now is a great time to start adding tests for the additional scenarios and adding better assertions to your existing tests.

Write tests for bug fixes. A good practice to start at this phase is writing tests whenever bugs are fixed. This ensures that you don't make the same mistake over-and-over again. Early adopters may be more forgiving when they encounter issues, but if they continue to encounter the same issues over-and-over again, they may bail on you.

Automate mission critical / smoke tests. Now that you have customers, what are the things they expect to work after every release? Try to find the 5 - 15 most important user flows and make sure they are well tested. For many organizations, these flows are things like lead generation, sign-ups, shopping carts, report generation, etc.