Showing results for 
Search instead for 
Did you mean: 

full integration testing with planning your timelines

Something recently occurred to me. Nobody I know who talks about microservices also talks about integration testing and the effect it has on timelines.

By integration testing I don't mean "my service can talk to the database I created at the same time". That's important, but that's not 'real integration testing', that just testing.

What I mean by integration testing is taking your dozens microservices created by various teams, running them all at once, and have them actually talking to each other. No mocks or other cheats, actual communication just like how it will run in projection.

That type of testing is hard and expensive. But it needs to be done or you're in for a nasty surprise when UAT testing starts.

But have any of you actually included it into timelines? Do you ever reject microservices in favor of monolith because you don't think you can afford the testing overhead?


Do you mean the impact these integration tests will have on the regularly (hopefully daily / multiple time per day) build & deploy time, or the overall delay that needing to build these integration tests will have to the project (ie we need to build 12 flows and that will add 3 weeks to the project)? I could see an impact to both - was curious which you are focused on.

Makes sense. We were in a situation where we had not developed the testing we needed as we started the business (bootstrapped, but became successful) - going back and putting in tests was painful and azar echatspin incomplete. 

It is necessary to understand which testing methodology in system integration suits each particular software solution — to save time and resources, and to achieve the most accurate results. Each type of integration testing is intended for differently composed systems. 

                                                                                                                                Toys r us credit card