Loading learning content...
Unit tests verify that your code logic is correct when all external dependencies behave perfectly. But in the real world, your microservice must interact with databases that have their own query semantics, message brokers with delivery guarantees, caches with eviction policies, and third-party APIs with rate limits. Integration tests bridge the gap between isolated unit tests and the messy reality of infrastructure.
Integration testing in microservices answers a critical question: Does my service actually work when connected to real dependencies? A perfectly passing unit test suite means nothing if your SQL query syntax is wrong, your Kafka consumer offset management is broken, or your HTTP client timeout configuration doesn't match the upstream service's response time.
By the end of this page, you will understand how to design and implement integration tests that verify your service's interactions with databases, message brokers, caches, and external services. You'll learn about test containers, database testing strategies, message broker testing, and how to structure integration test suites for speed and reliability.
The term 'integration testing' is notoriously overloaded. Different teams use it to mean different things, leading to confusion about what's being tested and how. In the microservices context, we need precise definitions.
Integration Testing Scope Levels:
Service-Infrastructure Integration: Testing a single service against its direct infrastructure dependencies (database, cache, message broker). This is the focus of this page.
Service-to-Service Integration: Testing the interaction between two or more services. This is typically covered by contract testing and component testing.
Full System Integration: Testing the entire system end-to-end. This is covered by end-to-end testing.
For this page, integration testing means: Verifying that a microservice correctly interacts with its infrastructure dependencies—databases, caches, message brokers, file systems—using real or realistic instances of those dependencies.
| Aspect | Unit Tests | Integration Tests | E2E Tests |
|---|---|---|---|
| Dependencies | All mocked | Real infrastructure | Entire system running |
| Speed | Milliseconds | Seconds | Minutes |
| Failure isolation | Exact location | General area | Broad scope |
| Setup complexity | Minimal | Moderate | High |
| Flakiness risk | Low | Medium | High |
| Bug detection | Logic errors | I/O and integration bugs | System behavior issues |
Integration tests occupy a crucial middle ground in the testing pyramid. They're slow enough that you can't run thousands of them, but fast enough to run on every commit. They catch bugs that unit tests miss (wrong SQL, serialization issues, transaction problems) without the setup complexity and flakiness of full E2E tests. A healthy microservice typically has 10-50 integration tests covering its major infrastructure interactions.
The traditional challenge with integration testing was environment consistency. 'Works on my machine' failures often stemmed from differences between local development databases, CI databases, and production databases. Testcontainers revolutionized integration testing by providing Docker-based, disposable instances of real infrastructure.
How Testcontainers Works:
Benefits:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960
import { PostgreSqlContainer, StartedPostgreSqlContainer } from "@testcontainers/postgresql";import { RedisContainer, StartedRedisContainer } from "@testcontainers/redis";import { KafkaContainer, StartedKafkaContainer } from "@testcontainers/kafka";import { PrismaClient } from "C:/learn-101/src/generated/prisma"; // Global container instances for the test suitelet postgresContainer: StartedPostgreSqlContainer;let redisContainer: StartedRedisContainer;let kafkaContainer: StartedKafkaContainer;let prisma: PrismaClient; // Setup before all tests in the suitebeforeAll(async () => { // Start containers in parallel for faster setup [postgresContainer, redisContainer, kafkaContainer] = await Promise.all([ new PostgreSqlContainer("postgres:15-alpine") .withDatabase("test_db") .withUsername("test") .withPassword("test") .start(), new RedisContainer("redis:7-alpine").start(), new KafkaContainer("confluentinc/cp-kafka:7.4.0").start(), ]); // Configure application to use test containers process.env.DATABASE_URL = postgresContainer.getConnectionUri(); process.env.REDIS_URL = `redis://${redisContainer.getHost()}:${redisContainer.getMappedPort(6379)}`; process.env.KAFKA_BROKERS = kafkaContainer.getBootstrapServers(); // Initialize Prisma with test database prisma = new PrismaClient(); // Run migrations to set up schema await runMigrations(postgresContainer.getConnectionUri()); console.log("Test infrastructure ready");}, 60000); // 60 second timeout for container startup // Cleanup after all testsafterAll(async () => { await prisma.$disconnect(); await Promise.all([ postgresContainer?.stop(), redisContainer?.stop(), kafkaContainer?.stop(), ]);}); // Reset state between testsbeforeEach(async () => { // Clear database tables (faster than recreating containers) await prisma.$executeRaw`TRUNCATE TABLE orders, customers, products CASCADE`; // Flush Redis const redis = getRedisClient(); await redis.flushAll();}); // Export for use in testsexport { prisma, postgresContainer, redisContainer, kafkaContainer };Starting containers for every test file is slow. Use suite-level container lifecycle (start once, clean between tests) to balance isolation with speed. Modern Testcontainers implementations also support container reuse across test runs, further speeding up local development while maintaining CI reproducibility.
Database interactions are the most common integration test target. These tests verify that your repository implementations, queries, and transactions behave correctly against a real database engine.
What Database Integration Tests Should Verify:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207
// Repository implementationclass PrismaOrderRepository implements OrderRepository { constructor(private readonly prisma: PrismaClient) {} async save(order: Order): Promise<void> { await this.prisma.order.upsert({ where: { id: order.id }, update: { status: order.status, customerId: order.customerId, totalAmount: order.totalAmount.toDecimal(), updatedAt: new Date(), }, create: { id: order.id, status: order.status, customerId: order.customerId, totalAmount: order.totalAmount.toDecimal(), createdAt: order.createdAt, updatedAt: new Date(), }, }); // Save order items in a transaction await this.prisma.$transaction([ this.prisma.orderItem.deleteMany({ where: { orderId: order.id } }), this.prisma.orderItem.createMany({ data: order.items.map(item => ({ orderId: order.id, productId: item.productId, quantity: item.quantity, unitPrice: item.unitPrice.toDecimal(), })), }), ]); } async findById(id: string): Promise<Order | null> { const data = await this.prisma.order.findUnique({ where: { id }, include: { items: true }, }); return data ? Order.fromPersistence(data) : null; } async findByCustomerWithStatus( customerId: string, status: OrderStatus, pagination: Pagination ): Promise<PaginatedResult<Order>> { const [orders, count] = await this.prisma.$transaction([ this.prisma.order.findMany({ where: { customerId, status }, include: { items: true }, orderBy: { createdAt: "desc" }, skip: pagination.offset, take: pagination.limit, }), this.prisma.order.count({ where: { customerId, status }, }), ]); return { items: orders.map(Order.fromPersistence), total: count, hasMore: pagination.offset + orders.length < count, }; }} // Integration tests for the repositorydescribe("PrismaOrderRepository Integration Tests", () => { let repository: PrismaOrderRepository; beforeEach(async () => { await prisma.$executeRaw`TRUNCATE TABLE order_items, orders, customers CASCADE`; repository = new PrismaOrderRepository(prisma); // Seed test customer (foreign key dependency) await prisma.customer.create({ data: { id: "cust-1", email: "test@example.com", name: "Test User" }, }); }); describe("save", () => { it("should persist new order with items", async () => { const order = OrderBuilder.anOrder() .withId("ord-123") .withCustomerId("cust-1") .withItems([ { productId: "prod-1", quantity: 2, unitPrice: Money.of(10) }, { productId: "prod-2", quantity: 1, unitPrice: Money.of(25) }, ]) .build(); await repository.save(order); // Verify directly in database const stored = await prisma.order.findUnique({ where: { id: "ord-123" }, include: { items: true }, }); expect(stored).not.toBeNull(); expect(stored!.customerId).toBe("cust-1"); expect(stored!.items).toHaveLength(2); expect(stored!.totalAmount).toEqual(new Decimal("45.00")); }); it("should update existing order", async () => { // Arrange - create initial order const order = OrderBuilder.anOrder() .withId("ord-456") .withCustomerId("cust-1") .withStatus(OrderStatus.DRAFT) .build(); await repository.save(order); // Act - update status order.confirm(); await repository.save(order); // Assert const stored = await prisma.order.findUnique({ where: { id: "ord-456" } }); expect(stored!.status).toBe(OrderStatus.CONFIRMED); }); it("should replace items when order is updated", async () => { const order = OrderBuilder.anOrder() .withId("ord-789") .withCustomerId("cust-1") .withItems([{ productId: "prod-1", quantity: 1, unitPrice: Money.of(10) }]) .build(); await repository.save(order); // Update items order.clearItems(); order.addItem({ productId: "prod-2", quantity: 3, unitPrice: Money.of(15) }); await repository.save(order); const stored = await prisma.order.findUnique({ where: { id: "ord-789" }, include: { items: true }, }); expect(stored!.items).toHaveLength(1); expect(stored!.items[0].productId).toBe("prod-2"); expect(stored!.items[0].quantity).toBe(3); }); }); describe("findByCustomerWithStatus", () => { beforeEach(async () => { // Seed 25 orders for pagination testing const orders = Array.from({ length: 25 }, (_, i) => OrderBuilder.anOrder() .withId(`ord-${i.toString().padStart(3, '0')}`) .withCustomerId("cust-1") .withStatus(i < 15 ? OrderStatus.CONFIRMED : OrderStatus.DRAFT) .build() ); for (const order of orders) { await repository.save(order); } }); it("should return paginated results for specific status", async () => { const result = await repository.findByCustomerWithStatus( "cust-1", OrderStatus.CONFIRMED, { offset: 0, limit: 10 } ); expect(result.items).toHaveLength(10); expect(result.total).toBe(15); expect(result.hasMore).toBe(true); result.items.forEach(order => { expect(order.status).toBe(OrderStatus.CONFIRMED); }); }); it("should return empty result for non-existent customer", async () => { const result = await repository.findByCustomerWithStatus( "nonexistent", OrderStatus.CONFIRMED, { offset: 0, limit: 10 } ); expect(result.items).toHaveLength(0); expect(result.total).toBe(0); expect(result.hasMore).toBe(false); }); it("should handle pagination correctly at boundaries", async () => { const result = await repository.findByCustomerWithStatus( "cust-1", OrderStatus.CONFIRMED, { offset: 10, limit: 10 } ); expect(result.items).toHaveLength(5); // Only 5 remaining expect(result.hasMore).toBe(false); }); });});Don't substitute H2 for PostgreSQL or SQLite for MySQL in integration tests. Each database has unique behavior around NULL handling, collation, JSON, transactions, and query optimization. Testing against an in-memory substitute may miss bugs that only appear with your production database engine.
Caching logic is notoriously bug-prone: cache invalidation timing, serialization mismatches, TTL misconfiguration, and race conditions can all cause subtle production issues. Integration tests with real cache infrastructure catch these bugs before deployment.
Key Cache Testing Scenarios:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206
// Cached repository implementationclass CachedProductRepository implements ProductRepository { private readonly CACHE_PREFIX = "product:"; private readonly TTL_SECONDS = 300; // 5 minutes constructor( private readonly redis: RedisClient, private readonly database: PrismaClient ) {} async findById(id: string): Promise<Product | null> { const cacheKey = `${this.CACHE_PREFIX}${id}`; // Try cache first const cached = await this.redis.get(cacheKey); if (cached) { return Product.fromJSON(JSON.parse(cached)); } // Cache miss - fetch from database const data = await this.database.product.findUnique({ where: { id } }); if (!data) return null; const product = Product.fromPersistence(data); // Populate cache await this.redis.setEx( cacheKey, this.TTL_SECONDS, JSON.stringify(product.toJSON()) ); return product; } async save(product: Product): Promise<void> { // Save to database await this.database.product.upsert({ where: { id: product.id }, update: product.toPersistence(), create: product.toPersistence(), }); // Invalidate cache (write-through would populate, write-aside invalidates) await this.redis.del(`${this.CACHE_PREFIX}${product.id}`); } async findByCategory(category: string): Promise<Product[]> { const cacheKey = `products:category:${category}`; const cached = await this.redis.get(cacheKey); if (cached) { return JSON.parse(cached).map(Product.fromJSON); } const products = await this.database.product.findMany({ where: { category }, }); const result = products.map(Product.fromPersistence); // Cache with shorter TTL for list queries await this.redis.setEx(cacheKey, 60, JSON.stringify(result.map(p => p.toJSON()))); return result; }} // Integration tests for caching behaviordescribe("CachedProductRepository Integration Tests", () => { let repository: CachedProductRepository; let redis: RedisClient; beforeEach(async () => { await prisma.$executeRaw`TRUNCATE TABLE products CASCADE`; redis = getRedisClient(); await redis.flushAll(); repository = new CachedProductRepository(redis, prisma); }); describe("findById", () => { it("should populate cache on first read", async () => { // Arrange - product only in database await prisma.product.create({ data: { id: "prod-1", name: "Widget", price: 29.99, category: "widgets" }, }); // Act - first read const product = await repository.findById("prod-1"); // Assert - product returned and cached expect(product).not.toBeNull(); expect(product!.name).toBe("Widget"); const cached = await redis.get("product:prod-1"); expect(cached).not.toBeNull(); expect(JSON.parse(cached!).name).toBe("Widget"); }); it("should return from cache without database query on second read", async () => { await prisma.product.create({ data: { id: "prod-2", name: "Gadget", price: 49.99, category: "gadgets" }, }); // First read - populates cache await repository.findById("prod-2"); // Modify database directly (simulating external change) await prisma.product.update({ where: { id: "prod-2" }, data: { name: "Updated Gadget" }, }); // Second read - should return cached (stale) value const product = await repository.findById("prod-2"); expect(product!.name).toBe("Gadget"); // Cached value, not updated }); it("should respect TTL and refresh from database after expiry", async () => { // This test uses short TTL to verify expiry behavior const shortTtlRepo = new CachedProductRepository(redis, prisma); await prisma.product.create({ data: { id: "prod-3", name: "Temporary", price: 10, category: "temp" }, }); await repository.findById("prod-3"); // Manually expire the cache entry (simulating TTL expiry) await redis.expire("product:prod-3", 0); // Update database await prisma.product.update({ where: { id: "prod-3" }, data: { name: "Refreshed" }, }); // Should fetch fresh data const product = await repository.findById("prod-3"); expect(product!.name).toBe("Refreshed"); }); }); describe("save", () => { it("should invalidate cache when product is updated", async () => { // Setup - create and cache product const product = new Product("prod-4", "Original", Money.of(20), "test"); await repository.save(product); await repository.findById("prod-4"); // Populate cache // Verify cached let cached = await redis.get("product:prod-4"); expect(cached).not.toBeNull(); // Update product product.updateName("Modified"); await repository.save(product); // Cache should be invalidated cached = await redis.get("product:prod-4"); expect(cached).toBeNull(); // Next read should get fresh data const fresh = await repository.findById("prod-4"); expect(fresh!.name).toBe("Modified"); }); }); describe("cache serialization", () => { it("should correctly serialize and deserialize Money values", async () => { const product = new Product( "prod-5", "Expensive Item", Money.of(1234.56), "luxury" ); await repository.save(product); const retrieved = await repository.findById("prod-5"); // Money object should be correctly reconstituted expect(retrieved!.price.toDecimal()).toBe(1234.56); expect(retrieved!.price.currency).toBe("USD"); }); it("should correctly serialize and deserialize Date values", async () => { const specificDate = new Date("2024-06-15T10:30:00Z"); await prisma.product.create({ data: { id: "prod-6", name: "Dated Product", price: 10, category: "test", createdAt: specificDate, }, }); const product = await repository.findById("prod-6"); // Date should survive cache round-trip expect(product!.createdAt.toISOString()).toBe(specificDate.toISOString()); }); });});Event-driven microservices depend on message brokers (Kafka, RabbitMQ, SQS) for inter-service communication. Integration tests must verify that messages are correctly produced, consumed, and processed—including edge cases like message ordering, duplicate handling, and failure recovery.
Message Broker Testing Challenges:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189
import { Kafka, Producer, Consumer, EachMessagePayload } from "kafkajs"; // Event producer serviceclass OrderEventProducer { constructor( private readonly producer: Producer, private readonly topic: string ) {} async publishOrderCreated(order: Order): Promise<void> { await this.producer.send({ topic: this.topic, messages: [{ key: order.id, // Partition by order ID for ordering value: JSON.stringify({ eventType: "OrderCreated", orderId: order.id, customerId: order.customerId, totalAmount: order.totalAmount.toDecimal(), timestamp: new Date().toISOString(), }), headers: { "event-type": "OrderCreated", "correlation-id": order.correlationId, }, }], }); } async publishOrderStatusChanged( orderId: string, previousStatus: OrderStatus, newStatus: OrderStatus ): Promise<void> { await this.producer.send({ topic: this.topic, messages: [{ key: orderId, value: JSON.stringify({ eventType: "OrderStatusChanged", orderId, previousStatus, newStatus, timestamp: new Date().toISOString(), }), }], }); }} // Integration tests for Kafka producer/consumerdescribe("Order Event Integration Tests", () => { let kafka: Kafka; let producer: Producer; let consumer: Consumer; let orderEventProducer: OrderEventProducer; const testTopic = "orders-test-" + Date.now(); // Unique topic per test run beforeAll(async () => { kafka = new Kafka({ clientId: "integration-test", brokers: [kafkaContainer.getBootstrapServers()], }); producer = kafka.producer(); await producer.connect(); orderEventProducer = new OrderEventProducer(producer, testTopic); // Create test topic const admin = kafka.admin(); await admin.connect(); await admin.createTopics({ topics: [{ topic: testTopic, numPartitions: 3 }], }); await admin.disconnect(); }); afterAll(async () => { await producer.disconnect(); if (consumer) await consumer.disconnect(); }); describe("publishOrderCreated", () => { it("should publish event with correct structure and headers", async () => { const receivedMessages: EachMessagePayload[] = []; // Set up consumer consumer = kafka.consumer({ groupId: "test-group-" + Date.now() }); await consumer.connect(); await consumer.subscribe({ topic: testTopic, fromBeginning: true }); const messagePromise = new Promise<void>((resolve) => { consumer.run({ eachMessage: async (payload) => { receivedMessages.push(payload); if (receivedMessages.length === 1) resolve(); }, }); }); // Publish event const order = OrderBuilder.anOrder() .withId("ord-test-123") .withCustomerId("cust-456") .withTotalAmount(Money.of(99.99)) .withCorrelationId("corr-789") .build(); await orderEventProducer.publishOrderCreated(order); // Wait for message to be received (with timeout) await Promise.race([ messagePromise, new Promise((_, reject) => setTimeout(() => reject(new Error("Timeout")), 10000) ), ]); // Verify message content const message = receivedMessages[0]; expect(message.message.key?.toString()).toBe("ord-test-123"); const payload = JSON.parse(message.message.value!.toString()); expect(payload.eventType).toBe("OrderCreated"); expect(payload.orderId).toBe("ord-test-123"); expect(payload.customerId).toBe("cust-456"); expect(payload.totalAmount).toBe(99.99); expect(payload.timestamp).toBeDefined(); // Verify headers expect(message.message.headers?.["event-type"]?.toString()).toBe("OrderCreated"); expect(message.message.headers?.["correlation-id"]?.toString()).toBe("corr-789"); }); }); describe("message ordering", () => { it("should maintain order for events with same partition key", async () => { const receivedMessages: string[] = []; consumer = kafka.consumer({ groupId: "order-test-" + Date.now() }); await consumer.connect(); await consumer.subscribe({ topic: testTopic, fromBeginning: false }); const allMessagesReceived = new Promise<void>((resolve) => { consumer.run({ eachMessage: async ({ message }) => { const payload = JSON.parse(message.value!.toString()); receivedMessages.push(payload.newStatus || payload.eventType); if (receivedMessages.length === 4) resolve(); }, }); }); // Wait for consumer to be ready await new Promise(resolve => setTimeout(resolve, 1000)); // Publish events for same order (should go to same partition) const orderId = "ord-ordering-test"; await orderEventProducer.publishOrderCreated( OrderBuilder.anOrder().withId(orderId).build() ); await orderEventProducer.publishOrderStatusChanged( orderId, OrderStatus.DRAFT, OrderStatus.CONFIRMED ); await orderEventProducer.publishOrderStatusChanged( orderId, OrderStatus.CONFIRMED, OrderStatus.PAID ); await orderEventProducer.publishOrderStatusChanged( orderId, OrderStatus.PAID, OrderStatus.SHIPPED ); await Promise.race([ allMessagesReceived, new Promise((_, reject) => setTimeout(() => reject(new Error("Timeout")), 15000) ), ]); // Verify order is preserved expect(receivedMessages).toEqual([ "OrderCreated", OrderStatus.CONFIRMED, OrderStatus.PAID, OrderStatus.SHIPPED, ]); }); });});Message broker tests are inherently asynchronous. Use promise-based waiting with timeouts rather than arbitrary delays. Tools like wait-for-expect or custom polling helpers can make tests more reliable and faster by completing as soon as the expected state is reached, rather than waiting a fixed duration.
Microservices often depend on external APIs—payment gateways, notification services, third-party data providers. Testing these integrations presents unique challenges: you can't control or reset external systems, rate limits apply, and credentials may be restricted.
Strategies for External API Testing:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266
import { MockServer } from "mockserver-node";import { mockServerClient } from "mockserver-client"; // Payment gateway clientclass StripePaymentGateway implements PaymentGateway { constructor( private readonly apiKey: string, private readonly baseUrl: string = "https://api.stripe.com" ) {} async createPaymentIntent( amount: Money, currency: string, customerId: string ): Promise<PaymentIntent> { const response = await fetch(`${this.baseUrl}/v1/payment_intents`, { method: "POST", headers: { "Authorization": `Bearer ${this.apiKey}`, "Content-Type": "application/x-www-form-urlencoded", }, body: new URLSearchParams({ amount: amount.toMinorUnits().toString(), // cents currency: currency.toLowerCase(), customer: customerId, }), }); if (!response.ok) { const error = await response.json(); throw new PaymentGatewayError(error.error.message, error.error.code); } const data = await response.json(); return { id: data.id, status: this.mapStatus(data.status), clientSecret: data.client_secret, amount: Money.fromMinorUnits(data.amount, data.currency.toUpperCase()), }; } async capturePayment(paymentIntentId: string): Promise<PaymentResult> { const response = await fetch( `${this.baseUrl}/v1/payment_intents/${paymentIntentId}/capture`, { method: "POST", headers: { "Authorization": `Bearer ${this.apiKey}`, }, } ); if (!response.ok) { const error = await response.json(); throw new PaymentGatewayError(error.error.message, error.error.code); } const data = await response.json(); return { success: data.status === "succeeded", transactionId: data.id, capturedAmount: Money.fromMinorUnits(data.amount_received, data.currency), }; } private mapStatus(stripeStatus: string): PaymentIntentStatus { const statusMap: Record<string, PaymentIntentStatus> = { "requires_payment_method": PaymentIntentStatus.PENDING, "requires_confirmation": PaymentIntentStatus.PENDING, "requires_action": PaymentIntentStatus.REQUIRES_ACTION, "processing": PaymentIntentStatus.PROCESSING, "requires_capture": PaymentIntentStatus.AUTHORIZED, "succeeded": PaymentIntentStatus.SUCCEEDED, "canceled": PaymentIntentStatus.CANCELED, }; return statusMap[stripeStatus] ?? PaymentIntentStatus.UNKNOWN; }} // Integration tests using MockServerdescribe("StripePaymentGateway Integration Tests", () => { let mockServer: MockServer; let mockClient: ReturnType<typeof mockServerClient>; let gateway: StripePaymentGateway; const mockPort = 1080; beforeAll(async () => { // Start MockServer mockServer = MockServer.start_mockserver({ serverPort: mockPort }); mockClient = mockServerClient("localhost", mockPort); gateway = new StripePaymentGateway( "sk_test_fake_key", `http://localhost:${mockPort}` ); }); afterAll(async () => { await MockServer.stop_mockserver({ serverPort: mockPort }); }); beforeEach(async () => { await mockClient.reset(); }); describe("createPaymentIntent", () => { it("should create payment intent and parse response correctly", async () => { // Set up mock response await mockClient.mockAnyResponse({ httpRequest: { method: "POST", path: "/v1/payment_intents", }, httpResponse: { statusCode: 200, headers: { "Content-Type": ["application/json"] }, body: JSON.stringify({ id: "pi_1234567890", object: "payment_intent", amount: 5000, // $50.00 in cents currency: "usd", status: "requires_payment_method", client_secret: "pi_1234_secret_xyz", }), }, }); const result = await gateway.createPaymentIntent( Money.of(50.00), "USD", "cus_abc123" ); expect(result.id).toBe("pi_1234567890"); expect(result.status).toBe(PaymentIntentStatus.PENDING); expect(result.clientSecret).toBe("pi_1234_secret_xyz"); expect(result.amount.toDecimal()).toBe(50.00); }); it("should throw PaymentGatewayError on API error", async () => { await mockClient.mockAnyResponse({ httpRequest: { method: "POST", path: "/v1/payment_intents", }, httpResponse: { statusCode: 400, headers: { "Content-Type": ["application/json"] }, body: JSON.stringify({ error: { type: "invalid_request_error", code: "parameter_invalid_integer", message: "Invalid positive integer", param: "amount", }, }), }, }); await expect( gateway.createPaymentIntent(Money.of(-50), "USD", "cus_abc123") ).rejects.toThrow(PaymentGatewayError); }); it("should handle rate limiting with retry", async () => { // First call returns 429, second succeeds let callCount = 0; await mockClient.mockAnyResponse({ httpRequest: { method: "POST", path: "/v1/payment_intents", }, httpResponseCallback: { callbackClass: "org.mockserver.mock.Callback", }, httpResponse: () => { callCount++; if (callCount === 1) { return { statusCode: 429, headers: { "Retry-After": ["1"] }, body: JSON.stringify({ error: { message: "Rate limit exceeded", type: "rate_limit_error" }, }), }; } return { statusCode: 200, body: JSON.stringify({ id: "pi_after_retry", amount: 1000, currency: "usd", status: "requires_payment_method", client_secret: "secret", }), }; }, }); // With retry logic, this should eventually succeed const result = await gateway.createPaymentIntent( Money.of(10), "USD", "cus_123" ); expect(result.id).toBe("pi_after_retry"); }); }); describe("request verification", () => { it("should send correct authentication headers", async () => { await mockClient.mockAnyResponse({ httpRequest: { path: "/v1/payment_intents" }, httpResponse: { statusCode: 200, body: JSON.stringify({ id: "pi_test", amount: 1000, currency: "usd", status: "requires_payment_method", client_secret: "secret", }), }, }); await gateway.createPaymentIntent(Money.of(10), "USD", "cus_123"); // Verify the request was made with correct auth header await mockClient.verify({ path: "/v1/payment_intents", headers: { Authorization: ["Bearer sk_test_fake_key"], }, }); }); it("should convert amount to minor units", async () => { await mockClient.mockAnyResponse({ httpRequest: { path: "/v1/payment_intents" }, httpResponse: { statusCode: 200, body: JSON.stringify({ id: "pi_test", amount: 2599, currency: "usd", status: "requires_payment_method", client_secret: "secret", }), }, }); await gateway.createPaymentIntent(Money.of(25.99), "USD", "cus_123"); // Verify amount was sent in cents await mockClient.verify({ path: "/v1/payment_intents", body: { type: "STRING", string: expect.stringContaining("amount=2599"), }, }); }); });});Sandbox environments test against real APIs but may have rate limits, require network access, and can't simulate all error scenarios. Mock servers offer complete control but may drift from actual API behavior over time. The best approach often combines both: mock servers for unit/integration tests, sandbox tests in CI for contract verification.
Integration tests are inherently slower than unit tests, but that doesn't mean they should be prohibitively slow. A suite that takes 30 minutes to run will be skipped by developers, negating its value. Optimization strategies keep integration tests fast enough to run regularly.
Performance Optimization Strategies:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091
// Optimized test setup with transaction-based isolationdescribe("Order Integration Tests", () => { let transactionClient: PrismaClient; // Suite-level container start (already started in global setup) beforeAll(async () => { // Pre-seed common reference data once await seedReferenceData(); }); // Transaction-based isolation - automatic rollback beforeEach(async () => { // Start a transaction that will be rolled back after each test transactionClient = await prisma.$transaction(async (tx) => { // Return the transaction context for use in tests return tx as unknown as PrismaClient; }); }); afterEach(async () => { // Rollback happens automatically when transaction isn't committed // This is faster than TRUNCATE for many tests }); // Alternative: Parallel-safe with schema isolation // Each test file gets its own schema beforeAll(async () => { const schemaName = `test_${process.pid}_${Date.now()}`; await prisma.$executeRawUnsafe(`CREATE SCHEMA IF NOT EXISTS ${schemaName}`); await prisma.$executeRawUnsafe(`SET search_path TO ${schemaName}`); await runMigrationsForSchema(schemaName); });}); // Parallel test execution configuration (jest.config.js)module.exports = { // Run test files in parallel maxWorkers: "50%", // Use half of available CPU cores // But run tests within a file serially (they may share state) testRunner: "jest-circus/runner", // Group integration tests to run together projects: [ { displayName: "unit", testMatch: ["**/*.test.ts"], testPathIgnorePatterns: ["integration"], }, { displayName: "integration", testMatch: ["**/integration/**/*.test.ts"], // Slower timeout for integration tests testTimeout: 30000, // Run integration tests serially if they share containers maxWorkers: 1, }, ],}; // Lazy container initializationconst containers = { postgres: null as StartedPostgreSqlContainer | null, redis: null as StartedRedisContainer | null, kafka: null as StartedKafkaContainer | null,}; export async function getPostgres(): Promise<StartedPostgreSqlContainer> { if (!containers.postgres) { containers.postgres = await new PostgreSqlContainer().start(); } return containers.postgres;} export async function getRedis(): Promise<StartedRedisContainer> { if (!containers.redis) { containers.redis = await new RedisContainer().start(); } return containers.redis;} // Only start containers needed by specific test filedescribe("Database-only tests", () => { beforeAll(async () => { // Only PostgreSQL needed, no Kafka/Redis startup overhead const postgres = await getPostgres(); process.env.DATABASE_URL = postgres.getConnectionUri(); }); // ... tests});| Approach | 100 Tests | 1000 Tests | Notes |
|---|---|---|---|
| Container per test | ~15 min | ~2.5 hours | Unacceptable |
| Container per suite + TRUNCATE | ~2 min | ~20 min | Good baseline |
| ~1 min | ~10 min | Better |
| ~30 sec | ~5 min | Optimal for CI |
Integration tests occupy a critical position in the microservices testing strategy—they verify that your service actually works with its real dependencies, catching bugs that unit tests with mocked dependencies would miss.
What's Next:
Integration tests verify that your service works with its infrastructure, but they don't verify that it correctly interacts with other services. The next page explores Contract Testing—how to verify that services honor their API contracts without running the entire system, enabling independent service evolution while maintaining system-wide compatibility.
You now understand how to design and implement integration tests for microservices. You've learned about Testcontainers, database testing, cache testing, message broker testing, and external API testing. Next, we'll explore how to verify service compatibility through contract testing.