Loading content...
High-level modules — your business rules, domain logic, and policies — represent the intellectual property and competitive advantage of your organization. Yet in many codebases, these precious assets are vulnerable to constant disruption from infrastructure changes: database migrations, library upgrades, API version changes, cloud provider updates.
This vulnerability isn't inevitable. With proper architectural techniques, high-level modules can be insulated from low-level volatility, allowing your business logic to remain stable even as the technical foundation evolves. This is the ultimate goal of the Dependency Inversion Principle.
By the end of this page, you will master the architectural patterns for protecting high-level modules, understand how abstraction boundaries provide stability, implement the interface ownership pattern, and apply these techniques to create systems that gracefully survive infrastructure evolution.
Before diving into techniques, let's understand the cost of not protecting high-level modules. When business logic directly depends on infrastructure, every technical change becomes a business risk.
When PricingService directly uses the PostgreSQL driver, upgrading PostgreSQL requires modifying pricing logic. This creates several problems:
Our goal is to create a structure where low-level changes (database migration, library upgrade, API version change) require ZERO modifications to high-level business logic. The business rules should be blissfully unaware of infrastructure evolution.
The primary tool for protection is the abstraction barrier — an interface that defines what high-level modules need without specifying how those needs are fulfilled.
An abstraction barrier consists of three elements:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102
// ELEMENT 1: The Interface (Abstraction Barrier)// Defined in terms of high-level domain concepts// No mention of databases, HTTP, or specific technologies interface OrderRepository { /** * Persist an order to durable storage. * @throws OrderPersistenceError if the operation fails */ save(order: Order): Promise<void>; /** * Retrieve an order by its unique identifier. * @returns The order if found, null if no order exists with this ID */ findById(id: OrderId): Promise<Order | null>; /** * Find all orders for a customer, most recent first. */ findByCustomer(customerId: CustomerId): Promise<Order[]>; /** * Search orders matching given business criteria. */ search(criteria: OrderSearchCriteria): Promise<PaginatedResult<Order>>;} // ELEMENT 2: High-Level Consumer// Uses the interface, completely unaware of implementation details class OrderService { constructor( private orderRepository: OrderRepository, // Depends on abstraction private pricingPolicy: PricingPolicy, private inventoryChecker: InventoryChecker, ) {} async createOrder(request: CreateOrderRequest): Promise<Order> { // Pure business logic — no awareness of PostgreSQL, MongoDB, etc. const customer = await this.customerRepository.findById(request.customerId); // Domain validation if (!customer.canPlaceOrders()) { throw new OrderCreationError('Customer account is suspended'); } // Business rules const order = Order.create({ customerId: request.customerId, items: request.items, shippingAddress: request.shippingAddress, }); // Price calculation (domain logic) const pricedOrder = this.pricingPolicy.applyPricing(order, customer); // Inventory verification (domain logic) await this.inventoryChecker.verifyAvailability(pricedOrder.items); // Persistence via abstraction — OrderService doesn't know HOW this works await this.orderRepository.save(pricedOrder); return pricedOrder; }} // ELEMENT 3: Low-Level Implementor// Fulfills the interface with specific technology class PostgresOrderRepository implements OrderRepository { constructor(private pool: Pool) {} async save(order: Order): Promise<void> { // PostgreSQL-specific implementation const client = await this.pool.connect(); try { await client.query('BEGIN'); await client.query( `INSERT INTO orders (id, customer_id, status, total, created_at) VALUES ($1, $2, $3, $4, NOW()) ON CONFLICT (id) DO UPDATE SET status = $3, total = $4`, [order.id.value, order.customerId.value, order.status, order.total.cents] ); for (const item of order.items) { await client.query( 'INSERT INTO order_items (order_id, product_id, quantity, price) VALUES ($1, $2, $3, $4)', [order.id.value, item.productId.value, item.quantity, item.price.cents] ); } await client.query('COMMIT'); } catch (error) { await client.query('ROLLBACK'); throw new OrderPersistenceError('Failed to save order', { cause: error }); } finally { client.release(); } } // ... other methods}With this structure, migrating from PostgreSQL to MongoDB requires only creating a new MongoOrderRepository that implements OrderRepository. OrderService never changes. All business logic tests pass without modification. The abstraction barrier has protected the high-level module.
A critical but often overlooked aspect of DIP is who defines the interface. The interface should be defined by — and live with — the high-level consumer, not the low-level provider. This is called interface ownership.
If the database team defines OrderRepository, they'll design it around database capabilities. If the domain team defines it, they'll design it around business needs. The difference is profound:
executeQuery(), insert(), update()save(), findByCustomer(), search()Interface ownership should be reflected in your package/module structure:
1234567891011121314151617181920212223242526
src/├── domain/ # High-level layer│ ├── entities/│ │ └── Order.ts│ ├── value-objects/│ │ └── Money.ts│ └── repositories/ # Interfaces live HERE, with domain│ ├── OrderRepository.ts # 👈 High-level defines interface│ ├── CustomerRepository.ts # 👈 High-level defines interface│ └── ProductRepository.ts # 👈 High-level defines interface│├── application/ # High-level layer│ └── services/│ └── OrderService.ts # Uses interfaces from domain│└── infrastructure/ # Low-level layer └── persistence/ ├── PostgresOrderRepository.ts # 👈 Implements domain interface ├── PostgresCustomerRepository.ts # 👈 Implements domain interface └── PostgresProductRepository.ts # 👈 Implements domain interface # DEPENDENCY DIRECTION:# infrastructure/PostgresOrderRepository → domain/OrderRepository# application/OrderService → domain/OrderRepository## Notice: domain layer has NO dependencies on infrastructureSource code dependencies must point inward, toward high-level policy. Nothing in an inner circle can know anything about something in an outer circle. The name of something declared in an outer circle must not be mentioned by code in an inner circle. (Robert C. Martin)
When you need to integrate with external APIs or libraries that don't match your domain interface, the Adapter pattern provides the translation layer.
An adapter wraps an external dependency and presents it as a domain interface:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112
// DOMAIN INTERFACE — Defined by high-level needsinterface PaymentProcessor { charge(payment: Payment): Promise<PaymentResult>; refund(payment: Payment, amount: Money): Promise<RefundResult>; getPaymentStatus(paymentId: PaymentId): Promise<PaymentStatus>;} // Domain types — pure business conceptsinterface Payment { id: PaymentId; orderId: OrderId; amount: Money; method: PaymentMethod; customer: CustomerPaymentInfo;} interface PaymentResult { success: boolean; transactionId: TransactionId; processedAt: Date; failureReason?: string;} // ADAPTER — Translates between domain and Stripe APIclass StripePaymentAdapter implements PaymentProcessor { constructor(private stripeClient: Stripe) {} async charge(payment: Payment): Promise<PaymentResult> { try { // Translate domain concepts to Stripe API format const stripeCharge = await this.stripeClient.paymentIntents.create({ amount: payment.amount.cents, // Domain → Stripe currency: payment.amount.currency.toLowerCase(), payment_method: payment.method.stripeId, customer: payment.customer.stripeCustomerId, metadata: { order_id: payment.orderId.value, // Correlation for debugging internal_payment_id: payment.id.value, }, confirm: true, }); // Translate Stripe response to domain result return { success: stripeCharge.status === 'succeeded', transactionId: new TransactionId(stripeCharge.id), processedAt: new Date(stripeCharge.created * 1000), failureReason: stripeCharge.last_payment_error?.message, }; } catch (error) { // Translate Stripe errors to domain errors if (error instanceof Stripe.errors.StripeCardError) { return { success: false, transactionId: TransactionId.none(), processedAt: new Date(), failureReason: this.translateCardError(error.code), }; } throw new PaymentProcessingError('Payment failed', { cause: error }); } } async refund(payment: Payment, amount: Money): Promise<RefundResult> { // Similar translation logic const stripeRefund = await this.stripeClient.refunds.create({ payment_intent: payment.stripePaymentIntentId, amount: amount.cents, }); return this.translateRefundResult(stripeRefund); } private translateCardError(stripeCode: string): string { // Map Stripe-specific codes to business-friendly messages const errorMap: Record<string, string> = { 'card_declined': 'Card was declined by the issuing bank', 'insufficient_funds': 'Insufficient funds on card', 'expired_card': 'Card has expired', 'incorrect_cvc': 'Security code is incorrect', }; return errorMap[stripeCode] || 'Payment could not be processed'; }} // Another adapter for a different providerclass AdyenPaymentAdapter implements PaymentProcessor { constructor(private adyenClient: AdyenClient) {} async charge(payment: Payment): Promise<PaymentResult> { // Translate domain to Adyen API format (different from Stripe!) const adyenPayment = await this.adyenClient.payments({ merchantAccount: this.merchantAccount, amount: { value: payment.amount.cents, currency: payment.amount.currency, }, reference: payment.orderId.value, paymentMethod: this.translatePaymentMethod(payment.method), shopperReference: payment.customer.id.value, }); return { success: adyenPayment.resultCode === 'Authorised', transactionId: new TransactionId(adyenPayment.pspReference), processedAt: new Date(), failureReason: adyenPayment.refusalReason, }; } // ... other methods with Adyen-specific translations}With adapters, switching from Stripe to Adyen requires only swapping the adapter implementation. All business logic using PaymentProcessor is unchanged. The adapter absorbs all vendor-specific complexity, keeping the domain pure.
When integrating with legacy systems, third-party APIs, or external domains with conflicting models, the Anti-Corruption Layer (ACL) provides robust protection.
An ACL is more comprehensive than a simple adapter. It includes:
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110
/** * Anti-Corruption Layer for integrating with legacy inventory system * * The legacy system has a completely different model: * - Uses item numbers instead of product IDs * - Tracks quantity in "units" that may differ from our SKU units * - Returns warehouse codes we need to map to locations * - Has undocumented error responses */class LegacyInventoryACL implements InventoryService { constructor( private legacyClient: LegacyInventoryClient, private productMapper: ProductLegacyMapper, private warehouseMapper: WarehouseLocationMapper, ) {} async checkAvailability( products: Product[] ): Promise<InventoryAvailability[]> { // TRANSLATION: Convert our products to legacy item numbers const legacyItems = products.map(p => ({ itemNumber: this.productMapper.toLegacyItemNumber(p.id), requestedUnits: this.productMapper.toLegacyUnits(p.id, p.quantity), })); // Call legacy system let legacyResponse: LegacyStockResponse; try { legacyResponse = await this.legacyClient.getStockLevels(legacyItems); } catch (error) { // ERROR TRANSLATION: Legacy errors to domain exceptions if (error.code === 'CONN_TIMEOUT') { throw new InventoryUnavailableError('Inventory system temporarily unavailable'); } if (error.code === 'INVALID_ITEM') { throw new ProductNotFoundError('One or more products not found in inventory'); } throw new InventoryError('Failed to check inventory', { cause: error }); } // VALIDATION: Ensure response makes sense if (!legacyResponse.items || !Array.isArray(legacyResponse.items)) { throw new InventoryDataCorruptionError('Invalid response from inventory system'); } // TRANSLATION: Convert legacy response to domain model return legacyResponse.items.map(legacyItem => { const product = this.productMapper.fromLegacyItemNumber(legacyItem.itemNo); // Map legacy warehouse codes to our location model const locations = legacyItem.stockByWarehouse.map(ws => ({ location: this.warehouseMapper.toLocation(ws.whCode), available: this.productMapper.fromLegacyUnits(product.id, ws.availUnits), reserved: this.productMapper.fromLegacyUnits(product.id, ws.rsvdUnits), })); return new InventoryAvailability({ productId: product.id, totalAvailable: locations.reduce((sum, l) => sum + l.available, 0), locationBreakdown: locations, lastUpdated: this.parseLegacyTimestamp(legacyItem.lstUpd), }); }); } async reserveInventory( reservation: InventoryReservation ): Promise<ReservationConfirmation> { // FACADE: Simplify complex legacy reservation workflow // Legacy system requires: check → lock → reserve → confirm // We expose simple: reserve const legacyItems = this.translateReservationItems(reservation.items); // Step 1: Lock items in legacy system const lockId = await this.legacyClient.lockItems(legacyItems); try { // Step 2: Create reservation const reservationId = await this.legacyClient.createReservation({ lockId, items: legacyItems, expiresAt: this.toLegacyTimestamp(reservation.expiresAt), }); // Step 3: Confirm reservation await this.legacyClient.confirmReservation(reservationId); return new ReservationConfirmation({ id: new ReservationId(reservationId), items: reservation.items, confirmedAt: new Date(), expiresAt: reservation.expiresAt, }); } catch (error) { // Cleanup on failure await this.legacyClient.releaseLock(lockId).catch(() => {}); throw this.translateReservationError(error); } } private parseLegacyTimestamp(legacyTs: string): Date { // Legacy uses format "YYYYMMDDHHMMSS" const match = legacyTs.match(/(\d{4})(\d{2})(\d{2})(\d{2})(\d{2})(\d{2})/); if (!match) { return new Date(0); // Default for corrupt data } return new Date(`${match[1]}-${match[2]}-${match[3]}T${match[4]}:${match[5]}:${match[6]}`); }}Use an Anti-Corruption Layer when: the external system has a fundamentally different domain model, the external API is unstable or poorly documented, the external system uses legacy data formats, you need to protect your domain from external model changes. Don't over-engineer simple integrations — a basic adapter may suffice.
Having abstractions is not enough — you also need a mechanism to connect high-level consumers to low-level implementations without the high-level code knowing about specifics. This is where Dependency Injection (DI) comes in.
DI requires a composition root — a single place where all dependencies are wired together. This is the only place in your application that knows about concrete implementations:
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394959697
/** * Composition Root — The ONLY place that knows about concrete implementations * * All dependencies are wired here and passed to consumers via constructor injection. * This is where you can swap implementations without touching business logic. */function createApplicationContainer(config: AppConfig): Container { const container = new Container(); // ===== INFRASTRUCTURE SETUP ===== // These know about specific technologies const databasePool = new Pool({ connectionString: config.databaseUrl, max: config.dbPoolSize, }); const redisClient = new Redis({ host: config.redisHost, port: config.redisPort, }); const stripeClient = new Stripe(config.stripeSecretKey, { apiVersion: '2023-10-16', }); // ===== REPOSITORY IMPLEMENTATIONS ===== // Decide which implementations to use based on config/environment const orderRepository: OrderRepository = config.useReadReplica ? new PostgresOrderRepositoryWithReadReplica(databasePool, readReplicaPool) : new PostgresOrderRepository(databasePool); const productRepository: ProductRepository = new CachedProductRepository( new PostgresProductRepository(databasePool), redisClient, { ttl: 300 } // 5 minute cache ); // ===== EXTERNAL SERVICE ADAPTERS ===== const paymentProcessor: PaymentProcessor = config.paymentProvider === 'stripe' ? new StripePaymentAdapter(stripeClient) : new AdyenPaymentAdapter(new AdyenClient(config.adyenConfig)); const emailService: EmailService = config.environment === 'test' ? new InMemoryEmailService() // Don't send real emails in tests : new SendGridEmailAdapter(config.sendGridApiKey); // ===== DOMAIN SERVICES (Pure business logic) ===== const pricingPolicy = new PricingPolicy( container.get(DiscountRepository), container.get(TaxRateProvider), ); const inventoryService = new InventoryService( container.get(InventoryRepository), ); // ===== APPLICATION SERVICES (Use case orchestration) ===== const orderService = new OrderService( orderRepository, // Abstract interface productRepository, // Abstract interface paymentProcessor, // Abstract interface pricingPolicy, // Domain service inventoryService, // Domain service emailService, // Abstract interface ); // Register in container for later retrieval container.register(OrderService, orderService); container.register(OrderRepository, orderRepository); // ... more registrations return container;} /** * Entry point — creates the composition root */async function main() { const config = loadConfig(); const container = createApplicationContainer(config); // The HTTP layer gets wired up with services from container const app = createExpressApp(container); app.listen(config.port, () => { console.log(`Server running on port ${config.port}`); });}With this structure, changing from PostgreSQL to MongoDB means modifying ONE file (composition root). OrderService, PricingPolicy, and all business logic remain untouched. Tests can inject InMemoryOrderRepository. Production uses PostgresOrderRepository. Same business logic, different infrastructure.
A well-protected architecture enables a powerful testing strategy with clear separation between unit, integration, and end-to-end tests.
| Test Type | What's Tested | Dependencies Used | Speed |
|---|---|---|---|
| Unit Tests (Many) | High-level business logic | In-memory fakes, plain objects | Milliseconds |
| Integration Tests (Some) | Low-level implementations | Real databases, real APIs | Seconds |
| Contract Tests (Some) | Interface compliance | Implementation against interface expectations | Milliseconds-Seconds |
| E2E Tests (Few) | Full system workflows | Everything real | Seconds-Minutes |
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114
// ===== UNIT TEST: Business logic with fakes =====describe('OrderService', () => { let orderService: OrderService; let fakeOrderRepo: InMemoryOrderRepository; let fakePaymentProcessor: FakePaymentProcessor; beforeEach(() => { // Fast, in-memory implementations fakeOrderRepo = new InMemoryOrderRepository(); fakePaymentProcessor = new FakePaymentProcessor(); orderService = new OrderService( fakeOrderRepo, new InMemoryProductRepository(testProducts), fakePaymentProcessor, new PricingPolicy(), new InMemoryInventoryService(), new FakeEmailService(), ); }); test('applies member discount to order total', async () => { // Arrange const request = createOrderRequest({ customerId: premiumCustomer.id, items: [{ productId: 'prod-1', quantity: 2 }], }); // Act const order = await orderService.createOrder(request); // Assert — pure business logic testing expect(order.total.amount).toBe(90); // 10% member discount expect(order.discounts).toContainEqual( expect.objectContaining({ type: 'MEMBER_DISCOUNT' }) ); }); test('rejects order when inventory unavailable', async () => { // Fake inventory to return "out of stock" fakeInventoryService.setAvailability('prod-1', 0); const request = createOrderRequest({ items: [{ productId: 'prod-1', quantity: 5 }], }); await expect(orderService.createOrder(request)) .rejects.toThrow(InsufficientInventoryError); });}); // ===== INTEGRATION TEST: Repository with real database =====describe('PostgresOrderRepository', () => { let repository: PostgresOrderRepository; let pool: Pool; beforeAll(async () => { pool = new Pool({ connectionString: process.env.TEST_DATABASE_URL }); repository = new PostgresOrderRepository(pool); await clearTestData(pool); }); afterAll(async () => { await pool.end(); }); test('persists and retrieves order with items', async () => { const order = createTestOrder({ items: [ { productId: 'prod-1', quantity: 2, price: Money.of(25, 'USD') }, { productId: 'prod-2', quantity: 1, price: Money.of(50, 'USD') }, ], }); await repository.save(order); const retrieved = await repository.findById(order.id); expect(retrieved).toEqual(order); expect(retrieved?.items).toHaveLength(2); });}); // ===== CONTRACT TEST: Ensure implementation matches interface =====describe('OrderRepository Contract', () => { // Run the same contract tests against all implementations const implementations = [ { name: 'PostgreSQL', create: () => new PostgresOrderRepository(testPool) }, { name: 'InMemory', create: () => new InMemoryOrderRepository() }, ]; implementations.forEach(({ name, create }) => { describe(`${name} implementation`, () => { let repository: OrderRepository; beforeEach(() => { repository = create(); }); test('findById returns null for non-existent order', async () => { const result = await repository.findById(OrderId.generate()); expect(result).toBeNull(); }); test('save then findById round-trips correctly', async () => { const order = createTestOrder(); await repository.save(order); const found = await repository.findById(order.id); expect(found).toEqual(order); }); // ... more contract tests }); });});With protection in place, you can have thousands of unit tests running in milliseconds, testing all business logic without any infrastructure. Integration tests verify that implementations work. Contract tests ensure implementations are interchangeable. This provides both speed AND confidence.
Let's examine how successful organizations protect their high-level modules in practice.
Hexagonal Architecture, coined by Alistair Cockburn, formalizes protection through ports (interfaces) and adapters (implementations):
Module Complete:
You now have a comprehensive understanding of the distinction between high-level and low-level modules, and the techniques for protecting your valuable business logic from infrastructure volatility. This knowledge forms the foundation for truly applying the Dependency Inversion Principle — not as an academic exercise, but as a practical tool for building maintainable, testable, and evolvable systems.
You've mastered the critical concepts of high-level vs low-level modules: what they are, how to identify them, and most importantly, how to protect high-level business logic from low-level infrastructure changes. These skills are fundamental to professional software architecture and will serve you throughout your career.