Building a High-Performance Event-Driven App with .NET and a Database

Event-driven architectures (EDA) are powerful for building scalable systems that handle complex workflows. But what if you need to build such a system using only a .NET application and a database—no message brokers like Kafka or RabbitMQ? This article explains how to design a robust, event-driven app capable of handling heavy transaction loads, tracking all states, and automatically retrying failed steps. Why Use a Database as an Event Store? While dedicated message brokers offer advanced features, a database can act as a reliable event store for simpler scenarios. By storing events in a database table, you gain: Persistence: Events survive app restarts. Transaction tracking: Every step’s state is recorded. Scalability: Databases like SQL Server or PostgreSQL handle high read/write loads. Step 1: Designing the Database Schema Create an Events table to store all transactions and their states: CREATE TABLE Events ( EventId UNIQUEIDENTIFIER PRIMARY KEY, EventType NVARCHAR(50) NOT NULL, Payload NVARCHAR(MAX), -- JSON data Status NVARCHAR(20) NOT NULL, -- Pending, Processing, Completed, Failed RetryCount INT DEFAULT 0, CreatedAt DATETIME DEFAULT GETDATE(), LastUpdated DATETIME DEFAULT GETDATE() ); Status: Tracks progress through the pipeline (e.g., PaymentPending → InventoryChecked → Completed). RetryCount: Limits retry attempts to avoid infinite loops. Step 2: Building the .NET Application Event Producer Any part of your app (e.g., an API endpoint) can insert events into the Events table: public async Task CreateEvent(string eventType, string payload) { var newEvent = new Event { EventId = Guid.NewGuid(), EventType = eventType, Payload = payload, Status = "Pending" }; await _dbContext.Events.AddAsync(newEvent); await _dbContext.SaveChangesAsync(); } Event Consumer (Background Worker) Use a .NET BackgroundService to poll for pending events and process them: public class EventProcessor : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { while (!stoppingToken.IsCancellationRequested) { var pendingEvents = await _dbContext.Events .Where(e => e.Status == "Pending" && e.RetryCount e.CreatedAt) .Take(100) // Batch size for load handling .ToListAsync(); foreach (var event in pendingEvents) { try { event.Status = "Processing"; await _dbContext.SaveChangesAsync(); await ProcessEvent(event); // Execute pipeline steps event.Status = "Completed"; await _dbContext.SaveChangesAsync(); } catch (Exception ex) { event.Status = "Failed"; event.RetryCount++; _logger.LogError(ex, "Event {EventId} failed", event.EventId); await _dbContext.SaveChangesAsync(); } } await Task.Delay(1000, stoppingToken); // Adjust polling interval } } } Step 3: Implementing the Processing Pipeline Define a pipeline with multiple steps (e.g., validate payment → update inventory → send confirmation). Use a strategy pattern: public interface IEventProcessor { Task ProcessAsync(Event @event); } public class PaymentProcessor : IEventProcessor { ... } public class InventoryProcessor : IEventProcessor { ... } // In ProcessEvent method: var processors = new Dictionary { { "PaymentPending", new PaymentProcessor() }, { "InventoryUpdate", new InventoryProcessor() } }; if (processors.TryGetValue(event.EventType, out var processor)) { await processor.ProcessAsync(event); } else { throw new NotSupportedException($"Event type {event.EventType} not recognized."); } Step 4: Handling Failures and Retries Auto-Retry: The background worker retries failed events (up to RetryCount limit). Dead-Letter Queue: Move events with RetryCount >= 3 to a separate table for manual review. Idempotency: Ensure each step can be retried safely (e.g., use unique IDs to avoid duplicate processing). Performance Tips for Heavy Loads Batching: Process events in batches to reduce database roundtrips. Indexing: Add indexes on Status and CreatedAt for faster queries. Concurrency Control: Use ROWVERSION or optimistic concurrency to prevent race conditions. Async/Await: Avoid blocking threads during I/O operations. Conclusion By leveraging a database as an event store and a .NET BackgroundService, you can build a scalable, event-driven app without external dependencies. This approach ensures all transaction states are tracked, supports heavy

Apr 5, 2025 - 15:18
 0
Building a High-Performance Event-Driven App with .NET and a Database

Event-driven architectures (EDA) are powerful for building scalable systems that handle complex workflows. But what if you need to build such a system using only a .NET application and a database—no message brokers like Kafka or RabbitMQ? This article explains how to design a robust, event-driven app capable of handling heavy transaction loads, tracking all states, and automatically retrying failed steps.

Why Use a Database as an Event Store?

While dedicated message brokers offer advanced features, a database can act as a reliable event store for simpler scenarios. By storing events in a database table, you gain:

  • Persistence: Events survive app restarts.
  • Transaction tracking: Every step’s state is recorded.
  • Scalability: Databases like SQL Server or PostgreSQL handle high read/write loads.

Step 1: Designing the Database Schema

Create an Events table to store all transactions and their states:

CREATE TABLE Events (
    EventId UNIQUEIDENTIFIER PRIMARY KEY,
    EventType NVARCHAR(50) NOT NULL,
    Payload NVARCHAR(MAX), -- JSON data
    Status NVARCHAR(20) NOT NULL, -- Pending, Processing, Completed, Failed
    RetryCount INT DEFAULT 0,
    CreatedAt DATETIME DEFAULT GETDATE(),
    LastUpdated DATETIME DEFAULT GETDATE()
);
  • Status: Tracks progress through the pipeline (e.g., PaymentPending → InventoryChecked → Completed).
  • RetryCount: Limits retry attempts to avoid infinite loops.

Step 2: Building the .NET Application

Event Producer

Any part of your app (e.g., an API endpoint) can insert events into the Events table:

public async Task CreateEvent(string eventType, string payload)
{
    var newEvent = new Event
    {
        EventId = Guid.NewGuid(),
        EventType = eventType,
        Payload = payload,
        Status = "Pending"
    };

    await _dbContext.Events.AddAsync(newEvent);
    await _dbContext.SaveChangesAsync();
}

Event Consumer (Background Worker)

Use a .NET BackgroundService to poll for pending events and process them:

public class EventProcessor : BackgroundService
{
    protected override async Task ExecuteAsync(CancellationToken stoppingToken)
    {
        while (!stoppingToken.IsCancellationRequested)
        {
            var pendingEvents = await _dbContext.Events
                .Where(e => e.Status == "Pending" && e.RetryCount < 3)
                .OrderBy(e => e.CreatedAt)
                .Take(100) // Batch size for load handling
                .ToListAsync();

            foreach (var event in pendingEvents)
            {
                try
                {
                    event.Status = "Processing";
                    await _dbContext.SaveChangesAsync();

                    await ProcessEvent(event); // Execute pipeline steps

                    event.Status = "Completed";
                    await _dbContext.SaveChangesAsync();
                }
                catch (Exception ex)
                {
                    event.Status = "Failed";
                    event.RetryCount++;
                    _logger.LogError(ex, "Event {EventId} failed", event.EventId);
                    await _dbContext.SaveChangesAsync();
                }
            }

            await Task.Delay(1000, stoppingToken); // Adjust polling interval
        }
    }
}

Step 3: Implementing the Processing Pipeline

Define a pipeline with multiple steps (e.g., validate payment → update inventory → send confirmation). Use a strategy pattern:

public interface IEventProcessor
{
    Task<bool> ProcessAsync(Event @event);
}

public class PaymentProcessor : IEventProcessor { ... }
public class InventoryProcessor : IEventProcessor { ... }

// In ProcessEvent method:
var processors = new Dictionary<string, IEventProcessor>
{
    { "PaymentPending", new PaymentProcessor() },
    { "InventoryUpdate", new InventoryProcessor() }
};

if (processors.TryGetValue(event.EventType, out var processor))
{
    await processor.ProcessAsync(event);
}
else
{
    throw new NotSupportedException($"Event type {event.EventType} not recognized.");
}

Step 4: Handling Failures and Retries

  • Auto-Retry: The background worker retries failed events (up to RetryCount limit).
  • Dead-Letter Queue: Move events with RetryCount >= 3 to a separate table for manual review.
  • Idempotency: Ensure each step can be retried safely (e.g., use unique IDs to avoid duplicate processing).

Performance Tips for Heavy Loads

  1. Batching: Process events in batches to reduce database roundtrips.
  2. Indexing: Add indexes on Status and CreatedAt for faster queries.
  3. Concurrency Control: Use ROWVERSION or optimistic concurrency to prevent race conditions.
  4. Async/Await: Avoid blocking threads during I/O operations.

Conclusion

By leveraging a database as an event store and a .NET BackgroundService, you can build a scalable, event-driven app without external dependencies. This approach ensures all transaction states are tracked, supports heavy loads through batching and async processing, and provides automatic retries for reliability. While not as fast as dedicated message brokers, it’s a practical solution for many business scenarios.

Need help optimizing your event-driven system? Share your use case, and let’s discuss!