Rate Limiting Microservice in Rust

This is a submission for the Amazon Q Developer "Quack The Code" Challenge: Exploring the Possibilities What I Built I recently embarked on an ambitious project to build a high-performance, multi-tenant rate limiting service in Rust. Despite being a Rust novice (I'm still learning the language!), I managed to create a sophisticated microservice that serves as a critical entry point for modern distributed architectures. Rate Limiter v2: A Comprehensive Solution My rate limiter service isn't just a simple API throttler—it's a full-featured system designed for production environments with the following capabilities: Multi-Tenant Architecture The service supports multiple tenants with complete isolation between them. In this context, tenants represent different applications, services, or customers that need rate limiting capabilities. Each tenant can have: Their own API key for authentication Custom rate limiting rules and quotas Different service tiers (free, standard, premium, enterprise) Isolated usage tracking and metrics Multiple Rate Limiting Algorithms I implemented four different rate limiting algorithms to suit various use cases: Token Bucket: Provides smooth rate limiting with burst capability Leaky Bucket: Ensures constant outflow rate limiting Fixed Window: Offers simple time-window based limiting Sliding Window: Delivers more accurate time-window limiting with less boundary issues Comprehensive Observability Monitoring and debugging are critical for production services, so I integrated: OpenTelemetry tracing: For distributed request tracing across services Prometheus metrics: For collecting and exposing performance metrics Grafana dashboards: For visualizing metrics with pre-configured dashboards Jaeger UI: For exploring and analyzing traces Resilient Storage Options The service supports multiple storage backends: In-memory storage for high performance Redis for distributed deployments Hybrid storage with Redis + memory fallback for resilience API Documentation and Testing Swagger/OpenAPI: For interactive API documentation Load testing scripts: To validate performance under various conditions Containerization The entire solution is containerized with Docker and Docker Compose, making it easy to deploy and scale. The Amazon Q Developer Experience What makes this project truly remarkable is how I built it. Despite my limited Rust knowledge, I was able to develop this complex system with the help of Amazon Q Developer. The experience was transformative—it felt like I was managing Amazon Q while it handled 90% of the coding work. We followed an iterative development process where Amazon Q: Helped plan out the project iterations Generated the core code structure Implemented complex algorithms and patterns Assisted with integrating various technologies Provided debugging support when issues arose It was genuinely shocking to experience firsthand the reality of AI-assisted development, where I could focus on architecture decisions and requirements while the AI handled the implementation details. This is the future—developers managing AI agents that generate most of the code, dramatically accelerating development cycles. For my next iteration, I'm planning to implement gRPC for internal service communication instead of HTTP, which should further improve performance. I'm incredibly excited about continuing this project and exploring more ways to leverage Amazon Q in my development workflow. Demo Code Repository https://github.com/tanishparashar/rate-limiter-final How I Used Amazon Q Developer Throughout the development of my Rate Limiter v2 service, Amazon Q Developer became my virtual coding partner—a software developer who worked alongside me, executing commands, writing code, and debugging issues. This collaboration transformed my development workflow in ways I hadn't anticipated. An Iterative Development Approach My journey with Amazon Q began when I decided to take an iterative approach to building the rate limiter. I simply prompted, "This is iteration 1, code for me," and Amazon Q responded by generating the initial codebase. This moment marked the beginning of a new development paradigm for me—one where I could focus on the big picture while Amazon Q handled the implementation details. For each iteration, I followed a process that looked something like this: Planning: I'd describe the features and components needed for the current iteration Code Generation: Amazon Q would generate the necessary Rust code Review & Refinement: I'd review the code, ask questions, and request modifications Integration: We'd integrate the new code with existing components Testing & Debugging: Amazon Q would help identify and fix issues Overcoming Challenges The journey wasn't without its challenges. There was a

May 12, 2025 - 02:13
 0
Rate Limiting Microservice in Rust

This is a submission for the Amazon Q Developer "Quack The Code" Challenge: Exploring the Possibilities

What I Built

I recently embarked on an ambitious project to build a high-performance, multi-tenant rate limiting service in Rust. Despite being a Rust novice (I'm still learning the language!), I managed to create a sophisticated microservice that serves as a critical entry point for modern distributed architectures.

Rate Limiter v2: A Comprehensive Solution

My rate limiter service isn't just a simple API throttler—it's a full-featured system designed for production environments with the following capabilities:

Multi-Tenant Architecture

The service supports multiple tenants with complete isolation between them. In this context, tenants represent different applications, services, or customers that need rate limiting capabilities. Each tenant can have:

  • Their own API key for authentication
  • Custom rate limiting rules and quotas
  • Different service tiers (free, standard, premium, enterprise)
  • Isolated usage tracking and metrics

Multiple Rate Limiting Algorithms

I implemented four different rate limiting algorithms to suit various use cases:

  • Token Bucket: Provides smooth rate limiting with burst capability
  • Leaky Bucket: Ensures constant outflow rate limiting
  • Fixed Window: Offers simple time-window based limiting
  • Sliding Window: Delivers more accurate time-window limiting with less boundary issues

Comprehensive Observability

Monitoring and debugging are critical for production services, so I integrated:

  • OpenTelemetry tracing: For distributed request tracing across services
  • Prometheus metrics: For collecting and exposing performance metrics
  • Grafana dashboards: For visualizing metrics with pre-configured dashboards
  • Jaeger UI: For exploring and analyzing traces

Resilient Storage Options

The service supports multiple storage backends:

  • In-memory storage for high performance
  • Redis for distributed deployments
  • Hybrid storage with Redis + memory fallback for resilience

API Documentation and Testing

  • Swagger/OpenAPI: For interactive API documentation
  • Load testing scripts: To validate performance under various conditions

Containerization

The entire solution is containerized with Docker and Docker Compose, making it easy to deploy and scale.

The Amazon Q Developer Experience

What makes this project truly remarkable is how I built it. Despite my limited Rust knowledge, I was able to develop this complex system with the help of Amazon Q Developer. The experience was transformative—it felt like I was managing Amazon Q while it handled 90% of the coding work.

We followed an iterative development process where Amazon Q:

  1. Helped plan out the project iterations
  2. Generated the core code structure
  3. Implemented complex algorithms and patterns
  4. Assisted with integrating various technologies
  5. Provided debugging support when issues arose

It was genuinely shocking to experience firsthand the reality of AI-assisted development, where I could focus on architecture decisions and requirements while the AI handled the implementation details. This is the future—developers managing AI agents that generate most of the code, dramatically accelerating development cycles.

For my next iteration, I'm planning to implement gRPC for internal service communication instead of HTTP, which should further improve performance. I'm incredibly excited about continuing this project and exploring more ways to leverage Amazon Q in my development workflow.

Demo

Jaeger Dashboard

Prometheus Targets

Graphana Dashboard

Graphana Dashboards with some metrics

Swagger Doc

Code Repository

https://github.com/tanishparashar/rate-limiter-final

How I Used Amazon Q Developer

Throughout the development of my Rate Limiter v2 service, Amazon Q Developer became my virtual coding partner—a software developer who worked alongside me, executing commands, writing
code, and debugging issues. This collaboration transformed my development workflow in ways I hadn't anticipated.

An Iterative Development Approach

My journey with Amazon Q began when I decided to take an iterative approach to building the rate limiter. I simply prompted, "This is iteration 1, code for me," and Amazon Q responded by
generating the initial codebase. This moment marked the beginning of a new development paradigm for me—one where I could focus on the big picture while Amazon Q handled the implementation
details.

For each iteration, I followed a process that looked something like this:

  1. Planning: I'd describe the features and components needed for the current iteration
  2. Code Generation: Amazon Q would generate the necessary Rust code
  3. Review & Refinement: I'd review the code, ask questions, and request modifications
  4. Integration: We'd integrate the new code with existing components
  5. Testing & Debugging: Amazon Q would help identify and fix issues

Overcoming Challenges

The journey wasn't without its challenges. There was a particularly difficult moment when I faced over 100 compilation errors in my Rust code. I felt overwhelmed and considered abandoning
the project—it seemed impossible to fix so many issues, especially with my limited Rust knowledge.

After taking a break to clear my head, I returned to the project and decided to tackle the errors one by one with Amazon Q's help. I shared the error messages, and Amazon Q:

  1. Explained what each error meant in plain language
  2. Identified patterns among the errors
  3. Suggested systematic fixes for groups of related issues
  4. Provided corrected code snippets
  5. Explained the underlying Rust concepts so I could learn from the experience

This debugging marathon not only saved my project but also significantly improved my understanding of Rust's ownership model, lifetimes, and type system.

Tips for Effective Collaboration with Amazon Q

Based on my experience, here are some insights for working effectively with Amazon Q Developer:

  1. Provide Clear Context: The more context you provide about your project, the better Amazon Q can assist you. Include information about your project structure, dependencies, and goals.

  2. Iterate Incrementally: Break your development into small, manageable iterations rather than trying to build everything at once.

  3. Ask Specific Questions: When you encounter issues, be specific about what you're trying to achieve and what's not working.

  4. Learn from the Code: Don't just copy-paste the generated code. Take time to understand it, which will improve your skills and help you make better requests in the future.

  5. Verify and Test: Always review and test the generated code. Amazon Q is powerful but not infallible.

  6. Use Natural Language: Explain your requirements in plain language, as if you were talking to a human developer.

The Transformation

Amazon Q has helped me become a better and more productive developer. It's not just about writing code faster—it's about having a knowledgeable companion throughout the development process
who can explain complex concepts, suggest best practices, and help troubleshoot issues.

The experience has changed my perspective on AI-assisted development. Rather than replacing developers, tools like Amazon Q amplify our capabilities, allowing us to tackle more complex
projects and focus on the creative aspects of software development.

I'm incredibly grateful for this challenge and the opportunity to work with Amazon Q. It's opened new possibilities for what I can build, even with technologies I'm still learning.