The Great Refactor: The Reality of an AI-Assisted Monolith Migration

By Andrei Roman

Principal Architect, The Foundry

Two weeks ago, I published The Great Refactor, promising to port a production-grade Java Spring Boot CRM (AmbienCRM) to a Serverless architecture (AWS Lambda and DynamoDB) without breaking the frontend.

I set out to apply Safety-Critical Rigor to Cloud Flexibility.

The mission is complete. The system is live. The idle cost has dropped from €50 per month to zero.

But the journey was not a linear march to victory. It was a high-speed collision between two eras of software development.

If you are an engineer looking to use AI to modernize legacy stacks, this is the post-mortem you need to read. It explains why AI does not solve complexity - it just accelerates how fast you crash into it.

The Physics of Time Dilation

In his essay Lost in Bugspace, Venkatesh Rao describes software development as having a deep asymmetry in time. There is a lower bound (complexity) but no upper bound (uncertainty).

When you migrate a legacy system, you traverse different strata of this space.

1. The Serendipitous Implementation (Days 1-3)

Rao describes this layer as the surprisingly lucky case where code works more or less as intended on the first pass.

In the pre-AI era, setting up the boilerplate for a Serverless migration (CDK infrastructure, Zod schemas, 18 API handlers) would have taken me three weeks of manual typing.

With Claude Code and Gemini, I finished it in 3 days.

This is the promise of Software 2.0. The AI handled the Implementation Gunk instantly. I felt invincible. I had a backend that passed unit tests, a deployed database, and a working API Gateway. I thought I was 90% done.

I was wrong. I had just sprinted into a wall at 100 miles per hour.

2. Entering Zemblanity (Days 4-7)

Rao defines Zemblanity as the unsurprising, grinding bad luck of things going wrong in predictable but hard-to-escape ways.

On Day 4, I connected the legacy React Frontend to the new Serverless Backend. The screen turned white. TypeError: Cannot read properties of undefined.

I had hit the Architectural Dissonance:

The Monolith (Old): Relied on Fat Objects. A single SQL query returned a User, their Role, and their Sales History in one JSON blob.

The Serverless (New): Relied on Lean IDs. DynamoDB returned a userId string.

The frontend was expecting a banquet. The lambdas gave it grocery lists.

Because I insisted on not rewriting the frontend (to save time), I entered a state of Zemblanity. I had to build complex Hydration Layers in my Lambdas to manually stitch data together, mimicking the behavior of Hibernate and JPA.

3. The Yak-Shaving Basin (Days 8-10)

This is where most projects die. Yak-Shaving is the deliberate decision to treat a local defect as a first-class problem.

I found myself debugging PDF font rendering in a Lambda environment because the legacy system used a specific Java library. I found myself writing adapters to translate boolean true into the string Materiale because the old frontend had a Double Conversion bug.

In a traditional team, this would have triggered weeks of meetings (Coordination Headwinds). But because I was a Company of One armed with AI, I could Shadow Yak-Shave.

I fed the source code of the old Java PDF service into Gemini Pro. I did not ask it to write code. I acted as the Context Bridge.

Look at this Java class. It draws borders. Write a TypeScript service using pdfkit that produces the exact same pixel output.

The AI did not understand the System (why we needed this), but it understood the Syntax perfectly. I provided the Judgment. It provided the Grunt Work.

The Strategic Exit: Expedient Delivery

Rao identifies three exits from Bugspace:

  1. Restart (Give up).

  2. Yak-Shaving (Fix everything perfectly. Infinite regress).

  3. Expedient Delivery (Accept bounded imperfection).

I chose Exit 3.

I did not refactor the frontend to be clean. I built the Adapters. I accepted that the backend code is slightly heavier than I would like because it has to translate New Data into Old Format.

The Result:

Total Time: 11 Days. (Previous benchmark: 3 months with a collaborator and last year's AIs). Team Size: 1 (Augmented by AI). Outcome: A working, serverless CRM.

The Verdict: The New Architect

This refactor proved my hypothesis about the Divergence Machine.

The Mediocre Middle of software development is gone. If I had hired a junior dev to do this migration, they would have drowned in the Zemblanity of the data mismatch. They would have spent a month just understanding why the dates were formatted as arrays.

AI allowed me to hold the entire system context in my head. It removed the Socio-Technical Amplification of bugs (where adding people adds confusion).

What did the AI actually help with?

It compressed the Happy Path to near zero. It allowed me to spend 100% of my energy on the Hard Problems (Identity, Hydration, PDF Rendering) rather than typing boilerplate.

I did not avoid the pain of migration. I just got to the pain faster, and had better weapons to fight it.

AmbienCRM serverless is live.

Read the full essay on software time dilation here: https://open.substack.com/pub/contraptions/p/lost-in-bugspace

Discussion (0)

No comments yet. Be the first!

Join the conversation