Field Aerospace is an aircraft modification company with approximately 250 employees. They design, fabricate, and certify aircraft modifications for U.S. Department of Defense programs and international customers.

Their proposal process looked like most manufacturers' quoting or RFP processes: a government solicitation comes in, a team of three or four people spends two weeks extracting requirements, drafting responses, and iterating before anything goes out the door. Meanwhile, the solicitation window is ticking.

After building on self-hosted n8n, they get to an 80% complete draft in 25 minutes. They also eliminated $30,000 in annual software costs in the process.

That result is real and publicly documented on n8n's case study page. It is also the result of specific architectural decisions, a content library built before any automation was written, and a clear definition of what "done" means for an AI-assisted draft. This post covers what they built, why it worked, and what to prepare for before you attempt something similar in your own operation.

What Field Aerospace Actually Built

The problem was not a lack of effort. It was a lack of infrastructure for repetitive document work.

Every solicitation required manual extraction of requirements from a 50-plus-page government document: identifying the "shall," "must," and "will" statements that define what the proposal needs to address. That extraction alone took hours. Then came the drafting: pulling from previous proposals, reference materials, and institutional knowledge spread across multiple people's heads and drives.

Each proposal was started from scratch, or close to it. There was no systematic way to reuse prior work. There was no process for ensuring the team was responding to every requirement. The timeline was two weeks minimum, with three to four people contributing for most of that time.

Field Aerospace deployed self-hosted n8n and built four interconnected systems:

Proposal generation workflow: n8n ingests a solicitation document, structures the content, and pulls from an internal reference library of past proposals and company capabilities. An AI model generates a draft response. The output goes to Microsoft Teams, where reviewers work from the draft rather than from a blank page.

Requirements extraction: A separate workflow produces a consolidated matrix of every requirement in the solicitation: every "shall," "must," and "will" statement, extracted and organized so reviewers can verify the proposal addresses each one.

Opportunity evaluator: An automated daily workflow connects to the Deltek GovWin API, pulls new solicitations, scores them against Field Aerospace's capability profile, and surfaces only the high-fit opportunities. The team stopped reactively discovering solicitations they had missed and started proactively reviewing a filtered daily list.

Internal AI chatbot: A company-data assistant orchestrated by n8n gives staff access to internal documents and reference material without manual searching.

The entire system runs on self-hosted n8n, which Field Aerospace chose specifically because of the data sensitivity of government contract work. Their data does not leave their infrastructure.

Quantified results (publicly verified):

  • Proposals: 80% completion in 25 minutes, versus a previous two-week timeline with three to four contributors
  • Requirements extraction: 15 to 20 minutes, versus hours of manual document review
  • Software costs: $30,000 annually eliminated by replacing two legacy tools with n8n
  • Opportunity discovery: Reactive to automated daily filtering

What the Case Study Doesn't Cover

This is where the published result ends and where an experienced automation engineer picks up. The following five things are what Field Aerospace almost certainly had to navigate, and what you will need to prepare for before replicating this approach.

1. Self-Hosted n8n Is the Right Call for Sensitive Data, but It Has Real Maintenance Requirements

Field Aerospace chose self-hosted n8n because their data cannot leave their infrastructure. For manufacturers dealing with proprietary product data, customer contract terms, or regulated information, this is often the right call.

But self-hosting is not a one-time decision. It requires a server to maintain, a backup strategy, a version management process, and someone internally who owns the infrastructure. The automation logic is only one layer of the system. The server it runs on is another, and that server needs to be managed.

Before choosing self-hosted, confirm who internally will own the infrastructure, how updates will be managed, and what the backup and recovery plan looks like if the server goes down during a critical proposal window.

2. The Reference Library Is the Foundation, Not the Feature

The 25-minute result is powered by the AI draft, which is powered by the reference library: a structured collection of past proposals, capability statements, technical descriptions, and boilerplate content that the automation draws from.

That library does not build itself. Someone at Field Aerospace curated, organized, and structured it before any automation was written. The quality of the automation's output is directly proportional to the quality of the content in that library.

If your equivalent of this automation is a quote generation workflow, the reference library is your product catalog, pricing logic, and standard terms. If it is a customer onboarding workflow, it is your process documentation and standard communications. Whatever form it takes, build and organize it first. The automation amplifies what is already there. It does not create what isn't.

3. An 80% Draft Is Not a Finished Document

Field Aerospace is explicit about this: the automation gets them to an 80% solution. The remaining 20% is human review and judgment. That is the right design for any AI-assisted document workflow.

The failure mode to avoid is treating the 80% as done and skipping the review step. An AI-assisted proposal that misses a critical requirement, quotes the wrong pricing, or includes a statement that doesn't reflect current capability is worse than a slower manual draft, because it looks like a finished document when it isn't.

Before deploying any document generation automation, define exactly what the human review step covers, who performs it, and what "approved to send" means. The automation earns its value by getting to 80% faster. The human review step is where the last 20% happens, and that step is not optional.

4. API Access to Your Opportunity Sources Needs to Be Confirmed Before Scoping

Field Aerospace's opportunity evaluator connects to the Deltek GovWin API. That API exists and is accessible. Your equivalent data source may or may not expose an API.

If your incoming opportunities, RFPs, or sales inquiries come through a portal, a platform, or a third-party system, confirm the API situation before scoping the automation. Some platforms have full REST APIs. Others have partial APIs. Some have no API and require a different approach (email parsing, SFTP file transfer, or a custom integration layer). The answer affects the build timeline and complexity significantly.

5. Track the Software Cancellations or the Savings Aren't Real

Field Aerospace eliminated $30,000 in annual software costs by replacing two legacy tools with n8n. That saving is real, but only if the licenses were actually cancelled.

Sunk license costs are the easiest projected saving to lose. The new system is running. The old systems are no longer needed. But the renewal happens automatically, someone forgets to cancel it, and the savings never materialize because two systems are now running simultaneously.

Before go-live, document exactly which existing tools the new automation replaces, who owns the vendor relationship, and when each contract comes up for renewal. Assign the cancellations as a task with a deadline, not as a follow-up item.

Why This Matters Beyond Aerospace

Field Aerospace's proposal workflow is a specific example of a general pattern: a document-heavy process that requires pulling from multiple sources, applying judgment, and producing an output that goes to an external party under time pressure.

For manufacturers, the equivalents are everywhere:

A quoting workflow where a sales rep pulls pricing from the ERP, applies custom discounts, writes a cover note, and sends a PDF. A purchase order process where someone extracts requirements from a customer order, cross-references against inventory and supplier lead times, and drafts a PO. A customer onboarding flow where someone pulls credit data, creates an ERP account, and generates a welcome communication.

The architecture Field Aerospace used is directly applicable to all of these. The reference library, the extraction workflow, the AI-assisted draft, and the human review step are a pattern, not a unique aerospace solution.

Before You Build

Find the one process in your operation that currently requires three or more people to produce a document or a decision. That is your Field Aerospace equivalent.

Then answer: is the source data accessible programmatically? Is the reference content organized and current? Is there a clear definition of what "done" looks like before it goes to the recipient?

If the answer to all three is yes, the workflow is ready to scope.

Book a free call to scope your equivalent build before you start