Military childcare fee assistance

The situation

A federal program serving 200k+ military families across all service branches. The organization had promised a modernized system to the military for years. Not a single line of code existed to back that promise. An engineering firm was brought in to prove it could be done — they shipped the first release, but the approach wasn't sustainable long-term.

76% of families had to resubmit their applications. Operations staff called a complete first-attempt submission a "unicorn." Leadership had absorbed years of perceived failure and operations teams had developed deep skepticism toward anyone promising modernization.

I came in during the engineering firm's engagement. The work expanded from documentation into design, research, and product strategy. The team built internal capability, and I moved to the organization directly.

What I found

Everyone assumed families were careless or confused. The three-layer diagnostic revealed something different:

Contract said LES must be within 90 days of care start date. Operations collected LES at application time regardless of when care started. System had no mechanism to distinguish "care starts next week" from "care starts in September."

A family applying in April for September care would submit a valid LES. By processing time, it had expired. They'd be asked for a new one. That's not user error. That's a timing mismatch built into the process.

This pattern was everywhere. Operations assumed wet signatures were required — the contract allows electronic PIN. The system showed 9 generic document categories — families uploaded whatever they guessed and coordinators sorted it out. Operations teams had built 20 years of workarounds that nobody designed but everyone lived with.

How I worked

Sequential 1:1 sessions with operations staff. QC lead, then the Army fee assistance director, then provider services, then non-Army processing. Each session refined what the previous one validated. Working prototypes, not presentations.

The key moment: operations saw their own internal 84-item checklist — the system they'd always used to track what each family needs — surfaced directly to families as a clear, personalized document list. They stopped being skeptical. They weren't reviewing my designs. They were seeing their expertise made visible.

The research practice built relationships with military families directly. One spouse described submitting documents into "the abyss" with no confirmation. The #1 Army recruiter nationally revealed that his provider barrier wasn't administrative — it was structural eligibility that no amount of process improvement could fix. Each family taught us something the system couldn't.

What it produced

  • ~0% → 87% first-attempt completion in pilot across six military branches

  • Weeks → one business day processing turnaround for complete applications

  • Root cause of 76% resubmission rate identified — document timing, not user error

  • Research practice that reached Pentagon level — the #1 Army recruiter nationally defended the work to the Under Secretary for Personnel & Readiness

  • Design Rationale Registry — every design decision traces to a person, a quote, and a result

  • 12 concurrent workstreams managed by one person with AI infrastructure

What I learned

This is where the three-layer diagnostic became a formal methodology. The gap between contract, operations, and system behavior is where every problem lived. Nobody could see it because each layer had a different owner who assumed the other two were aligned.

The research compounds. The person doing family outreach is the same person who found the resubmission root cause is the same person who designed the document automation is the same person who caught a family pausing mid-application and recognized it was the same problem expressing itself differently. That synthesis — holding all threads simultaneously — is what one person with the right tools can do that a fragmented team cannot.

Next
Next

Contractor workforce platform