By Elena Naids and Jessica Woodin
Human-centered design in government requires more than isolated UX efforts—it demands a coordinated, community-driven approach that scales across programs.
Most organizations start their human-centered design journey the same way: a skilled practitioner here, a research effort there, impact that's real but hard to scale. The complexity of government services calls for more: a coordinated, consistent approach to research, design, and delivery that works across programs, not just within them. That's what IronArch Technology set out to build. This article explores how we built that coordinated approach, from community structures and investments in people to cross-program collaboration, and what it means for the agencies we partner with.
IronArch Technology built its reputation as a trusted federal partner modernizing IT infrastructure and optimizing cloud environments. As government agencies began raising the bar for how people experience digital services—from applying for benefits online to managing health care remotely—we integrated human-centered design, modern product delivery, and agile engineering practices into our portfolio to better serve the needs of our partners.
Today, our Digital Services organization supports the U.S. Department of Veterans Affairs across both Veteran-facing and enterprise platforms, including initiatives such as Ask VA, My HealtheVet, Transition Experience, and Benefits Discovery Service. Our users span Veterans, their family members, caregivers, and VA employees—communities we understand deeply. With more than 35% of our workforce being Veterans, our commitment to this mission is both professional and personal.
As our work continued to grow in scale and complexity, we recognized that a more connected approach to research, design, and delivery would make us better partners, bringing deeper user insight and more consistent outcomes to the people we serve. That led us to build a stronger, more connected Experience practice.
IronArch's investment in human-centered design didn't begin with a fully-formed practice; it grew with the work. In the early stages, UX practitioners were embedded within delivery teams wherever the need emerged, helping product teams better understand users, clarify product direction, and improve usability. The work was impactful but largely project-driven, without shared structures for collaboration, mentorship, or consistency. Over time, we set out to build something more intentional: a practice that fosters cross-functional collaboration within and across teams, shares knowledge and resources, and invests in a foundation that supports how we hire, develop, and sustain strong, well-supported teams.
Cross-program structure: Today, the Experience practice operates as a cross-program community of researchers, designers, accessibility specialists, and content strategists. Practitioners remain embedded within product teams while collaborating across programs, sharing research techniques, exchanging feedback on in-progress designs, and building on one another's work rather than starting from scratch.
Connected practice community: Our team stays connected through deliberate, community-driven activities. Practitioners gather regularly to share design critiques, research findings, and practical tips, including shared Figma templates, research logistics like timelines and note-taking conventions, and emerging approaches to using AI in research and design workflows. What starts as a quick discussion often turns into a cross-team conversation that improves work happening across multiple programs at once.
Standardized recruiting and career paths: We've invested equally in the foundations that make a practice sustainable. Standardized recruiting processes, consistent job descriptions, and structured interview practices help us identify and hire strong practitioners efficiently. Streamlined onboarding gets new team members contributing quickly. Career ladders and dedicated practice leadership ensure everyone has a path to grow and the support to get there. For our government partners, this translates into experienced, well-supported teams with the continuity and quality that complex, long-running programs require.
Through the lens of the Nielsen Norman Group's UX maturity model, this evolution has taken us from early, project-based UX support to Stage 3 maturity, where design and research are embedded within delivery teams. Our next step is to advance further, strengthening how we measure impact, bringing user insight earlier into program decision-making across our portfolio, and building toward a consistent, organization-wide capability that keeps getting better at solving the kinds of problems federal programs face.
When government teams partner with IronArch, they get more than skilled practitioners on their delivery teams. They get the collective knowledge of a connected practice, and that difference shows up in concrete ways: faster delivery, shared insight, and an approach to emerging technology that keeps human lived experience at the center.
Teams move faster without sacrificing quality. Because practitioners are embedded in delivery teams and supported by a shared practice, programs get focused product support and organizational knowledge at the same time. We lead with research, uncovering real user needs before solutions are designed, which reduces uncertainty and aligns stakeholders early. Engineers participate in user research, designers collaborate closely through implementation, and assumptions are tested continuously across disciplines and projects. The result is less rework, stronger alignment, and on-time delivery in complex federal environments.
Teams share insights across programs. Knowledge generated on one program doesn't stay siloed. It flows to teams working on related challenges. When researchers supporting VA's Benefits Discovery Service API shared findings with the team building the Discover Your Benefits tool on VA.gov, it helped improved how transitioning servicemembers understand and act on benefit recommendations, without duplicating the underlying research.
Teams build on what came before. Our teams regularly share artifacts, research, and lessons learned so others can build on existing work rather than starting from scratch. When one team shared existing stakeholder maps with a second team tackling a related project, it gave the second team a strong jumping off point to visualize their complex stakeholder ecosystem and accelerated their work considerably. Both teams were able to learn from each other, strengthening the practice’s shared understanding of this method and adding to the growing body of knowledge that every future team can draw from.
Teams use AI thoughtfully to accelerate delivery. Our teams are actively integrating AI into research and design workflows—using AI tools to synthesize interview transcripts, running hypothetical scenarios through benefit eligibility decision trees to test accuracy, and evaluating mockups against design system standards before they go into review. We track how AI is being used across programs in a living document that grows as our experience does, making it easy for any team to build on what others have already learned. And while AI accelerates our work, human judgment and accountability remain central. For our partners, that means the benefits of AI adoption with the confidence that human expertise is always behind it.
For us, that's what meaningful modernization looks like: not just better systems, but better outcomes for the people who rely on them.