Government UX Across 26 Agencies
TL;DR
As a UX design apprentice at the French Ministry of Defense, I orchestrated enterprise-scale research across 26 government agencies -- auditing hundreds of web pages, designing standardized research instruments, facilitating 9+ co-design workshops with 100+ stakeholders from military and civilian organizations, and building the reusable process infrastructure that established user-centered design within the ministry. I was listed as 'Lead UX' on one of my three concurrent projects while holding the most junior title possible. The redesigned website achieved a 6-minute average session duration, up from 4:08 -- a 45% improvement in user engagement.
Context
The French Ministry of Defense (Ministere des Armees) operates one of the most complex digital ecosystems in French government. The public web portal defense.gouv.fr spans 26+ organizational sub-sites -- each managed independently by a different military or civilian agency (DGA, SGA, SCA, BOG, CGA, DIRISI, and others), each with its own communication team, editorial process, and institutional priorities. The portal serves millions of citizens: military personnel, students exploring career options, journalists seeking information, veterans accessing services, job seekers, and the general public.
I joined the DTPM (Delegation for Transformation and Ministerial Performance) -- the ministry's digital transformation unit -- as a UX design apprentice in 2020. "Apprentice" in the French system (alternante) is a specific educational structure where students alternate between academic study and professional work. It is a recognized professional role, but it is entry-level. I was the most junior person on the team.
The ministry had decided to redesign defense.gouv.fr. No UX practice existed within the organization. Many agency representatives could not answer basic questions about their own users' behavior. There was no shared digital strategy across agencies, no research methodology, and no mechanism for cross-agency collaboration on design. I stepped into a digital transformation effort that needed to be built from the ground up, in one of the most institutionally complex environments in French government.
The Challenge
The challenge was not a single design problem -- it was an organizational one operating at institutional scale.
26 agencies, zero shared understanding: Each agency sub-site had been built and maintained independently. There was no unified picture of the portal's structure, usability, or user needs. Some agencies had sophisticated web teams; others had a single communications officer managing their entire digital presence. When I deployed standardized questionnaires across agencies, some representatives answered "PAS DE REPONSE" (no response) to basic questions about their users -- they simply did not know.
Institutional complexity: The French government has strict digital standards. The DSFR (Systeme de Design de l'Etat / State Design System) governs visual and structural consistency across 20,000+ state websites. RGAA accessibility requirements mandate compliance at a level comparable to WCAG AA. Military security constraints separate internal systems (IntraDef) from public-facing platforms. Legacy CMS limitations constrained what was technically achievable. And above all: military hierarchy and protocol shaped every stakeholder interaction.
Cultural unfamiliarity with UX: User-centered design was a new concept within the ministry. Before stakeholders could participate in research, they needed to understand what UX was, why it mattered, and how the process would work. I was not just executing research -- I was introducing a discipline.
Three concurrent products: While the MINARM website redesign was the primary project, I simultaneously led UX for the LPB (Laissez Passer Balard) parking pass management application and contributed to user testing for the Atrium housing request portal. Multi-product ownership at the most junior title level.
My Role
Title: Alternante en UX design (UX Design Apprentice) -- the most junior professional title in the French system
Actual scope: Lead UX researcher and designer orchestrating multi-agency research, workshop facilitation, usability testing, and design across three concurrent products
Team: Worked within the DTPM team alongside project managers, developers, and Lab Design team members; co-facilitated workshops with senior DTPM staff
Recognition: Listed as "Lead UX" on the LPB project team -- a direct, artifact-level contradiction between my title and my function
This is the most dramatic example of operating above title in my career. The gap between "apprentice" and "leading enterprise-scale research across 26 government agencies, facilitating 9+ workshops, managing 3 concurrent products, and presenting at a defense industry trade show" is not subtle. It reflects what I have always believed: do the work the situation demands, regardless of what the title says.
Research & Discovery
Mapping the Entire Digital Territory: My first move was comprehensive. Before proposing any design direction, I conducted a systematic audit of the existing defense.gouv.fr portal. I created a multi-sheet spreadsheet documenting every page, every link, and every navigational pathway across the ecosystem -- 291+ rows cataloguing homepage links, 426+ rows documenting sub-site pages, plus tabs for SEO metrics, content structure, and usability observations.
The audit revealed the baseline: Alexa rank 1,379 in France, 4.0 daily page views per visitor, 4:08 average daily time on site, 42.6% bounce rate, 147 in-page links (96% internal), no custom 404 page, no social media structured data, and 74 images with 5 missing alt attributes. This was the territory I was designing within -- and the evidence base I would need to justify every subsequent decision.
Standardized Research Across Agencies: I designed a standardized questionnaire and deployed it across 5+ agencies (DGA, SGA, SCA, BOG, CGA). The questionnaire covered project objectives, user profiles, user behaviors, pain points, current functionality assessment, improvement priorities, and proposed changes. Every agency received the same instrument, producing comparable data that could be synthesized into a unified understanding of 26 organizations' needs.
This was not ad hoc. I built a reusable research instrument -- a system that any team member could deploy to any new agency. The standardized format ensured that as additional agencies were brought into the research, the data would be directly comparable to what had already been collected.
100-Slide Competitive Benchmark: In March 2022, I produced a comprehensive benchmark analysis spanning approximately 50+ reference websites across 5 categories: events and calendar pages, in memoriam and memorial pages, interactive cartography, key figures displays, and job offer pages. The analysis covered French government sites (numerique.gouv.fr, systeme-de-design.gouv.fr, gouvernement.fr), international institutions, museums, and commercial sites. This benchmark ensured design decisions were grounded in established best practices, not institutional assumptions.
Design Process
9+ Co-Design Workshops Across 3 Weeks: The heart of this project was the workshop program. I designed and co-facilitated 9+ workshops across three weeks in May 2021, organized in three progressive waves:
Wave 1 -- Card Sorting and Experience Mapping (May 11): Three parallel sessions with participants from DGA/BOG, SCA, and SGA groups. Participants sorted content categories, mapped their current experience navigating the portal, and documented pain points and information-seeking patterns. These sessions established the baseline understanding of how 26 agencies experienced their shared digital platform.
Wave 2 -- Tree Testing and Prioritization (May 20): Three parallel sessions where participants tested proposed information architectures through tree testing exercises and used prioritization matrices to rank features and content areas by importance. These sessions validated (or invalidated) the structural direction emerging from Wave 1 findings.
Wave 3 -- Brainstorming and UI Creation (May 27): Three parallel sessions focused on solution generation -- brainstorming exercises, card games to explore content relationships, and collaborative UI sketching where stakeholders drew their ideal interface concepts. These sessions transformed passive stakeholder feedback into active co-creation.
The workshops brought together an extraordinary cross-section of participants: representatives from DGA, SGA, SCA, BOG, CGA, DIRISI, ONISEP, and Uneo -- military officers, civilian administrators, cybersecurity experts, business school professors, and students. In one of the most hierarchical and siloed institutional environments in France, these workshops created a space where a military general and a student could work side by side on a prioritization matrix.
My facilitation philosophy -- displayed at the start of every session -- was explicit: "Everything is worth saying. No judgment. Censorship and self-censorship are forbidden. Prioritize quantity over quality. Whatever happens, don't take it personally." Creating psychological safety within military hierarchy was not optional; it was the prerequisite for honest co-design.
After each wave, I synthesized outputs into standardized RETEX (Return on Experience) documents for each agency -- capturing blocking points, improvement axes, and workshop results in a traceable format that connected stakeholder input to design decisions.
User Testing Across Multiple Products: In parallel with the workshop program, I designed and executed usability testing across all three products:
For the MINARM V.1 site, I designed a structured testing protocol: 60-minute sessions with 5 testers across 3 user profiles (youth, professional, military intern), testing on both desktop and mobile. The protocol followed a 3-phase structure -- free exploration with prompted questions, specific scenario tasks, and a post-test questionnaire covering ergonomics, readability, wording, and hierarchy.
Testing revealed critical navigation problems. Users consistently became confused when moving between the main site and sub-organization sites -- menus changed without warning. Search returned irrelevant and poorly categorized results. Pages were overwhelmingly long with too much news content. Users could not find practical information they came for -- job applications, contact details, recruitment information. Verbatims captured the human cost: "This site is not logical and very long." "Too many articles everywhere." "It's a rabbit hole." "Lots of communication but the rest is impossible to find. I don't go on the military site for PR -- I go there to work."
For the Atrium housing portal, I tested with 4 users including 1 user with a disability -- ensuring accessibility was not an afterthought but an integral part of evaluation. Ratings ranged from "very good" to "medium," surfacing specific improvement areas.
For LPB, I conducted multiple rounds of testing across both regular users and referent (administrator) roles, documenting findings and improvement axes at each iteration.
LPB -- End-to-End Product Design: For the LPB parking pass management application, I owned the complete UX process. I audited the existing application, created information architecture, documented current and proposed user flows for 4 key journeys (create request, renew/modify request, view all requests, personal information/logout), designed low-fidelity wireframes, iterated through multiple rounds of high-fidelity mock-ups (August through October 2021), and facilitated a dedicated dashboard workshop for referents in October. The project produced 15+ distinct screens across multiple iterations, all validated through user testing.
Solution
The project delivered multiple interconnected outcomes:
Redesigned Portal Architecture: The workshop-driven information architecture replaced the fragmented, agency-centric structure with a user-need-centered navigation model. Tree testing validated the new structure with actual stakeholders from across the ministry. The architecture was grounded in evidence from the site audit, agency questionnaires, and three waves of co-design workshops.
Reusable Research Infrastructure: Beyond the specific design deliverables, I built a complete set of process infrastructure:
- Standardized OICD questionnaire -- a research instrument deployable to any agency, producing comparable data across organizational boundaries
- Workshop facilitation templates -- structured agendas with timed exercises, rules, and facilitation guides usable by any DTPM team member
- User test protocol -- a reusable testing framework with defined phases, participant profiling, task scenarios, and analysis structure (applied across MINARM, Atrium, and LPB)
- RETEX documentation framework -- standardized post-workshop synthesis format ensuring systematic knowledge capture
- UX methodology presentation -- a 7-step process framework (Audit, Interviews, Personas, Empathy Maps, Workshops, Wireframes, User Tests) that established how the ministry would approach user-centered design
These are not one-off deliverables. They are organizational infrastructure designed to make user-centered design repeatable and sustainable within the ministry long after my tenure ended.
LPB Application: 15+ distinct screens across multiple iterations for the parking pass management system, validated through user testing and a dedicated referent workshop. The design addressed key user pain points identified through testing, including better button placement, clearer hierarchy, step-based forms instead of long scrolling pages, and improved legibility.
Educational Foundation for UX Practice: I created a 50-slide "What is a UX designer?" presentation tracing UX from ancient Greek ergonomics through Toyota's human-centered production systems to modern ISO standards, designed for ministry personnel with no prior exposure to the discipline. I also created a presentation for the Salon Fabrique Defense -- a defense industry trade show -- detailing specific UX methods including research techniques, workshop facilitation, prototyping, and evaluation scales (SUS, UMUX-lite, DEEP, AttrakDiff). An apprentice presenting methodology at an industry trade show is itself evidence that the work exceeded the title.
Results & Impact
6 min
Average session duration (up from 4:08)
~45%
Improvement in engagement
26+
Agencies researched
9+
Workshops facilitated
100+
Stakeholders participated
50+
Reference sites benchmarked
3
Concurrent products managed
5
Reusable process artifacts
User Engagement
- Redesigned website achieved 6-minute average session duration, up from 4:08 pre-redesign -- a ~45% improvement in user engagement
- Session duration improvement indicates that users were finding and engaging with relevant content rather than bouncing off an overwhelming portal
Research Scale
- 26+ agencies researched through standardized questionnaires and co-design workshops
- 9+ workshops facilitated across 3 weeks in a structured 3-wave program
- 100+ stakeholders participated across workshops, interviews, and testing
- 5 testers for MINARM V.1 across 3 user profiles (desktop and mobile)
- 4 testers for Atrium including 1 user with disability
- Multiple rounds of LPB testing across user and referent roles
- 100-slide competitive benchmark analyzing 50+ reference websites across 5 content categories
Process Infrastructure
- 5 reusable process infrastructure artifacts built and deployed: standardized questionnaire, workshop facilitation templates, user test protocol, RETEX documentation framework, and UX methodology presentation
- Research instruments designed for redeployment to any new agency entering the redesign process
- RETEX format adopted across all agency workshop documentation (confirmed across BOG, CGA, DGA, SCA, SGA)
UX Practice Establishment
- Introduced and established user-centered design methodology within the ministry
- Built organizational UX literacy through educational presentations to stakeholders and industry trade show
- Created the evidence base and process infrastructure for sustained UX practice beyond the apprenticeship period
Design Scope
- 3 concurrent products managed simultaneously: MINARM website redesign, LPB application (15+ screens), and Atrium portal testing
- User personas (3 for defense.gouv.fr), empathy maps, experience maps for multiple agencies, information architecture validated through tree testing
- State Design System (DSFR) integration contributing to standardization across 20,000+ government websites
Reflections
The Ministry of Defense was where I first discovered the approach that would define the rest of my career. It was the first time I mapped an entire ecosystem before designing in it. The first time I built reusable research instruments instead of one-off questionnaires. The first time I facilitated co-design across organizational silos. The first time I created educational materials to build UX literacy within an organization that did not know it needed it.
Looking back, what strikes me most is the scale of organizational complexity relative to my level of experience. I was an apprentice coordinating research across 26 agencies within a military bureaucracy. I was facilitating workshops that brought together military officers and students. I was presenting at an industry trade show. I was managing three concurrent products. These were not responsibilities assigned to me because of my title -- they were responsibilities I took on because the work needed to be done and I was willing to do it.
The institutional constraints -- RGAA accessibility requirements, DSFR design system standards, military security protocols, COVID-19 remote collaboration -- taught me to design within boundaries rather than against them. Regulated environments force a rigor that unregulated ones do not. Every decision needed justification. Every recommendation needed evidence. That discipline has served me ever since.
If I could change one thing, I would have been more explicit about the "operating above title" narrative at the time. Being listed as "Lead UX" on the LPB project while holding an apprentice title is documented evidence of how the organization perceived my contribution. I did not advocate for myself as clearly as I should have. I have since learned that the work speaks for itself only if you make sure people can hear it.
Core patterns this project demonstrates
- Operating Above Title: Listed as "Lead UX" while holding the title "UX Design Apprentice" -- the most dramatic title-function mismatch in my career. Managing 3 concurrent products, presenting at an industry trade show, and building organizational infrastructure as an apprentice.
- Silo Bridging: 9+ workshops bringing together participants from 9+ agencies -- military officers, civilian administrators, cybersecurity experts, business school professors, and students in structured co-design. The most challenging silo-bridging environment across my entire career.
- Territory Mapping: Comprehensive audit documenting every page and link across 26+ agency sub-sites before proposing any design direction. 100-slide competitive benchmark. Standardized questionnaires across 5+ agencies.
- Infrastructure Building: 5 reusable process artifacts (questionnaire, facilitation templates, test protocol, RETEX framework, UX methodology) designed for organizational persistence beyond any single project.
- Scaling Through Teaching: 50-slide educational presentation for ministry personnel. Salon Fabrique Defense trade show presentation. UX methodology framework establishing how the ministry approaches user-centered design.
The Ministry period established every pattern that defines my professional identity today. The research infrastructure I built here evolved into the more sophisticated practice I built at Jellyfish. The workshop facilitation became the foundation for my cross-cultural research across 7 countries. The ethical design and accessibility rigor became the conviction that led to my Center for Humane Technology certification and my thesis on dark patterns. Nothing was wasted. Everything compounded.
Key Artifacts
Workshop Photographs
Workshop Outputs
LPB High-Fidelity Mock-ups
Site Audit Documentation
RETEX Documentation
UX Methodology Presentation
Salon Fabrique Defense Presentation
Related Case Studies
See all projectsJellyfish / BrandTech Group
Multi-Product Research with European Methods
Ran concurrent research programs across two data products using 6 distinct methods including Bastien & Scapin ergonomic criteria, discovering a hidden workaround ecosystem.
6 methods
Marigold / Zeta Global
Leading Design Through an Acquisition
Built a complete research practice from scratch while leading 6 designers and achieving SUS 90 -- with zero missed deadlines during a corporate acquisition.
SUS 90
Let's Connect
I am looking for a player-coach role -- Staff, Lead, or Senior Product Designer -- where I can combine hands-on design with team leadership and research practice development.