How I Work
Methods & Process
Research-first, infrastructure-minded, ethically grounded. My approach to designing in complex environments.
Philosophy
I am a research-first designer. Every project I take on begins with understanding the full system -- the users, the stakeholders, the data flows, the organizational dependencies -- before I open a design tool. This is not perfectionism; it is risk management. At Marigold, a comprehensive UX Foundation discovered that 60% of first-time users failed the core task. At Jellyfish, a structured assessment revealed that users had built an entire parallel workflow of workaround tools because the product failed them at critical moments. These are the kinds of findings that save products from launching broken, and they only surface when you map the territory systematically.
I am also an infrastructure-minded designer. I do not just deliver research findings and move on. I build the organizational scaffolding -- research portals, intake forms, methodology guides, design briefs, feedback loops -- that makes research-informed design the default way of working, not a one-off event. At three consecutive organizations, I have built research practices from scratch. The question I always ask is not just "How do I do this work well?" but "How do I build the systems that enable everyone around me to do this work well, even when I am not in the room?"
And I am an ethically grounded designer. My Master's thesis investigated how dark patterns exploit cognitive biases. My Center for Humane Technology certification formalized that commitment. These are not decorative credentials -- they shape how I make design decisions every day, from how I frame research questions to how I evaluate tradeoffs between business goals and user wellbeing.
Research Methods Toolkit
I have applied 20+ research methods across government, adtech, martech, and enterprise data platforms. Below is how I organize them by phase.
Discovery Phase
Methods I use to map the territory before designing in it.
| Method | What I Use It For | Example |
|---|---|---|
| Stakeholder Interviews | Understanding organizational context, competing priorities, and success criteria | SOM: 6 C-suite/VP stakeholders across 4 countries See SOM Case Study → |
| User Interviews | Deep qualitative understanding of workflows, mental models, and pain points | J+Track: 8 interviews across 4 countries (5+ hours); Paid Media: 26+ interviews across 7 countries |
| Contextual Inquiry / Shadow Sessions | Observing real workflow in context | J+Bidding: shadow sessions revealing that most practitioners lacked the technical skills to use bid multipliers at all |
| Ecosystem Mapping | Documenting the full landscape of tools, data flows, and organizational dependencies | Ministry: multi-sheet audit of 26+ agency sub-sites; J+Lake: mapping 22 connectors, 8 products, 20,000+ daily jobs |
| Competitive Benchmarking | Positioning the product within its market and identifying design patterns | Ministry: 100-slide benchmark analyzing 50+ reference websites; LUX Thesis: 5 competitor analysis with market share data |
| Analytics Review | Establishing quantitative baselines before research | J+Track: Mixpanel behavioral analytics revealing 16% session abandonment and 45.93% export completion; Marigold Loyalty: Amplitude data showing 81% CSR page concentration |
Generative Phase
Methods I use to generate and validate design directions with users and stakeholders.
| Method | What I Use It For | Example |
|---|---|---|
| Co-Design Workshops | Collaborative ideation with cross-functional participants | Ministry: 9 workshops over 3 weeks with 100+ participants from 26 agencies See Ministry Case Study → |
| Card Sorting | Understanding how users categorize information | J+Report: 5 participants sorting 65 data connectors into categories over 75 minutes |
| Tree Testing | Validating information architecture | Ministry: cross-agency tree testing to validate navigation across organizational boundaries |
| Journey Mapping | Visualizing end-to-end user experience with emotional pain points | J+Bidding: 5-stage user journey with emotional mapping; Marigold Loyalty: multi-product journey maps |
| Persona Development | Creating evidence-based user archetypes | Across all professional roles -- Ministry (3 personas), Jellyfish (2-4 per product), Marigold (3+ per product) |
| Brainstorming & Prioritization | Structured ideation with evaluation frameworks | Crazy 8, How/Now/Wow, Five Whys, 2x2 Matrix -- all documented as reusable workshop templates |
Evaluative Phase
Methods I use to validate designs against real user behavior.
| Method | What I Use It For | Example |
|---|---|---|
| Usability Testing | Task-based testing with real users | Sailthru Folders: 2 rounds, 31 participants, SUS 90 (best-in-class) See Marigold Case Study → |
| Heuristic Evaluation | Expert assessment against established criteria | J+Track: 50-element Bastien & Scapin audit See J+Track Case Study → |
| SUS (System Usability Scale) | Standardized usability benchmarking | Applied across every professional role: J+Track (66.6), J+IQ (83.9), J+Lake (8.9/10), Sailthru Folders (90) |
| A/B Testing | Comparing design alternatives with quantified preferences | SOM: radar vs. scatter charts (80% preferred radar), bar vs. stacked bar (80% preferred stacked) |
| WCAG / RGAA Assessment | Accessibility compliance testing | J+Track: 601 findings including 69 contrast errors; Ministry: RGAA compliance; Marigold: WAVE scans across products |
| Task Completion & Time-on-Task | Measuring efficiency and effectiveness | Sailthru: task success from 86.3% to 96.9%; J+Lake: 100% task completion |
| SEQ (Single Ease Question) | Per-task difficulty assessment | Used in J+Report, SOM, and Paid Media testing protocols |
Continuous Phase
Methods I use to create ongoing feedback channels, not just project-level research.
| Method | What I Use It For | Example |
|---|---|---|
| Feedback Loop Systems | Persistent bridges between customer-facing teams and design decisions | Marigold: Slack channel with auto-logging for Solution Consultants; weekly rotating meetings with Training teams across 6 product areas |
| Research Intake Forms | Standardized process for requesting research aligned to business goals | Marigold: intake form linking every request to strategic value pillars |
| Research Repository | Searchable, accessible insights archive | Marigold: Glean.ly research repository with tagged, searchable findings |
| Quarterly Newsletter | Proactive distribution of research insights to stakeholders | Marigold: research newsletter distributed to product leadership |
| Success Criteria Frameworks | Quantified thresholds for launch readiness | Liveclicker: defined task completion ≥70%, time on task 10-20% faster, zero critical blockers, interaction parity ≥60% See Marigold Case Study → |
Process Infrastructure -- My Differentiator
Most designers deliver research findings and design files. I deliver that, plus the organizational systems that make research-informed design sustainable. This is the work I am most proud of, and it is what I believe distinguishes a Staff-level contributor from a Senior one.
Research Intake Systems
A standardized intake form that links every research request to business goals. The one I built at Marigold explicitly connects research to the company's strategic value pillars, embeds stakeholder alignment upfront, and routes requests to the appropriate methodology. It bridges PMs, POs, VPs, Engineering, and external clients -- ensuring that research is not something that happens in a design silo.
See how this works in practiceDesign Brief Templates
Before design work begins, I align cross-functional teams on goals, constraints, user needs, and success metrics. The brief template I developed standardizes this alignment -- it is the organizational handshake between research, design, product, and engineering that prevents the most common source of project failure: misaligned expectations.
See the templates and guidesMethodology Guides
I publish structured guides that enable designers to conduct research independently. At Marigold, this took the form of a numbered document series: a 14-page UX Research Fundamentals guide covering when and how to use each method, and a 13-page Surveys guide with question design principles, distribution strategies, and analysis techniques. These are not just documentation -- they are organizational education.
See the full collectionResearch Portals and Repositories
At both Jellyfish and Marigold, I built centralized research knowledge hubs -- Google Sites portals with Glean.ly repositories -- where anyone in the organization can access research findings, templates, and methodology guidance. At Marigold, this included a Research Kanban Board for workflow visibility and a quarterly newsletter distributing insights proactively.
See the Marigold infrastructureCareer Development Frameworks
At Marigold, I built a 69-skill career assessment matrix covering 12 categories with a leveling framework from Junior Designer through Design Director -- complete with expectations, responsibilities, behaviors, performance signals, and growth pathways. This is people-infrastructure typically owned by a Head of Design. I built it because my team needed it and it did not exist.
See the leadership case studyFeedback Loop Systems
At Marigold, I created persistent, scalable bridges between customer-facing teams and design decisions: a dedicated Slack channel with auto-logging for Solution Consultants and weekly rotating meetings with Training teams across 6 product areas. These are not project-level collaborations -- they are organizational infrastructure that ensures user pain points reach the people who can solve them.
See the feedback loop systemsFacilitation Approach
I design and facilitate workshops at scales ranging from 5-person card sorting sessions to 100+ participant co-design workshops across government agencies.
Structure creates freedom.
Every workshop has a clear objective, a defined methodology, and a documented synthesis process. I use progressive workshop design -- starting with divergent activities (experience mapping, brainstorming) and moving toward convergent activities (prioritization, tree testing, UI creation). This is how I ran the Ministry of Defense workshops: three waves across three weeks, each building on the outputs of the previous one.
Cross-functional composition is intentional.
The most valuable workshop outcomes emerge when you bring together people who do not normally collaborate. At the Ministry, I facilitated sessions that included military officers, civilian administrators, cybersecurity experts, and students -- participants from 9+ agencies in the same room. At Jellyfish, I conducted workshops across 4 countries and 5 seniority levels. The diversity of perspective is not a complication to manage; it is the entire point.
Workshops produce artifacts, not just conversations.
Every session I facilitate produces documented, synthesizable outputs. At the Ministry, I created standardized RETEX (post-workshop synthesis) documents. At Jellyfish, card sorting workshops produced validated information architectures. At Marigold, workshops produced feature prioritization matrices and design direction recommendations.
Tools & Technologies
I choose tools based on what serves the work, not what is trending.
Design & Prototyping
- Figma (primary -- component libraries, prototyping, design systems)
- FigJam (workshop facilitation, research synthesis)
- Framer (portfolio and marketing pages)
Research & Testing
- UserTesting / Maze / Optimal Workshop
- Glean.ly (research repository)
- Google Sites (research portal hosting)
- SUS, SEQ, UMUX-lite, AttrakDiff
Analytics & Data
- Mixpanel (behavioral analytics)
- Amplitude (product analytics)
- WAVE (accessibility scanning)
Collaboration & Documentation
- Miro / FigJam (workshop facilitation)
- Notion / Confluence (documentation)
- Jira / Linear (product management)
- Slack (feedback loop systems)
Design Systems
- Figma component libraries (built at Jellyfish and Marigold)
- DSFR -- French State Design System (Ministry of Defense)
Note: Tools list should be verified and updated to reflect current preferences.
European UX Perspective
My Paris education and French government experience gave me methodological tools that most US-based designers do not have -- and they consistently prove valuable.
Bastien & Scapin Ergonomic Criteria
An 8-criteria heuristic evaluation framework developed at INRIA (France's national research institute for digital science and technology). I applied it to J+Track as a 50-element audit, producing severity-rated findings that were more granular and actionable than a standard Nielsen heuristic evaluation. This is a distinctive methodological tool -- one that gives me a systematic lens most American industry designers have never encountered.
See Bastien & Scapin in actionRGAA Accessibility Standards
France's accessibility standard (Referentiel General d'Amelioration de l'Accessibilite), which implements WCAG guidelines within the French legal context. My Ministry of Defense work required RGAA compliance -- giving me practical experience with accessibility regulations beyond WCAG alone. This translates directly to any regulated-environment design work.
See Ministry case studyGDPR-Aware Research
Conducting user research under GDPR requires a different level of rigor around consent, data handling, and participant rights than most US-based research. My bilingual research protocols at Jellyfish (English/French) and my thesis work analyzing dark patterns through a GDPR regulatory lens mean I design research practices with privacy-by-default, not as an afterthought.
See the Responsible Tech threadBilingual Facilitation
I facilitate workshops, conduct interviews, and present findings natively in both English and French. At Jellyfish, I designed bilingual interview guides for cross-country research. At the Ministry, all work was conducted in French within a formal institutional context. This is not just a language skill -- it is the ability to navigate cultural expectations around hierarchy, directness, and collaboration that differ significantly between American and European professional environments.
See the Ministry case study