Developer Experience Team Topologies

Growth Scale STARs what did I improve: I’ve

Supporting a product mindset, including a user focus, amongst platform efforts Building a culture of trust where all stakeholders feel heard, supported, and involved in improving their situation. Gathering and reporting data that is reliable and actionable Preventing burnout. The Encouraging metrics tied to business and stakeholder outcomes. Measuring the business impact of platform efforts Collaborating with rapidly shifting teams and roles

In both cases, the skills are the same, though the approaches. The ability to help others feel heard. Define and measure productivity. Tie the pieces together in a way that helps engineers feel heard and part of the effort to improve their experience and productivity.

1 person all techs/pipelines or .5 people per platform 1 person all techs/pipelines or .5 people per platform teams diffuse their own improvements teams specialize pt person per persona (including 1 primary dx person) pt person per platform () 0 platform teams, pt person in product team + pt working group across product teams 1 platform team including lead responsible for dx enablement, and n dx persona people, n slices covered n platform teams, 1 dx enablement team, n dx persona teams (1 per persona, each with slice members)

lagging metrics: median of their platform teams’ outcomes (e.g., team’ contribution to core4 factors) Experiment management tool satisfaction Leadership report/dashboard satisfaction (insight understandability, actionablity, value) Experiment rate (mean, weighted by persona outcomes value) Experiment success rate (mean, weighted by persona outcomes value) Experiment adoption time (e.g., median of time for team’s experiments to reach 99% adoption) Team familiarity with how their actions connect to outcomes (builds trust) Team metrics satisfaction survey data quality (accuracy, internal/external/construct validity, completeness,) telemetry data quality ease for teams to choose metrics maintaining metrics inputs-to-outcomes model * weighted by historical persona outcomes value improvement/input metrics:

data quality
surveys
leadership reporting: actionability, satisfaction
improving experimentation capacity and quality
improving data quality capacity and quality
maintaining shared persona team tools and processes (e.g., surveys, dx telemetry services)
advising teams on metrics choices
maintaining metrics recommendations
maintaining top level input-to-outcome model
reports at various levels
coordinating and delivering leadership reports
listening to DX Persona teams
creating top-level performance dashboard

Platform (DX Persona Focus) Responsibilities lagging metrics: team: median of their product teams’ outcomes contribution (e.g., team’ contribution to core4 factors) trend in ^ individual metrics: p50 teammate rating trust p50 teammate rating outcomes contribution value p50 teammate rating trend (in trust and value)

metrics adapted based on which lagging metrics we want to improve

experiment quality ensuring their targeted inputs connect to outcomes

listening to product teams (and other stakeholders like PMs), integrating feedback, prioritizing experiments, integrating feedback, experimenting,listening to feedback, reporting results, integrating feedback. driving adoption of successful experiments contributing to leadership reports

Written on July 15, 2025