What the studio the future is asking for
As the systems we design become increasingly complex and entangled, how do we effectively design for and around them?
We’re no longer just shaping interfaces or strategies; we're designing the relationships between people, platforms, policies, and increasingly machines that can make decisions on their own.
Agentic AI. Multi-modal workflows. Identity ecosystems. Digital passports.
The work isn’t “about” those things but they are the terrain now. Are our existing design methodologies sufficient for this new context, or must they also undergo a transformation?
We're likely to find more and more that technical capability alone is no longer enough. I keep coming back to this - It’s trust. In motion. Not just as a value but as a posture. A kind of deeper alignment between how we design, what we govern, and the consequences we’re accountable for. It began with a central question:
How can we validate the trustworthiness of a digital signal, whether it's a credential, a model output, or an automated decision?
From this, I've been shaping process called Digital Trust Alignment. The goal is not to slow down delivery but to create clarity and coherence. Some early pilots of this methodology already with government teams working on complex challenges across identity, cybersecurity, AI, and data.
To effectively manage trust, we must first define the basic unit we are assessing. Project descriptions, user stories, OKRs, and PBIs are containers for work, but they often obscure the essential promise being made to a user.
We can cut through this complexity by focusing on the underlying Trust Signal - a clear, simple statement that defines what our work is actually asking people to trust.
Take examples of current work (e.g., user stories, technical tasks, OKRs) and ask a few core questions to help move beyond the jargon:
What is this, in simple terms?
Who does it ultimately serve?
What does it enable, or what risk does it / should it protect against?
Distil answers into a clear, structured sentence of the signal we are asked to trust:
A [what it is] for [who it serves] to [what it enables or protects].
Later this process prompts teams to ask critical questions about those Trust Signals:
Credibility: Is this signal or output verifiable and reliable?
Care: Was the data handled responsibly and ethically at every step?
Consequence: Does the level of trust we're asking the user to place in the system match the potential impact of its failure?
This lays the foundation - a shared orientation across teams and capabilities that are deeply interconnected, but often operate in silos within complex environments and large organisations. Here's the rest of the scaffold:
1. Shared orientation
Clarify direction, posture, and what trust looks like across systems.
A shared narrative around emerging tech: why it matters, how it connects across strategy, and where it’s already showing up in motion.
A readiness-aware portfolio: trust posture helps surface what’s ripe, what needs support, and what’s misaligned with the system’s current state.
2. Field sensing & signal Mapping
Listen into the system. Surface where trust is strained or drifting.
Governance that enables: establishing principles for data stewardship and innovation risk. It’s not about saying no - it’s about making “yes” safer and smarter..
Prototyping as sense-making: use trust signals and system maps to reveal strain -and where to begin experiments that test not just if something works, but it relational fit.
3. Integration & activation
Turn insight into infrastructure. Make coherence operational.
A live operating rhythm: repeatable processes, platforms, and feedback loops that keep coherence active - not just documented.
Emerging tech as capacity, not project: embedded into the organisation’s sensing, deciding, and evolving - an agent of alignment, not hype or just automation.
A new capability model
While working with this new method, I’ve noticed something else: a need for a different kind of studio. Not a lab. Not a delivery team. Something more like a sensing organism. One that doesn’t just build fast - but builds relationally. One that can hold complexity without shutting down. One that listens at every level: technical, emotional, systemic.
This led us to map the key practices of the most daptive, honest, and humane studios are already practicing, a capability model - 27 Capabilities of the Future Studio, includes for example:
Systemic sensing: Methods for mapping and identifying second-order effects and unseen connections before they become breakdowns.
Cognitive state design: Shaping interactions that account for users' cognitive load and emotional states, moving beyond screen-based usability alone.
Resonance testing: Advanced qualitative techniques to feel and measure whether a service or intervention truly lands with its intended audience in an authentic way.
This is not a theoretical wishlist but an emerging pattern. As the systems we build become more multi-dimensional and consequential, the studios that shape them must become more conscious and coherent.
These observations point to an unavoidable conclusion: a fundamental paradigm shift in design is not just necessary, it's already underway. The current evolution in our methods is only the beginning. Our next imperative is to refine these emerging practices and validate them at scale.