Design at Scale is People!

I believe the single most interesting challenge and opportunity for design as a practice and function is to figure out how to operate at scale. Upon reflection, our book was essentially about setting up your organization to deliver design at scale.

Much of “design at scale” discussion is dominated by design systems (most of which are just code-enabled style guides), because it’s an easy thing to talk about and feeds into the corporate desire for increased productivity. However, starting with systems runs exactly contrary to the true value that design brings to companies, which is a humanistic and creative problem-framing and problem-solving approach. In other words, the focus on systems could undercut design’s potential within organizations.

I am grateful for Adaptive Path’s UX Week to provide me a platform to share my thoughts and experience with scaling design, in a talk titled “Design at Scale is People!” I hope you enjoy it, and I’d love to hear any feedback you have about it.

Peter Merholz // Design At Scale is People! // UX Week 2018 from Adaptive Path on Vimeo.

 

Conduct better designer portfolio reviews with this tool

TL;DR: Here’s a Portfolio Review–Set-up and Assessment Tool for you to use.


A couple years ago I contracted with Capital One to help bring some order to their rapidly expanding design organization. I focused on recruiting and hiring practices–when you’ve got ~40 open reqs, you want to make those processes as efficient and effective as possible.

Key to any design recruiting process is the portfolio review, where a designer walks people through a selection of their work, and the thinking and activity that went into creating it.

Every interview loop at Capital One includes two behavioral interviews (BIs), which have a formal and repeatable structure, and require training to administer. It turned out the topics of one of the behavioral interviews was similar to what you’d get out of a portfolio review, and in an effort to reduce the time on-site (which could get up to 7 or 8 hours) I had hoped to replace a BI with the portfolio review.

I worked with Capital One’s HR team on this, and learned that, to be a worthwhile tool, it needed rigor and repeatability. This was key to removing bias from the process, focusing it on skills and experience, not personality and camaraderie.

I had never conducted ‘rigorous’ portfolio reviews. I’d always just had a candidate show their work, and ask some questions to clarify the candidate’s role, and that was that. Such a loose approach was not going to fly here.

The goal was to make something that ensured fairness in the process, and that, regardless of who was providing feedback, the assessment would be the same.

I asked the folks at Capital One if I could share the tool, and they said yes. So here you go:

Portfolio Review – Set Up and Assessment Tool

The idea is to not just have it be a free for all. The candidate preparation helps candidates know what to expect and how to shape their presentation. The suggested prompts provide a guide to the interview team for how to probe in a productive fashion

And key to making this work is the assessment tool. When I first drafted it, for each skill (visual design, interaction design, communication, etc.) you could score someone 1 to 5. However, there was no guide as to what a “1” would be or a “5”, and so it was too open for interpretation.

So I worked with craft leaders throughout Capital One to come up with language for these skills, to provide clear guidance in scoring candidates. That, for me, was the key ‘innovation’ in this approach.

I share it with the hopes that this helps make designer interview processes better everywhere. Feel free to copy it and make it your own!

If I were to write a second edition of Org Design for Design Orgs…

It’s been about two years since Org Design For Design Orgs came out. After having worked with it, taught it, and spoken with many design leaders and seen many design orgs, there’s a list of things I know I’d want to address if we wrote a second edition (note: no plans to do so).

Make dual-track career growth more explicit. In the book, we presented a single levels framework, with the idea that it could support career growth either as a manager, or an individual contributor. In retrospect, those paths are different enough that it warrants calling out, as I have in the levels framework we developed at Snagajob.  I’m also intrigued with the work Athenahealth did on establishing “Dual Track Leadership.”

In the “evolution of design organizations,” go beyond stage 5, to at least stage 7. We charted 5 stages of organizational evolution, from the “initial pair,” all the way to “distributed leadership,” where there’s about 70-80 people on the team. We yadda-yadda’d beyond that, saying, “just keep doing this, but more.” Since writing the book, there have been an increasing number of design teams that go beyond 100, and it’s clear that there are patterns in that development. It’s worth addressing what comes into place when the team hits 150 (stage 6), as  that’s when Design Operations / Design Management becomes quite robust, and again when it gets to about 250 (stage 7), where it can support deeper craft leadership, “principal” or “distinguished” designers, and also when it’s time to consider–should it remain as a single centralized org?

Dig into the crucial role of the Team Lead. In chapter 4, we dedicate a page to the role of the Team Lead, including the line, “the best team leads are a combination of coach, diplomat, and salesman.” That line became the seed for my talk on design leadership, and the process of writing that talk, and sharing these ideas at conferences and inside companies, has shown me that there’s much more to share about this crucial role. In fact, I consider it the most important role in a design organization, more than any VP or Director.  (That will be the subject of a future post on this site).

Go deeper on Design Operations / Design Management as a role and practice. Though the book has been called “the bible of design operations,” we don’t really tackle Design Operations / Design Management head on in a thorough way, particularly around matters of Program Management (budgeting, scheduling, coordinating efforts), Education (internal training and skills building), and Measurement (tools, systems, and approaches for understanding the impact of the work).

Do a better job distinguishing between Product and Communication Design. In our utopian desire to merge all design activities under the rubric of “service design,” and have product designers and communication designers working side-by-side on design teams, we neglected to delve into the very real differences between delivering product design and communication design. They operate on different cadences, work with different parts of the organization, and most of their time, simply don’t interact. That said, there is real value in having product and communication design on the same team (it was essential when we rebranded Snag). This is still a point of contention for many design orgs, and so warrants more honest, pointed discussion rather than our hand-waving of “it’ll be great”.

Soft power as a tool for distributed teams. However much I believe a centralized design organization to be the right way to go for, like, 95% of design teams, the reality is that many function in some kind of distributed, federated, siloed fashion. For those organizations, I’ve been applying the notion of “soft power” as a tool to get these distinct design teams aligned with a common goal, purpose, and set of practices.

Even more about recruiting and hiring–Portfolio Assessment Tool and design exercises. Even though it’s the longest chapter in the book, it turns out there’s still more to say about recruiting and hiring practices for design. The community still is at odds about the value of design exercises (though we’re not). Also, since writing the book, I’ve had the opportunity to craft a Portfolio Assessment Tool that brings a needed level of rigor to the practice–a clearer set of prompts to guide the discussion, and a guided worksheet to aid people in assessing a portfolio as to remove bias and focus on the content of the work.

What do you think?

So, these are the ideas I’ve had. And I’m sure Kristin has a bunch of things she’s considering. And I am wondering: for those who’ve read the book, what more could we address that would help you?

 

 

Read about athenahealth’s smart, pragmatic approach to scaling design within an agile product organization

For the kind of nerds who dig this website, I suggest visiting athenahealth’s Experience Design’s recently updated Medium site, with 5 articles related to design org matters.

Of particular note are:

  • Embedding Product Design in a Large Agile Organization“, which addresses the challenge of having ~85 designers work across >200 scrum teams while maintaining quality and not losing their minds
  • “How we approach DesignOps at athenahealth”, with the different functions DesignOps fills (measurement, research at scale, and design systems), in an attempt to realize efficiencies in order to deliver in an organization where the ratios conspire against you (that whole 85 designers across over 200 scrum teams thing).

There’s good stuff here, and it’s all the better as it’s real deal (actual application of different org models and structures), recognizes initial shortcomings (they iterated on their dual-track design leadership ladder), and offers details that can help others figure out how to begin considering these approaches on their teams.

 

New Team (and Role) for Big Design Orgs: Design Management (and its head)

(This post was developed with input from Kristin. Like how we wrote the whole book!)

(Also, this post is very much about an idea that is a Work in Progress. I’d love feedback to help sharpen it.)

Design organizations, particularly ones that grow beyond 100 or so (and definitely beyond 150), find themselves in unchartered territory.  To support a team at that scale requires establishing a set of roles and practices that are distinct from the practice of design, and serve to enable the health and effectiveness of the design organization. Looking around, I see new roles and sub-teams, such as Design Operations, Design Education, Design Program Management, and People Development. “Design Operations” is emerging as the oversight to address all of this, but I think that’s a mistake, as the word “operations” suggests something more strictly mechanical than what we’re talking about.

What I see is an opportunity for a new sub-org within design teams, Design Management, lead by a new role of Head of Design Management. (Let it be known that Kristin has been arguing for the role/org of “Design Management” for years now, and until recently I’ve fought her on this. I’m evolving.) This role serves as a near-peer to the Head of Design (near because they still report to them), and it addresses all the managerial and operational challenges that a design organization faces at scale, while the Head of Design, and their other reports (Design Directors, etc.) are focused on design leadership and delivering high quality work.

Here’s how I picture the scope of the organization:

designmanagement

It begins with People, under which there are the three Rs or Recruiting, Retention, and Reputation (I’ve taken this directly from Kristin). This is what tracks most closely with traditional HR and people management concerns – recruiting and hiring, job descriptions, performance reviews and promotions, developing a “talent brand.” The ultimate objective is Make Designers Happier, which is shown through such measures as speed of hiring (from posting a job to that person’s first day), internal referral rates, internal surveys of employee satisfaction, and retention rates.

Then you have Practice, where the work is to build the skills and capabilities of the design team. In support of an objective of Make Designers Better are a suite of activities dedicated to the content of the work—professional development and skills-building, developing content, training and education, codifying process and methodology, and hosting internal events. I’m stuck on how to measure improvement here. Much of the impact of this will be shown in the employee happiness and retention numbers.

Finally there’s Program, which is also what many think of when talking about “design operations,” and the idea here is to Make Designers More Effective. Program management helps design with planning and prioritization activities (including forecasting headcount needs), measuring effectiveness, standardizing tools and services that the design team uses, wrangling facilities to ensure the best working environments, evolving corporate policies that may obstruct the best design practice (particularly around user research), and owning the contractual relationships with external staffing, whether agencies or individual contractors. For measurement, I’ve used internal surveys for cross-functional teams to assess their satisfaction working with the Design team, but I think there should be more. An effective organization is one where the Design team is really humming along, feeling productive, seeing their work in the world. “Amount of work shipped” may be an indicator, though I’m wary of quantity measures.

Originally I had “Culture” as a component of Design Management, thinking primarily on how culture is articulated, codified, and transmitted throughout the design organization. Upon further reflection, I’ve set it apart as a joint responsibility with design leadership.

Roles in this Org

It’s not until a design organization gets to be about 100 that you need to consider a distinct Design Management sub-org. Up until then, the People and Practice activities are the responsibility of practicing design leadership, and there should be a team of Design Program Managers paired with these design leaders (typically at Director-level, maybe at Manager) who handle all the stuff under Program.

Once you get north of 100, and definitely beyond 150, economies of scale set in where it makes sense to have people dedicated to People and Practice, particularly if the design team is continuing on an aggressive growth trajectory. For the former, you may have a Head of People Development (such as Laura Kirkwood’s role on Capital One’s very large design team), and for the latter, a head of Design Education (my pal Billie Mandel is in this role at Atlassian). And as these teams continue to scale, these heads, in turn, may need their own small teams to keep things going.

Where is the Design System?

Conversations about design operations inevitably turn to design systems, which are not explicitly called out here. I consider a company’s design system a “tool and service”, and thus partly a responsibility of the Program team. From what I’ve seen and heard, the most successful design systems (particularly in large companies) are built and run by fully staffed cross-functional product teams, such as the one that maintains Polaris at Shopify.

What Do You Think? What Do You Do?

I’m keen on hearing about other models for addressing the organizational, managerial, and operational concerns of a design team. Please let us know in the comments!

Design Exercises are a Bad Interviewing Practice

Recruiting and hiring is among the most difficult and time-consuming aspects of a design manager’s job, and wherever they gather and share experiences, the subject of design exercises inevitably comes up. We wrote about it briefly in our book:

Design Tests?

A topic of some controversy within product design circles is whether candidate interviews should involve some kind of design test or challenge akin to what happens in engineering interviews. Our firm, resolute response to this is “no.” Design tests set up an unhealthy power dynamic in the interview environment, when instead you should be fostering collegiality. The context in which the challenge is given (typically narrowly time-boxed and with only a little information and little support) is wholly artificial—and so whether a candidate succeeds or fails is not a meaningful indicator of actual practice. There is nothing you will find out in such a test that you couldn’t better learn through probing the candidate about their portfolio.

I had hoped that this would be sufficient and never need to be discussed again. Judging by lengthy multi-party threads on Twitter, I was wrong. Forthwith, a lengthier set of reasons for why design exercises are bad interview practice.

Design Firms Don’t Do Them

At Adaptive Path, we hired world-class designers without ever having them conduct a challenge. Same thing back at the first design firm I worked for, Studio Archetype, which was a standard-bearer for early digital design. These are companies whose sole purpose was the delivery of superlative design, and where the value was the talent of the people on staff. How were we able to assess their abilities? As alluded to in the passage above, portfolio reviews, including discussions of how they tackled design challenges.

They are a waste of time

If there’s nothing you can get from a design exercise that you can’t get from a portfolio review and a well-structured thoughtful interview, then it follows they are a waste of time. I call this out because recruiting and hiring is already monumentally time consuming, and anything that needlessly takes up time should be excised from the process.

Design Is Not Engineering

I can’t say for certain, as I haven’t done the research as to where design exercises emerged as an interviewing practice (it’s not from traditional design practice), but my guess is that they came about in technology companies where software engineering was the dominant practice. Design had to overcome its perception as squishy, soft, “make it pretty,” by demonstrating rigor, relying on data, and generally making the practice of design operate more like engineering.

And engineering hiring interviews involve technical exercises (coding challenges and the like), so shouldn’t design hiring interviews?

The thing is, coding challenges are waaaaay more straightforward than design exercises. There are demonstrably better ways to solve engineering problems. And in most coding exercises, the outcome is predetermined — it’s a matter of how would you realize it?

The same is not at all true for design. You’re not applying process to realize an already known outcome. You’re taking in a massive amount of input in order to navigate your way through the problem space. Unlike engineers, you need to consider business context, user needs, goals, and capabilities, brand concerns, technical constraints, channels of use, and god knows what else. And good designers know that there are many potential solutions to a problem, and require testing and iteration to get to anything like a good solution.

Design Exercises Bias Towards Facile Problem-Solving

Designers don’t all solve problems the same way. Some take in a lot of data, go off into a cave, noodle on it for a while, and come out with something great. Others iterate and prototype almost from the get go, uncovering solutions through refinement. Some require thinking out loud, and deep collaboration to get their best work. A great design organization has people with a variety of problem-solving modes and approaches, which enables the organization to better tackle a wide array of challenges.

The artificial constraints of design exercises (typically time-limited; a problem that the candidate isn’t prior familiar with, but which the interviewers are; performing under the scrutiny of others) biases toward a narrow range of problem solving.

A design exercise, by its very nature, is inclined towards facile solutions, and so biases teams towards facile designers. There’s not really any room for grokking depth.

Design Exercises Exacerbate An Already Problematic Power Dynamic

Design exercises ask candidates to perform on demand. In the context of a job interview, this only heightens the fraught power dynamic between an employer and prospective candidate. Even in markets where talent is in high demand, job interviews place candidates in a vulnerable situation. Being expected to perform on demand only adds to the candidate’s stress and anxiety, and makes for a suboptimal candidate experience. This Twitter exchange between my friends Ryan and Jared touches on this…

twitter

 

As Jared notes later in the thread, design exercises introduce cultural bias, too:

What about take-home exercises?

This is often the response to my ranty diatribes against design exercises. What if they’re take-home? Then people have all the time they need, and it’s the pressure cooker of performing-on-demand is.

Beyond the obvious problem, that’s still at the root of all of my issues with design exercises (for the people in the back: THEY ARE ARTIFICIAL CONSTRUCTS THAT DON’T REFLECT HOW DESIGN ACTUALLY HAPPENS), they introduce new issues… Namely, now you’re asking this person to do unpaid work. Young people with savings (i.e., don’t need money) and free time will be able to put a lot more effort into take-home exercises than, say, a single parent whose at-home time is focused on their children, and can only do “homework” after the kids asleep and when they’re likely exhausted.

Recognizing this, some companies do offer to pay people for taking the time to do a take-home exercise (which can help defray costs like child care), and that’s better than not doing so, but even better…? No exercises. Because you don’t need them. Because they add nothing to the recruiting and hiring process that can’t be figured out through thoughtful, experienced-based interviews, a savvy portfolio review, and speaking with people the candidate has worked with.

Stop separating product and marketing design–how rebranding Snag showed me it’s all one design

On my personal site, http://peterme.com/, I’ve written a four-part series on how we rebranded Snagajob to Snag (Parts 1, 2, 3, 4). In the context of “org design for design orgs,” there’s a key learning worth sharing.

Some context will help. Traditionally, when building internal design capabilities, companies distinguished design for marketing, which would report up through a marketing executive, and design for the product, which reported up through product management or engineering. This may have made sense in a pre-Web era where what was developed, and how it was packaged, advertised, and sold, were vastly different activities. Product design involved industrial design, hardware design, user interface design. Marketing design involved the design of the packaging, the sales support material, advertising across a variety of media. These activities were often outsourced to different kinds of firms that specialized in one or the other.

In this post-Web, post-mobile era, where products are becoming services, and the media and modes of use are the same as the media and modes of marketing, these distinctions become blurred. Old Webheads like me saw this repeatedly in the early days of digital transformation. I worked with a number of traditional companies that had their marketing team in charge of the website, because they initially saw it as a platform for acquiring customers, only to then realize that the website was also a way for existing customers to conduct business (think online banking). These companies would have wholly different teams working on the “public” and “private” parts of the website, which would lead to vastly different designs and experiences pre- and post-login.

As digital services evolve, these distinctions are meaningless. Is the home page of Snag a marketing page or a product page? The answer is yes. But if you have two different teams, reporting up to different executives, working on it, they’re distinct mandates (“drive acquisition!” “drive engagement!”) will conflict, and will cause consternation.

This is why, in the book, we argue for taking a service design mindset, one that orients on the customer journey, and that recognizes that “marketing” and “product” are simply way stations on that customer journey, and what matters more is orchestrating that entire experience.

Getting back to brand

In rebranding Snag, particularly in developing a new brand identity, I had marketing and product design leaders involved throughout. And I couldn’t have imagined it any other way – while marketing is responsible for how we communicate about our business, the product designers are responsible for the day-to-day interactions with our services, and those interactions define how people experience our brand. It might seem obvious to say, but I feel obliged to say it, as I know of rebrands that were wholly run out of marketing, where the product development teams were at best a stakeholder, and typically simply a recipient, of a style guide built without their direct input.

Having marketing and product people work closely together to build our new style guide and design system made the brand identity work much stronger. And this works best when those marketing and product designers are on the same team, with the same boss, and have developed great working relationships over time, and aren’t just thrown together for the first time on something as hairy and arduous as a rebrand.

 

 

UX Research – A dedicated role, or a skill everyone develops? The answer: Yes.

 

Over the holiday break emerged a Twitter discussion about the role of design research. It started here:

To which Jared Spool responded:

And which spawned a spider’s web of @s and quoted tweets, with folks debating the merits of a dedicated UX research role. Forthwith, my take.

User Research is a skill

I agree that user research is a design skill – it’s one of the 8 core skills we identified in Org Design for Design Orgs. At Adaptive Path, we had no dedicated user researchers – all designers conducted their own research, and then were expected to derive insights through analysis, and then define solutions to the issues that arose. Very much what Jonathan Lupo describes in his tweet.

UX Researcher is a role

It should be noted, though, that Jonathan Lupo’s experience is based in design consulting. Like him, I would have never considered dedicated user researchers at Adaptive Path. Design consulting is project-based, and the research that is conducted is specific to that project, so the designers conduct the research, derive the insights, and drive to new solutions. In house, work is typically less about discrete projects and more about programs that flow. Also, research doesn’t need to be bound by the needs of a single team. Here’s how we wrote about the role in Org Design for Design Orgs:

In leading technical organizations, it is common that once they reach a certain scale, often around the time they have five or six designers, they bring on a dedicated User Experience (UX) Researcher to do everything from out-in-the-world field research to user testing of interfaces.

…This role seeks to understand the totality of the user’s experience, and the insights drawn from such research will inform work across marketing, sales, product, and customer care, as well as design.

The key responsibilities are generative and evaluative research. Generative research, typically field research such as in-home observations or diary studies, leads to insights for framing problems in new ways that stimulate the development of innovative solutions. Evaluative research tests the efficacy of designed solutions, through observing use and seeing where people have problems. Strong organizational skills and keen attention to detail are required, as much of UX research is operational management: screening and recruiting participants; scheduling them; note-taking and other data collection; and analysis and organization of that data.

This role is also commonly called “User Researcher ” We prefer “User Experience Researcher,” as it sounds less clinical and vague, and highlights what about the user is the subject of study—their experience with the service.

Developing a dedicated user experience research function does not absolve others from taking part in research. Researchers who work on their own, delivering reports filled with findings in hopes that others take heed, will find their impact blunted. Instead, the UX research team should remain small, highly leveraged, and supportive of everyone else’s ability to engage with users directly. For larger, more robust studies, involving travel or time-consuming observation, it might not make sense for marketing and product development staff to take that much time away from their primary duties. In these cases, UX researchers will conduct the work. But within an iterative design and development context, most research efforts should be conducted by designers, product managers, and even engineers, with help from the UX research team.

At Snagajob, my design team has within it a UX and Market Research team, staffed with two dedicated researchers. Along with their responsibility of enabling product teams to conduct research, they also Go Deep on issues that cross not only product teams, but marketing and sales as well. Last year we conducted a two-week diary study, an effort that’s too big for a product team to take on (with their delivery expectations), and which has lead to insights and the development of personas that are spanning product and marketing. Later they spearheaded a two-week online community study around the subject of underemployment. They made sure to get marketing, design, and product management involved, but this kind of deep research, which has lead to insights many teams are already taking advantage of, simply wouldn’t have happened without dedicated people.

Respecting the skill of user research

One reason that the online community study wouldn’t have happened is simply the bandwidth required to conduct such an activity is greater than most folks have time for. But another, and perhaps more important reason is that it was the brainchild of our lead researcher. When posed with the general research question of “how do we better understand underemployment?” she reached into her toolkit and identified this method, which was new to me and our organization. Dedicated researchers hone, refine, and expand their craft just like any other practitioner. Designers, product managers, and engineers don’t have time to continually grow their user research skills alongside their other responsibilities, and will default to familiar practices. Dedicated researchers can try new things, and that exploration can identify methods that are better suited to answering certain questions.

Something I found ironic in the Twitter discussion of “user research as a skill” was the lack of respect for deepening the practice of that skill, seeing it as simply a phase in a designer’s process. User research can do way more than just help designers solve problems. Dedicated user research teams have an opportunity to deeply impact an entire organization’s awareness of its customers.

 

Coach, Diplomat, Advocate, Architect – My talk from Leading Design

In October, I gave a new talk titled Coach, Diplomat, Advocate, Architect, where I dissect these four archetypes of the design leader, and share the difficult news that, in order to fully succeed, that design leader must embrace all of them. It’s my first new material in a couple of years, and though I ended up having to speed through it at the end (I typically put too much stuff in a new talk), I think it came out well.

Coach, Diplomat, Advocate, and Architect: The Leveraged Design Leader: Peter Merholz, Leading Design 2017 from Clearleft on Vimeo.

Pretty much every talk was worth watching. Some personal favorites include Stuart Frisby sharing his experience growing design at Booking.com from 6 to 100 (and beyond), Kim Lenox’s frank sharing of how her personal growth allowed her to become a better leader, Ben Terrett’s funny and real grappling with being the bottleneck, and Cap Watkins’ confessional on the neuroses of the design VP.

Some videos to queue up for holiday viewing!