May 30, 2024

Steve Portigal on what makes a research practice mature

Steve Portigal joined us for a Rally AMA on May 30 on what makes a research practice mature. If you missed our live event, or want to revisit the highlights, read our recap below. If you’d like to watch a recording of the full AMA, follow this link

Who is Steve?

I’m an experienced User Researcher who helps organizations to build mature user research practices. Based outside of San Francisco, I’m the principal of Portigal Consulting, and have conducted research with thoracic surgeons, families eating breakfast, rock musicians, home-automation enthusiasts, credit-default swap traders, and real estate agents. My work has informed the development of professional audio gear, wine packaging, medical information systems, design systems, video-conferencing technology, and music streaming services.

I’ve written two books: Interviewing Users: How To Uncover Compelling Insights (now in a second edition) and Doorbells, Danger, and Dead Batteries: User Research War Stories and also host the Dollars to Donuts podcast.

👉 Grab yourself a discounted copy of Steve’s latest book Interviewing Users: How To Uncover Compelling Insights using code SteveAMA. Discount code is valid through July 6, 2024 only on Rosenfeld Media. 

How do you define a mature research practice?

We talk about being lifelong learners and always wanting to grow and change. One of the people I recently spoke with for my podcast characterized their company as an organism. I think that sort of anthropomorphization of companies is useful. Organizations change and evolve, and there's no static point at which we become "mature." Ideally, we're always progressing towards more maturity in various aspects of our lives and careers. Context is important; we need to recognize where we were and where we are now. Celebrating small steps and acknowledging tangible progress is key. Everything is changing, and it doesn't always move in a linear direction, but the goal is continuous growth and development.


I think the context for looking at an organization, part of an organization, or a team is to say, "Hey, we used to be here, and now we're here." If we're going to say "it depends," it depends on where we were at some time. It's nice to take those moments and look back and say, "Oh my goodness, look where we are." We change slowly and develop slowly. I think it's good to have those moments to say, "Wow, we wouldn't have been doing this kind of work a year ago. We wouldn't have had this person show up to our readout six months ago." Treat yourself well by acknowledging tangible specific small steps and remind yourself that we're an organism, we work in organisms, and we evolve.

In terms of what that evolution might look like, I recently had Daniel Escher, Director of UX and Research at Remitly on my podcast, Dollars to Donuts, and he said something I found interesting:

"When I joined the company, the biggest needs were understanding our customer audience and our potential audience. And so we did total market segmentation, and then developed a brand position. A lot of that work in marketing is, it's not like it's complete, but the bigger opportunity in the last couple of years has been on the product side. About a year ago we moved from reporting into marketing to reporting into product. And that was a function of the needs of the business shifting.” One interesting aspect of maturity that I really like is around where research reports. This is a classic question: what's the best place for research to report? Daniel describes an evolution where, when he started leading research, the company's challenge was understanding the audience.

Research was in the marketing organization addressing that need. As the company matured, the biggest opportunities shifted to product improvement. Research then moved to a different part of the organization to have a greater impact.

This isn't just about research maturing but the business overall maturing, with reporting structures changing accordingly. It's about where you're having an impact in the organization and evolving as you build maturity.

I like this example because it’s relevant today.

Two questions are implicit in these conversations: 

  1. Who are the people doing research? (Thanks to Kate Towsey, we now can differentiate using terms like “people who do research” and “researchers.”)
  2. What do we mean by research? 

Chris Avore’s model (see below) touches on staffing, but we should also consider what activities we mean by research and who is doing them.

So, technically, this isn't just about the research organization maturing, but about the business overall maturing, with reporting structures changing. There's a larger sense of maturity here than just research. You can think about where you're having an impact in the organization and how it evolves as you build maturity in different parts of the business.

Are there specific processes or frameworks that you recommend to assess the maturity of a research practice? 

Over the years, we’ve had lots of maturity models for product, UX, and design, but this was the first one about research. I’ve seen other versions, but I still like this one from Chris the best.

Image from Interviewing Users, 2nd edition, by Steve Portigal, adapted from Chris Avore's model

Let me give you an overview. Along the top, the X-axis in this matrix, is the level of maturity: laggard, early, progressing, mature. The Y-axis represents different areas of research maturity: executive attitude towards research, the scope of research, its purpose, who's doing it, who receives it, and the overall governance perspective.

Chris’s model describes what each cell in the matrix looks like, from executive attitudes to staffing. I'm less concerned with the specifics here, as it's generalized. To be actionable, it has to be specific to you. This matrix doesn't answer how to move to the right, but it's a good start.

How important is it for a team wanting to gauge their maturity to look at the maturity of other teams within the organization and the broader organization's maturity?

I think it's important in terms of setting goals. You can't be a highly mature research organization if your product organization is new, your design organization is new, or if the company's overall cultural sophistication is lacking. 

Is research leading and pulling the organization up, or pushing from the bottom? Is everybody learning together and growing together? 

When companies hire new research leaders, they want to understand where the organization is at and how to approach the job of leading research or being the first researcher. For a company that has not done foundational research about who their users are and how they are doing the tasks they're building something for, that reflects poor research maturity and product maturity. It doesn't reflect on the new research team; it's about understanding what the organization needs. 


How do you see the relationship between the maturity of Research and Research Ops? How do they line up?

Chris wrote his article a few years ago, and I hypothesize that if he wrote it today, there would be a row for Research Operations. Our practice has a more sophisticated view of the importance of ops, and we can articulate what it looks like. Adding a row for operations in the maturity model would make sense. We could add another row and talk about the laggard having no operations, progressing to dedicated staff and repeatable processes. Research Ops is crucial for driving the maturity of a Research function and product org. 

What other internal and external factors do you think affect research maturity?

Internally (as in, within the research team), staffing is key.

  • How many people are on your research team? 
  • What skills do they have? 

The research skills framework by the Research Ops Community is a rigorous way to assess individual skills and build them. Maturity is skill-based, involving processes, infrastructure, and career development. 

Externally (e.g., beyond the research team itself):

  • The impact of Research
  • The influence of Research
  • Leadership buy-in 

Leadership that values research is key for maturity. It's the top of the mountain, the gold standard, enabling or blocking further maturity.

Are there specific group or team skills that you think are foundational for increasing research maturity?

I have a bias here. My bias is about the work that I do and the stuff that I write about and teach. I'll own that. But I wouldn't say just interviewing skills, survey writing skills, or Qualtrics skills. There's a piece above that which really makes us better as researchers, method and domain aside.

We talk about things like empathy, curiosity, and listening. These are big words, and you have to chew on them a bit to get to where they're meaningful. I think there's a big skill around not just knowing yourself but also hearing yourself. We work in fast-paced environments and are asked to be experts. Developing comfort with not knowing, being confident and curious, and honestly asking questions we think we're supposed to know the answers to are crucial.


These skills are about knowing ourselves and being the person in the meeting who asks the question no one else is willing to ask. It's also about storytelling, empathy, and compassion — being able to do research and communicate it in a way that's impactful and significant, even if it's hard to hear. We've learned that something we're doing isn't going to work, and there's a new problem to solve. Finding that out is great, but presenting it without compassion and nuance can be harmful.

Emotional maturity, hearing your own discomfort, being sensitive to others' discomfort, and being a good storyteller are all in service of relationships. This is what it gets down to: the relationships we build with people. In many environments, we work in different ways, and we want to use self-knowledge and emotional maturity to build relationships because that's how we can achieve what we're trying to accomplish. If I want to talk about upskilling, these are the things I would work on because they pay off across the board. 

But these skills are also side effects from practicing the more technical skills of research. If you practice survey writing and try out a survey, you'll learn humility and empathy. It's baked into everything we do if we're reflective about our own learning.

How do changing times cause relatively mature research practices to become immature, even if they were working well initially?

This hits on the organism point from above – research practices aren’t static. Things are constantly changing. Ideally, we want things to change and improve. But it’s important to realize that things can go backwards.


Here are a few changes that can affect maturity: 

  • Organizational changes
  • Turnover
  • New leadership
  • Restructuring can all affect maturity. 
  • Changes in strategy or OKRs 

There's usually a lag before the consequences for the practice and dynamics are apparent. It's important to ask what aspects have changed and why. Sometimes our memory has biases, so using something analytical can help understand the changes. If you feel you’re moving backwards, I recommend taking the time to retro to understand what’s changing and why.

When doing a retro to assess maturity, who should be involved in that discussion?

I would start with the research team for a quick heuristic assessment. For understanding executive buy-in, talk to executives. For audience engagement with your research, look at who attends readouts, accesses the repository, who is requesting research (and potentially being turned down). These retros and self assessments shouldn’t always become a research project. Start with what you can put your arms around, which is your own team, and seek external information as needed. 


How do you recommend bridging the gap between research and stakeholders to advance maturity?

One way to bridge this gap is by moving from reactive to proactive research. 

  • Reactive research often comes too late and is more tactical. 
  • Proactive research involves understanding the business's decision-making timeline and proposing research that aligns with it. 

This changes the dynamic of what people expect from research and helps them see its value. Take every opportunity to do proactive research. 

What are some good questions to ask or signals to look for to gauge an organization's research maturity during job hunting?

  • What types of research are being done? 
  • How is research prioritized? 
  • What’s the process for determining research efforts?
  • What barriers to research are you facing? 
  • What’s an example of a research you’ve done that was well-received?
  • How would you describe the maturity of research at your org?
  • What is the reporting structure like? 
  • What does the research model look like? Is it centralized or embedded?

Many of these questions can reveal a lot about the organization’s maturity. You can also directly ask about the organization’s maturity.

Do you see the ability to articulate the business impact of Research as playing a role in the maturity level?

Absolutely. Articulating the business impact of Research is crucial for maturity. Research leaders should be able to speak to peers about the value of research initiatives and their impact on business decisions. It's important for individual contributors to tie research efforts to business outcomes and communicate that effectively.

Research as an activity has a business impact in general, and also a specific piece of research can have a specific business impact if we do certain things with it. I think addressing the individual piece is a way to achieve that broader impact.

This is a great thing for a research leader to discuss with peers. Explaining why we're saying no to certain requests and why we're proposing a particular research schedule with specific actions is important because it impacts business decisions. These conversations should absolutely be happening, especially when the research leader has a mandate from senior leadership.

It's best when senior leadership hires research leaders specifically for this purpose, so they aren't trying to sell the value of research up the chain or across departments. That's the ideal condition. For individual contributors or those not reporting to a research leader, it's a heavy burden to change people's minds about the business value of research in a broad sense.

However, that doesn't mean individual contributors shouldn’t try to convey the value of research. We care about research and what it can achieve, so we naturally talk about it. But we should set realistic expectations about what we're trying to change and how we position it.


On a project basis or any research initiative, we should talk about the value of the research to the business. 

  • Here's why we're doing this and not that
  • Here's why we're talking to these people. 
  • Here's the business question and the corresponding research question. 
  • Here's the research method that best answers the research question, informing the business challenge. 
  • Then, here's what we learned and how we'll make decisions based on this research.

On a case-by-case basis, individual contributors not responsible for the overall positioning of research should discuss these aspects. This topic can be contentious, particularly around whether researchers should make recommendations. Some people believe researchers should make recommendations, while others think maybe not.

It ultimately depends on the context. When researchers co-synthesize or work in project teams spanning disciplines, discussing what the research revealed is crucial. Analyzing, synthesizing, and sense-making together points to what actions to take. So yes, recommendations are made, but collaboratively.

When I hear the recommendations question, I worry about researchers proposing naive solutions. People who understand technical aspects or design deeply know there are many ways to achieve something that can be optimized based on various factors. I don't want us to suggest, "We heard this, so we should do X," without depth.

Naive enthusiasm can hurt our credibility in discussing business realistically. If we're making recommendations, we should facilitate that process. We might have a brainstorm or solicit input, perhaps doing paper prototyping with colleagues to create actionable steps.

Recommendations, business actions, and decisions should come out of a collaborative process, not just be thrown over the wall. It all comes back to relationships.


Any tips for shifting executive attitudes toward research?

I think we should be realistic, and I'm going to be a little cynical here. If you have the mandate, you're in a much better position to manage up, but managing up is hard. We have to be realistic about what we can expect if we don't have someone's ear or they don't see us as having authority.

We want to believe that if we do great work, it will shine on its own, and people will say, "Wow, we asked you to do this work, and it was awesome. It changed our results, and we want more." While this does happen, it's not always the case. It's sort of table stakes for us to do great work, and we need to recognize that.

Whether proactive or reactive, you can prioritize based on this objective. The Salesforce User Research team published a piece a few years ago showing how they segmented their stakeholders using a two-by-two matrix around influence and impact. They chose different strategies for each cell: 

  • staying close to influential people who believed in research, and
  • ignoring those who were negative for now.
Image from “Get the most out of stakeholder collaboration – and maximize your research impact,” by Anna Poznyakov 

Shifting executive attitudes by going where the interest or heat is ultimately is a long game. Aviva Rosenstein, when she was Senior Manager of User Experience Research at DocuSign, talked on my podcast about prioritizing requests by asking, "What decisions will you make as a result of this research?" This extracted a commitment to act on the research, creating shared cultural knowledge that research changes things. Only doing projects where people agree to change stuff helps avoid the narrative that research was interesting but didn’t lead to action. If you have more requests than you can handle, pick the ones that will change the narrative and culture by telling stories of success.


It's less about addressing reluctant leaders head-on and more about creating visible success stories throughout the organization. You have a matrix, prioritize your leaders, and do work for those who will use and champion it. This creates information that others will talk about, which takes longer but is essential for culture change. Culture changes slowly through relationships, time, and repetition.

What tactical advice would you give for teams that want to scale their maturity while balancing everyday research responsibilities?

How do you eat an elephant? One bite at a time. And maturity is the elephant here. 

  • Focus on small, manageable steps. 
  • Identify what's important to you and prioritize based on effort and value. 
  • Aim to move one cell to the right in the maturity matrix. 
  • Use prioritization methods to determine what's most important and start there.

It's about making incremental progress rather than trying to fix everything at once. 

What keeps you passionate and motivated about User Research after so many years in the field?


Research is just so fascinating. The act of research is wonderful and it just gives me a lot of joy. I love the collaborative aspect of it. I enjoy the discovery process and working with people to figure things out. Those moments of discovery are incredibly rewarding. I love interviews – that’s my method of choice – and collaboratively synthesizing with people, figuring something out by talking aloud that none of us could have reached alone. There's a joyous spark of discovery in those moments, and I love working with people to get there.

Connect with Steve

If you enjoyed Steve’s AMA:

Thank you, Steve!

We’re grateful to Steve for joining us and sharing his insights and experiences. If you’d like to watch the full webinar, follow this link