Stakeholder Research  ·  docu tools  ·  Pin Placement Flow

Before we redesign,
we listen.

A structured interview guide for Sales, Customer Experience, and Support — surfacing the lived truth about how customers use, struggle with, and talk about Pin placement.

VERSION   1.0
SESSIONS   45–60 min each
FORMAT   1:1, recorded
OWNER   UX / Product Design

Three windows on the same product

Each team watches Pin placement from a different vantage point. Interviewing all three triangulates the picture — what customers want (Sales), what they do (CX), and what breaks for them (Support).

SALES

Hears the expectation. What prospects compare us against, which demo moments close deals, and which gaps cost us deals.

CUSTOMER EXPERIENCE

Hears the reality. What happens once the contract is signed — celebrated moments, workarounds, renewal risks.

SUPPORT

Hears the breakage. The actual tickets, the "how do I..." questions, the repeat issues that never quite get solved.

PRINCIPLE 01
Stories over opinions

"Tell me about a time..." beats "Do you think...". Specific memories beat generalizations every time.

PRINCIPLE 02
Frequency + impact

For every pattern, ask how often it happens and how much it hurts. Rare + painful ≠ common + mild.

PRINCIPLE 03
Listen for the workaround

Workarounds are fossils of friction. When someone describes one, dig in — they reveal what the product should do.

PRINCIPLE 04
Follow the platform

Always ask which device/OS. The audit shows Android is the weakest — confirm whether stakeholders have felt this too.

S
SALES · 45 MINUTES · 2–3 INTERVIEWEES

Conversations with Sales

Sales sits at the boundary between market expectation and our product reality. They hear what prospects ask for during demos, what alternatives they mention, and what gaps turn into objections. Their stories reveal which parts of Pin placement are selling points and which are liabilities.

Who to talk toAccount Executives, SDRs, Solutions Engineers
Look forDemo moments, objections, competitor references
AvoidFeature requests as-is — probe the underlying need
GROUP 01

Demo behaviour & first impressions

Understand what prospects notice and react to when they first see Pin placement.
S1.1

Walk me through how you typically demo Pin placement. Which moments land — and which do you skip past?

  • Which platform do you demo on most often? Why that one?
  • Are there steps you avoid showing? What are they and why?
Why we ask Sales demos are a curated highlight reel. What they choose to show reveals strengths; what they skip reveals weak spots.
S1.2

When prospects see Pin placement for the first time, what do they typically react to — positively or negatively?

  • Any "wow" moments you can remember specifically?
  • Any "wait, that's it?" moments?
  • What questions do they ask right after the demo of this flow?
Why we ask First reactions are unfiltered. They surface both delight and confusion before prospects rationalise.
S1.3

Which platform do prospects usually want to see the demo on — and have you ever had to switch platforms mid-call to avoid showing something?

  • Have you ever avoided showing the Android experience specifically?
  • When a prospect is an Android-first company, how does the conversation change?
Why we ask The audit flagged Android parity as a major gap. Sales can confirm whether this is already affecting deal strategy.
GROUP 02

Objections & competitor comparisons

Understand which Pin-related gaps cost deals and which competitors are used as yardsticks.
S2.1

Have you ever lost a deal — or come close to losing one — because of something specific in how Pins work?

  • What was the exact gap or concern raised?
  • Were they comparing us to a specific competitor?
  • How did you handle it?
Why we ask Lost-deal stories are the most concrete signal of where the product falls short of market expectation.
S2.2

Which competitor features do prospects mention when discussing Pins or defect-tracking on plans? What do they specifically praise about those tools?

  • PlanRadar, Dalux, Fieldwire, Autodesk — which come up most?
  • What do prospects say those tools do better?
  • Any features you wish we had when you're comparing?
Why we ask Competitor benchmarking points to specific interaction patterns (like right-click-to-create in PlanRadar) that the audit already identified as gaps.
S2.3

When prospects ask "can we do X with Pins?" — what X comes up most often?

  • Templates? Bulk actions? Integrations? Custom workflows?
  • Any requests you hear weekly vs. once a year?
Why we ask Recurring feature requests reveal unmet expectations in the category. The "X" patterns inform roadmap priority.
GROUP 03

Persona & industry signal

Understand who uses Pins most intensively and how that shapes expectations.
S3.1

Which types of companies or roles get most excited about Pins during a demo? Which barely react?

  • GC vs. subcontractor? Site manager vs. office-based PM?
  • Any industry verticals where Pins are the make-or-break feature?
Why we ask Different personas stress different parts of the flow. Knowing who cares most helps prioritise the right improvements.
S3.2

If you had a magic wand and could change one thing about how Pins work before your next demo, what would it be?

  • Why that one specifically?
  • Would it change how you pitch the product?
Why we ask A focused wish-list question that forces prioritisation and often surfaces the most-felt pain point.
C
CUSTOMER EXPERIENCE / SUCCESS · 60 MINUTES · 3–4 INTERVIEWEES

Conversations with Customer Experience

CX lives in the space between "what customers bought" and "what customers actually do". They see onboarding moments, renewal conversations, and the long tail of adoption patterns. Their stories reveal which parts of Pin placement customers love, tolerate, and route around.

Who to talk toCSMs, Onboarding Specialists, Account Managers
Look forAdoption blockers, workarounds, churn signals
AvoidGeneric "customers are happy" — push for specifics
GROUP 01

Onboarding & adoption

Understand where customers stumble when they first learn Pin placement — and what eventually clicks.
C1.1

When you onboard a new customer, what's the hardest part of teaching them to place Pins?

  • Which step takes the most explaining?
  • Any moments where customers say "I would never have found that"?
  • Difference between iOS vs. Android customers?
Why we ask Teaching points map directly to discoverability gaps. The audit flagged long-tap entry and hidden actions (Groups, Subscribe) — CX will confirm whether these are onboarding bottlenecks.
C1.2

Are there features in the Pin flow that you deliberately don't show during onboarding because they'll confuse more than help?

  • What are they?
  • When — if ever — do you eventually reveal them?
Why we ask When CX hides features, those features are either too complex or poorly positioned. Either way, a design opportunity.
C1.3

Think of a customer who really nailed Pins and made them core to their workflow. What did they do differently from average customers?

  • How did they discover or learn the advanced behaviours?
  • What setup did they have in place (categories, templates, groups)?
  • Can that pattern be made more accessible to others?
Why we ask Power users reveal the product's full potential — and the gap between its potential and what average users achieve.
GROUP 02

Workarounds & workflow stories

Surface the fossils of friction — things customers do to work around gaps.
C2.1

What workarounds have you seen customers invent to get Pins to work the way they want?

  • Naming conventions to compensate for missing grouping?
  • Using categories as a workaround for templates?
  • Exporting to Excel for anything?
  • Using external tools alongside Pins?
Why we ask This is the single most productive question in the whole guide. Workarounds are literal maps of missing features.
C2.2

When customers describe their Pin workflow, how often do they mention categories, templates, groups, or subscriptions? Which of these are live ideas in their heads vs. ignored?

  • Any of these four that almost no customer uses?
  • Any that everyone uses but in very different ways?
Why we ask Several of these concepts surfaced in the audit with unclear value. Adoption data from CX will tell us which are pulling weight.
C2.3

Have you seen customers use Pins differently than we designed them? Any surprising or creative uses?

  • Pins used as bookmarks? As measurement markers? As social notes?
  • Use cases you didn't expect to see?
Why we ask Emergent use cases reveal real jobs-to-be-done that the original design may not have anticipated.
GROUP 03

Platform & context patterns

Validate the audit's platform findings against lived customer experience.
C3.1

Do customers use one platform for Pins or mix devices? Where does each platform shine — and where do complaints cluster?

  • Is mobile the primary placement device? Is desktop the primary review device?
  • Any patterns by role — site walker vs. office manager?
  • Any complaints specific to Android?
Why we ask Confirms the real-world consequences of the audit's platform inconsistencies and tells us where to focus.
C3.2

Has Pin placement ever been mentioned in a renewal conversation — positively or negatively?

  • Any customer who renewed partly because of Pins?
  • Any who raised Pin friction as a concern at renewal?
  • Churned customers — was Pins ever cited as a factor?
Why we ask Renewal and churn are the highest-stakes signals. Anything that shows up there deserves priority.
C3.3

If you could tell the product team one thing about how customers really use Pins, what would it be?

  • Anything that surprised you when you first saw customers using it?
  • Anything that frustrates you about the current design?
Why we ask Open closer — invites the interviewee to say what's been on their mind that the structured questions missed.
H
SUPPORT · 45 MINUTES · 2–3 INTERVIEWEES

Conversations with Support

Support sees the product at its most broken. Every ticket is a moment when the design failed someone. The repeat questions, the screenshots with confused users highlighting toolbar areas, the "how do I..." tickets — these are the most concrete, unfiltered signals of friction we have.

Who to talk toSupport leads, Tier 1 & Tier 2 agents
Look forRepeat ticket themes, confusion patterns, bugs
AvoidSolution-mode ("they should just...") — stay on symptoms
GROUP 01

The ticket pile

Quantify the most common Pin-related issues and where they cluster.
H1.1

What are the top three recurring tickets related to Pin placement?

  • Which is most common? Which is most frustrating to resolve?
  • Are they bugs, missing features, or confusion?
  • Any that come in waves — tied to releases or project types?
Why we ask Ticket frequency is the most honest measure of UX failure. Three ranked repeat offenders give immediate priority signal.
H1.2

Which platform generates the most Pin-related tickets relative to its user base?

  • Any platform where tickets spike?
  • Any platform that's surprisingly quiet — and is that because it's working, or because customers gave up?
  • Android-specific ticket patterns?
Why we ask Validates the audit's platform severity ranking from an independent angle. "Surprisingly quiet" is a trap — could mean abandonment.
H1.3

Can you walk me through a recent Pin ticket you resolved — start to finish? What did the customer think was wrong, and what was actually going on?

  • What was the first message they sent?
  • How many back-and-forths to resolve?
  • Was it a bug, misunderstanding, or missing feature?
Why we ask Concrete narrative reveals the journey of confusion in specific detail. Often uncovers subtler issues than a ticket-category count.
GROUP 02

Confusion patterns

Map the specific moments where users get stuck.
H2.1

Are there parts of the Pin flow where customers often ask "where is the button to..." or "how do I..."?

  • Any action they repeatedly can't find?
  • Grouping? Subscribing? Changing category? Creating a task?
  • Anything about starting Pin creation on mobile?
Why we ask Directly tests the audit's hypothesis about hidden actions and the lack of a mobile "New Pin" trigger.
H2.2

Are there moments where customers think a feature is broken but it's actually working as designed?

  • The Android tap-placement behaviour (centred vs. at tap)?
  • Category dropdown that looks like a dropdown but acts like a modal?
  • Attach button that turns into Submit?
  • Any others?
Why we ask "Working as designed" tickets are dead giveaways that the design itself is the problem.
H2.3

What questions do you answer so often that you've built a canned response for them?

  • Are any of them Pin-related?
  • What's the top canned answer for Pins?
Why we ask A canned response is evidence of a systematic UX failure worth fixing at the design level rather than the ticket level.
GROUP 03

Bugs & edge cases

Capture the long tail of technical issues and unusual scenarios.
H3.1

Are there known bugs in Pin placement that customers hit regularly but haven't been prioritised for fixing?

  • Data loss on sync? Missing attachments? Wrong categories?
  • Any bugs specific to a platform or OS version?
  • Anything you keep expecting to be fixed but isn't?
Why we ask Support often knows bugs engineering hasn't triaged. Surfaces the backlog hidden in tribal knowledge.
H3.2

What's the weirdest or most frustrating Pin-related ticket you've ever dealt with?

  • What made it so hard?
  • Did it ever actually get resolved?
Why we ask Extreme cases are memorable and often reveal structural issues, not one-offs.
H3.3

If we could fix one Pin-related ticket theme so it never came in again, which would save you the most time?

  • Why that one specifically?
  • How big a chunk of the Pin ticket volume is it?
Why we ask Forces ranking and quantification. Produces an immediately actionable priority signal.

Practical guidance for the interviewer

Good interviews are a craft. These habits make the difference between collecting opinions and collecting evidence.

Do

  • Start with warm-up. First 3 minutes: role, tenure, typical day. Builds rapport and calibrates context.
  • Ask for stories. "Tell me about the last time..." beats hypotheticals. Concrete memory beats abstract opinion.
  • Probe frequency. For every pattern: "How often does this happen? Once a week? Once a month?" Quantify.
  • Name the platform. Always clarify — Web, iPad, iPhone, Android tablet, Android phone. Don't let "the app" stay ambiguous.
  • Summarise back. "So what I'm hearing is..." — catches misunderstandings and invites correction.
  • Record (with consent) and take structured notes. Transcribe key quotes verbatim — they're gold for synthesis.
  • Leave 5 minutes for "anything we didn't cover that you think we should know?" — the best insights often come here.

Don't

  • Lead the witness. "Don't you think Android placement is bad?" → "How do customers react to placement on Android?"
  • Accept solutions at face value. "They want a button here" → probe underlying need, not the proposed fix.
  • Defend the product. Even if you disagree, stay curious. Debate kills signal.
  • Rush through silence. Three seconds of silence often produces the most honest answer. Wait.
  • Mix roles. Don't interview a CSM and a Support agent together — their different vantage points will collapse into the loudest voice.
  • Skip the "how often" question. Without frequency, every insight looks equally important. It isn't.
  • Treat the interview as a focus group. This is 1:1 for a reason. Consensus isn't the goal.

Synthesis template

Fill this out within 24 hours of each interview — while the detail is fresh. The goal is 5–10 minutes per session, not a full report. You'll thank yourself during cross-interview synthesis later.

Interview debrief · one per session

Fill immediately after call · keep it short · quote verbatim where possible
Interviewee
Name · Role · Tenure · Date  
Top 3 insights
The three things you'd tell a colleague on the way to lunch. Specific, not general.
Best verbatim quote
One sentence, word-for-word, that captures the session's central tension or truth.
Surprise
What did you hear that you didn't expect? This is often the most valuable part.
Confirmed / contradicted
Which findings from the audit did this session reinforce? Which did it complicate or contradict?
New question
What new research question did this interview create? Carry it forward into the next session.
Quote-worthy tension
Any tension between what the interviewee said vs. what another team has said? Flag for cross-session synthesis.