Enhancing recruiters’ outreach messages leveraging AI
Overview
| Company | |
| Product | LinkedIn Recruiter |
| My Role | Sole UX Researcher (Contract) |
| Team | 1 Product Manager, 1 Designer, 2 Engineers |
| Timeline | 7 weeks |
| Methods | Moderated usability study, think-aloud protocol, semi-structured interviews |
Context
LinkedIn Recruiter had long been a leader in talent sourcing, but there was a persistent problem: the outreach messages recruiters sent to potential candidates were often generic and low-quality. Passive candidates—the very people recruiters most wanted to reach—weren’t responding. Response rates were declining, and recruiters were spending significant time crafting messages that failed to stand out in crowded inboxes.
With Generative AI emerging in late 2022, there was a compelling opportunity to explore whether AI could enhance recruiter outreach messaging. This was the early days of Gen AI—the “wild west” as our team called it—and we had a chance to be among the first to explore how this technology could create personalized, unique messages by scraping data from LinkedIn member profiles. The initiative would allow us to build an AI-powered composing tool that could reduce time and effort while producing eye-catching outreach messages to passive candidates.
Research Statement
“How might we leverage Gen AI to enhance recruiter outreach messaging, improve message quality, and increase response rates while keeping the user at the center?”
This research statement aligned our team around a common goal. Notice that our North Star wasn’t only about the product—it was about the recruiter experience. This intentional framing ensured we never lost sight of the humans using the tool, even as we explored cutting-edge AI capabilities.
Research Goals
I worked closely with the product team to establish goals that balanced business outcomes with user experience:
Business Goals
- Provide clear value to recruiters through an AI-powered composing tool
- Enhance outreach message quality and increase response rates
Research Goals
- Evaluate the MVP design for intuitiveness and ease of use
- Assess the MVP experience for transparency and user control over AI-generated content
Research Questions
Our questions sat at the intersection of foundational understanding and tactical usability:
Foundational: “How can LinkedIn design an AI messaging tool that enhances recruiter engagement and increases response rates?”
Usability: “How can LinkedIn improve the usability of the prototype’s AI-powered features to help users confidently navigate and operate the recruitment tool?”
This dual focus was intentional—we needed to understand both whether AI aligned with recruiter workflows (foundational) and whether our specific implementation was usable (tactical).
Research Methodology
Given that Gen AI was still a novel technology, I chose moderated usability studies as the primary method. We needed rich qualitative data, and we needed to observe real-time user behavior—especially reactions to AI-generated content, which was an entirely new paradigm for most recruiters.
The study combined a think-aloud protocol with semi-structured interviews. The think-aloud method was particularly effective for capturing recruiters’ immediate, unfiltered reactions to AI-generated messages—moments of surprise, skepticism, delight, and confusion that would have been lost in a survey. The semi-structured interview portion allowed us to explore deeper foundational questions about AI and messaging workflows. Sessions were 60 minutes with prototype testing conducted within the LinkedIn Recruiter environment.
Recruitment
I recruited 7 participants with the following criteria:
- Sourcers or recruiters who actively source candidates
- Sent outreach messages at a rate of ≥30 per month
- Mixed age and gender representation
- US-based participants
The focus on high-volume outreach users was deliberate—these were the recruiters most likely to benefit from AI-assisted messaging and most equipped to evaluate whether the tool fit their workflow. Seven participants was our target, primarily because the study was predominantly usability-focused and we were operating within a rapid research timeline.
Sample Usability Tasks
Tasks were structured as realistic scenarios to elicit natural behavior:
“You’ve identified a potential candidate. Begin the process of crafting an outreach message using the AI-assisted drafting tool.”
“Your first outreach message didn’t receive a response. Using the same tool, craft a follow-up message to the candidate.”
Sample Interview Questions
During the semi-structured portion, I explored deeper themes around AI and messaging:
“What aspects of the current AI messaging feature do you find most useful?”
“If you could change one thing about the feature, what would it be and why?”
Collaboration & Stakeholder Partnership
One of the most valuable things I learned about approaching rapid research at LinkedIn was making sure stakeholders understood this was their research as much as mine. During kickoff, we workshopped research goals and questions together, set expectations for how rapid research operates, and established a genuine research partnership. I made sure stakeholders—typically the designer—were along for the research journey, observing sessions and participating in synthesis. This investment in partnership paid dividends when it came time to act on findings.
Timeline
The study followed a 7-week timeline:
- Week 1: Kickoff and research plan
- Week 2: Recruitment, prototype preparation, discussion guide
- Weeks 3-4: Conduct research (7 sessions)
- Week 5: Usability data analysis
- Week 6: Foundational data analysis
- Week 7: Craft and deliver two separate presentations (usability insights and foundational insights)
I typically deliver a single presentation, but in this case, prioritization required trade-offs. Delivering usability findings first allowed the team to begin iterating on the design immediately, while the foundational insights provided strategic direction for the product roadmap.
Key Insights & Impact
The research surfaced four critical findings:
- Outreach quality valued over speed — Recruiters emphasized the need for personalized, high-quality messages over quick automation. This insight directly influenced product strategy to prioritize customization features over speed-focused ones.
- Trust is key in AI tools — Users needed transparency about what was AI-generated versus human-written to build trust. This led to design changes that promoted transparency throughout the experience.
- Navigation needs simplification — Several usability issues were identified in the tool’s workflow, driving a targeted redesign of the user flow.
- Recruiter engagement boosted — Foundational insights led to strategic enhancements that increased engagement metrics, informing the product roadmap for 2024.
Reflections
- Leverage existing research — Outreach messaging isn’t a new topic. There was plenty of existing research I could have used to inform our foundational questions. In hindsight, conducting secondary research upfront would have sharpened our research questions and made better use of our limited time with participants.
- Push for extended timelines when complexity demands it — Gen AI is inherently complex. While I did a strong job synthesizing data and telling a compelling story, the team would have benefited from having stakeholders engage with both tactical and foundational insights in a single, unified shareout rather than two separate presentations.
- Embrace mixed-methods deliberately — Rather than trying to capture everything through one methodology, I would have approached things more deliberately with both quantitative and qualitative methods integrated from the start—using surveys to establish baselines and qualitative sessions to explore the “why” behind the numbers.