Every marketing team knows they need to understand their target audience. The research on buyer personas has been published, the ICP workshop has been facilitated, and the demographic profile of the ideal customer has been documented in the brand guidelines. And yet the content the team produces still speaks to everyone in general and no one in particular — because the audience definition that exists is a demographic, not a person, and a demographic cannot be written to.
The difference between knowing who your target audience is at the demographic level and knowing who they are at the depth of specificity that marketing actually requires is the difference between content that is technically relevant and content that resonates. Technically relevant content reaches the right person. Resonant content makes the right person feel that this specific piece of content was made specifically for them — which is the quality that produces the engagement, the trust, and the pipeline movement that technically relevant content does not.
Most audience definition frameworks stop at the demographic. They identify the job title, the company size, the industry, and the geographic market — and call the resulting profile an ICP. This is the starting point of audience definition, not the completion of it. The ICP at scripting depth — the level of specificity required to write a video hook that makes the right viewer stop scrolling, a script that makes the right buyer feel understood, and a CTA that produces action rather than consideration — requires a fundamentally different kind of research and a fundamentally different kind of documentation than a demographic profile produces.
In this video, Dallin Nead walks through the complete framework for knowing your target audience at the depth of specificity that B2B video marketing requires — covering the research methods that surface genuine buyer understanding rather than assumed buyer knowledge, the documentation format that makes the audience definition usable by every team member who produces content, and the specific audience signals that tell you whether your current content is reaching the right person or merely reaching the right demographic.
Why Most Audience Definitions Fail
The demographic trap — and why it produces content that reaches everyone and resonates with no one
The most common audience definition failure mode is the demographic profile — a description of the target audience that specifies who they are without specifying what they are experiencing. A VP of Marketing at a B2B SaaS company with 50 to 500 employees is a demographic. It tells you who to reach. It tells you nothing about what to say when you reach them.
The content that reaches the VP of Marketing demographic without understanding what the VP of Marketing at a specific stage of company growth, with a specific team size, in a specific category, facing a specific combination of pipeline pressure and resource constraints is actually experiencing — that content is technically aimed at the right person and experientially aimed at nobody. It describes the category the target audience occupies without speaking to the situation the target audience is in. And a person who does not feel spoken to in the first five seconds of a video is not a viewer who continues watching.
The audience definition that makes content resonate requires knowing not just who the target audience is but what they are experiencing — the specific situation they are in, the specific friction they are navigating, the specific outcome they are trying to achieve, the specific language they use internally to describe the problem before they have encountered the solution's vocabulary, and the specific objections they carry into every evaluation of a potential solution based on the previous solutions that did not deliver.
This is the depth of audience knowledge that makes a video hook land, a script feel personal, and a CTA produce action. And it cannot be produced through demographic profiling alone.
The Four Research Methods That Produce Genuine Audience Understanding
Method one — Sales call recordings
The highest-value source of genuine buyer language is the recordings of conversations between the sales team and prospects who are currently evaluating the product — specifically the moments when the prospect is describing their own situation, their own problem, and their own previous attempts to solve it before encountering this solution.
Sales call recordings contain the specific language buyers use before they have encountered the solution's vocabulary — the words, the phrases, and the analogies they reach for when describing a problem they have been living with long enough to have developed their own internal description of it. This language is the raw material of every high-performing hook, every resonant problem statement, and every CTA that produces action because it meets the buyer in their own cognitive frame rather than asking them to translate the company's language into their own.
The specific process for mining sales call recordings for buyer language — the categories of statement to listen for, the documentation format that captures the language in a form the content team can use, and the frequency of recording review that keeps the audience definition current as the ICP and the market evolve.
Method two — Customer interviews
Customer interviews are the research method that produces the deepest understanding of the complete buyer journey — from the moment the problem first became significant enough to address through the evaluation process to the specific outcome the engagement produced. Unlike sales call recordings, which capture the prospect's perspective before a decision has been made, customer interviews capture the retrospective account of someone who has completed the journey — which means they can describe what they were thinking at every stage, what almost made them choose a different option, and what specifically made the difference in the final decision.
The specific customer interview structure that produces the most useful audience understanding — the question sequence that moves through the before state, the evaluation process, the decision moment, and the after state in a way that surfaces the specific insights the content team needs. The recording and documentation approach that captures the interview's most useful moments in a searchable, referencable format. And the interview cadence that produces ongoing audience understanding rather than a one-time research exercise that becomes outdated as the ICP evolves.
Method three — Community and forum research
The specific places where the target ICP discusses their problems in public — LinkedIn posts and comments, industry forums, community Slack groups, subreddits, and the comment sections of content that serves the same audience — are sources of unfiltered buyer language that no interview or sales call can fully replicate. In a sales call, the prospect is speaking to the company and is therefore calibrating their language to the conversational context. In a community forum, they are speaking to peers — and they are using the exact language they use internally, without any consideration of how the company might interpret it.
The specific community and forum research process that surfaces the highest-value audience language — the platforms most likely to contain relevant conversations for a specific ICP, the search and monitoring approach that identifies the most valuable threads and posts, and the documentation method that captures the language in a form the content team can apply without losing the original context in which it appeared.
Method four — Direct audience signals from existing content
For content teams that are already publishing — on LinkedIn, YouTube, email, or other channels — the engagement data from existing content is one of the most direct and most frequently ignored sources of audience understanding available. The videos that generate the highest watch time, the posts that produce the most comments, the emails that generate the most replies, and the content that drives the most inbound enquiries are all signals about what the target audience cares about most — and most content teams do not have a documented process for capturing and acting on those signals.
The specific audience signal review process that extracts actionable audience understanding from existing content performance data — the metrics to track, the pattern recognition method that identifies the content themes and formats the audience responds to most strongly, and the feedback loop that connects content performance data to the audience definition document so both evolve together rather than diverging over time.
The Audience Definition Document — The Format That Makes Audience Understanding Usable
Why the research is only half the work
The most thorough audience research produces no content improvement if it is not documented in a format that every team member who produces content can access, understand, and apply consistently. The audience definition document is the format that bridges the research and the application — translating the buyer language, the situational understanding, and the specific insights produced by the research methods into a reference document that every scriptwriter, every content producer, and every executive who films on-camera content can use to ensure that every piece of content speaks to the same specific person in the same specific situation.
The six components of an audience definition document that makes content better
The specific situation description — a documented narrative of the specific situation the target ICP is in at the moment they are most likely to encounter the content. Not a demographic. Not a job title. A situation — the combination of role, team size, resource constraints, pipeline pressure, and strategic context that makes the target prospect both qualified for the solution and motivated to evaluate it.
The problem statement in buyer language — the specific words, phrases, and analogies the target audience uses to describe the problem when they are talking to peers rather than to vendors. Not the company's description of the problem the solution addresses. The buyer's own language for the friction they are experiencing — drawn directly from the sales call recordings, the customer interviews, and the community research.
The previous solution history — a documented account of what the target audience has typically tried before encountering this solution, why those previous attempts failed to fully resolve the problem, and the specific frustrations and residual skepticism those failures produced. This is the context that makes the reframe in a video script land — the identification of the root cause that explains why every previous solution failed, which sets up the current solution as the logical answer rather than another option on a list of things to try.
The objection inventory — a documented list of the specific objections the target audience carries into every evaluation of a potential solution, drawn from the moments in sales call recordings where prospects express hesitation, the questions that appear most frequently in discovery calls, and the reasons prospects give when they choose a competitor or decide not to move forward. The objection inventory is the reference document that ensures every piece of conversion-stage content addresses the specific barriers standing between the qualified prospect and the decision.
The outcome aspiration — the specific, concrete change in the buyer's professional situation that they are trying to achieve — stated in the terms they use internally rather than in the outcome language the company uses in its marketing. The difference between "better video content" and "a content program that my CMO stops asking me to explain and starts using as evidence of marketing's pipeline contribution" is the difference between a general aspiration and a specific outcome — and the specific outcome is what the transformation promise in every piece of conversion content should be built from.
The buyer language library — a curated collection of the specific phrases, analogies, and vocabulary items drawn from the research that most accurately capture how the target audience describes their situation, their problem, and their desired outcome. This is the reference document that scriptwriters use to ensure that every hook, every problem statement, and every CTA is written in the buyer's language rather than the company's.
The Three Audience Signals That Tell You Whether Your Definition Is Accurate
How to know if the audience definition is working — before the research cycle repeats
An audience definition document is not a static artifact. It is a hypothesis — a documented set of claims about who the target audience is, what they are experiencing, and what they need to hear to engage with the content and act on it. Like every hypothesis, it should be tested against evidence and updated when the evidence contradicts the current assumptions.
The three specific audience signals that tell you whether the current audience definition is producing the content performance it should — and that identify the specific components of the definition that need to be revised when performance falls short.
Hook performance data is the first signal. A hook written from an accurate audience definition produces strong three-second view rates — because the opening statement resonates with the right viewer immediately and earns the continued attention that every other metric in the performance report depends on. A hook that produces weak three-second view rates from the right demographic audience is a signal that the problem statement in the audience definition is not specific enough, not resonant enough, or not written in sufficiently close approximation to the language the target audience uses internally.
Comment and reply quality is the second signal. Content that resonates with a specifically defined audience produces comments and replies that reflect genuine recognition — the specific, personal responses that signal the viewer felt the content was made for them. Comments that are generic, polite, or positive without being specific are a signal that the content reached the right demographic but did not resonate at the situational level that makes content feel personal.
Inbound enquiry quality is the third signal. Content built on an accurate audience definition produces inbound enquiries from prospects who are specifically qualified — who describe their situation, their problem, and their desired outcome in the same language the audience definition documents. Inbound enquiries from prospects who are outside the defined ICP or whose described situation does not match the audience definition are a signal that the content is reaching a broader or different audience than the definition specifies — and that either the audience definition or the content distribution needs to be refined.
Applying Audience Understanding to Video Content
The translation from audience definition to video script
The audience definition document is the strategic foundation. The video script is the creative application. The translation between the two — the process of taking the documented buyer language, the situational understanding, and the objection inventory and applying them to the specific sections of a video script — is the production discipline that ensures audience research produces content improvement rather than sitting in a document that nobody references between the quarterly planning sessions.
The specific script sections where the audience definition has the most direct impact — the hook, which is written directly from the problem statement in buyer language; the problem agitation section, which is written from the previous solution history and the specific frustrations it produced; the reframe, which is written from the root cause insight that explains why previous solutions failed; the proof section, which is written from the outcome aspiration and the specific evidence that makes the transformation credible; and the CTA, which is written from the specific next action the buyer is most ready to take at the stage of the buyer journey the content serves.
The script review process that verifies audience definition alignment — the specific questions to ask of every draft script before it goes into production to confirm that the hook speaks the buyer's language, the problem statement resonates with the buyer's actual situation, and the CTA asks for the specific action the buyer is ready to take.
Who This Video Is For
Founders, marketing leaders, and content teams who are producing video content and have experienced the gap between content that is technically aimed at the right audience and content that actually resonates — and who want the specific research methods and documentation frameworks that close that gap.
Any B2B marketing team that has completed an ICP or buyer persona exercise and found that the resulting profile did not meaningfully improve the specificity or the performance of the content it was supposed to inform — and who want to understand what a genuinely usable audience definition looks like and how to build one from research rather than assumption.
And any content producer, scriptwriter, or executive who films on-camera content and wants a documented reference that makes every scripting decision — every hook, every problem statement, every CTA — grounded in genuine audience understanding rather than the company's assumptions about what the audience cares about.






