Most in-house link building programs die the same way. A CMO greenlights a budget, a marketing coordinator is handed the project, outreach goes out for a few weeks, a handful of placements trickle in, and by month four, the spreadsheet is stale, and nobody is quite sure who owns it. The program is not dead on paper — nobody has killed it — but nothing is happening. The links have stopped.
This pattern is common enough that it has become a running joke in SEO circles, but the cause is almost never a lack of effort or talent. It is almost always a missing workflow. Link building is a multi-step, repetitive, cross-functional process. When teams run it as artisan work rather than as a documented procedure, it collapses under its own weight. The fix is less glamorous than a new tool or a smarter pitch template: it is a written standard operating procedure that names every step, every owner, and every quality gate.
The Numbers Nobody Plans For
Outreach is a volume game, and the volume required is higher than most in-house teams plan for. Backlinko’s analysis of 12 million outreach emails found that only 8.5% receive any reply at all — not a link, just a reply. The other 91.5% are ignored. More recent benchmarks from Hunter.io’s State of Email Outreach 2026 put the average cold email sequence reply rate at 4.5%, with SEO-focused digital PR campaigns landing around 13%. These numbers assume a sequence — meaning follow-ups — not a single send.
Walk the math forward. If a team wants ten placements in a month at an 8.5% reply rate, and if roughly a third of replies convert to actual links after negotiation and editorial back-and-forth, that team needs to send roughly 350 personalized pitches. With follow-ups, that is closer to 1,000 touch points. One coordinator working outreach part-time, between client calls and reporting, cannot produce that volume at quality. When the volume drops, the pipeline dries up, and the program stalls — not because link building stopped working but because the team stopped doing enough of it.
Most in-house teams never run that math before starting. They budget for an outreach tool and a coordinator’s time and assume the placements will follow. When they do not, the program gets labeled ineffective, when in reality it was simply under-resourced for the outreach volume the response rates require.
Where the Workflow Actually Breaks
The specific failure points are predictable and show up in roughly the same order across dozens of stalled programs. Knowing them is the first step to documenting around them.
Prospecting Is Ad Hoc
Someone pulls a list of target sites one week, then two weeks later, someone else pulls another list using slightly different criteria. There is no shared definition of what qualifies as a relevant prospect, no shared vetting checklist, and no central record of who has already been contacted. Duplicate outreach to the same editor is embarrassing; gaps in coverage are invisible.
Pitches Drift Off-Template
The first month’s pitches get attention and careful personalization. By month three, pitches are copy-pasted from a Google Doc that has been edited by three different people, each of whom softened or sharpened the language based on their own preferences. The message that earned placements in week two no longer resembles what is going out in week ten.
Follow-Ups Fall Through the Cracks
Follow-ups produce a disproportionate share of replies — in many studies, more than the initial send. Teams without a documented cadence simply forget to send them. A pitch goes out Tuesday, gets no response by Thursday, and is quietly abandoned because nobody scheduled the follow-up. The team leaves most of the response rate on the table without ever noticing.
Quality Vetting Is Inconsistent
Without a written vetting standard — traffic thresholds, topical relevance tests, outbound link ratios, content quality checks — vetting drifts toward whatever the person on duty happens to care about that week. Some placements that get approved would have been rejected a month earlier. The backlink profile becomes a collage of conflicting standards, and the program’s overall link quality gets harder to defend at review time.
Reporting Is Reactive
The coordinator compiles a report the day before the stakeholder meeting, which means metrics are defined on the fly and rarely compared against the same metrics from the previous month. Leadership senses the lack of rigor and loses confidence. Budget conversations get harder. The program shrinks.
What the SOP Should Actually Contain
A workable link building SOP is less complicated than people assume, but it has to exist in writing, with owners and timing. At a minimum, it covers five stages. What an SOP actually contains is worth reviewing before drafting one, because an SOP that is really just a bulleted checklist will not survive contact with a real outreach week.
Stage one is prospect sourcing. The SOP names the exact tools used (e.g., Ahrefs for backlink gap analysis, a specified database for journalist contacts), the qualification criteria every prospect must pass before entering the sheet, and the owner who reviews borderline cases. It also specifies where the prospect list lives and who has write access.
Stage two is pitch drafting. The SOP defines a primary pitch template with named variables for personalization, a rule for how many of those variables must be filled per pitch, and a QA step in which a second person reviews a random sample before sending goes out. It lists the subject-line conventions and the send-time window.
Stage three is the send cadence. Initial send, follow-up one at day three, follow-up two at day seven, close-out at day fourteen. The SOP specifies what changes in each follow-up so that recipients do not receive the same message three times. It also defines the hard stop — nobody gets a fourth follow-up — so coordinators are not agonizing over the decision.
Stage four is placement vetting and acceptance. When a site replies with an offer to place, the SOP runs through the vetting checklist — traffic, relevance, outbound link hygiene, and content standards — and produces a yes or no in under ten minutes. Borderline cases escalate to a named owner. Accepted placements trigger a content brief for the writer handling that piece.
Stage five is tracking and reporting. Every placement is logged with URL, anchor text, referring domain, date live, and target page. The monthly report pulls from this log automatically. Response rate, placement rate, and cost per placement are calculated the same way every month.
None of this is exotic. It is the kind of repeatable procedure that any documented-process approach is built for — the five steps to documenting SOPs apply here, the same as they do to onboarding or customer support. The reason so few in-house teams actually do it is that nobody has written workflow documentation into the coordinator’s job description, and nobody gets praised for SOPs until the moment the coordinator leaves and the program implodes.
The Ownership Question Kills More Programs Than the Tactics Do
Even a well-drafted SOP fails if no single person owns its enforcement. In-house teams commonly spread link building across three or four partial owners — the SEO lead reviews strategy, a content writer drafts pitches, a coordinator sends them, and a freelancer handles overflow. Nobody is accountable for hitting the monthly placement number, and nobody has the authority to stop a pitch from going out when the QA process is skipped.
The working pattern is a single named owner who is evaluated on placements per month and is empowered to reject work that violates the SOP. Everyone else contributes, but accountability does not diffuse. This is not a link building insight — it is basic operations — but it gets violated in link building more than almost anywhere else because the work feels creative and nobody wants to micromanage creative work. It is not creative work. It is a pipeline.
When to Keep Building In-House and When Not To
Not every team should run this program in-house. The honest calculation is whether the team can sustain the outreach volume the math requires, at the quality level that earns links from sites worth having. For teams with two or more people who can genuinely commit fifteen to twenty hours a week each to outreach, with an SOP in place and a named owner, in-house link building works and is often cheaper than outsourcing over a long horizon.
For teams that cannot commit that capacity, or that keep seeing the same four failure modes above despite good intentions, the realistic option is to hand the operational layer to a specialist. Resolve’s link building services are an example of what a fully operationalized version of this workflow looks like when a team has been running it for years — vetted prospect databases, tested outreach templates, a documented QA process, and the capacity to maintain pitch volume without letting quality slip. The point of naming is not that every team should outsource. It is that the alternative to a specialist team running a documented process is, in most in-house cases, an undocumented process running at roughly a quarter of the necessary volume.
The Through Line
In-house link building does not stall because Google changed the algorithm or because journalists are harder to reach than they used to be. It stalls because the work is treated as a heroic effort rather than as a documented procedure. A team that writes down its prospecting criteria, its pitch QA rules, its follow-up cadence, and its vetting standards will outperform a team twice its size that is running on tribal knowledge. The teams that understand this and either invest in their own documented workflow or partner with a team that has already built one are the teams that still have a link building program six months from now.
Everyone else is rebuilding the spreadsheet from scratch every quarter and wondering why the placements are not coming in.