OSINT Methodology

Source triangulation

Source triangulation is validating claims across multiple independent source types—regulatory, verified news, and direct observation—to build confidence before treating data as actionable, avoiding the error of treating high-volume derivative mentions as independent verification.

Source triangulation is validating claims across multiple independent source types—not just multiple mentions, but genuinely distinct evidence classes (regulatory filings + verified news + direct observation)—to establish confidence before treating data as actionable.

Without triangulation, you're trusting single-source claims. One LinkedIn update. One press release. One database entry. With triangulation, you're building probabilistic confidence: "Three independent source types confirm this person is CIO—regulatory filing shows title, industry publication quotes them in role, company website lists them." This is the difference between "probably right" and "confident enough to act."

The error is treating volume as validation. Ten mentions from the same root source (syndicated press release, copied database entry) don't triangulate—they amplify the same potential error. Real triangulation requires independent origination: regulatory (Form ADV, SEC filings), verified media (paywalled business press), direct observation (company website, event roster), and cross-validation (multiple regulatory filings converging).

How allocators define source triangulation risk drivers

Teams evaluate triangulation quality through:

  • Source independence: distinct origination (not syndicated/copied content)
  • Tier diversity: mixing Tier 1 (regulatory) + Tier 2 (verified press) + Tier 3 (direct observation)
  • Temporal alignment: sources from similar time periods (avoid mixing 2-year-old filing with yesterday's news)
  • Claim consistency: sources agree on facts, not just mention the same entity
  • Evidence phrases: "confirmed via," "cross-referenced against," "validated through," "triangulated across"

Allocator framing:
"Is this data point confirmed across independent sources—or are we seeing the same claim echoed through derivative channels?"

Where triangulation matters most

  • high-stakes fields: role assignments, decision authority, mandate changes, investment activity
  • new relationship targeting where errors are costly
  • contested or ambiguous claims (multiple people with same name, unclear entity structures)
  • verification of significant changes (CIO departure, mandate shift, firm restructuring)

How triangulation changes outcomes

Strong triangulation discipline:

  • eliminates targeting errors from single-source mistakes
  • builds field-level confidence scores (High: 3+ independent sources; Medium: 2 sources; Low: 1 source)
  • surfaces conflicts between sources that reveal change events
  • protects sender reputation by preventing outreach to wrong contacts

Weak triangulation discipline:

  • amplifies errors when copying from unreliable databases
  • creates false confidence from multiple mentions of same root claim
  • misses entity changes because single outdated source wasn't challenged
  • produces targeting failures that damage GP credibility with LPs

How allocators evaluate triangulation discipline

Confidence increases when teams:

  • specify exact sources used for triangulation (not just "we verified")
  • explain why sources are independent (different origination paths)
  • flag conflicts between sources and show resolution logic
  • assign field-level confidence based on triangulation completeness

What slows decision-making

  • claiming triangulation with derivative sources (same press release syndicated 5x)
  • treating high volume (10 mentions) as equivalent to independence (3 distinct source types)
  • no documentation of source paths and independence verification
  • mixing aged Tier 1 source with current Tier 3 source without recency discipline

Common misconceptions

"Three mentions = triangulated." → Three copies of the same source aren't independent.
"Triangulation proves certainty." → It quantifies confidence; even strong triangulation can be wrong if all sources rely on the same outdated root claim.
"Always triangulate everything." → Prioritize high-stakes fields; over-triangulation wastes resources on stable data.

Key allocator questions during diligence

  • What source types did you use to triangulate this claim?
  • Are these sources truly independent or derivative?
  • What is the field-level confidence score (High/Medium/Low)?
  • When was each source published or updated?
  • What conflicts did you find between sources, and how were they resolved?

Key Takeaways

  • Source triangulation = confirming across independent source types (regulatory + news + direct), not just multiple mentions
  • Use it for high-stakes fields (roles, mandates, decision authority) where errors are costly; skip for stable identifiers
  • Conflicts between sources are signals—investigate rather than ignore; they often reveal change events or disambiguation needs