Investor Vetting Report
How a 200-credit Investor Vetting Report is produced. The published standards we adhere to, the limits we will not pretend to overcome, and the corrections process if we get something wrong.
Overview
An Investor Vetting Report is a paginated, twelve-section due-diligence document on one investor — VC partner, angel, family office principal, or solo capitalist. It is generated on demand from public OSINT plus the investor's own enriched profile, takes three to five minutes to produce, costs 50 credits (about $20 USD), and is delivered as a shareable HTML report with a printable PDF view.
It is built for the founder considering an offer, the limited partner conducting reverse due diligence, the journalist writing on a fund's track record, and the corporate development lead evaluating a strategic investor. It is the mirror image of the Founder Vetting Report: the founder's-eye view of the investor, evidence-cited, framework-bound, deliberately neutral.
The report is not a credit report, not a background check, and not an FCRA-regulated consumer report. It does not access sealed records, paywalled fund data, or non-public limited-partner agreements. Everything that appears in the report is recoverable by any sufficiently patient researcher with public-internet access; what we sell is the synthesis, the cited audit trail, the framework adherence declared on this page, and the speed of delivery.
The Three Frameworks We Adopt
MentionFox / Verifierce reports align with three published standards. They were not written for venture due diligence — they were written for the U.S. intelligence community, professional fact-checkers, and the United Kingdom's Defence Intelligence agency — but they translate cleanly to investment-grade research, and adopting them publicly is the single most credible thing a research vendor can do.
ICD 203 — Analytic Standards (Office of the Director of National Intelligence)
The U.S. Intelligence Community's Directive 203 defines nine tradecraft standards that any piece of finished intelligence is required to meet: properly described sources, proper expression of uncertainty, distinction between intelligence and assumptions, incorporation of alternative analysis, judgement of consequences, customer-relevant focus, logical argumentation, accurate reflection of source content, and clear language. We treat these as binding for every Vetting Report we produce. The full list is enumerated in section four below.
KSJ Three Models of Fact-Checking (Knight Science Journalism, MIT)
Deborah Blum's KSJ Fact-Checking Project at MIT documents three distinct working models: pre-publication line-by-line verification (the New Yorker model), post-publication rapid response (the Washington Post Fact Checker model), and structured public-knowledge correction (the BBC editorial model). A Vetting Report borrows from all three. Sonnet drafts each section under cited-evidence constraints (pre-publication discipline). The Corrections section below describes our post-publication response. And the methodology page you are reading enacts the third model — public statement of how we know what we claim to know.
UK PHIA Probability Yardstick (UK Defence Intelligence)
The Professional Head of Intelligence Assessment publishes a seven-band probability yardstick used across UK government intelligence reporting since 2018. It maps natural-language probability words ("highly unlikely", "realistic possibility", "almost certain") to numeric bands (under 5%, 10-20%, 25-35%, 40-50%, 55-75%, 80-90%, over 95%). The yardstick eliminates the well-documented problem that an analyst writing "may" and a reader interpreting "may" can mean wildly different things. Every probabilistic claim in a MentionFox Vetting Report is expressed using the seven yardstick terms, paired with a separate analytical-confidence rating (High / Moderate / Low) per ICD 203.
AML / KYC Sanctions and PEP Screening Framework
Anti-Money-Laundering and Know-Your-Customer compliance regimes — adopted by every regulated financial institution worldwide — define a public-record screening framework: OFAC Specially Designated Nationals list, UN Consolidated Sanctions list, UK HMT (formerly OFSI) sanctions list, EU Consolidated Financial Sanctions list, and Politically Exposed Persons screening. We adopt that screening framework verbatim in Section 4.5 of every Investor Vetting Report. A founder taking capital from a fund generates a HIGH / MODERATE / LOW Capital-Source Risk Flag from the public record before signing a term sheet. The sanctions lists are free, public, and authoritative-Primary by Source Class. The output is founder-facing, not a compliance document — translated into "as a founder pitching this firm, you should know..."
The Twelve Sections of an Investor Vetting Report
| # | Section | Purpose |
|---|---|---|
| 1 | Executive Summary | One opening sentence on the investor, three "what's good" bullets, three "worth digging into" bullets, one headline recommendation. Built last from the eleven other sections so the verdict reflects the evidence, not the order it was researched. |
| 2 | Investor-Stage Fit | Stage thesis named explicitly (pre-seed, seed, Series A, multi-stage, growth) with cited evidence from public deal announcements, fund prospectuses, and partner statements. |
| 3 | Check Size & Cadence | Modal first-check size, follow-on cadence, fund-deployment pace, lead-versus-participate ratio. Inferred from public deal disclosures with explicit caveats. |
| 4 | Portfolio Pattern Analysis | Sector concentration, geo concentration, founder-archetype concentration. Patterns named, not just listed. Outliers explicitly flagged. |
| 4.5 | Capital Source & Sanctions Risk | AML / KYC screening framework applied to the firm's public record: disclosed LP composition, geographic concentration, OFAC SDN + UN + UK HMT + EU sanctions screening of named principals, adverse media in the last 24 months, and a HIGH / MODERATE / LOW Capital-Source Risk Flag. Founder-facing voice — "as a founder pitching this firm, you should know..." — not a compliance document. |
| 5 | Co-Investor Network | Frequent syndicate partners, high-conviction-led versus follow-on bets, signal-positive co-investors. Ranked, not catalogued. |
| 6 | Founder Treatment Reputation | Public testimony from portfolio founders. Sourced from podcast interviews, blog posts, and named-attribution press where available. Aggregator-class signal flagged. |
| 7 | Decision Speed | First-meeting-to-term-sheet velocity inferred from cited deal-announcement timing and founder-side public retrospectives. Honest range, not point estimate. |
| 8 | Board Behavior | Public reputation as a board member: founder-friendly, intervention pattern, exit-pressure signals if any. Sourced from named-attribution press only. |
| 9 | Term Sheet Patterns | Reported preferred-terms patterns: liquidation preferences, board composition expectations, pro rata rights, information rights. Where terms are private, the section says so. |
| 10 | Exit Track Record | Realised and unrealised exits where publicly disclosed. IRR claims only when the LP has published them or the fund has voluntarily disclosed; otherwise the claim is omitted, not estimated. |
| 11 | Red Flags & Reputation Risk | Public flags only where evidence supports them. The default state is "no public flags found in extensive search as of [date]" — we do not invent flags to look thorough. |
| 12 | References & Source Citations | Aggregated audit trail of every URL cited above, deduplicated, grouped by source class (Primary, Authoritative-Secondary, Aggregator, Unverified) per ICD 206 sourcing standards. |
The Nine ICD 203 Tradecraft Standards
Each Sonnet section prompt is constructed against these nine standards. Compliance is enforced at draft time (banned-words list, cited-URL minimums, mandatory uncertainty-flag phrasing) and reviewed at the executive-summary synthesis pass.
| # | Standard | How a Vetting Report applies it |
|---|---|---|
| 1 | Properly describes quality and credibility of underlying sources | Every URL in section twelve is tagged with a Source Class label (Primary, Authoritative-Secondary, Aggregator, Unverified). Inline citations carry the same badges. Reader can audit reliance distribution at a glance. |
| 2 | Properly expresses and explains uncertainties | Probabilistic claims use the seven UK PHIA bands. Each is paired with a High / Moderate / Low confidence rating that explains the basis for the confidence judgement. |
| 3 | Properly distinguishes between underlying intelligence and analyst's assumptions and judgements | Sections separate "the public record shows" claims (cited inline) from "we infer" claims (introduced with "we assess", "the pattern suggests", or similar). Decision-speed and board-behavior inferences are explicitly labeled as inference, not direct disclosure. |
| 4 | Incorporates analysis of alternatives | Investor-Stage Fit and Portfolio Pattern sections name the alternative thesis they considered and rejected. Red Flags asks whether absence of evidence is evidence of absence and answers honestly. |
| 5 | Demonstrates customer relevance and addresses implications | Each section opens with the founder-relevant question it answers. Executive Summary closes with a Headline Recommendation directly callable in a term-sheet conversation. |
| 6 | Uses clear and logical argumentation | Sections progress from cited evidence through analysis to verdict. No hidden inferential leaps. Banned filler words: "brilliant", "visionary", "controversial", "may be", "could be", "perhaps", "possibly". |
| 7 | Explains change to or consistency of analytic judgements | Where a re-generation of the same subject changes a numeric claim, the diff is preserved on the report's view page (re-generation history). Methodology change-log on this page records framework updates. |
| 8 | Makes accurate judgements and assessments | Accuracy floor is enforced by the banned-invention rule: where evidence is thin, the report writes "[insufficient public evidence as of date]" instead of fabricating. Any reader can audit the cited-URL trail in section twelve. |
| 9 | Incorporates effective visual information where appropriate | Portfolio-pattern tables, co-investor network blocks, term-sheet pattern tables, exit-track-record tables, and source-class reference tables. Print stylesheet preserves tables for the PDF view. |
UK PHIA Probability Yardstick
Verbatim from the UK Professional Head of Intelligence Assessment guidance. Used in MentionFox Vetting Reports anywhere the underlying claim is probabilistic rather than asserted.
| Yardstick band | Numeric range | Used in a Vetting Report when |
|---|---|---|
| Remote chance | Under 5% | The claim is conceivable but no public evidence supports it. |
| Highly unlikely | 10 - 20% | Public evidence weakly contradicts the claim or strong base rates contradict it. |
| Unlikely | 25 - 35% | Public evidence is mixed and tilts away from the claim. |
| Realistic possibility | 40 - 50% | Public evidence is mixed, neither tilt is decisive, and the claim is operationally relevant. |
| Likely / probably | 55 - 75% | Public evidence supports the claim with one or more corroborating sources. |
| Highly likely | 80 - 90% | Multiple authoritative sources independently support the claim and no countervailing public evidence surfaces. |
| Almost certain | Over 95% | Primary-source evidence is on the record and uncontested. |
Every PHIA-band claim is paired with an Analytical Confidence rating expressed separately:
- High Confidence — multiple independent authoritative sources, recently re-verified.
- Moderate Confidence — one authoritative source or multiple secondary sources.
- Low Confidence — claim is inferred from indirect evidence or relies on uncited assumptions; the reader should weight accordingly.
Example phrasing as it appears in a real report: "Highly likely (~85%) the partner leads the round at typical Series A check size of $8-12M (Confidence: Moderate — based on the last six lead-position deals on Crunchbase plus partner statements in two recorded podcasts)." The phrasing is mandatory; vague hedges ("may", "could", "perhaps", "possibly") are banned in the Sonnet prompts.
Source Class Taxonomy
Per ICD 206, every citation declares its source class. Four classes, ordered by reliability:
| Class | Definition | Examples |
|---|---|---|
| Primary | The investor's own writing on a domain they own; firm-published deal announcements; SEC Form D filings and Form ADV; primary regulatory disclosures. | The fund's own site, the partner's personal site, X / Twitter at exact-match handle, sec.gov, federalregister.gov, court records, wikidata.org subject pages. |
| Authoritative-Secondary | Reputable institutional secondary sources with editorial standards, named authors, and a public corrections process. | Wikipedia, Reuters, Associated Press, New York Times, Wall Street Journal, Financial Times, BBC, NPR, ProPublica, Bloomberg, The Atlantic, Crunchbase company pages, LinkedIn rate-limited verified profiles, The Information, Pitchbook editorial articles where attributable. |
| Aggregator | Useful structured data with weaker editorial review, often crowd-sourced, often sales-funnelled. | Crunchbase rankings, AngelList listings, G2, Capterra, SimilarWeb, BuiltWith, Glassdoor, Indeed, ProductHunt rankings. |
| Unverified | Forum posts, anonymous comments, third-party blogs without named authorship, content on platforms with no editorial review. | Reddit, Hacker News comments, Stack Overflow, Medium posts not authored by the subject, Substack posts not authored by the subject, generic blog hosts. |
Section twelve of every Vetting Report opens with a class-distribution summary of the form "47 citations: 18 Primary, 22 Authoritative-Secondary, 5 Aggregator, 2 Unverified." A report with too few Primary citations or too many Unverified ones is a self-disclosing weakness; the reader can decide what weight to put on it.
Honest Limits — what we do not do
What we DO do that competitors do not
- Synthesis-tier output: 12-section narrative report with cited evidence and verdicts, not a database export with empty fields.
- Public methodology: this page. Our exact frameworks are auditable by anyone, including by competitors. We treat that as a feature.
- Asymmetric pricing: 50 credits (about $20) for a full Investor Vetting Report. Founders evaluating a term sheet from a multi-stage fund typically have no analogous reverse-Due Diligence product available at any price; the closest institutional equivalents at industry-research vendors run 5x to 50x.
- Adopted intelligence-community frameworks (ICD 203, ICD 206, UK PHIA Yardstick) and pharmaceutical-grade data integrity (ALCOA), in writing, openly.
What we DO NOT do
- We do not access sealed court records, sealed limited-partner agreements, non-public side letters, or unfiled regulatory data.
- We do not call investors, send pretextual outreach, or solicit private disclosures from the investor's portfolio founders.
- We do not query paywalled investigative archives or fund databases (Pitchbook, Preqin, etc.) that require institutional credentials we do not hold.
- We do not conduct FCRA-regulated consumer reporting. An Investor Vetting Report is not a credit report and may not be used for employment, housing, or credit decisions covered by the Fair Credit Reporting Act.
- We do not invent claims to fill thin sections. Where evidence is genuinely absent, the report writes "[insufficient public evidence as of date]" and moves on.
- We do not promise predictive accuracy. An Investor Vetting Report is a synthesis of the public record at the moment of generation. Founders' decisions to take or refuse capital remain their own.
Corrections Policy
Modeled on the BBC editorial corrections process. Three commitments:
- Identification window. Errors flagged within thirty days of report generation are corrected on the canonical view URL within five business days. Errors flagged after thirty days are evaluated and corrected at our discretion.
- Re-publication, not silent edit. Corrections do not overwrite the prior text silently. The report's view page preserves a redline diff between the original draft and the corrected draft, time-stamped, with a one-line explanation of the correction. Any reader who saw the original can audit the change.
- Subject right of reply. The investor named in any Vetting Report may submit a one-paragraph factual rebuttal to corrections@mentionfox.com. Verifiable rebuttals attach to the report alongside the original section. Where the investor and our research disagree on a public-record claim, both views are surfaced; we do not silently capitulate, and we do not refuse to publish the investor's view.
Data integrity floor — ALCOA. Every Vetting Report carries an ALCOA Methodology footer at the end: each factual claim is Attributable to a cited source, presented in Legible plain language, marked with the date it was Contemporaneously verified, sourced from the Original primary record where available, and Accurately reflects the underlying evidence with explicit uncertainty flags where evidence is thin. ALCOA is the U.S. Food and Drug Administration's data-integrity principle for regulated industries; we adopt it as the floor for synthesis research because it captures the same disciplines without the regulatory overhang.
References
Primary documents referenced throughout this methodology declaration. All publicly available; we encourage readers to read them in the original.
- ICD 203 — Analytic Standards — Office of the Director of National Intelligence (2015).
- ICD 206 — Sourcing Requirements for Disseminated Analytic Products — Office of the Director of National Intelligence.
- KSJ Science Editing Handbook — Knight Science Journalism, MIT.
- UK Professional Head of Intelligence Assessment — Probability Yardstick — UK Government / Cabinet Office.
- FDA Data Integrity and Compliance With Drug CGMP — ALCOA principles — U.S. Food and Drug Administration.
- BBC Editorial Guidelines — Accuracy and Corrections — British Broadcasting Corporation.
- Fair Credit Reporting Act (FCRA) overview — Federal Reserve History project. (Cited as the regulated-reporting regime an Investor Vetting Report is explicitly NOT.)
Methodology v1.0 · Published 2026-05-03 · Verifierce / MentionFoxFounder Vetting Report methodology →