Fact-Checking Framework
Claim Analysis Process
Step 1: Identify the Claim
- What exactly is being asserted?
- Is it a single claim or multiple?
- What's the implicit claim vs explicit?
Example:
"Violent crime has doubled since 2020"
Claims embedded:
- Violent crime has increased (verifiable)
- The increase is 100% (verifiable)
- 2020 is the baseline (affects interpretation)
Step 2: Identify What's Verifiable
| Type | How to Verify |
|---|---|
| Statistics | Find original data source |
| Quotes | Find primary source, check context |
| Events | Cross-reference news, records |
| Causation | More complex—correlation vs causation |
| Predictions | Cannot verify (flag as opinion/forecast) |
Step 3: Find Sources
Priority order:
- Primary sources (original data, documents)
- Government/official statistics
- Academic research (peer-reviewed)
- Quality journalism (multiple sources, corrections policy)
- Expert statements (credentialed in relevant field)
Step 4: Assess and Rate
Verdict Categories
| Verdict | When to Use | Example |
|---|---|---|
| ✅ TRUE | Claim is accurate as stated | "The Earth orbits the Sun" |
| ⚠️ MOSTLY TRUE | Accurate but context missing | "Jeff Bezos is richest" (depends on metric) |
| ❓ MIXED | Some elements true, some false | Complex claims with multiple parts |
| ⚠️ MOSTLY FALSE | Grain of truth but misleading | Misattributed quote, wrong numbers |
| ❌ FALSE | Claim is inaccurate | "Humans only use 10% of their brain" |
| 🔍 UNVERIFIABLE | Cannot confirm either way | Claims about private events |
Source Credibility Assessment
High Credibility
- Peer-reviewed academic journals
- Government statistics agencies (BLS, Census, etc.)
- Established news organizations with corrections policies
- Primary documents and data
- Expert consensus in relevant field
Medium Credibility
- Think tanks (note ideological leaning)
- Industry reports (note potential bias)
- Individual expert opinions
- News analysis and opinion pieces
- Wikipedia (good starting point, verify sources)
Low Credibility
- Anonymous sources
- Self-published content without citations
- Known partisan sources on partisan topics
- Social media posts without verification
- "Studies show" without specific citation
Red Flags
- No sources cited
- Sources don't say what's claimed
- Cherry-picked data or quotes
- Old data presented as current
- Correlation presented as causation
- Emotional manipulation tactics
Common Fact-Check Patterns
Misleading Statistics
- Out of context: "Murder up 30%!" (from historic low)
- Wrong baseline: Comparing 2020 (pandemic) to normal years
- Cherry-picked timeframe: Choosing dates that support narrative
- Relative vs absolute: "50% increase" sounds scarier than "1 to 1.5"
Misattributed Quotes
- Said by someone else
- Never said at all (fabricated)
- Taken out of context
- Paraphrased inaccurately
False Causation
- Correlation ≠ causation
- Confounding variables ignored
- Reversed causation
- Coincidence presented as pattern
Limitations to Acknowledge
What Can't Be Fact-Checked
- Future predictions
- Private conversations (unless recorded)
- Subjective opinions
- Claims about intentions
- Real-time events (information still emerging)
Inherent Uncertainty
- Some questions have genuine scientific debate
- Historical events may have incomplete records
- Statistics can be calculated different ways
- Context can legitimately change interpretation
Always:
- State confidence level
- Acknowledge limitations
- Provide sources for verification
- Note if experts disagree
Detecting Myths and Misattributions
Quote Verification Strategies
When someone attributes a quote to a famous person, actively look for debunking:
- Search for debunking first: Query "did [person] actually say [quote]" or "[quote] misattributed"
- Check specialized sources: Quote Investigator, Wikiquote's "Misattributed" sections, Snopes
- Find earliest occurrence: The quote should appear in sources from the person's lifetime
- Verify primary source: Can you find audio, video, or a published work with the quote?
Red flags for fake quotes:
- Too-perfect phrasing that sounds modern
- Quote perfectly supports a contemporary argument
- No primary source citation, just "Einstein said..."
- Suspiciously witty or profound (sounds like a meme)
Myth Detection Signals
Popular myths often share characteristics:
- Too neat: "Humans only use 10% of their brain" — suspiciously round numbers
- Counterintuitive hook: Designed to be memorable and shareable
- Widespread belief, weak sourcing: Everyone "knows" it but can't cite a study
- Appeals to authority vaguely: "Scientists say..." without specifics
Debunking search strategy:
- Search "[claim] myth" or "[claim] debunked"
- Check Snopes, academic debunking papers, science communication sites
- Look for systematic reviews or meta-analyses that address the claim
Research mindset: Don't just search for confirmation. Actively search for credible sources that contradict the claim. If you can't find debunking AND can find strong support, confidence increases.
Context-Dependent Claims
Some claims cannot be verified without additional context. Identify these early.
Jurisdiction-Dependent Claims
Claims about legality vary by location:
- "It's illegal to collect rainwater" → True in some US states, false in others
- "You can turn right on red" → Varies by country and locality
- Tax rules, age limits, regulations → Almost always jurisdiction-specific
Handling: Ask the user for location before rendering verdict. If unknown, explain that the answer varies by jurisdiction and provide examples.
Time-Sensitive Claims
Claims with superlatives or rankings change over time:
- "X is the largest company" → By what metric? As of when?
- "Y holds the record for..." → Records get broken
- Statistics and rankings → Require date context
Handling: Verify claim as of the most recent reliable data. Note the date and warn if data is stale.
Conditional Claims
Claims that are true only under specific circumstances:
- "Coffee is bad for you" → Depends on amount, individual health, what "bad" means
- "Electric cars are better for the environment" → Depends on electricity source, manufacturing, comparison baseline
Handling: Explain the conditions under which the claim is true or false. Use MIXED verdict with clear breakdown.
Satire Detection
Satirical content should not be fact-checked as serious claims.
Known Satire Sources
Identify content from satirical publications:
- The Onion, ClickHole
- The Babylon Bee
- Reductress
- The Daily Mash
- Waterford Whispers News
- Private Eye (some sections)
- The Borowitz Report
Satire Indicators
Even from unknown sources, watch for:
- Absurdly specific or unlikely details
- Too-perfect quotes that are obviously jokes
- Humorous or ironic tone throughout
- Headlines that seem designed to provoke outrage then relief
- "Area man" or similar obviously fictional framing
Handling Satire
If content appears to be satire:
- Verify the source is satirical
- Use verdict: 🎭 SATIRE
- Explain: "This appears to be from [source], a satirical publication. It is not intended as factual reporting."
- Do NOT apply serious fact-checking to obvious jokes
Edge case: Sometimes real news sounds like satire. If unsure, verify the source's nature before dismissing as satire.