What “Inconclusive” Really Means: A Guide to Understanding HTPBE Verification Results

Code examples verified against the API as of March 2026. If the API has changed since then, check the changelog.
You uploaded a document. HTPBE returned inconclusive. Now what?
If your first reaction was “the tool couldn’t figure it out,” you’re not alone — but that’s not what inconclusive means here. It’s a specific finding with a specific implication, and once you understand it, you’ll realize it’s often the most useful result HTPBE can give you.
This guide explains all three verdicts and — most importantly — what action to take for every result you’ll see.
The Three Verdicts
HTPBE analyzes the internal structure of a PDF: its metadata, cross-reference tables, timestamps, producer software, and incremental save history. Based on what it finds, it assigns one of three statuses.
intact — Document is clean
No post-creation modifications were detected. The document was produced by software consistent with institutional or enterprise systems — banking platforms, university registrar software, enterprise document management systems.
What this means: The file structure is consistent with what you would expect from a legitimately generated document. Proceed with confidence.
What it doesn’t mean: A forensic guarantee of authenticity. Metadata can theoretically be spoofed by someone with technical knowledge. But for the vast majority of document verification scenarios, intact is a strong positive signal.
modified — Post-creation edits detected
The analysis found evidence that the document was altered after its original creation. This could mean timestamps differ between creation and the last save, the cross-reference table shows incremental updates consistent with editing sessions, or the producer software is a PDF editor rather than the original creation tool.
What this means: Do not proceed with this document at face value. Request the original from the sender through a different channel — not a re-upload from the same person, but a fresh copy obtained directly from the source (bank portal, institution records office, official email from a verified domain).
Note on context: Not every modification is malicious. A contract re-saved after adding a signature, or a form with filled-in fields, will sometimes show modification signals. The findings array in the full response will tell you specifically what was detected. Read it before drawing conclusions.
inconclusive — Consumer software origin
This is the verdict that confuses people most. Here is the exact definition:
No post-creation modifications were detected. However, the document was created by consumer software — Microsoft Word, Microsoft Excel, Canva, LibreOffice, Google Docs, or similar personal productivity tools — rather than institutional software.
This matters because of what it implies: anyone with a laptop could have created this document from scratch. You cannot verify it’s authentic based on file structure alone, because there is no institutional fingerprint to verify against.
inconclusive is not a failure. It is a finding about the document’s origin, and that finding has clear implications depending on what type of document you’re looking at.
Why inconclusive Is Often the Most Valuable Result
Consider three scenarios:
Scenario 1: Loan applicant submits a “bank statement”
{
"id": "d8f0b234-56c7-89de-f012-345678901bcd",
"status": "inconclusive",
"status_reason": "consumer_software_origin",
"creator": null,
"producer": "Microsoft Excel",
"origin": { "type": "consumer_software", "software": "Microsoft Excel" },
"xref_count": 1,
"has_incremental_updates": false,
"modification_markers": []
}
No modifications were detected. But the producer is Microsoft Excel.
No bank generates official statements in Excel. Retail banking systems — whether at a major institution or a credit union — generate PDF statements from core banking software, not from a spreadsheet application. This result tells you: whoever submitted this document created it themselves, in Excel, and exported it as a PDF. Request the statement directly from the bank.
Scenario 2: Job candidate submits an “MIT diploma”
inconclusive, producer: Canva.
MIT does not issue diplomas from Canva. No accredited university does. The producer field alone tells you the document’s entire story: someone opened a template in a design tool, filled in their name, and exported a PDF. Request official verification directly from the institution — MIT, like most universities, offers transcript and credential verification services.
Scenario 3: A vendor sends a contract they drafted
inconclusive, producer: Microsoft Word.
This is completely normal. A vendor who drafted a contract for your review almost certainly used Word, Google Docs, or a similar tool. inconclusive here does not raise a concern about the document’s legitimacy — it just reflects the reality that you cannot verify the “origin” of a document the other party wrote themselves. What you can verify is whether it was modified after they sent it to you, and in this case, no modifications were found.
The lesson: inconclusive does not mean the same thing for every document type. Context determines the implication.
Reading the Status: The Primary Verdict
The status field is the primary verdict. It has three possible values: intact, modified, and inconclusive. Use it as the authoritative finding.
An inconclusive result means no modifications were detected after creation, but the document’s origin cannot be confirmed as institutional. The concern with inconclusive is the origin, not tampering.
An intact result is worth cross-checking with the modification_markers array. If it is non-empty, the signal was detected but the overall verdict is still intact — review and confirm.
Combining the Signals: A Decision Table
In practice, you’re always reading two things together: status and modification_markers. Here is a simple guide to the most common combinations:
| Status | Recommended Action |
|---|---|
intact | Proceed with confidence. Review modification_markers for any edge cases. |
intact (markers present) | Review the markers. If they describe benign events (e.g. LTV signature preservation), proceed. If they describe unexpected editing, investigate. |
modified | Do not proceed. Request the original document from the source via a channel that bypasses the person who submitted it. |
inconclusive | Check the context. Is consumer software expected for this document type? (Vendor-drafted contract: yes. Bank statement: no.) Act accordingly. |
Reading the modification_markers Array
The modification_markers array is the most specific part of any HTPBE response. Each entry is a plain-language string naming one detected modification signal.
For example:
"modification_markers": [
"Different creation and modification dates"
]
This tells you exactly what structural signal triggered the verdict. Use the modification_markers array to distinguish between:
- A single marker (one clear signal) vs. multiple markers (several independent signals fired simultaneously)
- A high-certainty marker like
"Digital signature was removed"vs. a structural marker like"Multiple cross-reference tables (incremental updates)" - An empty array (no modification detected) vs. a non-empty array (evidence present)
The verdict gives you the headline. The modification_markers array gives you the evidence.
A Note on What HTPBE Cannot Detect
PDF metadata analysis is powerful, but it has boundaries worth knowing.
HTPBE analyzes the file’s internal structure, not its visual content. It cannot tell you whether the numbers in a financial document are accurate, whether a name matches an official record, or whether a signature belongs to the right person.
What it can tell you with high reliability is:
- Whether the file was created by software consistent with the type of institution that should have issued it
- Whether the file was modified after its original creation
- The strength and nature of those signals
Think of HTPBE as answering the question: “Does this file’s internal history support the story it’s supposed to tell?” For most document verification workflows, that question — answered quickly and at scale — is exactly where the value lies.
Putting It Together
If you’re building a document review workflow, the practical sequence is:
- Run every incoming PDF through HTPBE.
- Route
modifiedresults immediately to a review queue. Do not process them further without requesting the original from the source. - Route
inconclusiveresults through a context check: is consumer software plausible for this document type? If not, flag for follow-up. - Pass
intactresults through to the next stage of your process. - For anything that shows
modifiedor has notable findings, read the findings array before making a decision.
This approach takes seconds per document and filters out the majority of issues before they consume manual review time.
Questions about a specific result, or unsure how to interpret what you’re seeing? Check the API documentation for detailed field descriptions, or use the contact form on the home page to reach out directly.