Know Exactly What AI Wrote.
Prove It.
Law firms, consulting firms, and regulated enterprises face a new question on every document: which sentences came from AI, which came from humans, and which were AI-drafted then edited? An AI policy does not answer that question. Cryptographic provenance does.
The Gap Between Policy and Proof
Most organizations have an AI use policy. Almost none can prove what that policy produced on a specific document.
What "We Have a Policy" Cannot Do
Answer a court order
Federal courts are requiring attorneys to certify AI use. A policy memo does not satisfy a signed certification requirement.
Identify which paragraph was hallucinated
If opposing counsel or a regulator challenges a specific citation, you need sentence-level proof, not an attestation that "AI may have been used."
Prove retroactively
Without provenance embedded at creation time, any claim about a document's origin is unverifiable -- and therefore contestable.
Protect against internal dispute
When a client asks "did your team use AI on this engagement," a policy cannot tell you what actually happened on a specific deliverable.
What Cryptographic Provenance Does
Marks provenance at creation
Every AI-generated or AI-assisted passage is signed with model metadata, timestamp, and author at the moment it is produced -- not reconstructed later.
Sentence-level Merkle tree
Each sentence has independent cryptographic proof. You can show that paragraph 4 was AI-generated while paragraphs 1-3 and 5 were human-authored.
Tamper-evident export
Evidence packages are independently verifiable -- the proof is embedded in the document itself, not on Encypher's servers.
Detects post-signing modification
Any change to a signed passage is cryptographically recorded. You can prove a section was edited after signing, and by whom.
Where Provenance Proof Is Required
The question is no longer hypothetical. Courts, regulators, and clients are asking for documentation that standard AI governance frameworks cannot produce.
Law Firms
Federal and state courts are issuing standing orders requiring attorneys to certify AI use in filings. Bar associations are publishing ethics guidance on disclosure obligations. Attorneys using AI assistants (Harvey, Westlaw AI, Copilot) need to certify not just "AI may have been used" but which specific passages -- and that those passages were reviewed.
- +Court filing certification: paragraph-level AI attribution
- +Malpractice defense: prove which citations were AI-generated vs. attorney-verified
- +Sanctions defense: evidence that AI output was reviewed before filing
- +Client billing documentation: distinguish AI-assisted from attorney work
Consulting and Advisory Firms
Enterprise clients -- especially in regulated industries -- are adding AI disclosure requirements to engagement terms. A strategy memo or due diligence report that contains AI-synthesized sections without disclosure creates professional liability. Firms need to demonstrate exactly what was AI-produced and what was partner-level analysis.
- +Client deliverable provenance on request
- +M&A due diligence: which synthesis was AI, which was analyst judgment
- +Engagement audit trail for regulatory review
- +Professional standards compliance (AICPA, CFA, etc.)
Financial Services
SEC guidance requires disclosure of AI use in filings. Research reports, prospectuses, and regulatory submissions that use AI-generated content without provenance documentation create material risk. Financial firms need an audit trail that satisfies both internal compliance and external regulatory review.
- +SEC filing AI disclosure documentation
- +Research report provenance for analyst certification
- +Internal audit trail for AI governance frameworks (SR 11-7 equivalent)
- +EU AI Act compliance for customer-facing AI outputs
Enterprise Legal and Compliance
General counsel and compliance teams at large enterprises face a discovery problem: when litigation or regulatory investigation touches internal documents, they need to produce provenance information that currently does not exist. Signing documents at creation builds that record before it is needed.
- +e-Discovery: identify AI-generated content in document review
- +Contract lifecycle: prove which clauses were AI-drafted vs. negotiated
- +Board reporting: accurate AI usage disclosure in governance reports
- +HR and policy documents: provenance audit for internal investigations
How Sentence-Level Provenance Works
Provenance is embedded at creation time, not reconstructed later. The proof travels with the document wherever it goes.
BYOK: Your Keys. Your Infrastructure.
For law firms and regulated enterprises, attorney-client privilege and data residency requirements mean you cannot send document content to a third party's signing service. Encypher's BYOK model addresses this: your organization registers its own Ed25519 public key, and all signing uses your key. Encypher provides the infrastructure; your key material never leaves your environment.
Key custody stays with you
Encypher never stores, transmits, or accesses your key material. HSM, AWS KMS, Azure Key Vault supported.
Independently verifiable
C2PA assertions embed your certificate. Anyone can verify the signature against your public key without trusting Encypher.
Data residency compatible
Signing can run within your infrastructure. Document content does not need to leave your environment.
Encypher co-chairs the C2PA Text Provenance Task Force. We're building the standard together.
Frequently Asked Questions
Questions from legal, compliance, and IT teams at law firms and regulated enterprises evaluating Encypher.
Have a question not covered here?
Build the Record Before You Need It
The time to establish document provenance is at creation, not during litigation or regulatory review. Schedule a technical architecture review to see how Encypher fits your document workflow and governance requirements.