Consent infrastructure for AI training data

Put your AI training choice on record.

Creators declare whether their work is available for AI training.

AI teams check the registry before ingestion and build an audit trail.

Fingerprint-based matchingCryptographically anchoredQueryable API for AI teams

A better AI training ecosystem starts with clarity. Who made what, what's available, and how to reach the right person.

Creators declare

Put your AI training choice on record alongside a clear licensing route.

AI teams check

Query the registry before ingestion and build a defensible audit trail.

Both sides benefit

Clearer choices, visible routes, and verifiable records that travel with the work.

How it works

Three steps. Nothing leaves your device.

Step 1

Fingerprint the work

Select images, documents or video. Sourcemark fingerprints them locally, so the original stays on your device.

Get started — it's free

Drop files here

morning-light-series-04.jpgReady
summer-skincare-routine-final.mp4Ready
brand-guidelines-v3.pdfReady
Step 2

Set your AI training choice

Choose whether the work is not available for AI training or available by licence. Add the right licensing contact where permission is given.

Learn more

AI training availability

Choose how AI companies may use this work for training.

Not available
Available by licence
Applying to 3 files
summer-skincare-routine-final.mp4Not available
brand-guidelines-v3.pdfNot available
morning-light-series-04.jpgLicensable
Step 3

Publish a verifiable record

Sourcemark creates a timestamped, machine-readable record. Creators share a verification link. AI teams query the registry by fingerprint.

Get started — it's free
For creators
SA
Sam Avery
ceramic-collection-02.jpg
Not available for AI training
For AI teams
POST /api/v1/check
"status": "not_available",
"similarity": 0.97

Two options. One record.

Every declaration states one of two positions.

Not available for AI training

Record that this work is not available for AI training. Sourcemark makes that choice timestamped, attributable and queryable before ingestion.

Available by licence

Record that this work may be used for AI training with permission. Sourcemark provides a route to the relevant rights holder or representative.

For creators

Put your position on record.

AI companies are building with creative work at scale — but there's no consistent way for creators to say what's available and on what terms. Notes in bios are easy to miss. Platform settings vary.

Sourcemark gives you a clearer way to put your AI training choice on record and make licensing routes visible, so your position can travel with the work instead of being left behind.

  • Your files stay on your device — Sourcemark stores fingerprints, not originals
  • Your record is timestamped, versioned and cryptographically anchored
  • Your verification link can be shared wherever your work appears online
  • Images, documents and video are supported from day one
Get started — it's free
Photographer's workspace — a large print of the mountain landscape alongside a camera and laptop
File
Fingerprint
Registry
Query
{
"match": true,
"status": "not_available",
"similarity": 0.97,
"contact": "rights@example-studio.com"
}
Pre-ingestion check completeDeclared
For AI companies

Check before you train.

Teams building with creative work want clear, documented permissions. Without shared infrastructure, consent checking is fragmented and harder to scale.

Sourcemark gives AI teams a machine-readable registry of declared permissions that datasets can be checked against before ingestion.

  • Query by file fingerprint — exact match or perceptual similarity
  • Read the declaration — consent status, licensing contact and timestamp
  • Build an audit trail — a record of what was checked and what was found
  • Support transparency and governance — in procurement, compliance and internal review
Speak with us

What Sourcemark is not

So you know exactly what you're getting.

Not a licensing marketplace

Sourcemark does not set prices, negotiate deals or process payments. It records consent and provides a licensing contact pathway.

Not a copyright enforcement tool

Sourcemark records creator choices. It does not police, monitor or remove content.

Not a scraper blocker

Sourcemark does not stop scraping. It makes AI training preferences visible and queryable.

Not a rights verification service

Sourcemark records who made a declaration and when. It does not prove ownership or verify authority.

How the record works

What's stored, what stays private, and what makes it verifiable.

What gets recorded

Every Sourcemark contains a fingerprint of the file, the declarer’s identity, the AI training consent status, a licensing contact pathway where relevant, a timestamp and an attestation of authority.

What stays private

Your original file is never stored or published. The record is about the fingerprint, not the file. If you choose private visibility, the consent signal is still queryable but your identity is not revealed.

What makes it verifiable

Every record is cryptographically anchored to an independent public ledger. The verification page is public, machine-readable and audit-friendly. If your choice changes, Sourcemark records the update without erasing what came before.

Supported content

Declare the version you publish or share — the one that matters most.

Images

JPEG, PNG, HEIC, TIFF, RAW and WebP. Exact fingerprinting plus perceptual matching for crops, resizes and edits.

Documents

PDFs, including research papers, articles, reports and manuscripts.

Video

MP4 and MOV. Fingerprinted and declared like any other content.

Compare plans

All plans include declarations, verification and cryptographic anchoring.

FreeProEnterprise
AI training declarations
Verification pages
Cryptographic anchoring
Perceptual matching
Collections & tags
Monthly volumeGenerousHigherCustom
Bulk workflows
API access
Representative declarations
Admin controls & reporting
For creators

Put your AI training choice on record.

Make licensing routes clear. Sourcemark is free for individual creators.

Get started — it's free
For AI companies

Check declared permissions before training.

Build a more auditable, defensible process.

Speak with us