OpenLedger is an AI-native blockchain aimed at making “data, models, and autonomous agents verifiable, ownable, and economically accountable,” the companies said in a Thursday (Jan. 29) news release announcing the offering.
Story Protocol describes itself as an open protocol for registering, licensing and monetizing intellectual property. Together, the two companies have come up with a standard they say lets artificial intelligence (AI) systems train on licensed intellectual property (IP) while “cryptographically proving how that IP is used,” and making sure creators are compensated.
“Until now, once creative work entered AI training pipelines, it effectively became untraceable. Creators had limited visibility into how their work was used, enterprises lacked reliable auditability, and AI developers operated in an expanding legal gray zone,” the release said.
“Story Protocol and OpenLedger aim to address this by embedding rights, attribution, and payments directly into AI infrastructure itself.”
With this standard, Story Protocol acts as the canonical registry for intellectual property, defining ownership, licensing terms, derivative permissions and economic rights in what the companies call a “machine-readable format.”
Advertisement: Scroll to Continue
OpenLedger serves as the AI execution and verification layer, enforcing those licenses during both training and inference, cryptographically verifying IP usage, and instantly routing payments when licensed content plays a role in model behavior or AI-generated derivatives.
“AI cannot scale on legal ambiguity,” said Ram Kumar, core contributor at OpenLedger. “If intelligence is becoming economic infrastructure, then intellectual property must be programmable, enforceable, and monetized by default. Story defines what IP can be used. OpenLedger defines how it’s used in AI.”
PYMNTS wrote last year about the way generative artificial intelligence is causing the worlds of copyright law and antitrust law to intersect after decades of operating in parallel.
That report included comments from Daryl Lim, H. Laddie Montague Jr. chair in law at Penn State Dickinson Law and associate dean for research, who said generative AI introduces the need for copyrighted works at industrial scale.
“When you train frontier models, you need to ingest vast repositories of works that may include copyrighted works,” Lim said. “Only a handful of firms can do this, and those firms often control the compute, the data, the cloud infrastructure and distribution simultaneously.”
For all PYMNTS AI coverage, subscribe to the daily AI Newsletter.