The AI boom has an ugly side to it. All of the major Large Language Models (LLMs) are run on sweatshop-style labor. Millions of low-wage workers have put together billions of work hours to help scrape the internet for this revolution, and yet they never got credited or earned any noteworthy amount to make ends meet. As trust in these big AI companies erodes, the primary focus is on the extractive foundation of the status quo and how this base is cracking.

Can emerging technologies be given a pass just because they are disrupting a certain space? Unethical tech can be defeated by ethical tech, where contributors are rewarded for their efforts. The concept of fair compensation is a much-needed concept that needs to be revisited by companies.

Today, Adewale Opeyemi sat down with Ram Kumar, the CEO of OpenLedger, to explore the technical challenges and market realities of building fair compensation into AI systems.

Ade: Can you please tell us more about your background and your professional journey?

Ram: I have always been obsessed with the idea of systems. How they scale, who they empower, and who they quietly exploit. Before OpenLedger I worked on AI infrastructure and saw how quickly the field was advancing, but also how invisible the actual contributors were. That disconnect stuck with me. OpenLedger is my way of saying that if AI is going to run the future then the builders, the data providers, the human contributors should be in the room, not erased from it.

Ade: It is an open secret that big AI-based LLM projects have been using scraped data from millions of creators who did not get paid. What's the economic impact on creators?

Ram: The economic impact is theft in slow motion. Millions of creators’ work has already been ingested into models worth billions. Think about that for a second. The very foundation of this AI boom is unpaid labor. For the creator it means your art or writing no longer rewards you. For the industry it means we are scaling on exploitation and cracks are already showing. It is not just unfair. It is unsustainable.

Ade: Several high-profile lawsuits have been filed. Does the industry have the capacity to solve this crisis, or is further regulation or innovation needed?

Ram: You cannot sue your way into a fair system. Lawsuits might create pressure, but regulation is always playing catch up. What we need is innovation that bakes fairness in by design. That is where attribution tech comes in. If we can track who contributed, we can reward them. It is not about patching the old model. It is about replacing it with something better.

Ade: The current training model makes billions. Why would companies abandon it for a fairer approach?

Ram: Because they will have no choice. Extractive models look profitable in the short term, but they collapse under their own weight. Creators push back, lawsuits mount, trust erodes. At some point your cheap data becomes the most expensive liability on your balance sheet. Fair attribution is not charity. It is the only path to sustainability.

Ade: Content creators are skeptical of blockchain. It feels complex and scammy. How do you change that perception?

Ram: By cutting the jargon and showing real outcomes. If a creator sees a payment hit their wallet because their work powered an AI answer, that is the trust signal. People do not care what technology we use behind the scenes, they care if the system treats them fairly. Blockchain just happens to be the only tech that can do this at scale, transparent, tamper proof, and not controlled by one company.

Ade: Your Proof of Attribution model sounds complex. Can it really scale?

Ram: It has to scale. AI is global so attribution has to be global. What we are building is not about tagging one artist’s work. It is about tracing contributions across billions of data points and making sure the rewards flow back correctly. That is why we are using blockchains, because attribution is not a one time action, it is a living ledger.

Ade: Why blockchain specifically? What does it solve that databases cannot?

Ram: Databases can track records, but they cannot create shared truth. If attribution lives inside a company’s private database, it is just another black box. Blockchain flips that. Everyone can verify the trail, no one can tamper with it, and the system itself becomes accountable. That is the level of trust this space has been missing.

Ade: Let’s talk economics. If creators get paid fairly, will AI companies see their costs skyrocket?

Ram: The costs do not skyrocket, they rebalance. Right now creators are paid zero and companies are pocketing billions. That is extraction. When attribution enters, companies still profit, but value gets distributed instead of hoarded. Yes the margins shift, but so does the trust. Long term it is cheaper to build on a fair foundation than to keep fighting lawsuits and bad press.

Ade: As Core Contributor building this infrastructure, what has been your biggest challenge?

Ram: The hardest part is convincing both sides to jump at once. Creators ask if companies will really adopt this. Companies ask if creators will really care. It is the classic chicken and egg. Our approach is to prove it in motion. Launch products like OpenChat, reward contributors directly, and show companies that attribution is not just possible, it is profitable.

Ade: Finally, let’s talk results. OpenChat is your flagship. What are you seeing so far, and what is your message to creators and AI companies still watching from the sidelines?

Ram: With OpenChat we have already proven the model. Real contributors are getting credited and paid in real time. My message is simple. The old system is burning out. If you are a creator this is your shot at ownership. If you are an AI company this is your chance to build on ground that will not collapse under you. The future of AI is not extractive. It is OPEN.