Molecule Labs launch powered by Kamu

June 30, 2025

Molecule Labs Launch on Kamu #︎

Today I’m very excited to announce the launch of Molecule Labs - a biotech research data management solution based on Kamu, developed by our partner Molecule!

Molecule is a biotech and pharmaceutical research funding platform where science is directly funded by public, and revolutionary drugs and treatments are developed and brought to market by communities with highly aligned incentives.

Molecule Labs is an integral part of the Molecule ecosystem that empowers innovators and researchers to bring their projects to life, track verifiable progress, and share milestone achievements with their communities.

See the product launch announcement by Kevin Noessler:

Role of Kamu #︎

  • Ledgerized data room solution that combines
    • decentralization
    • privacy
    • time-travel
    • accountability
  • Single solution for unstructured (presentations, videos) and structured (financial, IoT, metrics) data
  • Personal data ownership - all data uploaded to Molecule Labs remains under full control of project owners
  • Data composition and dissemination

Unlike all existing scientific data portals and enterprise data rooms:

  • Kamu can handle real-time, constantly flowing data, while still guaranteeing reproducibility
  • Instead of focusing on files it works with logical datasets that can be efficiently queried and analyzed without downloading

Kamu at DeSci.Berlin 2025 #︎

I also had an honor to present at DeSci.Berlin conference a talk titled “From Scientific Data Publishing to Collaborative Data Economy”:

Key points:

  • Research data management (RDM) is in a state of crisis where reproducibility, verifiability, and collaboration on data are fundamentally broken
  • RDM is an integral part of global data economy, and we cannot fix one without fixing the other. Next stage of RDM requires a holistic solution for multi-party data exchange
  • We identified the right technical foundation:
    • Ledgerized data formats enable reproducibility and accountability of data at the source
    • Temporal processing elevates people from being “cogs in the machine” to designers and stewards of autonomous data pipelines
    • Verifiable computing ensures results are auditable and trustworthy
  • The combination of these technologies will unlock a new phase of data-centric research where:
    • Research is no longer obsolete by the time it’s published, but continues to provide value months and years after
    • Data publishers and pipline maintainers get deserved credits
    • Same data can contribute to hundreds of different research projects through composability
    • Verifiable provenance is used to distribute the rewards to all parties
  • Challenges of multi-party data exchange in RDM mirror the ones we see in the AI space. Verifiable provenance is a way to fix broken incentives in both areas.

Kamu’s roots are in the scientific data space, and it was great to talk to many researchers and DAO founders during the event who acutely feel these systemic problems of data.

Big thanks to Molecule for organizing the event and to everyone who attended! Let’s continue the conversations on Discord.

Till next time!