• © Goverland Inc. 2026
  • Privacy Policy
  • Terms of Use
StreamrStreamrby0xc6D330E5B7Deb31824B837Aa77771178bD8e6713ryanwould.eth

SIP-13: Streamr Log Store Proposal

Voting ended about 3 years agoSucceeded

Context

Streamr currently does not support indefinite and/or decentralised storage for data streams.

Storage Nodes are Streamr’s current solution for delivering data persistence.

While a good solution for ephemeral data, it

  • lacks controlled and/or indefinite persistence
  • is centralised with a capacity for semi-decentralisation
  • requires the development of the same value propositions offered by existing & maintained decentralised storage platforms like Arweave and IPFS/Filecoin.

To decentralise persistence, incorporating existing decentralised storage platforms is an inevitable step forward for the Streamr platform.

Usher, is proposing to solve decentralised storage for Streamr by delivering integration between Streamr and existing decentralised storage/data technologies.

Solution

This solution is comprised of leveraging four technologies:

  1. Streamr

    Necessary for transporting data anywhere.

  2. Kyve

    To securely coordinate a decentralised network of nodes, aka. validators, responsible for performing and validating the movement of data from Streamr data streams to a storage blockchain.

    These validators form a Kyve Pool (example). Each validator can be an independent entity securely participating by staking via Kyve’s platform.

    Data archived with Kyve is also indexed, such that data can be queried through a GraphQL interface.

  3. Arweave

    A blockchain for permanent, accessible and decentralised file storage.

  4. EVM Smart Contract — logstore.sol

    A decentralised authority over which data streams are stored by the validator nodes.

    A value-based stake (in ETH, MATIC, DATA, etc.) is required to be included in the parameterised Log Store to finance the compute and storage costs.

By integrating these technologies, we form a decentralised data pipeline governed by an EVM Smart Contract, such that Streamr data is moved to Arweave for permanent and decentralised storage by a Kyve Pool (a decentralised network of nodes).

Scopes & Objectives

The development of this solution will be separated into milestones.

Each milestone will begin with a proposal submitted to Streamr’s Snapshot.

This allows the Streamr Community to vote on whether the given milestone proceeds and if it should be funded.

If the milestone is approved to be funded and funding terms & amounts have not been explicitly detailed within the given milestone proposal, a subsequent proposal to approve funding terms & amounts will be shared for open governance.

The purpose of this proposal (SIP) is to determine whether to start the first milestone.

Approval of this proposal (SIP) will set the first milestone’s scope of work in motion.

The scope for each milestone is as follows:

Milestone 1: Proof of Concept

This phase includes developing just enough of the solution to demonstrate that it can work within production parameters such that ingestion of data into the node network represents Streamr’s live environment.

The objectives within this scope are to develop the:

  1. Log Store Node — which operates the Kyve Pool.

  2. logstore.sol Smart Contract

    An EVM Compatible Smart Contract that requires

    1. an arbitrary financial stake in MATIC, ETH or DATA
    2. A list of Streamr data stream identifiers

Milestone 2: Production

This phase includes the development of optimisations, integrations and compatibility mechanisms that further improve UX.

This scope’s objectives, ordered in priority, include:

  1. A solution to improve time to query-ability

    The outcome allows data to be instantly fetched after being published to the Streamr Network.

    Prior to this optimisation, there will be a buffer of time between when the event is captured, and when the event can be surfaced inside of a query.

    This is due to the block times of storage blockchains.

    Nonetheless, the requirement for businesses to effectively query immediately still persists.

    The interim solution will be the continued use of existing Storage Nodes for immediately accessible data.

    The approach to solve this and unify both storage mechanisms, as discussed in the Streamr Discord, includes:

    1. Hot & Cold storage, whereby each validator will be required to run a Broker Node with Storage enabled.

      Validators will still be required to bundle data and store on decentralised storage (Cold store) which yields consensus about the validity of the stored data.

      To deliver on instant query-ability, each validator will effectively store a local ephemeral cache using Streamr’s existing storage technology.

    2. A new time-series query interface that unifies access to both the Hot/Cold storage mechanisms

  2. Add more Cryptocurrencies as payment methods for the Log Store solution

  3. Add Permanent Storage as a configurable option within the Streamr Core Frontend/UI & Streamr Client JS SDK

  4. Allow the existing timestamp-based event search to surface Cold Stored data — when executed from Streamr Client JS SDK

  5. Allow stored data to surface within the data explorers of Streamr’s Core Frontend/UI.

  6. Compatibility with Data Union Framework

  7. Make the validators’ Broker Nodes publicly available, as to remove the need for self-hosted Broker Nodes, but secure access to those staked in the EVM Smart Contract.

    This optimisation leverages the Hot/Cold store and simplifies access to Streamr from any software environment.

The extent to which the production platform is developed and optimised will be determined by the funding amount allocated, as discussed further in this proposal.

What is Kyve?

It’s important to establish what Kyve does before diving into our approach to development.

The Kyve Blockchain is a Cosmos-based Blockchain responsible for drawing consensus on votes submitted by Validators within a “Pool”.

A Validator has two primary processes:

  1. Deterministically produce a “bundle” of data from some source.
  2. Compare this produced “bundle” to a “bundle” that has been proposed, and then vote on whether the proposed “bundle” of data is valid.

The result is a series of validated datasets stored on Arweave or some other storage blockchain.

The key outcome occurs when these bundles are combined in chronological order. When combined, they produce a fully decentralised append-only event log.

Read more about Kyve here.

Development

Log Store Node

Each node follows the same deterministic process:

  1. Pull/Refresh data streams from logstore.sol Smart Contract.
  2. Listen to data streams
  3. Move events into a local “bundle”
  4. Kyve Blockchain selects the Node as a Data Bundle “Proposer” or “Validator”
  5. If Proposer
    1. Propose the bundle of data, comprised of events from different streams by uploading it to Storage (Arweave)
    2. Use the Storage ID to submit a proposal for a new bundle to the Kyve Blockchain
  6. If Validator
    1. Download the proposed bundle of data from Storage (Arweave)
    2. Compare foreign proposed bundle with local bundle
    3. Prepare a vote on whether the data bundle is valid
    4. Submit the vote to the Kyve Blockchain
  7. Clear local “bundle”
  8. Repeat Step 1

logstore.sol

This Smart Contract will allow anyone to create a new decentralised log store, powered by Streamr.

The user experience is like so:

  1. Create a Log Store by submitting a transaction with Streamr identifiers and a financial stake

  2. Receive an NFT representing ownership over the new log store.

  3. When log store funds deplete, top up your funds to continue log storage.

    Funding Log Stores is open to anyone with a vested interest in maintaining a log store.

  4. Burn the log store NFT to delete the log store.

  5. Use the log store NFT to amend the log store.

To ensure funds are being re-allocated continuously, the following processes will be included:

  1. A validator node will be selected to notify the Smart Contract when a new dataset/bundle is validated and uploaded.

  2. Within this transaction, an Oracle Network will be tasked to bring validated metadata about the latest dataset onto the Smart Contract.

    This may or may not use Kyve’s GraphQL interface depending on Oracle compatibility.

  3. This data is used to reallocate funds to compensate the validators accordingly.

    Price Feeds provided by Oracles, such as Redstone, will ensure fee accuracy between disparate cryptocurrencies staked and expensed.

Storage Costs and Kyve Staking will be managed by Node Operators participating in the Kyve Pool. Node Operators will earn funds within Kyve and a proportional amount will be reflected as fees in the EVM Smart Contract.

Fee Calculation

The stake requirement to facilitate a decentralised log store compensates the Kyve Pool Validators.

Fees are calculated for each log store.

Fees are reallocated after a given dataset/bundle has been stored.

The total fee is the sum of the:

  • fees incurred from data stored
  • validator rewards for facilitating storage
  • treasury fees to fund the maintenance & development of the platform

Validator rewards and treasury fees should & will be governable in a decentralised manner.

While a new governance token offers a breadth of options for scale and purpose, Kyve already offers a governance platform, and therefore, the first version of the Log Store platform will leverage Kyve’s existing governance technology.

This enables validators staking in Kyve’s platform to participate and vote.

Repository

Work on the repository can be found here:

https://github.com/usherlabs/logstore

Timeline for Milestone #1

  • 4 - 6 weeks
    • Log Store Node
    • logstore.sol Smart Contract
    • Creating Documentation
    • Developing security compatibility with Streamr private data streams
  • 2 - 3 weeks — additional time for:
    • e2e testing
    • proposal and governance management across technologies
    • validator onboarding and coordination
    • in case anything goes astray

Execution

As part of it’s research in merging off-chain & on-chain, Usher has devised the storage solution outlined in this proposal and will be mandated to carry out the delivery and management of the solution.

Deployment

The solution will first be deployed to testnets, such that the entire experience can be tested without real value reallocated to validators.

Deployment to live networks will be proposed to the validator network upon completion of testing.

Upon proposal approval, such that each validator is prepared for the live deployment, the solution will be deployed to appropriate live networks.

Funding

While this proposal establishes a model for monetisation through treasury fees, direct funding provides an additional mark of confidence from Streamr and other interested backers.

With funding delivery of all of the objectives detailed in the given milestone is guaranteed as the requirement for immediate monetisation is alleviated.

Funding for Milestone #1

This proposal (SIP) enables the Streamr Community to vote on funding the given milestone’s scope of work.

Due to the immediate and small nature of the scope, being a proof-of-concept, funding will size similar to a Streamr Network grant. That is up to twenty thousand USD dollars.

Voting

This proposal (SIP) will allow DATA token holders to vote on the most appropriate way forward for the delivery of the Log Store platform:

  • Approve & Fund: Prior to project commencement, a new proposal (SIP) will be produced to offer DATA token holders a vote on the funding amount and treasury fee share & terms.
  • Approve: Simple approval by DATA token holders to commence the project.
  • Reject: Reject the proposal and keep Steamr storage exclusively managed by the Streamr Network.

The case AGAINST the proposal

  • The Streamr Network has an existing solution for storage and this solution is already being developed to facilitate the value propositions outlined in this proposal, making the Log Store redundant.
  • The proposed solution creates too much dependency on third-party technology adding a layer of security risk to Streamr Developers and Data Providers.

The case FOR the proposal

  • Inevitable development: The Streamr Network needs a decentralised & fault tolerant integration with decentralised storage networks for permanent and/or controllable decentralised data persistence.
  • Focus: Decentralised Storage of Streamr Data (Log Store) and the Streamr Network are fundamentally different technologies, and deliver different values.
  • Empowerment: The Log Store platform becomes a self-sustaining, healthy project powered by Streamr. The Log Store also empowers the Data Union Framework to distribute larger stored & validated datasets to Buyers.
  • Retained Vision & Direction: While this new development does set a new precedent for how storage by Streamr will be managed, it does not redirect or change the network’s original vision and purpose, which is to enable a real-time data network.

Off-Chain Vote

Approve & Fund
28.94M DATA93.6%
Approve
64.71K DATA0.2%
Reject
1.9M DATA6.2%
Download mobile app to vote

Timeline

Nov 22, 2022Proposal created
Nov 22, 2022Proposal vote started
Nov 30, 2022Proposal vote ended
Oct 26, 2023Proposal updated