2PM.Network
WebsiteXDiscordLinkedinGithub
  • Overview
    • What is 2PM.Network
    • Architecture
    • FAQ
    • Official Social Channels
  • 2PM Data VSIES Service
    • What is Data VSIES and why is it important
    • [V] Data Validation (ZK)
    • [SI] Data Standardization and Index
    • [E] Data Encryption Client (FHE)
    • [S] Data Storage and Access
    • Data VSIES SDK
  • Node Framework
    • Modular Architecture
    • Federated Learning
      • Horizontal Federated Learning Task
      • Logistic Regression Task
      • On-chain Secure Aggregation
      • Typical Scenarios
    • FHE Machine Learning
      • Built-in Models
      • Deep Learning
      • Typical Scenarios
    • Task Submission
    • Running a 2PM Node
      • Installation
      • Chain Connector Configuration
      • Data Preparation
      • Joining a Subnet
  • Security and Verification
    • Node Staking and Slash Mechanism
    • Running Verification Client
      • EigenLayer
      • Mind Network
    • Restaking and Delegation
  • Model Inference
    • 2PM Node Inference API
    • Posting Request to a Subnet Model
    • Getting Inference Results on Chain
      • Oracle Adapters
  • Monetization and Incentives
    • AI Model IP Assets
    • Distribution Algorithm
  • 2PM DAO
    • Build Subnets
      • Establishing New Subnets
      • General Requirements
      • Data Schema Definition
      • Model Selection
      • Task Implementation
    • $DMP Token
  • Deployed Smart Contracts
    • Subnets on Testnets
    • Local Deployment Guideline
  • Ecosystem
    • Partners
    • Use Cases
      • Private Personalized Recommendation
Powered by GitBook
On this page
  • Concept of On-Chain Secure Aggregation
  • Importance of Secure Aggregation
  • Centralized vs. Decentralized Aggregation
  • Blockchain as a Coordination Layer
  • Process Flow of On-Chain Secure Aggregation in 2PM.Network
  1. Node Framework
  2. Federated Learning

On-chain Secure Aggregation

Concept of On-Chain Secure Aggregation

Secure aggregation in the context of federated learning involves executing models or calculations on various decentralized nodes that each hold a piece of data. These nodes compute results locally which are then aggregated to produce a global outcome. The traditional challenge in this aggregation process is ensuring that the aggregation of results does not expose the raw data from each node, thus maintaining privacy and security.

Importance of Secure Aggregation

Secure aggregation is pivotal for ensuring that federated processes can utilize data from multiple parties without compromising data privacy. This approach enables participants to collaboratively train models or perform statistical calculations without directly sharing sensitive data.

Centralized vs. Decentralized Aggregation

Traditionally, secure aggregation algorithms require a centralized server to coordinate the participants and compute the final aggregated result. This centralization raises issues of transparency and potential collusion between the server and some participants, thus increasing trust costs. To mitigate these issues, 2PM.Network adopts blockchain technology to replace the centralized server with a decentralized trust model, drastically reducing the need for trust in a central authority.

Blockchain as a Coordination Layer

In 2PM.Network, the blockchain serves as a decentralized coordination layer that automates the audit and settlement of computational tasks without human intervention. It verifies zero-knowledge proofs and commitments submitted by nodes to ensure that computations are correct and that no data has been falsified during the process.

Process Flow of On-Chain Secure Aggregation in 2PM.Network

  1. Task Registration: Developers publish computational tasks on the blockchain, which include zero-knowledge proofs or commitments to the tasks for later verification.

  2. Participant Selection: Nodes opt to join the task based on criteria specified in the blockchain task registration. This selection is crucial as it ensures that only qualified nodes partake in the computation.

  3. Key Exchange and Secret Sharing: Essential for secure communications and for handling node drop-offs. This step also involves sharing secrets that help mask the data during computation to prevent any single node or a colluding subset from reconstructing the entire dataset.

  4. Local Computation: Each node performs the computation on their local data based on the code and parameters fetched from the blockchain, adhering strictly to the protocol to ensure consistency and validity across all computing nodes.

  5. Data Aggregation: After computation, nodes submit their encrypted results along with zero-knowledge proofs or commitments to the blockchain. The aggregation process is then conducted securely, ensuring that no intermediate results or sensitive data are exposed externally.

Reference:

PreviousLogistic Regression TaskNextTypical Scenarios

Last updated 10 months ago

Practical Secure Aggregation for Privacy-Preserving Machine Learning