2020 Frontiers of AI:ML- Lalana Kagal

Conference Video|Duration: 40:10
July 16, 2020
Please login to view this video.
  • Video details
    Federated learning enables collaborating agents to develop a shared model without requiring them to share their underlying data. However, naive implementations are susceptible to privacy and security threats. As sensitive data is not shared, the privacy risk is reduced. However, it is still possible to leak information about the training dataset from the model’s weights or parameters. Also, malicious agents who train on random data, or worse, try to poison the model, can weaken the shared model and must be identified and held accountable.

    In this talk, I will describe an initial implementation of an accountable federated learning system that is privacy-preserving. BlockFLow incorporates differential privacy to reduce information leakage, introduces a novel auditing mechanism for evaluating model contribution, and uses Ethereum smart contracts to incentivize good behavior. Its primary goal is to reward agents proportional to the quality of their contribution while protecting the privacy of the underlying datasets and being resilient to malicious adversaries.

Locked Interactive transcript
Please login to view this video.
  • Video details
    Federated learning enables collaborating agents to develop a shared model without requiring them to share their underlying data. However, naive implementations are susceptible to privacy and security threats. As sensitive data is not shared, the privacy risk is reduced. However, it is still possible to leak information about the training dataset from the model’s weights or parameters. Also, malicious agents who train on random data, or worse, try to poison the model, can weaken the shared model and must be identified and held accountable.

    In this talk, I will describe an initial implementation of an accountable federated learning system that is privacy-preserving. BlockFLow incorporates differential privacy to reduce information leakage, introduces a novel auditing mechanism for evaluating model contribution, and uses Ethereum smart contracts to incentivize good behavior. Its primary goal is to reward agents proportional to the quality of their contribution while protecting the privacy of the underlying datasets and being resilient to malicious adversaries.

Locked Interactive transcript