SINCE 2004

  • 0

      0 Item in Bag


      Your Shopping bag is empty

      CHECKOUT
  • Notice

    • ALL COMPUTER, ELECTRONICS AND MECHANICAL COURSES AVAILABLE…. PROJECT GUIDANCE SINCE 2004. FOR FURTHER DETAILS CALL 9443117328

    Projects > COMPUTER > 2017 > NON IEEE > APPLICATION

    m-Privacy for Collaborative Data Publishing


    Abstract

    The paper addresses this new threat, and makes several contributions. The notion of m-privacy, which guarantees that the anonym zed data satisfies a given privacy constraint against any group of up to m colluding data providers. Second, we present heuristic algorithms exploiting the monotonicity of privacy constraints for efficiently checking m-privacy given a group of records. Third, we present a data provider-aware anonymization algorithm with adaptive m-privacy checking strategies to ensure high utility. Also Propose k-Secure Sum Protocol for computation of sum of individual parties preserving privacy of their inputs.


    Existing System

    According to the Cynthia DworkOver the past five years a new approach to privacy-preserving data analysis has born fruit. This approach differs from much of the related literature in the statistics, databases, theory, and cryptography communities, in that a formal and ad omnia privacy guarantee is defined, and the data analysis techniques presented are rigorously proved to satisfy the guarantee. The key privacy guarantee that has emerged is differential privacy. Roughly speaking, this ensures that (almost, and quantifiably) no risk is incurred by joining a statistical database. In this survey, I recall the definition of differential privacy and two basic techniques for achieving it. I then show some interesting applications of these techniques, presenting algorithms for three specific tasks and three general results on differentially private learning. First, I show that an attacker can discover the values of sensitive attributes when there is little diversity in those sensitive attributes. Second, attackers often have background knowledge, and I show that k-anonymity does not guarantee privacy against attackers using background knowledge. I give a detailed analysis of these two attacks and I propose a novel and powerful privacy definition called â„“-diversity.


    Proposed System

    This work consider the collaborative data publishing setting with horizontally partitioned data across multiple data providers, each contributing a subset of records Ti. As a special case, a data provider could be the data owner itself who is contributing its own records. This is a very common scenario in social networking and recommendation systems. My goal is to publish an anonymized view of the integrated data such that a data recipient including the data providers will not be able to compromise the privacy of the individual records provided by other parties. This work proposed k-Secure Sum Protocol for computation of sum of individual parties preserving privacy of their inputs. The protocol allows parties to break their data inputs into segments and distributing these segments among parties before computation. It provides zero probability for two colluding neighbors when they want to attack data of a middle party.


    Architecture


    FOR MORE INFORMATION CLICK HERE