Jump to content

Soft privacy technologies: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 63: Line 63:


==== Massive dataset ====
==== Massive dataset ====
With increasing data collection one source may have they become prone to privacy violations and a target for malicious attacks due to the abundance of personal information they hold. Some solutions proposed would be to anonymize the data by building a virtual database while that anonymizes both the data provider and the subjects of the data. The proposed solution here is a new and developing technology called l-site diversity.<ref>{{Citation|last=Jurczyk|first=Pawel|title=Distributed Anonymization: Achieving Privacy for Both Data Subjects and Data Providers|date=2009|url=http://link.springer.com/10.1007/978-3-642-03007-9_13|work=Data and Applications Security XXIII|volume=5645|pages=191–207|editor-last=Gudes|editor-first=Ehud|place=Berlin, Heidelberg|publisher=Springer Berlin Heidelberg|doi=10.1007/978-3-642-03007-9_13|isbn=978-3-642-03006-2|access-date=2020-11-12|last2=Xiong|first2=Li|editor2-last=Vaidya|editor2-first=Jaideep}}</ref>
With increasing data collection one source may have they become prone to privacy violations and a target for malicious attacks due to the abundance of personal information they hold.<ref>{{Cite journal|last=Al-Zobbi|first=Mohammed|last2=Shahrestani|first2=Seyed|last3=Ruan|first3=Chun|date=December 2017|title=Improving MapReduce privacy by implementing multi-dimensional sensitivity-based anonymization|url=https://journalofbigdata.springeropen.com/articles/10.1186/s40537-017-0104-5|journal=Journal of Big Data|language=en|volume=4|issue=1|pages=45|doi=10.1186/s40537-017-0104-5|issn=2196-1115|via=}}</ref> Some solutions proposed would be to anonymize the data by building a virtual database while that anonymizes both the data provider and the subjects of the data. The proposed solution here is a new and developing technology called l-site diversity.<ref>{{Citation|last=Jurczyk|first=Pawel|title=Distributed Anonymization: Achieving Privacy for Both Data Subjects and Data Providers|date=2009|url=http://link.springer.com/10.1007/978-3-642-03007-9_13|work=Data and Applications Security XXIII|volume=5645|pages=191–207|editor-last=Gudes|editor-first=Ehud|place=Berlin, Heidelberg|publisher=Springer Berlin Heidelberg|doi=10.1007/978-3-642-03007-9_13|isbn=978-3-642-03006-2|access-date=2020-11-12|last2=Xiong|first2=Li|editor2-last=Vaidya|editor2-first=Jaideep}}</ref>


==References==
==References==

Revision as of 01:36, 13 December 2020

Soft privacy technology falls under the category of PET, Privacy-enhancing technology, as methods of protecting data. PET has another sub-category, called hard privacy. Soft privacy technology has the goal to keep information safe and process data while having full control of how the data are being used. Soft privacy technology emphasis the usage of third-party programs to protect privacy, emphasizing audit, certification, consent, access control, encryption, and differential privacy.[1] With the advent of new technology, there is a need to process billions of data every day in many areas such as health care, autonomous cars, smart cards, social media data, and more. 

Applications

Health care

Usages of some medical devices like Ambient Assisted Living, monitors and reports sensitive information remotely into a cloud.[2] Cloud computing offers a solution that meets the healthcare need for processing and storage at an affordable price.[2]They are used to monitor a patient's biometric conditions remotely and can connect with smart technology. In addition to monitoring, the devices can also send a mobile notification when certain conditions pass a set point such as a major change in blood pressure. Due to the nature of the device reporting constant data and usage of smart technology,[3] they are subject to a lot of privacy concerns. This is where soft privacy comes into question on the effectiveness of the third-party cloud service as they present several privacy concerns including risk in unauthorized access, data leakage, sensitive information disclosure, and privacy disclosure.[4]

One solution proposed to this issue for cloud computing in health care is through the usage of Access control by giving partial access to data based on a user's role such as a doctor, family, and etc... Another solution that is explored for wireless technology when moving data to a cloud is through the usage of Differential privacy.[5] The differential privacy system typically encrypts the data, sends it to a trusted service, then opens it up for the hospital institutions. A strategy that is often used to prevent data leakage and attacks is by adding ‘noise’ into the data which changes the value slightly while accessing the real information through security questions. A study by Sensors concluded that differential privacy techniques involving additional noise helped achieve mathematically-precise guaranteed privacy.[5]

In the mid-90s the Commonwealth of Massachusetts Group Insurance Commission released anonymous health records while hiding some sensitive information such as address and phone number. Despite this attempt of hiding information while providing some sort of database, privacy was still breached as some people found out a correlation between the health databases with public voting databases. Without encrypting data and noise that differential privacy adds, it is possible to link two data that may not seem related, together.[6]

Autonomous cars

Autonomous cars raise concerns as a location tracking device as they are sensor-based vehicles. To achieve full autonomy there would need to be a massive database with information on the surrounding, paths to take, interaction with others, weather, and many more circumstances that needs to be accounted for. This leads to the question of how the data will all be stored, who is it shared with, and what type of data is being stored. A lot of this data is potentially sensitive and could be used maliciously if it was leaked or hacked. There are concerns over selling data to companies as the data can help predict products the consumer would like or need. It may expose health conditions and alert some companies to advertise to said customers with location-based spam/products.

Self driving car

In terms of the legal aspect of this technology, there are rules and regulations garnering some parts of the car, but a lot of it is still out in the open and oftentimes outdated.[7] Many of the current laws are vague, leaving the rules to be open to interpretation. For example, there are federal laws dating back many years ago restricting computer privacy, and the laws simply extended that rule to phones, and now are extending the same rules to the “computer” inside most driverless cars.[7] In addition to this, there are concerns about the government having access to these driver-less car data, giving opportunity for mass surveillance and tracking with no warrants. From the perspective of companies, they are using this data to improve their technology and hone in on their marketing data to fit the needs of their consumer. In response to consumer's concern and understanding of the slow-paced government action, automakers created Automotive Information Sharing and Analysis Center (Auto-ISAC) in August 2015[8] to establish protocols for cybersecurity and how to handle vehicular communication systems in a safe manner through with other autonomous cars.

Vehicle-to-grid known as V2G plays a big role in energy consumption, cost, and effectiveness for smart grids.[9] This system is what electric vehicles use to charge and discharge as it communicates with the power grid to fill up to the appropriate amount. This is extremely important for electric vehicles but opens up to privacy issues relating to the location. For example, the user's home address, place of work, place of entertainment, and record of frequency, may be reflected in the charging history. With this information that could potentially be leaked, there is a wide variety of security breach that can occurs. For example, someone’s health could be deduced by the number of hospital visits, or the user may receive location-based spams, or there may be malicious intent with the usage of home address and work schedule. Similarly to the health data, a possible solution here is to use differential privacy to add noise to the mathematical data so leaked information may not be as accurate.

Cloud storage

Main Article: Cloud storage

Using the cloud is important for consumers and business as it provides a cheap source of virtually unlimited storage space. One challenge cloud storage faced was their search function. Typical cloud computing design encrypts the word before it enters the cloud. This brings up the issue of the user’s ability to search up keywords hindering the results when searching for a specific file. This ends up being a double-edged sword of protecting privacy but introducing new inconvenient problems.[10] One solution lies in the search, when the documents are being indexed entirely rather than just keywords. It also handles this process through the change of letters into what the key is, matching each letter to their respective pair. In this case, privacy is achieved and it is easy to search for words to match with the encrypted files now. However, there is a new issue that arises from this, as it takes a longer time to read and match through an encrypted file and decrypt the whole thing for a user.[10]

Mobile data

VPNs are used to create a remote user and a home network and encrypt their package usage. This allows the user to have their own private network while on the internet. This encrypted tunnel trusts a third party to protect the privacy of the user as it acts as a virtual leased line over a shared public infrastructure. VPN has a difficult time when it comes to mobile applications as the network may be constantly changing and breaks, thus endangering the privacy that VPN gives from its encryption.[11]

Mechanism of VPN

Polices and other authority also could use soft privacy technology against criminals by accessing cloud data.[12] There have been many instances where privacy through surveillance was a concern and has reached the Supreme Court. Some instances used GPS data to track down suspects' locations and aggregated monitored data over long periods of time. However, it is now prevented through the supreme court case Riley v. California which unanimously was voted to prevent warrantless searches of mobile data.

In the effort to reduce these call many people are being taken advantage of by apps that promise to reduce these attacks. It tethers the line of privacy consent as many of these apps are known to collect phone data including callers, phone honeypot call detail records (CDRs), and call recordings. An important thing to consider in these equations when deciding what type of differential privacy to use is their budgets. As these are typically small-scaled apps with varying budget degrees. There are differential privacy protocols proven to be effective in many fields. This resulted in a higher cost than companies would be paying without any differential privacy. This is because the dataset needed to construct a good algorithm that achieves the local differential privacy is much larger.[13]

Danger of using VPNs

They are susceptible to attackers that fabricate, intercept, modify, and interrupt traffic. They become a target as sensitive information is often being transmitted over these networks. Quick VPNs generally provides faster tunnel establishment and less overhead but it downgrades the effectiveness of VPNs as a security protocol. It is also found that changing long usernames(IP addresses) and passwords frequently is also important to achieving security and protection from malicious attacks.[14]

Smart cards

Main Article: Smart Card

Newer smart cards are a developing technology used to authorize users for certain resources.[15] Using biometrics such as fingerprints, iris images, voice patterns, or even DNA sequences as access control for sensitive information such as a passport can help ensure privacy. Biometric is important because it is the access code to one’s information to virtually anything about the particular person. Currently they are being used for telecommunications, e-commerce, banking applications, government applications, healthcare, transportation, and more.

Danger of using biometrics

Leaked as biometric contains unique characteristic details about a person and in said event, it would be fairly easy to trace the endangered user. This poses a great danger if said information was leaked as biometric contains unique characteristic details about a person and in said event, it would be fairly easy to trace the endangered user. There are some possible solutions to this:

Anonymous Biometric Access Control System (ABAC): Authenticate valid users into the system without knowing who it is. For example, a hotel should be able to admit a VIP guest member into a VIP room without knowing any details about that person even though the verification process is still utilizing biometrics. Developed by Center for Visualization and Virtual Environment, their system is composed of hamming distance computation, bit extraction, comparison, and result aggregation, all implemented with a homomorphic cipher. Essentially, it allows the biometric server to confirm a user without knowing their identity. This is done by taking the biometric saved and encrypting it. While saving the biometrics, there are specific processes to de-identify features such as facial recognition when encrypting the data so even if it was leaked, there would be no danger of tracing someone's identity.[16]

Online videos

Online learning using mobile videos has become prevalent in our time. One of the biggest challenges for privacy in this field is the idea of prefetching in videos. With a sudden increase of higher traffic for more video presence, many have turned to prefer to offload some of these demands to delay reduction. Prefetching is when the program loads some resources before it is needed to decrease the eventual wait time. It seems to be a perfect solution that is needed for our time with increasing demand for videos, but prefetching heavily relies on prediction. For an accurate prediction to happen, it is necessary to access view history and to know more about the user to give accurate prefetching, otherwise, it will be more of a waste to bandwidth than a benefit. After learning the user’s view of history and popular content that the user likes, it is easier to predict the next video to prefetch, but these are valuable and possibly sensitive data.[17]

Proposed privacy solution:

DPDL-SVP -differential privacy-oriented distributed online learning for mobile social video prefetching, separates the main problems into two subproblems and enables mobile users to solve the two subproblems focusing only on communicating between their mobile neighbors and social members. This gives differential privacy as it preserves the sensitive viewing information.[17]

Certified logos used to instill higher trust (USDA organic certified)

Third party certification

There is an increased growth of online transactions, and there has been initiative with the purpose of reducing the consumer’s perception of risk. Firms have found ways to gain trust from new consumers through the use of seals and certifications off of third party platforms. A study done by Electronic Commerce Research found that payment providers can reduce their perceptions of risk from consumers by having the presence of third-party logos and seals on the checkout page to enhance visitor conversion.[18] These logos and certificates serve as an indicator for the consumer to feel safe about inputting their payment information and shipping address.

Future of soft privacy technology

Mobile concern and possible solutions

mIPS - mobile Intrusion Prevention System looks to be location-aware and help protect users when utilizing future technology such as virtualized environments where the phone acts as a small virtual machine. Some cases to be wary about in the future includes stalking attacks, bandwidth stealing, attack on cloud hosts, and PA2: traffic analysis attack on the 5G device based on a confined area. Current privacy protection programs are prone to leakage and do not account for the changing of Bluetooth, locations, and LAN connections which affect how often leakage can occur. [19]

Public key-based access control

In the context of sensor net, public-key based access control(PKC) may be a good solution in the future to cover some issues in wireless access control. For sensor net, the danger from attackers includes; impersonation which grants access to malicious users, replay attack where the adversary captures sensitive information by replaying it, interleaving which selectively combines messages from previous sessions, reflection where an adversary sends an identical message to the originator similar to impersonation, forced delay which blocks communication message to be sent at a later time, and chosen-text attack where the attacker tries to extract the keys to access the sensor. The solution to this may be public key-based cryptography as a study done by Haodong Wang shows that PKC-based protocol presented is better than the traditional symmetric key in regards to memory usage, message complexity, and security resilience.[20]

Social media

Privacy management is a big part of social networks and this paper presents several solutions to this issue. For example, users of various social networks has the ability to control and specify what information they want to share to certain people based on their trust levels. Privacy concerns arise from this, for example in 2007 Facebook received complaints about their advertisements. In this instance Facebook’s partner collects information about a user and spreads it to the user’s friends without any consent. There are some proposed solutions in prototype stage by using a protocol that focuses on cryptographic and digital signature techniques to ensure the right privacy protections are in place.[21]

Massive dataset

With increasing data collection one source may have they become prone to privacy violations and a target for malicious attacks due to the abundance of personal information they hold.[22] Some solutions proposed would be to anonymize the data by building a virtual database while that anonymizes both the data provider and the subjects of the data. The proposed solution here is a new and developing technology called l-site diversity.[23]

References

  1. ^ Deng, Mina; Wuyts, Kim; Scandariato, Riccardo; Preneel, Bart; Joosen, Wouter (March 2011). "A privacy threat analysis framework: supporting the elicitation and fulfillment of privacy requirements". Requirements Engineering. 16 (1): 3–32. doi:10.1007/s00766-010-0115-7. ISSN 0947-3602.
  2. ^ a b Salama, Usama; Yao, Lina; Paik, Hye-young (June 2018). "An Internet of Things Based Multi-Level Privacy-Preserving Access Control for Smart Living". Informatics. 5 (2): 23. doi:10.3390/informatics5020023.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  3. ^ "Ambient Assisted Living (AAL): Technology for independent life". GlobalRPH. Retrieved 2020-10-29.
  4. ^ Yang, Pan; Xiong, Naixue; Ren, Jingli (2020). "Data Security and Privacy Protection for Cloud Storage: A Survey". IEEE Access. 8: 131723–131740. doi:10.1109/ACCESS.2020.3009876. ISSN 2169-3536.
  5. ^ a b Ren, Hao; Li, Hongwei; Liang, Xiaohui; He, Shibo; Dai, Yuanshun; Zhao, Lian (2016-09-10). "Privacy-Enhanced and Multifunctional Health Data Aggregation under Differential Privacy Guarantees". Sensors. 16 (9): 1463. doi:10.3390/s16091463. ISSN 1424-8220. PMC 5038741. PMID 27626417.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  6. ^ Jain, Priyank; Gyanchandani, Manasi; Khare, Nilay (2018-04-13). "Differential privacy: its technological prescriptive using big data". Journal of Big Data. 5 (1): 15. doi:10.1186/s40537-018-0124-9. ISSN 2196-1115.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  7. ^ a b Salama, Chasel; Lee (2017). "Grabbing the Wheel Early: Moving Forward on Cybersecurity and Privacy Protections for Driverless Cars" (PDF). Informatics. 5 (2): 23. doi:10.3390/informatics5020023.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  8. ^ "Auto-ISAC – Automotive Information Sharing & Analysis Center". Retrieved 2020-10-29.
  9. ^ Li, Yuancheng; Zhang, Pan; Wang, Yimeng (2018-10-01). "The Location Privacy Protection of Electric Vehicles with Differential Privacy in V2G Networks". Energies. 11 (10): 2625. doi:10.3390/en11102625. ISSN 1996-1073.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  10. ^ a b Salam, Md Iftekhar; Yau, Wei-Chuen; Chin, Ji-Jian; Heng, Swee-Huay; Ling, Huo-Chong; Phan, Raphael C-W; Poh, Geong Sen; Tan, Syh-Yuan; Yap, Wun-She (December 2015). "Implementation of searchable symmetric encryption for privacy-preserving keyword search on cloud storage". Human-centric Computing and Information Sciences. 5 (1): 19. doi:10.1186/s13673-015-0039-9. ISSN 2192-1962.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  11. ^ Alshalan, Abdullah; Pisharody, Sandeep; Huang, Dijiang (2016). "A Survey of Mobile VPN Technologies". IEEE Communications Surveys & Tutorials. 18 (2): 1177–1196. doi:10.1109/COMST.2015.2496624. ISSN 1553-877X.
  12. ^ Marshall, Emma W.; Groscup, Jennifer L.; Brank, Eve M.; Perez, Analay; Hoetger, Lori A. (2019). "Police surveillance of cell phone location data: Supreme Court versus public opinion". Behavioral Sciences & the Law. 37 (6): 751–775. doi:10.1002/bsl.2442. ISSN 1099-0798.
  13. ^ "Building a Collaborative Phone Blacklisting System with Local Differential Privacy" (PDF).{{cite web}}: CS1 maint: url-status (link)
  14. ^ Rahimi, Sanaz; Zargham, Mehdi (2011), Butts, Jonathan; Shenoi, Sujeet (eds.), "Security Analysis of VPN Configurations in Industrial Control Environments", Critical Infrastructure Protection V, vol. 367, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 73–88, doi:10.1007/978-3-642-24864-1_6, ISBN 978-3-642-24863-4, retrieved 2020-11-12
  15. ^ Sanchez-Reillo, Raul; Alonso-Moreno, Raul; Liu-Jimenez, Judith (2013), Campisi, Patrizio (ed.), "Smart Cards to Enhance Security and Privacy in Biometrics", Security and Privacy in Biometrics, London: Springer London, pp. 239–274, doi:10.1007/978-1-4471-5230-9_10, ISBN 978-1-4471-5229-3, retrieved 2020-10-29
  16. ^ Ye, Shuiming; Luo, Ying; Zhao, Jian; Cheung, Sen-Ching S. (2009). "Anonymous Biometric Access Control". EURASIP Journal on Information Security. 2009: 1–17. doi:10.1155/2009/865259. ISSN 1687-4161.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  17. ^ a b Wang, Mu; Xu, Changqiao; Chen, Xingyan; Hao, Hao; Zhong, Lujie; Yu, Shui (March 2019). "Differential Privacy Oriented Distributed Online Learning for Mobile Social Video Prefetching". IEEE Transactions on Multimedia. 21 (3): 636–651. doi:10.1109/TMM.2019.2892561. ISSN 1520-9210.
  18. ^ Cardoso, Sofia; Martinez, Luis F. (2019-03-01). "Online payments strategy: how third-party internet seals of approval and payment provider reputation influence the Millennials' online transactions". Electronic Commerce Research. 19 (1): 189–209. doi:10.1007/s10660-018-9295-x. ISSN 1572-9362.
  19. ^ Ulltveit-Moe, Nils; Oleshchuk, Vladimir A.; Køien, Geir M. (2011-04-01). "Location-Aware Mobile Intrusion Detection with Enhanced Privacy in a 5G Context". Wireless Personal Communications. 57 (3): 317–338. doi:10.1007/s11277-010-0069-6. ISSN 1572-834X.
  20. ^ Wang, Haodong; Sheng, Bo; Tan, Chiu C.; Li, Qun (July 2011). "Public-key based access control in sensornet". Wireless Networks. 17 (5): 1217–1234. doi:10.1007/s11276-011-0343-x. ISSN 1022-0038.
  21. ^ Salam, Md Iftekhar; Yau, Wei-Chuen; Chin, Ji-Jian; Heng, Swee-Huay; Ling, Huo-Chong; Phan, Raphael C-W; Poh, Geong Sen; Tan, Syh-Yuan; Yap, Wun-She (December 2015). "Implementation of searchable symmetric encryption for privacy-preserving keyword search on cloud storage". Human-centric Computing and Information Sciences. 5 (1): 19. doi:10.1186/s13673-015-0039-9. ISSN 2192-1962.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  22. ^ Al-Zobbi, Mohammed; Shahrestani, Seyed; Ruan, Chun (December 2017). "Improving MapReduce privacy by implementing multi-dimensional sensitivity-based anonymization". Journal of Big Data. 4 (1): 45. doi:10.1186/s40537-017-0104-5. ISSN 2196-1115.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  23. ^ Jurczyk, Pawel; Xiong, Li (2009), Gudes, Ehud; Vaidya, Jaideep (eds.), "Distributed Anonymization: Achieving Privacy for Both Data Subjects and Data Providers", Data and Applications Security XXIII, vol. 5645, Berlin, Heidelberg: Springer Berlin Heidelberg, pp. 191–207, doi:10.1007/978-3-642-03007-9_13, ISBN 978-3-642-03006-2, retrieved 2020-11-12