Jump to content

Tokenization (data security): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Difference from encryption: added credible sources
fixed the dead link, expanded on risks, limitations and tokenization process
Line 18: Line 18:
== Concepts and origins ==
== Concepts and origins ==


The concept of tokenization, as adopted by the industry today, has existed since the first [[currency]] systems emerged centuries ago as a means to reduce risk in handling high value [[financial instrument]]s by replacing them with surrogate equivalents.<ref>{{cite news |last1=Rolfe |first1=Alex |title=The fall and rise of Tokenization |url=https://www.paymentscardsandmobile.com/the-fall-and-rise-of-tokenization/ |access-date=27 September 2022 |date=May 2015}}</ref><ref>{{Cite journal |last=Xu |first=Xiwei |last2=Pautasso |first2=Cesare |last3=Zhu |first3=Liming |last4=Lu |first4=Qinghua |last5=Weber |first5=Ingo |date=2018-07-04 |title=A Pattern Collection for Blockchain-based Applications |url=https://doi.org/10.1145/3282308.3282312 |journal=Proceedings of the 23rd European Conference on Pattern Languages of Programs |series=EuroPLoP '18 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=1–20 |doi=10.1145/3282308.3282312 |isbn=978-1-4503-6387-7}}</ref> In the physical world, [[Token coin|coin tokens]] have a long history of use replacing the financial instrument of [[Mint (coin)|minted coins]] and [[banknote]]s. In more recent history, subway tokens and casino chips found adoption for their respective systems to replace physical currency and cash handling risks such as theft. [[Exonumia]], and [[scrip]] are terms synonymous with such tokens.
The concept of tokenization, as adopted by the industry today, has existed since the first [[currency]] systems emerged centuries ago as a means to reduce risk in handling high value [[financial instrument]]s by replacing them with surrogate equivalents.<ref>{{cite news |last1=Rolfe |first1=Alex |title=The fall and rise of Tokenization |url=https://www.paymentscardsandmobile.com/the-fall-and-rise-of-tokenization/ |access-date=27 September 2022 |date=May 2015}}</ref><ref>{{Cite journal |last=Xu |first=Xiwei |last2=Pautasso |first2=Cesare |last3=Zhu |first3=Liming |last4=Lu |first4=Qinghua |last5=Weber |first5=Ingo |date=2018-07-04 |title=A Pattern Collection for Blockchain-based Applications |url=https://doi.org/10.1145/3282308.3282312 |journal=Proceedings of the 23rd European Conference on Pattern Languages of Programs |series=EuroPLoP '18 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=1–20 |doi=10.1145/3282308.3282312 |isbn=978-1-4503-6387-7}}</ref><ref>{{Cite news |last=Millmore |first=B. |last2=Foskolou |first2=V. |last3=Mondello |first3=C. |last4=Kroll |first4=J. |last5=Upadhyay |first5=S. |last6=Wilding |first6=D. |title=Tokens: Culture, Connections, Communities: Final Programme |publisher=The University of Warwick |url=https://warwick.ac.uk/fac/arts/classics/research/dept_projects/tcam/events/tccc/tokens_programme.pdf}}</ref> In the physical world, [[Token coin|coin tokens]] have a long history of use replacing the financial instrument of [[Mint (coin)|minted coins]] and [[banknote]]s. In more recent history, subway tokens and casino chips found adoption for their respective systems to replace physical currency and cash handling risks such as theft. [[Exonumia]], and [[scrip]] are terms synonymous with such tokens.


In the digital world, similar substitution techniques have been used since the 1970s as a means to isolate real data elements from exposure to other data systems. In databases for example, [[surrogate key]] values have been used since 1976 to isolate data associated with the internal mechanisms of databases and their external equivalents for a variety of uses in data processing.<ref>{{Cite news |last=Link |first=S. |last2=Luković |first2=I. |last3=Mogin |first3=P. |date=2010 |title=Performance evaluation of natural and surrogate key database architectures |work=School of Engineering and Computer Science, Victoria University of Wellington}}</ref><ref>{{Cite web |last=Hall |first=P. |last2=Owlett |first2=J. |last3=Todd |first3=S. |date=1976 |title=Relations and entities. Modelling in Database Management Systems |url= |publisher=GM Nijssen}}</ref> More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection.
In the digital world, similar substitution techniques have been used since the 1970s as a means to isolate real data elements from exposure to other data systems. In databases for example, [[surrogate key]] values have been used since 1976 to isolate data associated with the internal mechanisms of databases and their external equivalents for a variety of uses in data processing.<ref>{{Cite news |last=Link |first=S. |last2=Luković |first2=I. |last3=Mogin |first3=P. |date=2010 |title=Performance evaluation of natural and surrogate key database architectures |work=School of Engineering and Computer Science, Victoria University of Wellington}}</ref><ref>{{Cite web |last=Hall |first=P. |last2=Owlett |first2=J. |last3=Todd |first3=S. |date=1976 |title=Relations and entities. Modelling in Database Management Systems |url= |publisher=GM Nijssen}}</ref> More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection.
Line 26: Line 26:
In 2001, TrustCommerce created the concept of Tokenization to protect sensitive payment data for a client, Classmates.com.<ref>{{Cite news|url=http://www.trustcommerce.com/blog/where-did-tokenization-come-from/|title=Where Did Tokenization Come From?|newspaper=TrustCommerce|access-date=2017-02-23|language=en-US}}</ref> It engaged Rob Caulfield, founder of TrustCommerce, because the risk of storing card holder data was too great if the systems were ever hacked. TrustCommerce developed TC Citadel®, with which customers could reference a token in place of card holder data and TrustCommerce would process a payment on the merchant's behalf.<ref>{{Cite web|url=http://trustcommerce.com/billing.html |title=TrustCommerce |date=2001-04-05 |access-date=2017-02-23 |archive-url=https://web.archive.org/web/20010405203446/http://trustcommerce.com/billing.html |archive-date=2001-04-05 }}</ref> This billing application allowed clients to process recurring payments without the need to store cardholder payment information. Tokenization replaces the Primary Account Number (PAN) with randomly generated tokens. If intercepted, the data contains no cardholder information, rendering it useless to hackers. The PAN cannot be retrieved, even if the token and the systems it resides on are compromised, nor can the token be reverse engineered to arrive at the PAN.
In 2001, TrustCommerce created the concept of Tokenization to protect sensitive payment data for a client, Classmates.com.<ref>{{Cite news|url=http://www.trustcommerce.com/blog/where-did-tokenization-come-from/|title=Where Did Tokenization Come From?|newspaper=TrustCommerce|access-date=2017-02-23|language=en-US}}</ref> It engaged Rob Caulfield, founder of TrustCommerce, because the risk of storing card holder data was too great if the systems were ever hacked. TrustCommerce developed TC Citadel®, with which customers could reference a token in place of card holder data and TrustCommerce would process a payment on the merchant's behalf.<ref>{{Cite web|url=http://trustcommerce.com/billing.html |title=TrustCommerce |date=2001-04-05 |access-date=2017-02-23 |archive-url=https://web.archive.org/web/20010405203446/http://trustcommerce.com/billing.html |archive-date=2001-04-05 }}</ref> This billing application allowed clients to process recurring payments without the need to store cardholder payment information. Tokenization replaces the Primary Account Number (PAN) with randomly generated tokens. If intercepted, the data contains no cardholder information, rendering it useless to hackers. The PAN cannot be retrieved, even if the token and the systems it resides on are compromised, nor can the token be reverse engineered to arrive at the PAN.


Tokenization was applied to payment card data by [[Shift4 Payments|Shift4 Corporation]]<ref>{{Cite web |url=https://www.reuters.com/article/2008/09/17/idUS168810+17-Sep-2008+PRN20080917 |title=Shift4 Corporation Releases Tokenization in Depth White Paper |access-date=2017-07-02 |archive-url=https://web.archive.org/web/20140313203320/http://www.reuters.com/article/2008/09/17/idUS168810+17-Sep-2008+PRN20080917 |archive-date=2014-03-13 |url-status=dead }}</ref> and released to the public during an industry Security Summit in [[Las Vegas Valley|Las Vegas]], [[Nevada]] in 2005.<ref>{{cite magazine |url=http://www.internetretailer.com/internet/marketing-conference/36258-shift4-launches-security-tool-that-lets-merchants-re-use-credit-card-data.html |title=Shift4 Launches Security Tool That Lets Merchants Re-Use Credit Card Data |magazine=Internet Retailer}}{{dl|fix-attempted=yes|date=November 2018}}</ref> The technology is meant to prevent the theft of the credit card information in storage. Shift4 defines tokenization as: “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In [[payment card industry]] (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”<ref>{{Cite web |url=http://www.shift4.com/pr_20080917_tokenizationindepth.cfm |title=Shift4 Corporation Releases Tokenization in Depth White Paper |access-date=2010-09-17 |archive-url=https://web.archive.org/web/20110716055923/http://www.shift4.com/pr_20080917_tokenizationindepth.cfm |archive-date=2011-07-16 |url-status=dead }}</ref>
Tokenization was applied to payment card data by [[Shift4 Payments|Shift4 Corporation]]<ref>{{Cite web |url=https://www.reuters.com/article/2008/09/17/idUS168810+17-Sep-2008+PRN20080917 |title=Shift4 Corporation Releases Tokenization in Depth White Paper |access-date=2017-07-02 |archive-url=https://web.archive.org/web/20140313203320/http://www.reuters.com/article/2008/09/17/idUS168810+17-Sep-2008+PRN20080917 |archive-date=2014-03-13 |url-status=dead }}</ref> and released to the public during an industry Security Summit in [[Las Vegas Valley|Las Vegas]], [[Nevada]] in 2005.<ref>{{cite magazine |date= |title=Shift4 Launches Security Tool That Lets Merchants Re-Use Credit Card Data |url=http://www.internetretailer.com/internet/marketing-conference/36258-shift4-launches-security-tool-that-lets-merchants-re-use-credit-card-data.html |magazine=Internet Retailer |archive-url=https://web.archive.org/web/20150218062740/https://www.internetretailer.com/2005/10/13/shift4-launches-security-tool-that-lets-merchants-re-use-credit |archive-date=2005-10-13}}</ref> The technology is meant to prevent the theft of the credit card information in storage. Shift4 defines tokenization as: “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In [[payment card industry]] (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”<ref>{{Cite web |url=http://www.shift4.com/pr_20080917_tokenizationindepth.cfm |title=Shift4 Corporation Releases Tokenization in Depth White Paper |access-date=2010-09-17 |archive-url=https://web.archive.org/web/20110716055923/http://www.shift4.com/pr_20080917_tokenizationindepth.cfm |archive-date=2011-07-16 |url-status=dead }}</ref>


To protect data over its full lifecycle, tokenization is often combined with [[end-to-end encryption]] to secure [[data in transit]] to the tokenization system or service, with a token replacing the original data on return. For example, to avoid the risks of [[malware]] stealing data from low-trust systems such as [[point of sale]] (POS) systems, as in the [http://www.securityweek.com/experts-debate-how-hackers-stole-40-million-card-numbers-target Target breach of 2013], cardholder data encryption must take place prior to card data entering the POS and not after. Encryption takes place within the confines of a security hardened and validated card reading device and data remains encrypted until received by the processing host, an approach pioneered by [[Heartland Payment Systems]]<ref>{{Cite web |url=http://philadelphiafed.org/consumer-credit-and-payments/payment-cards-center/publications/discussion-papers/2010/D-2010-January-Heartland-Payment-Systems.pdf |title=Lessons Learned from a Data Breach |access-date=2014-04-01 |archive-date=2013-05-02 |archive-url=http://webarchive.loc.gov/all/20130502162448/http://www.philadelphiafed.org/consumer-credit-and-payments/payment-cards-center/publications/discussion-papers/2010/D-2010-January-Heartland-Payment-Systems.pdf |url-status=dead }}</ref> as a means to secure payment data from advanced threats, now widely adopted by industry payment processing companies and technology companies.<ref>[https://archive.today/20140401073840/http://www.eweek.com/c/a/IT-Infrastructure/Voltage-Ingencio-Partner-on-Data-Encryption-Platform-559537/ Voltage, Ingencio Partner on Data Encryption Platform]</ref> The PCI Council has also specified end-to-end encryption (certified point-to-point encryption—P2PE) for various service implementations in various [https://www.pcisecuritystandards.org/security_standards/documents.php PCI Council Point-to-point Encryption] documents.
To protect data over its full lifecycle, tokenization is often combined with [[end-to-end encryption]] to secure [[data in transit]] to the tokenization system or service, with a token replacing the original data on return. For example, to avoid the risks of [[malware]] stealing data from low-trust systems such as [[point of sale]] (POS) systems, as in the [http://www.securityweek.com/experts-debate-how-hackers-stole-40-million-card-numbers-target Target breach of 2013], cardholder data encryption must take place prior to card data entering the POS and not after. Encryption takes place within the confines of a security hardened and validated card reading device and data remains encrypted until received by the processing host, an approach pioneered by [[Heartland Payment Systems]]<ref>{{Cite web |url=http://philadelphiafed.org/consumer-credit-and-payments/payment-cards-center/publications/discussion-papers/2010/D-2010-January-Heartland-Payment-Systems.pdf |title=Lessons Learned from a Data Breach |access-date=2014-04-01 |archive-date=2013-05-02 |archive-url=http://webarchive.loc.gov/all/20130502162448/http://www.philadelphiafed.org/consumer-credit-and-payments/payment-cards-center/publications/discussion-papers/2010/D-2010-January-Heartland-Payment-Systems.pdf |url-status=dead }}</ref> as a means to secure payment data from advanced threats, now widely adopted by industry payment processing companies and technology companies.<ref>[https://archive.today/20140401073840/http://www.eweek.com/c/a/IT-Infrastructure/Voltage-Ingencio-Partner-on-Data-Encryption-Platform-559537/ Voltage, Ingencio Partner on Data Encryption Platform]</ref> The PCI Council has also specified end-to-end encryption (certified point-to-point encryption—P2PE) for various service implementations in various [https://www.pcisecuritystandards.org/security_standards/documents.php PCI Council Point-to-point Encryption] documents.

== The tokenization process ==
The process of tokenization consists of the following steps:

* The application sends the tokenization data and authentication information to the tokenization system.
* The application sends the tokenization data and authentication information to the tokenization system. It is stopped if authentication fails and the data is delivered to an event management system. As a result, administrators can discover problems and effectively manage the system. The system moves on to the next phase if authentication is successful.
* Using one-way cryptographic techniques, a token is generated and kept in a highly secure data vault.
* The new token is provided to the application for further use.<ref>{{Cite journal |last=Ogigau-Neamtiu |first=F. |date=2016 |title=Tokenization as a data security technique |url=http://yadda.icm.edu.pl/baztech/element/bwmeta1.element.baztech-7680b362-6a77-420a-aff6-9409bfb9efe6 |journal=Zeszyty Naukowe AON |language=EN |volume=nr 2(103) |issn=0867-2245}}</ref>

Tokenization systems share several components according to established standards.

# Token Generation is the process of producing a token using any means, such as mathematically reversible cryptographic functions based on strong encryption algorithms and key management mechanisms, one-way nonreversible cryptographic functions (e.g., a hash function with strong, secret salt), or assignment via a randomly generated number. Random Number Generator (RNG) techniques are often the best choice for generating token values.
# Token Mapping – this is the process of assigning the created token value to its original value. To enable permitted look-ups of the original value using the token as the index, a secure cross-reference database must be constructed.
# Token Data Store –&nbsp;this is a central repository for the Token Mapping process that holds the original values as well as the related token values after the Token Generation process. On data servers, sensitive data and token values must be securely kept in encrypted format.
# Encrypted Data Storage – this is the encryption of sensitive data while it is in transit.
# Management of Cryptographic Keys. Strong key management procedures are required for sensitive data encryption on Token Data Stores.<ref name=":1">{{Cite journal |last=Ozdenizci |first=Busra |last2=Ok |first2=Kerem |last3=Coskun |first3=Vedat |date=2016-11-30 |title=A Tokenization-Based Communication Architecture for HCE-Enabled NFC Services |url=https://www.hindawi.com/journals/misy/2016/5046284/ |journal=Mobile Information Systems |language=en |volume=2016 |pages=e5046284 |doi=10.1155/2016/5046284 |issn=1574-017X}}</ref>


== Difference from encryption ==
== Difference from encryption ==
Line 65: Line 81:


With tokenization's increasing adoption, new tokenization technology approaches have emerged to remove such operational risks and complexities and to enable increased scale suited to emerging [[big data]] use cases and high performance transaction processing, especially in financial services and banking.<ref>
With tokenization's increasing adoption, new tokenization technology approaches have emerged to remove such operational risks and complexities and to enable increased scale suited to emerging [[big data]] use cases and high performance transaction processing, especially in financial services and banking.<ref>
[http://www.computerworld.com/s/article/9246264/_Banks_push_for_tokenization_standard_to_secure_credit_card_payments Banks push for tokenization standard to secure credit card payments]
{{Cite web |last=Vijayan |first=Jaikumar |date=2014-02-12 |title=Banks push for tokenization standard to secure credit card payments |url=https://www.computerworld.com/article/2487635/banks-push-for-tokenization-standard-to-secure-credit-card-payments.html |access-date=2022-11-23 |website=Computerworld |language=en}}
</ref> In addition to conventional tokenization methods, Protegrity provides additional security through its so-called "obfuscation layer." This creates a barrier that prevents not only regular users from accessing information they wouldn't see but also privileged users who has access, such as database administrators.<ref>{{Cite news |last=Mark |first=S. J. |date=2018 |title=De-identification of personal information for use in software testing to ensure compliance with the Protection of Personal Information Act}}</ref>
</ref> Stateless tokenization enables random mapping of live data elements to surrogate values without needing a database while retaining the isolation properties of tokenization.


Stateless tokenization enables random mapping of live data elements to surrogate values without needing a database while retaining the isolation properties of tokenization.
November 2014, [[American Express]] released its token service which meets the [[EMV]] tokenization standard.<ref>{{cite web |url=http://about.americanexpress.com/news/pr/2014/amex-intros-online-mobile-payment-security.aspx |title=American Express Introduces New Online and Mobile Payment Security Services |website=AmericanExpress.com |date=3 November 2014 |access-date=2014-11-04 |archive-url=https://web.archive.org/web/20141104055035/http://about.americanexpress.com/news/pr/2014/amex-intros-online-mobile-payment-security.aspx |archive-date=2014-11-04 |url-status=dead }}</ref>

November 2014, [[American Express]] released its token service which meets the [[EMV]] tokenization standard.<ref>{{cite web |url=http://about.americanexpress.com/news/pr/2014/amex-intros-online-mobile-payment-security.aspx |title=American Express Introduces New Online and Mobile Payment Security Services |website=AmericanExpress.com |date=3 November 2014 |access-date=2014-11-04 |archive-url=https://web.archive.org/web/20141104055035/http://about.americanexpress.com/news/pr/2014/amex-intros-online-mobile-payment-security.aspx |archive-date=2014-11-04 |url-status=dead }}</ref> Other notable examples of Tokenization-based payment systems, according to the EMVCo standard, include [[Google Wallet]], [[Apple Pay]],<ref>{{Cite web |title=Apple Pay Programming Guide: About Apple Pay |url=https://developer.apple.com/library/archive/ApplePay_Guide/index.html |access-date=2022-11-23 |website=developer.apple.com}}</ref> [[Samsung Pay]], [[Microsoft Wallet]], [[Fitbit Pay]] and [[Garmin Pay]]. [[Visa Inc.|Visa]] uses tokenization techniques to provide a secure online and mobile shopping.<ref>{{Cite web |title=Visa Token Service |url=https://usa.visa.com/products/visa-token-service.html |access-date=2022-11-23 |website=usa.visa.com |language=en}}</ref>

Using blockchain, as opposed to relying on trusted third parties, it is possible to run highly accessible, tamper-resistant databases for transactions.<ref>{{Cite journal |last=Beck |first=Roman |last2=Avital |first2=Michel |last3=Rossi |first3=Matti |last4=Thatcher |first4=Jason Bennett |date=2017-12-01 |title=Blockchain Technology in Business and Information Systems Research |url=https://doi.org/10.1007/s12599-017-0505-1 |journal=Business & Information Systems Engineering |language=en |volume=59 |issue=6 |pages=381–384 |doi=10.1007/s12599-017-0505-1 |issn=1867-0202}}</ref><ref>{{Cite news |last=Çebi |first=F. |last2=Bolat |first2=H.B. |last3=Atan |first3=T. |last4=Erzurumlu |first4=Ö. Y. |date=2021 |title=International Engineering and Technology Management Summit 2021–ETMS2021 Proceeding Book |publisher=İstanbul Technical University & Bahçeşehir University |isbn=978-975-561-522-6}}</ref> With help of blockchain, tokenization is the process of converting the value of a tangible or intangible asset into a token that can be exchanged on the network.

This enables the tokenization of conventional financial assets, for instance, by transforming rights into a digital token backed by the asset itself using blockchain technology.<ref name=":2">{{Cite journal |last=Morrow |last2=Zarrebini |date=2019-10-22 |title=Blockchain and the Tokenization of the Individual: Societal Implications |url=https://www.mdpi.com/1999-5903/11/10/220 |journal=Future Internet |language=en |volume=11 |issue=10 |pages=220 |doi=10.3390/fi11100220 |issn=1999-5903}}</ref> Besides that, tokenization enables the simple and efficient compartmentalization and management of data across multiple users. Individual tokens created through tokenization can be used to split ownership and partially resell an asset.<ref>{{Cite journal |last=Tian |first=Yifeng |last2=Lu |first2=Zheng |last3=Adriaens |first3=Peter |last4=Minchin |first4=R. Edward |last5=Caithness |first5=Alastair |last6=Woo |first6=Junghoon |date=2020 |title=Finance infrastructure through blockchain-based tokenization |url=https://link.springer.com/10.1007/s42524-020-0140-2 |journal=Frontiers of Engineering Management |language=en |volume=7 |issue=4 |pages=485–499 |doi=10.1007/s42524-020-0140-2 |issn=2095-7513}}</ref><ref>{{Cite journal |last=Ross |first=Omri |last2=Jensen |first2=Johannes Rude |last3=Asheim |first3=Truls |date=2019-11-16 |title=Assets under Tokenization |url=https://papers.ssrn.com/abstract=3488344 |journal= |language=en |location=Rochester, NY |doi=10.2139/ssrn.3488344}}</ref> Consequently, only entities with the appropriate token can access the data.<ref name=":2" />

Numerous blockchain companies support asset tokenization. In 2019, [[eToro]] acquired Firmo and renamed as eToroX. Through its Token Management Suite, which is backed by USD-pegged stablecoins, eToroX enables asset tokenization.<ref>{{Cite web |last=Tabatabai |first=Arman |date=2019-03-25 |title=Social investment platform eToro acquires smart contract startup Firmo |url=https://techcrunch.com/2019/03/25/social-investment-platform-etoro-acquires-smart-contract-startup-firmo/ |access-date=2022-11-23 |website=TechCrunch |language=en-US}}</ref><ref>{{Cite web |title=eToroX Names Omri Ross Chief Blockchain Scientist |url=https://www.financemagnates.com/executives/moves/etorox-names-firmo-ceo-omri-ross-chief-blockchain-scientist/ |access-date=2022-11-23 |website=Financial and Business News {{!}} Finance Magnates |language=en}}</ref>

The tokenization of equity is facilitated by STOKR, a platform that links investors with small and medium-sized businesses. Tokens issued through the STOKR platform are legally recognized as transferable securities under European Union capital market regulations.<ref name=":3">{{Cite journal |last=Sazandrishvili |first=George |date=2020 |title=Asset tokenization in plain English |url=https://onlinelibrary.wiley.com/doi/10.1002/jcaf.22432 |journal=Journal of Corporate Accounting & Finance |language=en |volume=31 |issue=2 |pages=68–73 |doi=10.1002/jcaf.22432 |issn=1044-8136}}</ref>

Breakers enable tokenization of intellectual property, allowing content creators to issue their own digital tokens. Tokens can be distributed to a variety of project participants. Without intermediaries or governing body, content creators can integrate reward-sharing features into the token.<ref name=":3" />


== Application to alternative payment systems ==
== Application to alternative payment systems ==
Building an alternate payments system requires a number of entities working together in order to deliver [[near field communication]] (NFC) or other technology based payment services to the end users. One of the issues is the interoperability between the players and to resolve this issue the role of trusted service manager (TSM) is proposed to establish a technical link between [[mobile network operator]]s (MNO) and providers of services, so that these entities can work together. Tokenization can play a role in mediating such services.
Building an alternate payments system requires a number of entities working together in order to deliver [[near field communication]] (NFC) or other technology based payment services to the end users. One of the issues is the interoperability between the players and to resolve this issue the role of trusted service manager (TSM) is proposed to establish a technical link between [[mobile network operator]]s (MNO) and providers of services, so that these entities can work together. Tokenization can play a role in mediating such services.


Tokenization as a security strategy lies in the ability to replace a real card number with a surrogate (target removal) and the subsequent limitations placed on the surrogate card number (risk reduction). If the surrogate value can be used in an unlimited fashion or even in a broadly applicable manner, the token value gains as much value as the real credit card number. In these cases, the token may be secured by a second dynamic token that is unique for each transaction and also associated to a specific payment card. Example of dynamic, transaction-specific tokens include cryptograms used in the EMV specification.
Tokenization as a security strategy lies in the ability to replace a real card number with a surrogate (target removal) and the subsequent limitations placed on the surrogate card number (risk reduction). If the surrogate value can be used in an unlimited fashion or even in a broadly applicable manner, the token value gains as much value as the real credit card number. In these cases, the token may be secured by a second dynamic token that is unique for each transaction and also associated to a specific payment card. Example of dynamic, transaction-specific tokens include cryptograms used in the EMV specification.


==Application to PCI DSS standards==
==Application to PCI DSS standards==
Line 79: Line 107:


==Standards (ANSI, the PCI Council, Visa, and EMV)==
==Standards (ANSI, the PCI Council, Visa, and EMV)==
Tokenization is currently in standards definition in ANSI X9 as [http://x9.org/wp-content/uploads/2014/01/X9-Tokenization-Webinar-January-2014.pptx%20 X9.119 Part 2]. X9 is responsible for the industry standards for financial cryptography and data protection including payment card PIN management, credit and debit card encryption and related technologies and processes. The PCI Council has also stated support for tokenization in reducing risk in data breaches, when combined with other technologies such as Point-to-Point Encryption (P2PE) and assessments of compliance to PCI DSS guidelines.<ref>{{Cite web |url=http://democrats.energycommerce.house.gov/sites/default/files/documents/Testimony-Russo-CMT-Consumer-Information-Data-Breach-2014-2-5.pdf |title=Protecting Consumer Information: Can Data Breaches Be Prevented? |access-date=2014-04-01 |archive-date=2014-04-07 |archive-url=https://web.archive.org/web/20140407070258/http://democrats.energycommerce.house.gov/sites/default/files/documents/Testimony-Russo-CMT-Consumer-Information-Data-Breach-2014-2-5.pdf |url-status=dead }}</ref> Visa Inc. released Visa Tokenization Best Practices<ref>[https://usa.visa.com/dam/VCOM/global/support-legal/documents/bulletin-tokenization-best-practices.pdf Visa Tokenization Best Practices]</ref> for tokenization uses in credit and debit card handling applications and services. In March 2014, [http://www.emvco.org EMVCo LLC] released its first payment tokenization specification for [[EMV]].<ref>{{cite web|url=http://www.emvco.com/specifications.aspx?id=263 |title=EMV Payment Tokenisation Specification – Technical Framework|date=March 2014}}</ref>
Tokenization is currently in standards definition in ANSI X9 as [http://x9.org/wp-content/uploads/2014/01/X9-Tokenization-Webinar-January-2014.pptx%20 X9.119 Part 2]. X9 is responsible for the industry standards for financial cryptography and data protection including payment card PIN management, credit and debit card encryption and related technologies and processes. The PCI Council has also stated support for tokenization in reducing risk in data breaches, when combined with other technologies such as Point-to-Point Encryption (P2PE) and assessments of compliance to PCI DSS guidelines.<ref>{{Cite web |url=http://democrats.energycommerce.house.gov/sites/default/files/documents/Testimony-Russo-CMT-Consumer-Information-Data-Breach-2014-2-5.pdf |title=Protecting Consumer Information: Can Data Breaches Be Prevented? |access-date=2014-04-01 |archive-date=2014-04-07 |archive-url=https://web.archive.org/web/20140407070258/http://democrats.energycommerce.house.gov/sites/default/files/documents/Testimony-Russo-CMT-Consumer-Information-Data-Breach-2014-2-5.pdf |url-status=dead }}</ref> Visa Inc. released Visa Tokenization Best Practices<ref>[https://usa.visa.com/dam/VCOM/global/support-legal/documents/bulletin-tokenization-best-practices.pdf Visa Tokenization Best Practices]</ref> for tokenization uses in credit and debit card handling applications and services. In March 2014, [http://www.emvco.org EMVCo LLC] released its first payment tokenization specification for [[EMV]].<ref>{{cite web|url=http://www.emvco.com/specifications.aspx?id=263 |title=EMV Payment Tokenisation Specification – Technical Framework|date=March 2014}}</ref> PCI DSS is the most frequently utilized standard for Tokenization systems used by payment industry players.<ref name=":1" />


==Risk reduction==
==Risk reduction==
Line 85: Line 113:


As a security best practice,<ref>{{Cite web |url=https://www.owasp.org/index.php/Guide_to_Cryptography |title=OWASP Guide to Cryptography |access-date=2014-04-01 |archive-url=https://web.archive.org/web/20140407071624/https://www.owasp.org/index.php/Guide_to_Cryptography |archive-date=2014-04-07 |url-status=dead }}</ref> independent assessment and validation of any technologies used for data protection, including tokenization, must be in place to establish the security and strength of the method and implementation before any claims of privacy compliance, regulatory compliance, and data security can be made. This validation is particularly important in tokenization, as the tokens are shared externally in general use and thus exposed in high risk, low trust environments. The infeasibility of reversing a token or set of tokens to a live sensitive data must be established using industry accepted measurements and proofs by appropriate experts independent of the service or solution provider.
As a security best practice,<ref>{{Cite web |url=https://www.owasp.org/index.php/Guide_to_Cryptography |title=OWASP Guide to Cryptography |access-date=2014-04-01 |archive-url=https://web.archive.org/web/20140407071624/https://www.owasp.org/index.php/Guide_to_Cryptography |archive-date=2014-04-07 |url-status=dead }}</ref> independent assessment and validation of any technologies used for data protection, including tokenization, must be in place to establish the security and strength of the method and implementation before any claims of privacy compliance, regulatory compliance, and data security can be made. This validation is particularly important in tokenization, as the tokens are shared externally in general use and thus exposed in high risk, low trust environments. The infeasibility of reversing a token or set of tokens to a live sensitive data must be established using industry accepted measurements and proofs by appropriate experts independent of the service or solution provider.

== Restrictions on token use ==
Not all organizational data can be tokenized, and needs to be examined and filtered.

When databases are utilized on a large scale, they expand exponentially, causing the search process to take longer, restricting system performance, and increasing backup processes. A database that links sensitive information to tokens is called a vault. With the addition of new data, the vault's maintenance workload increases significantly.

For ensuring database consistency, token databases need to be continuously synchronized.

Apart from that, secure communication channels must be built between sensitive data and the vault so that data is not compromised on the way to or from storage.<ref name=":0" />


==See also==
==See also==

Revision as of 21:09, 23 November 2022

This is a simplified example of how mobile payment tokenization commonly works via a mobile phone application with a credit card.[1][2] Methods other than fingerprint scanning or PIN-numbers can be used at a payment terminal.

Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods that render tokens infeasible to reverse in the absence of the tokenization system, for example using tokens created from random numbers.[3] A one-way cryptographic function is used to convert the original data into tokens, making it difficult to recreate the original data without obtaining entry to the tokenization system's resources.[4] To deliver such services, the system maintains a vault database of tokens that are connected to the corresponding sensitive data. Protecting the system vault is vital to the system, and improved processes must be put in place to offer database integrity and physical security.[5]

The tokenization system must be secured and validated using security best practices[6] applicable to sensitive data protection, secure storage, audit, authentication and authorization. The tokenization system provides data processing applications with the authority and interfaces to request tokens, or detokenize back to sensitive data.

The security and risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens, or detokenize back to redeem sensitive data under strict security controls. The token generation method must be proven to have the property that there is no feasible means through direct attack, cryptanalysis, side channel analysis, token mapping table exposure or brute force techniques to reverse tokens back to live data.

Replacing live data with tokens in systems is intended to minimize exposure of sensitive data to those applications, stores, people and processes, reducing risk of compromise or accidental exposure and unauthorized access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications explicitly permitted to detokenize when strictly necessary for an approved business purpose. Tokenization systems may be operated in-house within a secure isolated segment of the data center, or as a service from a secure service provider.

Tokenization may be used to safeguard sensitive data involving, for example, bank accounts, financial statements, medical records, criminal records, driver's licenses, loan applications, stock trades, voter registrations, and other types of personally identifiable information (PII). Tokenization is often used in credit card processing. The PCI Council defines tokenization as "a process by which the primary account number (PAN) is replaced with a surrogate value called a token. A PAN may be linked to a reference number through the tokenization process. In this case, the merchant simply has to retain the token and a reliable third party controls the relationship and holds the PAN. The token may be created independently of the PAN, or the PAN can be used as part of the data input to the tokenization technique. The communication between the merchant and the third-party supplier must be secure to prevent an attacker from intercepting to gain the PAN and the token.[7]

De-tokenization[8] is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value".[9] The choice of tokenization as an alternative to other techniques such as encryption will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities. This is in addition to any technical, architectural or operational constraint that tokenization imposes in practical use.

Concepts and origins

The concept of tokenization, as adopted by the industry today, has existed since the first currency systems emerged centuries ago as a means to reduce risk in handling high value financial instruments by replacing them with surrogate equivalents.[10][11][12] In the physical world, coin tokens have a long history of use replacing the financial instrument of minted coins and banknotes. In more recent history, subway tokens and casino chips found adoption for their respective systems to replace physical currency and cash handling risks such as theft. Exonumia, and scrip are terms synonymous with such tokens.

In the digital world, similar substitution techniques have been used since the 1970s as a means to isolate real data elements from exposure to other data systems. In databases for example, surrogate key values have been used since 1976 to isolate data associated with the internal mechanisms of databases and their external equivalents for a variety of uses in data processing.[13][14] More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection.

In the payment card industry, tokenization is one means of protecting sensitive cardholder data in order to comply with industry standards and government regulations.[15]

In 2001, TrustCommerce created the concept of Tokenization to protect sensitive payment data for a client, Classmates.com.[16] It engaged Rob Caulfield, founder of TrustCommerce, because the risk of storing card holder data was too great if the systems were ever hacked. TrustCommerce developed TC Citadel®, with which customers could reference a token in place of card holder data and TrustCommerce would process a payment on the merchant's behalf.[17] This billing application allowed clients to process recurring payments without the need to store cardholder payment information. Tokenization replaces the Primary Account Number (PAN) with randomly generated tokens. If intercepted, the data contains no cardholder information, rendering it useless to hackers. The PAN cannot be retrieved, even if the token and the systems it resides on are compromised, nor can the token be reverse engineered to arrive at the PAN.

Tokenization was applied to payment card data by Shift4 Corporation[18] and released to the public during an industry Security Summit in Las Vegas, Nevada in 2005.[19] The technology is meant to prevent the theft of the credit card information in storage. Shift4 defines tokenization as: “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In payment card industry (PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.”[20]

To protect data over its full lifecycle, tokenization is often combined with end-to-end encryption to secure data in transit to the tokenization system or service, with a token replacing the original data on return. For example, to avoid the risks of malware stealing data from low-trust systems such as point of sale (POS) systems, as in the Target breach of 2013, cardholder data encryption must take place prior to card data entering the POS and not after. Encryption takes place within the confines of a security hardened and validated card reading device and data remains encrypted until received by the processing host, an approach pioneered by Heartland Payment Systems[21] as a means to secure payment data from advanced threats, now widely adopted by industry payment processing companies and technology companies.[22] The PCI Council has also specified end-to-end encryption (certified point-to-point encryption—P2PE) for various service implementations in various PCI Council Point-to-point Encryption documents.

The tokenization process

The process of tokenization consists of the following steps:

  • The application sends the tokenization data and authentication information to the tokenization system.
  • The application sends the tokenization data and authentication information to the tokenization system. It is stopped if authentication fails and the data is delivered to an event management system. As a result, administrators can discover problems and effectively manage the system. The system moves on to the next phase if authentication is successful.
  • Using one-way cryptographic techniques, a token is generated and kept in a highly secure data vault.
  • The new token is provided to the application for further use.[23]

Tokenization systems share several components according to established standards.

  1. Token Generation is the process of producing a token using any means, such as mathematically reversible cryptographic functions based on strong encryption algorithms and key management mechanisms, one-way nonreversible cryptographic functions (e.g., a hash function with strong, secret salt), or assignment via a randomly generated number. Random Number Generator (RNG) techniques are often the best choice for generating token values.
  2. Token Mapping – this is the process of assigning the created token value to its original value. To enable permitted look-ups of the original value using the token as the index, a secure cross-reference database must be constructed.
  3. Token Data Store – this is a central repository for the Token Mapping process that holds the original values as well as the related token values after the Token Generation process. On data servers, sensitive data and token values must be securely kept in encrypted format.
  4. Encrypted Data Storage – this is the encryption of sensitive data while it is in transit.
  5. Management of Cryptographic Keys. Strong key management procedures are required for sensitive data encryption on Token Data Stores.[24]

Difference from encryption

Tokenization and “classic” encryption effectively protect data if implemented properly, and a computer security system may use both. While similar in certain regards, tokenization and classic encryption differ in a few key aspects. Both are cryptographic data security methods and they essentially have the same function, however they do so with differing processes and have different effects on the data they are protecting.

Tokenization is a non-mathematical approach that replaces sensitive data with non-sensitive substitutes without altering the type or length of data. This is an important distinction from encryption because changes in data length and type can render information unreadable in intermediate systems such as databases. Tokenized data can still be processed by legacy systems which makes tokenization more flexible than classic encryption.

In many situations, the encryption process is a constant consumer of processing power, hence such a system needs significant expenditures in specialized hardware and software.[4]

Another difference is that tokens require significantly less computational resources to process. With tokenization, specific data is kept fully or partially visible for processing and analytics while sensitive information is kept hidden. This allows tokenized data to be processed more quickly and reduces the strain on system resources. This can be a key advantage in systems that rely on high performance.

In comparison to encryption, tokenization technologies reduce time, expense, and administrative effort while enabling teamwork and communication.[4]

Types of tokens

There are many ways that tokens can be classified however there is currently no unified classification. Tokens can be: single or multi-use, cryptographic or non-cryptographic, reversible or irreversible, authenticable or non-authenticable, and various combinations thereof.

In the context of payments, the difference between high and low value tokens plays a significant role.

High-value tokens (HVTs)

HVTs serve as surrogates for actual PANs in payment transactions and are used as an instrument for completing a payment transaction. In order to function, they must look like actual PANs. Multiple HVTs can map back to a single PAN and a single physical credit card without the owner being aware of it.

Additionally, HVTs can be limited to certain networks and/or merchants whereas PANs cannot.

HVTs can also be bound to specific devices so that anomalies between token use, physical devices, and geographic locations can be flagged as potentially fraudulent.

Low-value tokens (LVTs) or security tokens

LVTs also act as surrogates for actual PANs in payment transactions, however they serve a different purpose. LVTs cannot be used by themselves to complete a payment transaction. In order for an LVT to function, it must be possible to match it back to the actual PAN it represents, albeit only in a tightly controlled fashion. Using tokens to protect PANs becomes ineffectual if a tokenization system is breached, therefore securing the tokenization system itself is extremely important.

System operations, limitations and evolution

First generation tokenization systems use a database to map from live data to surrogate substitute tokens and back. This requires the storage, management, and continuous backup for every new transaction added to the token database to avoid data loss. Another problem is ensuring consistency across data centers, requiring continuous synchronization of token databases. Significant consistency, availability and performance trade-offs, per the CAP theorem, are unavoidable with this approach. This overhead adds complexity to real-time transaction processing to avoid data loss and to assure data integrity across data centers, and also limits scale. Storing all sensitive data in one service creates an attractive target for attack and compromise, and introduces privacy and legal risk in the aggregation of data Internet privacy, particularly in the EU.

Another limitation of tokenization technologies is measuring the level of security for a given solution through independent validation. With the lack of standards, the latter is critical to establish the strength of tokenization offered when tokens are used for regulatory compliance. The PCI Council recommends independent vetting and validation of any claims of security and compliance: "Merchants considering the use of tokenization should perform a thorough evaluation and risk analysis to identify and document the unique characteristics of their particular implementation, including all interactions with payment card data and the particular tokenization systems and processes"[25]

The method of generating tokens may also have limitations from a security perspective. With concerns about security and attacks to random number generators, which are a common choice for the generation of tokens and token mapping tables, scrutiny must be applied to ensure proven and validated methods are used versus arbitrary design.[26][27] Random number generators have limitations in terms of speed, entropy, seeding and bias, and security properties must be carefully analysed and measured to avoid predictability and compromise.

With tokenization's increasing adoption, new tokenization technology approaches have emerged to remove such operational risks and complexities and to enable increased scale suited to emerging big data use cases and high performance transaction processing, especially in financial services and banking.[28] In addition to conventional tokenization methods, Protegrity provides additional security through its so-called "obfuscation layer." This creates a barrier that prevents not only regular users from accessing information they wouldn't see but also privileged users who has access, such as database administrators.[29]

Stateless tokenization enables random mapping of live data elements to surrogate values without needing a database while retaining the isolation properties of tokenization.

November 2014, American Express released its token service which meets the EMV tokenization standard.[30] Other notable examples of Tokenization-based payment systems, according to the EMVCo standard, include Google Wallet, Apple Pay,[31] Samsung Pay, Microsoft Wallet, Fitbit Pay and Garmin Pay. Visa uses tokenization techniques to provide a secure online and mobile shopping.[32]

Using blockchain, as opposed to relying on trusted third parties, it is possible to run highly accessible, tamper-resistant databases for transactions.[33][34] With help of blockchain, tokenization is the process of converting the value of a tangible or intangible asset into a token that can be exchanged on the network.

This enables the tokenization of conventional financial assets, for instance, by transforming rights into a digital token backed by the asset itself using blockchain technology.[35] Besides that, tokenization enables the simple and efficient compartmentalization and management of data across multiple users. Individual tokens created through tokenization can be used to split ownership and partially resell an asset.[36][37] Consequently, only entities with the appropriate token can access the data.[35]

Numerous blockchain companies support asset tokenization. In 2019, eToro acquired Firmo and renamed as eToroX. Through its Token Management Suite, which is backed by USD-pegged stablecoins, eToroX enables asset tokenization.[38][39]

The tokenization of equity is facilitated by STOKR, a platform that links investors with small and medium-sized businesses. Tokens issued through the STOKR platform are legally recognized as transferable securities under European Union capital market regulations.[40]

Breakers enable tokenization of intellectual property, allowing content creators to issue their own digital tokens. Tokens can be distributed to a variety of project participants. Without intermediaries or governing body, content creators can integrate reward-sharing features into the token.[40]

Application to alternative payment systems

Building an alternate payments system requires a number of entities working together in order to deliver near field communication (NFC) or other technology based payment services to the end users. One of the issues is the interoperability between the players and to resolve this issue the role of trusted service manager (TSM) is proposed to establish a technical link between mobile network operators (MNO) and providers of services, so that these entities can work together. Tokenization can play a role in mediating such services.

Tokenization as a security strategy lies in the ability to replace a real card number with a surrogate (target removal) and the subsequent limitations placed on the surrogate card number (risk reduction). If the surrogate value can be used in an unlimited fashion or even in a broadly applicable manner, the token value gains as much value as the real credit card number. In these cases, the token may be secured by a second dynamic token that is unique for each transaction and also associated to a specific payment card. Example of dynamic, transaction-specific tokens include cryptograms used in the EMV specification.

Application to PCI DSS standards

The Payment Card Industry Data Security Standard, an industry-wide set of guidelines that must be met by any organization that stores, processes, or transmits cardholder data, mandates that credit card data must be protected when stored.[41] Tokenization, as applied to payment card data, is often implemented to meet this mandate, replacing credit card and ACH numbers in some systems with a random value or string of characters.[42] Tokens can be formatted in a variety of ways. Some token service providers or tokenization products generate the surrogate values in such a way as to match the format of the original sensitive data. In the case of payment card data, a token might be the same length as a Primary Account Number (bank card number) and contain elements of the original data such as the last four digits of the card number. When a payment card authorization request is made to verify the legitimacy of a transaction, a token might be returned to the merchant instead of the card number, along with the authorization code for the transaction. The token is stored in the receiving system while the actual cardholder data is mapped to the token in a secure tokenization system. Storage of tokens and payment card data must comply with current PCI standards, including the use of strong cryptography.[43]

Standards (ANSI, the PCI Council, Visa, and EMV)

Tokenization is currently in standards definition in ANSI X9 as X9.119 Part 2. X9 is responsible for the industry standards for financial cryptography and data protection including payment card PIN management, credit and debit card encryption and related technologies and processes. The PCI Council has also stated support for tokenization in reducing risk in data breaches, when combined with other technologies such as Point-to-Point Encryption (P2PE) and assessments of compliance to PCI DSS guidelines.[44] Visa Inc. released Visa Tokenization Best Practices[45] for tokenization uses in credit and debit card handling applications and services. In March 2014, EMVCo LLC released its first payment tokenization specification for EMV.[46] PCI DSS is the most frequently utilized standard for Tokenization systems used by payment industry players.[24]

Risk reduction

Tokenization can render it more difficult for attackers to gain access to sensitive data outside of the tokenization system or service. Implementation of tokenization may simplify the requirements of the PCI DSS, as systems that no longer store or process sensitive data may have a reduction of applicable controls required by the PCI DSS guidelines.

As a security best practice,[47] independent assessment and validation of any technologies used for data protection, including tokenization, must be in place to establish the security and strength of the method and implementation before any claims of privacy compliance, regulatory compliance, and data security can be made. This validation is particularly important in tokenization, as the tokens are shared externally in general use and thus exposed in high risk, low trust environments. The infeasibility of reversing a token or set of tokens to a live sensitive data must be established using industry accepted measurements and proofs by appropriate experts independent of the service or solution provider.

Restrictions on token use

Not all organizational data can be tokenized, and needs to be examined and filtered.

When databases are utilized on a large scale, they expand exponentially, causing the search process to take longer, restricting system performance, and increasing backup processes. A database that links sensitive information to tokens is called a vault. With the addition of new data, the vault's maintenance workload increases significantly.

For ensuring database consistency, token databases need to be continuously synchronized.

Apart from that, secure communication channels must be built between sensitive data and the vault so that data is not compromised on the way to or from storage.[4]

See also

References

  1. ^ "Tokenization demystified". IDEMIA. 2017-09-19. Archived from the original on 2018-01-26. Retrieved 2018-01-26.
  2. ^ "Payment Tokenization Explained". Square. Archived from the original on 2018-01-02. Retrieved 2018-01-26.
  3. ^ CardVault: "Tokenization 101"
  4. ^ a b c d Ogigau-Neamtiu, F. (2016). "Tokenization as a data security technique". Regional Department of Defense Resources Management Studies. Zeszyty Naukowe AON (2(103)). Brasov, Romania: Akademia Sztuki Wojennej: 124–135. ISSN 0867-2245.
  5. ^ Ogîgău-Neamţiu, F. (2017). "Automating the data security process". Journal of Defense Resources Management (JoDRM) (8(2)).
  6. ^ "OWASP Top Ten Project". Archived from the original on 2019-12-01. Retrieved 2014-04-01.
  7. ^ Stapleton, J.; Poore, R. S. (2011). "Tokenization and other methods of security for cardholder data". Information Security Journal: A Global Perspective. 20(2): 91–99.
  8. ^ Y., Habash, Nizar (2010). Introduction to Arabic natural language processing. Morgan & Claypool. ISBN 978-1-59829-796-6. OCLC 1154286658.{{cite book}}: CS1 maint: multiple names: authors list (link)
  9. ^ PCI DSS Tokenization Guidelines
  10. ^ Rolfe, Alex (May 2015). "The fall and rise of Tokenization". Retrieved 27 September 2022.
  11. ^ Xu, Xiwei; Pautasso, Cesare; Zhu, Liming; Lu, Qinghua; Weber, Ingo (2018-07-04). "A Pattern Collection for Blockchain-based Applications". Proceedings of the 23rd European Conference on Pattern Languages of Programs. EuroPLoP '18. New York, NY, USA: Association for Computing Machinery: 1–20. doi:10.1145/3282308.3282312. ISBN 978-1-4503-6387-7.
  12. ^ Millmore, B.; Foskolou, V.; Mondello, C.; Kroll, J.; Upadhyay, S.; Wilding, D. "Tokens: Culture, Connections, Communities: Final Programme" (PDF). The University of Warwick.
  13. ^ Link, S.; Luković, I.; Mogin, P. (2010). "Performance evaluation of natural and surrogate key database architectures". School of Engineering and Computer Science, Victoria University of Wellington.
  14. ^ Hall, P.; Owlett, J.; Todd, S. (1976). "Relations and entities. Modelling in Database Management Systems". GM Nijssen. {{cite web}}: Missing or empty |url= (help)
  15. ^ "Tokenization eases merchant PCI compliance". Archived from the original on 2012-11-03. Retrieved 2013-03-28.
  16. ^ "Where Did Tokenization Come From?". TrustCommerce. Retrieved 2017-02-23.
  17. ^ "TrustCommerce". 2001-04-05. Archived from the original on 2001-04-05. Retrieved 2017-02-23.
  18. ^ "Shift4 Corporation Releases Tokenization in Depth White Paper". Archived from the original on 2014-03-13. Retrieved 2017-07-02.
  19. ^ "Shift4 Launches Security Tool That Lets Merchants Re-Use Credit Card Data". Internet Retailer. Archived from the original on 2005-10-13. {{cite magazine}}: |archive-date= / |archive-url= timestamp mismatch; 2015-02-18 suggested (help)
  20. ^ "Shift4 Corporation Releases Tokenization in Depth White Paper". Archived from the original on 2011-07-16. Retrieved 2010-09-17.
  21. ^ "Lessons Learned from a Data Breach" (PDF). Archived from the original (PDF) on 2013-05-02. Retrieved 2014-04-01.
  22. ^ Voltage, Ingencio Partner on Data Encryption Platform
  23. ^ Ogigau-Neamtiu, F. (2016). "Tokenization as a data security technique". Zeszyty Naukowe AON. nr 2(103). ISSN 0867-2245. {{cite journal}}: |volume= has extra text (help)
  24. ^ a b Ozdenizci, Busra; Ok, Kerem; Coskun, Vedat (2016-11-30). "A Tokenization-Based Communication Architecture for HCE-Enabled NFC Services". Mobile Information Systems. 2016: e5046284. doi:10.1155/2016/5046284. ISSN 1574-017X.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  25. ^ PCI Council Tokenization Guidelines
  26. ^ How do you know if an RNG is working?
  27. ^ Gimenez, Gregoire; Cherkaoui, Abdelkarim; Frisch, Raphael; Fesquet, Laurent (2017-07-01). "Self-timed Ring based True Random Number Generator: Threat model and countermeasures". 2017 IEEE 2nd International Verification and Security Workshop (IVSW). Thessaloniki, Greece: IEEE: 31–38. doi:10.1109/IVSW.2017.8031541. ISBN 978-1-5386-1708-3.
  28. ^ Vijayan, Jaikumar (2014-02-12). "Banks push for tokenization standard to secure credit card payments". Computerworld. Retrieved 2022-11-23.
  29. ^ Mark, S. J. (2018). "De-identification of personal information for use in software testing to ensure compliance with the Protection of Personal Information Act".
  30. ^ "American Express Introduces New Online and Mobile Payment Security Services". AmericanExpress.com. 3 November 2014. Archived from the original on 2014-11-04. Retrieved 2014-11-04.
  31. ^ "Apple Pay Programming Guide: About Apple Pay". developer.apple.com. Retrieved 2022-11-23.
  32. ^ "Visa Token Service". usa.visa.com. Retrieved 2022-11-23.
  33. ^ Beck, Roman; Avital, Michel; Rossi, Matti; Thatcher, Jason Bennett (2017-12-01). "Blockchain Technology in Business and Information Systems Research". Business & Information Systems Engineering. 59 (6): 381–384. doi:10.1007/s12599-017-0505-1. ISSN 1867-0202.
  34. ^ Çebi, F.; Bolat, H.B.; Atan, T.; Erzurumlu, Ö. Y. (2021). "International Engineering and Technology Management Summit 2021–ETMS2021 Proceeding Book". İstanbul Technical University & Bahçeşehir University. ISBN 978-975-561-522-6.
  35. ^ a b Morrow; Zarrebini (2019-10-22). "Blockchain and the Tokenization of the Individual: Societal Implications". Future Internet. 11 (10): 220. doi:10.3390/fi11100220. ISSN 1999-5903.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  36. ^ Tian, Yifeng; Lu, Zheng; Adriaens, Peter; Minchin, R. Edward; Caithness, Alastair; Woo, Junghoon (2020). "Finance infrastructure through blockchain-based tokenization". Frontiers of Engineering Management. 7 (4): 485–499. doi:10.1007/s42524-020-0140-2. ISSN 2095-7513.
  37. ^ Ross, Omri; Jensen, Johannes Rude; Asheim, Truls (2019-11-16). "Assets under Tokenization". Rochester, NY. doi:10.2139/ssrn.3488344. {{cite journal}}: Cite journal requires |journal= (help)
  38. ^ Tabatabai, Arman (2019-03-25). "Social investment platform eToro acquires smart contract startup Firmo". TechCrunch. Retrieved 2022-11-23.
  39. ^ "eToroX Names Omri Ross Chief Blockchain Scientist". Financial and Business News | Finance Magnates. Retrieved 2022-11-23.
  40. ^ a b Sazandrishvili, George (2020). "Asset tokenization in plain English". Journal of Corporate Accounting & Finance. 31 (2): 68–73. doi:10.1002/jcaf.22432. ISSN 1044-8136.
  41. ^ The Payment Card Industry Data Security Standard
  42. ^ "Tokenization: PCI Compliant Tokenization Payment Processing". Bluefin Payment Systems. Retrieved 2016-01-14.
  43. ^ "Data Security: Counterpoint – "The Best Way to Secure Data is Not to Store Data"" (PDF). Archived from the original (PDF) on 2009-07-31. Retrieved 2009-06-17.
  44. ^ "Protecting Consumer Information: Can Data Breaches Be Prevented?" (PDF). Archived from the original (PDF) on 2014-04-07. Retrieved 2014-04-01.
  45. ^ Visa Tokenization Best Practices
  46. ^ "EMV Payment Tokenisation Specification – Technical Framework". March 2014.
  47. ^ "OWASP Guide to Cryptography". Archived from the original on 2014-04-07. Retrieved 2014-04-01.
  • Cloud vs Payment - Cloud vs Payment - Introduction to tokenization via cloud payments.