HOME

TheInfoList



OR:

Tokenization, when applied to data security, is the process of substituting a sensitive
data element In metadata, the term data element is an atomic unit of data that has precise meaning or precise semantics. A data element has: # An identification such as a data element name # A clear data element definition # One or more representation terms ...
with a non-sensitive equivalent, referred to as a
token Token may refer to: Arts, entertainment, and media * Token, a game piece or counter, used in some games * The Tokens, a vocal music group * Tolkien Black, a recurring character on the animated television series ''South Park,'' formerly known a ...
, that has no intrinsic or exploitable meaning or value. The token is a reference (i.e. identifier) that maps back to the sensitive data through a tokenization system. The mapping from original data to a token uses methods that render tokens infeasible to reverse in the absence of the tokenization system, for example using tokens created from random numbers. A one-way cryptographic function is used to convert the original data into tokens, making it difficult to recreate the original data without obtaining entry to the tokenization system's resources. To deliver such services, the system maintains a vault database of tokens that are connected to the corresponding sensitive data. Protecting the system vault is vital to the system, and improved processes must be put in place to offer database integrity and physical security. The tokenization system must be secured and validated using security best practices applicable to sensitive data protection, secure storage, audit, authentication and authorization. The tokenization system provides data processing applications with the authority and interfaces to request tokens, or detokenize back to sensitive data. The security and risk reduction benefits of tokenization require that the tokenization system is logically isolated and segmented from data processing systems and applications that previously processed or stored sensitive data replaced by tokens. Only the tokenization system can tokenize data to create tokens, or detokenize back to redeem sensitive data under strict security controls. The token generation method must be proven to have the property that there is no feasible means through direct attack,
cryptanalysis Cryptanalysis (from the Greek ''kryptós'', "hidden", and ''analýein'', "to analyze") refers to the process of analyzing information systems in order to understand hidden aspects of the systems. Cryptanalysis is used to breach cryptographic s ...
, side channel analysis, token mapping table exposure or brute force techniques to reverse tokens back to live data. Replacing live data with tokens in systems is intended to minimize exposure of sensitive data to those applications, stores, people and processes, reducing risk of compromise or accidental exposure and unauthorized access to sensitive data. Applications can operate using tokens instead of live data, with the exception of a small number of trusted applications explicitly permitted to detokenize when strictly necessary for an approved business purpose. Tokenization systems may be operated in-house within a secure isolated segment of the data center, or as a service from a secure service provider. Tokenization may be used to safeguard sensitive data involving, for example,
bank account A bank account is a financial account maintained by a bank or other financial institution in which the financial transactions between the bank and a customer are recorded. Each financial institution sets the terms and conditions for each type of ...
s,
financial statement Financial statements (or financial reports) are formal records of the financial activities and position of a business, person, or other entity. Relevant financial information is presented in a structured manner and in a form which is easy to un ...
s,
medical record The terms medical record, health record and medical chart are used somewhat interchangeably to describe the systematic documentation of a single patient's medical history and care across time within one particular health care provider's jurisd ...
s,
criminal record A criminal record, police record, or colloquially RAP sheet (Record of Arrests and Prosecutions) is a record of a person's criminal history. The information included in a criminal record and the existence of a criminal record varies between coun ...
s,
driver's license A driver's license is a legal authorization, or the official document confirming such an authorization, for a specific individual to operate one or more types of motorized vehicles—such as motorcycles, cars, trucks, or buses—on a publi ...
s,
loan In finance, a loan is the lending of money by one or more individuals, organizations, or other entities to other individuals, organizations, etc. The recipient (i.e., the borrower) incurs a debt and is usually liable to pay interest on that ...
applications, stock trades,
voter registration In electoral systems, voter registration (or enrollment) is the requirement that a person otherwise eligible to vote must register (or enroll) on an electoral roll, which is usually a prerequisite for being entitled or permitted to vote. The r ...
s, and other types of
personally identifiable information Personal data, also known as personal information or personally identifiable information (PII), is any information related to an identifiable person. The abbreviation PII is widely accepted in the United States, but the phrase it abbreviates ha ...
(PII). Tokenization is often used in credit card processing. The PCI Council defines tokenization as "a process by which the primary account number (PAN) is replaced with a surrogate value called a token. A PAN may be linked to a reference number through the tokenization process. In this case, the merchant simply has to retain the token and a reliable third party controls the relationship and holds the PAN. The token may be created independently of the PAN, or the PAN can be used as part of the data input to the tokenization technique. The communication between the merchant and the third-party supplier must be secure to prevent an attacker from intercepting to gain the PAN and the token. De-tokenization is the reverse process of redeeming a token for its associated PAN value. The security of an individual token relies predominantly on the infeasibility of determining the original PAN knowing only the surrogate value". The choice of tokenization as an alternative to other techniques such as
encryption In cryptography, encryption is the process of encoding information. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Ideally, only authorized parties can d ...
will depend on varying regulatory requirements, interpretation, and acceptance by respective auditing or assessment entities. This is in addition to any technical, architectural or operational constraint that tokenization imposes in practical use.


Concepts and origins

The concept of tokenization, as adopted by the industry today, has existed since the first
currency A currency, "in circulation", from la, currens, -entis, literally meaning "running" or "traversing" is a standardization of money in any form, in use or circulation as a medium of exchange, for example banknotes and coins. A more general ...
systems emerged centuries ago as a means to reduce risk in handling high value
financial instrument Financial instruments are monetary contracts between parties. They can be created, traded, modified and settled. They can be cash (currency), evidence of an ownership interest in an entity or a contractual right to receive or deliver in the form ...
s by replacing them with surrogate equivalents. In the physical world, coin tokens have a long history of use replacing the financial instrument of minted coins and
banknote A banknote—also called a bill (North American English), paper money, or simply a note—is a type of negotiable instrument, negotiable promissory note, made by a bank or other licensed authority, payable to the bearer on demand. Banknotes w ...
s. In more recent history, subway tokens and casino chips found adoption for their respective systems to replace physical currency and cash handling risks such as theft. Exonumia, and
scrip A scrip (or ''chit'' in India) is any substitute for legal tender. It is often a form of credit. Scrips have been created and used for a variety of reasons, including exploitive payment of employees under truck systems; or for use in local co ...
are terms synonymous with such tokens. In the digital world, similar substitution techniques have been used since the 1970s as a means to isolate real data elements from exposure to other data systems. In databases for example,
surrogate key A surrogate key (or synthetic key, pseudokey, entity identifier, factless key, or technical key) in a database is a unique identifier for either an ''entity'' in the modeled world or an ''object'' in the database. The surrogate key is ''not'' deri ...
values have been used since 1976 to isolate data associated with the internal mechanisms of databases and their external equivalents for a variety of uses in data processing. More recently, these concepts have been extended to consider this isolation tactic to provide a security mechanism for the purposes of data protection. In the
payment card Payment cards are part of a payment system issued by financial institutions, such as a bank, to a customer that enables its owner (the cardholder) to access the funds in the customer's designated bank accounts, or through a credit account and ...
industry, tokenization is one means of protecting sensitive cardholder data in order to comply with industry standards and government regulations. In 2001, TrustCommerce created the concept of Tokenization to protect sensitive payment data for a client, Classmates.com. It engaged Rob Caulfield, founder of TrustCommerce, because the risk of storing card holder data was too great if the systems were ever hacked. TrustCommerce developed TC Citadel®, with which customers could reference a token in place of card holder data and TrustCommerce would process a payment on the merchant's behalf. This billing application allowed clients to process recurring payments without the need to store cardholder payment information. Tokenization replaces the Primary Account Number (PAN) with randomly generated tokens. If intercepted, the data contains no cardholder information, rendering it useless to hackers. The PAN cannot be retrieved, even if the token and the systems it resides on are compromised, nor can the token be reverse engineered to arrive at the PAN. Tokenization was applied to payment card data by Shift4 Corporation and released to the public during an industry Security Summit in
Las Vegas Las Vegas (; Spanish for "The Meadows"), often known simply as Vegas, is the 25th-most populous city in the United States, the most populous city in the state of Nevada, and the county seat of Clark County. The city anchors the Las Veg ...
,
Nevada Nevada ( ; ) is a U.S. state, state in the Western United States, Western region of the United States. It is bordered by Oregon to the northwest, Idaho to the northeast, California to the west, Arizona to the southeast, and Utah to the east. N ...
in 2005. The technology is meant to prevent the theft of the credit card information in storage. Shift4 defines tokenization as: “The concept of using a non-decryptable piece of data to represent, by reference, sensitive or secret data. In
payment card industry The payment card industry (PCI) denotes the debit, credit, prepaid, e-purse, ATM, and POS cards and associated businesses. Overview The payment card industry consists of all the organizations which store, process and transmit cardholder data ...
(PCI) context, tokens are used to reference cardholder data that is managed in a tokenization system, application or off-site secure facility.” To protect data over its full lifecycle, tokenization is often combined with
end-to-end encryption End-to-end encryption (E2EE) is a system of communication where only the communicating users can read the messages. In principle, it prevents potential eavesdroppers – including telecommunications service providers, telecom providers, Internet ...
to secure
data in transit Data in transit, also referred to as data in motion and data in flight, is data en route between source and destination, typically on a computer network. Data in transit can be separated into two categories: information that flows over the publi ...
to the tokenization system or service, with a token replacing the original data on return. For example, to avoid the risks of
malware Malware (a portmanteau for ''malicious software'') is any software intentionally designed to cause disruption to a computer, server, client, or computer network, leak private information, gain unauthorized access to information or systems, depr ...
stealing data from low-trust systems such as
point of sale The point of sale (POS) or point of purchase (POP) is the time and place at which a retail transaction is completed. At the point of sale, the merchant calculates the amount owed by the customer, indicates that amount, may prepare an invoice f ...
(POS) systems, as in th
Target breach of 2013
cardholder data encryption must take place prior to card data entering the POS and not after. Encryption takes place within the confines of a security hardened and validated card reading device and data remains encrypted until received by the processing host, an approach pioneered by
Heartland Payment Systems Heartland Payment Systems, Inc. is a U.S.-based payment processing and technology provider. Founded in 1997, Heartland Payment Systems' last headquarters were in Princeton, New Jersey. An acquisition by Global Payments, expected to be worth $3.8 ...
as a means to secure payment data from advanced threats, now widely adopted by industry payment processing companies and technology companies. The PCI Council has also specified end-to-end encryption (certified point-to-point encryption—P2PE) for various service implementations in variou
PCI Council Point-to-point Encryption
documents.


The tokenization process

The process of tokenization consists of the following steps: * The application sends the tokenization data and authentication information to the tokenization system. * The application sends the tokenization data and authentication information to the tokenization system. It is stopped if authentication fails and the data is delivered to an event management system. As a result, administrators can discover problems and effectively manage the system. The system moves on to the next phase if authentication is successful. * Using one-way cryptographic techniques, a token is generated and kept in a highly secure data vault. * The new token is provided to the application for further use. Tokenization systems share several components according to established standards. # Token Generation is the process of producing a token using any means, such as mathematically reversible cryptographic functions based on strong encryption algorithms and key management mechanisms, one-way nonreversible cryptographic functions (e.g., a hash function with strong, secret salt), or assignment via a randomly generated number. Random Number Generator (RNG) techniques are often the best choice for generating token values. # Token Mapping – this is the process of assigning the created token value to its original value. To enable permitted look-ups of the original value using the token as the index, a secure cross-reference database must be constructed. # Token Data Store – this is a central repository for the Token Mapping process that holds the original values as well as the related token values after the Token Generation process. On data servers, sensitive data and token values must be securely kept in encrypted format. # Encrypted Data Storage – this is the encryption of sensitive data while it is in transit. # Management of Cryptographic Keys. Strong key management procedures are required for sensitive data encryption on Token Data Stores.


Difference from encryption

Tokenization and “classic”
encryption In cryptography, encryption is the process of encoding information. This process converts the original representation of the information, known as plaintext, into an alternative form known as ciphertext. Ideally, only authorized parties can d ...
effectively protect data if implemented properly, and a computer security system may use both. While similar in certain regards, tokenization and classic encryption differ in a few key aspects. Both are
cryptographic Cryptography, or cryptology (from grc, , translit=kryptós "hidden, secret"; and ''graphein'', "to write", or '' -logia'', "study", respectively), is the practice and study of techniques for secure communication in the presence of adv ...
data security methods and they essentially have the same function, however they do so with differing processes and have different effects on the data they are protecting. Tokenization is a non-mathematical approach that replaces sensitive data with non-sensitive substitutes without altering the type or length of data. This is an important distinction from encryption because changes in data length and type can render information unreadable in intermediate systems such as databases. Tokenized data can still be processed by legacy systems which makes tokenization more flexible than classic encryption. In many situations, the encryption process is a constant consumer of processing power, hence such a system needs significant expenditures in specialized hardware and software. Another difference is that tokens require significantly less computational resources to process. With tokenization, specific data is kept fully or partially visible for processing and analytics while sensitive information is kept hidden. This allows tokenized data to be processed more quickly and reduces the strain on system resources. This can be a key advantage in systems that rely on high performance. In comparison to encryption, tokenization technologies reduce time, expense, and administrative effort while enabling teamwork and communication.


Types of tokens

There are many ways that tokens can be classified however there is currently no unified classification. Tokens can be: single or multi-use,
cryptographic Cryptography, or cryptology (from grc, , translit=kryptós "hidden, secret"; and ''graphein'', "to write", or '' -logia'', "study", respectively), is the practice and study of techniques for secure communication in the presence of adv ...
or non-cryptographic, reversible or irreversible, authenticable or non-authenticable, and various combinations thereof. In the context of payments, the difference between high and low value tokens plays a significant role.


High-value tokens (HVTs)

HVTs serve as surrogates for actual
PANs Cookware and bakeware is food preparation equipment, such as cooking pots, pans, baking sheets etc. used in kitchens. Cookware is used on a stove or range cooktop, while bakeware is used in an oven. Some utensils are considered both cookware ...
in payment transactions and are used as an instrument for completing a payment transaction. In order to function, they must look like actual PANs. Multiple HVTs can map back to a single PAN and a single physical credit card without the owner being aware of it. Additionally, HVTs can be limited to certain networks and/or merchants whereas PANs cannot. HVTs can also be bound to specific devices so that anomalies between token use, physical devices, and geographic locations can be flagged as potentially fraudulent.


Low-value tokens (LVTs) or security tokens

LVTs also act as surrogates for actual PANs in payment transactions, however they serve a different purpose. LVTs cannot be used by themselves to complete a payment transaction. In order for an LVT to function, it must be possible to match it back to the actual PAN it represents, albeit only in a tightly controlled fashion. Using tokens to protect PANs becomes ineffectual if a tokenization system is breached, therefore securing the tokenization system itself is extremely important.


System operations, limitations and evolution

First generation tokenization systems use a database to map from live data to surrogate substitute tokens and back. This requires the storage, management, and continuous backup for every new transaction added to the token database to avoid data loss. Another problem is ensuring consistency across data centers, requiring continuous synchronization of token databases. Significant consistency, availability and performance trade-offs, per the
CAP theorem In theoretical computer science, the CAP theorem, also named Brewer's theorem after computer scientist Eric Brewer, states that any distributed data store can provide only two of the following three guarantees:Seth Gilbert and Nancy Lynch"Brewe ...
, are unavoidable with this approach. This overhead adds complexity to real-time transaction processing to avoid data loss and to assure data integrity across data centers, and also limits scale. Storing all sensitive data in one service creates an attractive target for attack and compromise, and introduces privacy and legal risk in the aggregation of data
Internet privacy Internet privacy involves the right or mandate of personal privacy concerning the storing, re-purposing, provision to third parties, and displaying of information pertaining to oneself via Internet. Internet privacy is a subset of data privacy. Pr ...
, particularly in the EU. Another limitation of tokenization technologies is measuring the level of security for a given solution through independent validation. With the lack of standards, the latter is critical to establish the strength of tokenization offered when tokens are used for regulatory compliance. The PCI Council recommends independent vetting and validation of any claims of security and compliance: "Merchants considering the use of tokenization should perform a thorough evaluation and risk analysis to identify and document the unique characteristics of their particular implementation, including all interactions with payment card data and the particular tokenization systems and processes" The method of generating tokens may also have limitations from a security perspective. With concerns about security and attacks to
random number generators Random number generation is a process by which, often by means of a random number generator (RNG), a sequence of numbers or symbols that cannot be reasonably predicted better than by random chance is generated. This means that the particular ou ...
, which are a common choice for the generation of tokens and token mapping tables, scrutiny must be applied to ensure proven and validated methods are used versus arbitrary design.
Random number generators Random number generation is a process by which, often by means of a random number generator (RNG), a sequence of numbers or symbols that cannot be reasonably predicted better than by random chance is generated. This means that the particular ou ...
have limitations in terms of speed, entropy, seeding and bias, and security properties must be carefully analysed and measured to avoid predictability and compromise. With tokenization's increasing adoption, new tokenization technology approaches have emerged to remove such operational risks and complexities and to enable increased scale suited to emerging
big data Though used sometimes loosely partly because of a lack of formal definition, the interpretation that seems to best describe Big data is the one associated with large body of information that we could not comprehend when used only in smaller am ...
use cases and high performance transaction processing, especially in financial services and banking. In addition to conventional tokenization methods, Protegrity provides additional security through its so-called "obfuscation layer." This creates a barrier that prevents not only regular users from accessing information they wouldn't see but also privileged users who has access, such as database administrators. Stateless tokenization enables random mapping of live data elements to surrogate values without needing a database while retaining the isolation properties of tokenization. November 2014,
American Express American Express Company (Amex) is an American multinational corporation, multinational corporation specialized in payment card industry, payment card services headquartered at 200 Vesey Street in the Battery Park City neighborhood of Lower Man ...
released its token service which meets the EMV tokenization standard. Other notable examples of Tokenization-based payment systems, according to the EMVCo standard, include
Google Wallet Google Wallet (or simply Wallet) is a digital wallet platform developed by Google. It is available for the Android, Wear OS, and Fitbit OS operating systems, and was announced on May 11, 2022, at the 2022 Google I/O keynote. It began rol ...
,
Apple Pay Apple Pay is a mobile payment service by Apple Inc. that allows users to make payments in person, in iOS apps, and on the web. It is supported on these Apple devices: iPhone, Apple Watch, iPad, and Mac. It digitizes and can replace a cred ...
,
Samsung Pay Samsung Pay (stylized as SΛMSUNG Pay) is a mobile payment and digital wallet service by Samsung Electronics that lets users make payments using compatible phones and other Samsung-produced devices. The service supports contactless payments usin ...
, Microsoft Wallet,
Fitbit Pay Fitbit (stylized as fitbit) is an American consumer electronics and fitness company. It produces wireless-enabled wearable technology, physical fitness monitors and activity trackers such as smartwatches, pedometers and monitors for heart rate, ...
and
Garmin Pay Garmin Ltd. (shortened to Garmin, stylized as GARMIN, and formerly known as ProNav) is an American, Swiss-domiciled multinational technology company founded in 1989 by Gary Burrell and Min Kao in Lenexa, Kansas, United States, with headquarte ...
. Visa uses tokenization techniques to provide a secure online and mobile shopping. Using blockchain, as opposed to relying on trusted third parties, it is possible to run highly accessible, tamper-resistant databases for transactions. With help of blockchain, tokenization is the process of converting the value of a tangible or intangible asset into a token that can be exchanged on the network. This enables the tokenization of conventional financial assets, for instance, by transforming rights into a digital token backed by the asset itself using blockchain technology. Besides that, tokenization enables the simple and efficient compartmentalization and management of data across multiple users. Individual tokens created through tokenization can be used to split ownership and partially resell an asset. Consequently, only entities with the appropriate token can access the data. Numerous blockchain companies support asset tokenization. In 2019,
eToro eToro is an Israeli social trading and multi-asset investment company that focuses on providing financial and copy trading services. Its headquarters are located in Central Israel, and the company has registered offices in Cyprus, the Unite ...
acquired Firmo and renamed as eToroX. Through its Token Management Suite, which is backed by USD-pegged stablecoins, eToroX enables asset tokenization. The tokenization of equity is facilitated by STOKR, a platform that links investors with small and medium-sized businesses. Tokens issued through the STOKR platform are legally recognized as transferable securities under European Union capital market regulations. Breakers enable tokenization of intellectual property, allowing content creators to issue their own digital tokens. Tokens can be distributed to a variety of project participants. Without intermediaries or governing body, content creators can integrate reward-sharing features into the token.


Application to alternative payment systems

Building an alternate payments system requires a number of entities working together in order to deliver
near field communication Near-field communication (NFC) is a set of communication protocols that enables communication between two electronic devices over a distance of 4 cm (1 in) or less. NFC offers a low-speed connection through a simple setup that can be u ...
(NFC) or other technology based payment services to the end users. One of the issues is the interoperability between the players and to resolve this issue the role of trusted service manager (TSM) is proposed to establish a technical link between
mobile network operator A mobile network operator (MNO), also known as a wireless service provider, wireless carrier, cellular company, or mobile network carrier, is a provider of wireless communications services that owns or controls all the elements necessary to sel ...
s (MNO) and providers of services, so that these entities can work together. Tokenization can play a role in mediating such services. Tokenization as a security strategy lies in the ability to replace a real card number with a surrogate (target removal) and the subsequent limitations placed on the surrogate card number (risk reduction). If the surrogate value can be used in an unlimited fashion or even in a broadly applicable manner, the token value gains as much value as the real credit card number. In these cases, the token may be secured by a second dynamic token that is unique for each transaction and also associated to a specific payment card. Example of dynamic, transaction-specific tokens include cryptograms used in the EMV specification.


Application to PCI DSS standards

The
Payment Card Industry Data Security Standard The Payment Card Industry Data Security Standard (PCI DSS) is an information security standard used to handle credit cards from major card brands. The standard is administered by the Payment Card Industry Security Standards Council and its use ...
, an industry-wide set of guidelines that must be met by any organization that stores, processes, or transmits cardholder data, mandates that credit card data must be protected when stored. Tokenization, as applied to payment card data, is often implemented to meet this mandate, replacing credit card and ACH numbers in some systems with a random value or string of characters. Tokens can be formatted in a variety of ways. Some token service providers or tokenization products generate the surrogate values in such a way as to match the format of the original sensitive data. In the case of payment card data, a token might be the same length as a Primary Account Number (
bank card number A payment card number, primary account number (PAN), or simply a card number, is the card identifier found on payment cards, such as credit cards and debit cards, as well as stored-value cards, gift cards and other similar cards. In some situat ...
) and contain elements of the original data such as the last four digits of the card number. When a payment card authorization request is made to verify the legitimacy of a transaction, a token might be returned to the merchant instead of the card number, along with the authorization code for the transaction. The token is stored in the receiving system while the actual cardholder data is mapped to the token in a secure tokenization system. Storage of tokens and payment card data must comply with current PCI standards, including the use o
strong cryptography


Standards (ANSI, the PCI Council, Visa, and EMV)

Tokenization is currently in standards definition in ANSI X9 a
X9.119 Part 2
X9 is responsible for the industry standards for financial cryptography and data protection including payment card PIN management, credit and debit card encryption and related technologies and processes. The PCI Council has also stated support for tokenization in reducing risk in data breaches, when combined with other technologies such as Point-to-Point Encryption (P2PE) and assessments of compliance to PCI DSS guidelines. Visa Inc. released Visa Tokenization Best Practices for tokenization uses in credit and debit card handling applications and services. In March 2014
EMVCo LLC
released its first payment tokenization specification for EMV. PCI DSS is the most frequently utilized standard for Tokenization systems used by payment industry players.


Risk reduction

Tokenization can render it more difficult for attackers to gain access to sensitive data outside of the tokenization system or service. Implementation of tokenization may simplify the requirements of the PCI DSS, as systems that no longer store or process sensitive data may have a reduction of applicable controls required by the PCI DSS guidelines. As a security best practice, independent assessment and validation of any technologies used for data protection, including tokenization, must be in place to establish the security and strength of the method and implementation before any claims of privacy compliance, regulatory compliance, and data security can be made. This validation is particularly important in tokenization, as the tokens are shared externally in general use and thus exposed in high risk, low trust environments. The infeasibility of reversing a token or set of tokens to a live sensitive data must be established using industry accepted measurements and proofs by appropriate experts independent of the service or solution provider.


Restrictions on token use

Not all organizational data can be tokenized, and needs to be examined and filtered. When databases are utilized on a large scale, they expand exponentially, causing the search process to take longer, restricting system performance, and increasing backup processes. A database that links sensitive information to tokens is called a vault. With the addition of new data, the vault's maintenance workload increases significantly. For ensuring database consistency, token databases need to be continuously synchronized. Apart from that, secure communication channels must be built between sensitive data and the vault so that data is not compromised on the way to or from storage.


See also

* Adaptive Redaction * PAN truncation *
Format preserving encryption In cryptography, format-preserving encryption (FPE), refers to encrypting in such a way that the output (the ciphertext) is in the same format as the input (the plaintext). The meaning of "format" varies. Typically only finite sets of characters ar ...


References


External links


Cloud vs Payment
- Cloud vs Payment - Introduction to tokenization via cloud payments. {{DEFAULTSORT:Tokenization (Data Security) Cryptography