The pixel

Tokenized Data

Tokenization entails the substitution of sensitive data with a non-sensitive equivalent, known as a token. This token then maps back to the original sensitive data through a tokenization system that makes tokens practically impossible to reverse without them. Many such systems leverage random numbers to produce secure tokens. Tokenization is often used to secure financial records, bank accounts, medical records and many other forms of personally identifiable information (PII).

Related Terms

Data Classification

Data Classification is the process of categorizing data in order to take more efficient actions on them. The process is used to describe a higher level business classification on the data set itself, such as confidential, sensitive, or personally identifiable. This kind of data classification can be helpful to implement a data protection policy or other data governance rules.

Learn More
Data Catalog

An organized inventory of data assets in the organization. Data catalogs use metadata to help organizations manage their data. They also help data professionals collect, organize, access, and enrich metadata to support data discovery and governance.

Learn More