Unavailable

This livestream is restricted

Already a member? Login with your membership email address

The Fifth Elephant 2024 Annual Conference (12th &13th July)

Maximising the Potential of Data — Discussions around data science, machine learning & AI

Shadab

Shadab

@shadab

Design Patterns for Data Masking and Tokenization

Submitted Jun 5, 2024

Outline

In the era of big data, ensuring the privacy and security of sensitive information is more crucial than ever.

This talk will cover fundamental concepts, including the various types of data masking (static, dynamic, deterministic, and non-deterministic) and tokenization (format-preserving and non-format-preserving).
I will present detailed design patterns and best practices, exploring their implementation in different big data environments

This talk helps data platform engineers with practical insights to secure confidential/critical information throughout its lifecycle.

Who is the audience for your session?

  • Data engineers, scientists, and analysts responsible for designing and building data pipelines requiring data security/obfuscation
  • Decision-makers interested in understanding emerging trends and best practices for data privacy and security
  • Security engineers, and architects seeking data protection techniques

What problem/pain are you trying to solve (for the audience)?

  • Data engineers in most organizations while data, struggle with balancing data security/privacy and functionality.
  • The session will help them understand different challenges while securing sensitive data in big data environments.
    • I will talk about what techniques can be used for what use cases, so it’s scalable and extendable
    • I will talk about how can they ensure forward and backward compatibility
  • This session aims to equip data engineers with design patterns for data masking and tokenization, allowing them to effectively anonymize sensitive data while preserving its utility for analytics and machine learning

What will be the scope of your session?

  • The session will cover fundamental concepts, common challenges and solutions
  • I will delve into the concepts of data masking and tokenization, differentiating between them and exploring their use cases
  • I will discuss security considerations to be aware of when implementing masking and tokenization, including re-identification attacks, token vault security, and key management

How will participants benefit from your session?

  • Understand the nuances of various data masking and tokenization patterns
  • understanding of data masking and tokenization, learn practical design patterns and implementation techniques
  • understand common pitfalls and how to avoid them

Comments

{{ gettext('Login to leave a comment') }}

{{ gettext('Post a comment…') }}
{{ gettext('New comment') }}
{{ formTitle }}

{{ errorMsg }}

{{ gettext('No comments posted yet') }}

Hosted by

Jump starting better data engineering and AI futures

Supported by

Gold Sponsor

Atlassian unleashes the potential of every team. Our agile & DevOps, IT service management and work management software helps teams organize, discuss, and compl

Silver Sponsor

Together, we can build for everyone.

Workshop sponsor

Datastax, the real-time AI Company.

Lanyard Sponsor

We reimagine the way the world moves for the better.

Sponsor

MonsterAPI is an easy and cost-effective GenAI computing platform designed for developers to quickly fine-tune, evaluate and deploy LLMs for businesses.

Community Partner

FOSS United is a non-profit foundation that aims at promoting and strengthening the Free and Open Source Software (FOSS) ecosystem in India. more

Beverage Partner

BONOMI is a ready to drink beverage brand based out of Bangalore. Our first segment into the beverage category is ready to drink cold brew coffee.