arXiv Analytics

Sign in

arXiv:2209.15421 [cs.LG]AbstractReferencesReviewsResources

TabDDPM: Modelling Tabular Data with Diffusion Models

Akim Kotelnikov, Dmitry Baranchuk, Ivan Rubachev, Artem Babenko

Published 2022-09-30Version 1

Denoising diffusion probabilistic models are currently becoming the leading paradigm of generative modeling for many important data modalities. Being the most prevalent in the computer vision community, diffusion models have also recently gained some attention in other domains, including speech, NLP, and graph-like data. In this work, we investigate if the framework of diffusion models can be advantageous for general tabular problems, where datapoints are typically represented by vectors of heterogeneous features. The inherent heterogeneity of tabular data makes it quite challenging for accurate modeling, since the individual features can be of completely different nature, i.e., some of them can be continuous and some of them can be discrete. To address such data types, we introduce TabDDPM -- a diffusion model that can be universally applied to any tabular dataset and handles any type of feature. We extensively evaluate TabDDPM on a wide set of benchmarks and demonstrate its superiority over existing GAN/VAE alternatives, which is consistent with the advantage of diffusion models in other fields. Additionally, we show that TabDDPM is eligible for privacy-oriented setups, where the original datapoints cannot be publicly shared.

Related articles: Most relevant | Search more
arXiv:2312.05989 [cs.LG] (Published 2023-12-10)
A Note on the Convergence of Denoising Diffusion Probabilistic Models
arXiv:2409.08487 [cs.LG] (Published 2024-09-13)
Sub-graph Based Diffusion Model for Link Prediction
arXiv:2306.17560 [cs.LG] (Published 2023-06-30)
Class-Incremental Learning using Diffusion Model for Distillation and Replay