[Paperlist] Awesome paper list of controllable text generation via latent auto-encoders. Contributions of any kind are welcome.
Papers about controllable text generation (CTG) via latent auto-encoders (AEs). Mainly focus on open-domain sentence generation with some style transfer generation methods (without dialogue generation for now).
List format follows:
Publication info. / paper and link / TL; DR / Code link (if available) / Chinese Blog Link (if available)
Entropy (Wuhan Univ.) / A Transformer-Based Hierarchical Variational AutoEncoder Combined Hidden Markov Model for Long Text Generation / G2T, long controllable text (passage) generation, use word-level and sentence-level latent variables. Encode the passage title as the latent prior to conduct controllable passage generation. / Nan
Arxiv (EPFL) / Bag-of-Vectors Autoencoders For Unsupervised Conditional Text Generation / G2T, style transfer task / Nan
EACL (Waterloo Univ.) / Polarized-VAE: Proximity Based Disentangled Representation Learning for Text Generation / G2T, style transfer task; proposed to use two separate encoders to encode sentence syntax and semantic information, added a proximity loss (cosine) on latent space to distinguish dissimilar sentences (with different labels) / Code
Arxiv (Buffalo Univ.) / Transformer-based Conditional Variational Autoencoder for Controllable Story Generation / G2T, explored 3 different methods for condition combination with GPT-2 as both encoder (w/o causal mask) and decoder of a text VAE. / Code / Chinese Blog
Arxiv (UCLA) / Latent Space Energy-Based Model of Symbol-Vector Coupling for Text Generation and Classification / G2T, use energy-based model to model latent prior and variational bayes for posterior approximation, use the similar paradigm of S-VAE to deal with semi-supervised latent learning. / Code