Toolkit for Machine Learning, Natural Language Processing, and Text Generation, in TensorFlow. This is part of the CASL project: http://casl-project.ai/
BERTTokenizer
, XLNetTokenizer
, etc). (#225)GPT2Encoder
, GPT2Decoder
, GPT2Classifier
, etc). (#228)dropout_strategy=='item'
to support TensorFlow v1.15. (#231).gitignore
and add .gitignore
files to all examples. (#233)import texar.tf as tx
. (#197)texar.modules.BERTEncoder
(doc) and texar.modules.BERTClassifier
(doc). (#167)TransformerEncoder
and TransformerDecoder
to separate position embeddings from the modules. (#126)output_layer
of decoders' constructors -- used for weight tie b/w the output layer and input embedding matrix. (#126)TransformerDecoder
constructor interface made exact the same with RNN decoders
constructor interfaces. (#126)Helper
s to allow two-argument embedding_fn
(supporting for position embedding). (#126)SinusoidsPositionEmbedder
to enable infinite large or negative position indexes. (#176)texar.losses.reduce_batch_time
when sequence
has dtype other than tf.float32
. (#143)texar.losses.reduce_dimensions
when average_axes
or sum_axes
is int
. (#141)translation_length
is 0. (#176)StochasticConnector
and ReparameterizedStochasticConnector
when transform=False
. (#179)TFRecordData
: A new data module for reading and processing TFRecord data, with support for, e.g., image data, feature data, etc. (#107)GPT-2
: OpenAI pretrained language model. (#91, example)TopKSampleEmbeddingHelper
to perform top_k random sample decoding. (baa09ff)BERT
example using TFRecordData
data module.TransformerDecoder
supports helper
arguments to specify decoding strategy. (#76)examples/seqgan
. (#110)beam_search_decode
with output_layer=tf.identity
(#77)