textattack.constraints.grammaticality.language_models.learning_to_write package
“Learning To Write”
AdaptiveSoftmax
- class textattack.constraints.grammaticality.language_models.learning_to_write.adaptive_softmax.AdaptiveLoss(cutoffs)[source]
Bases:
Module
- forward(inp, target)[source]
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- training: bool
- class textattack.constraints.grammaticality.language_models.learning_to_write.adaptive_softmax.AdaptiveSoftmax(input_size, cutoffs, scale_down=4)[source]
Bases:
Module
- forward(inp)[source]
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- training: bool
Language model helpers
- class textattack.constraints.grammaticality.language_models.learning_to_write.language_model_helpers.QueryHandler(model, word_to_idx, mapto, device)[source]
Bases:
object
- query(sentences, swapped_words, batch_size=32)[source]
Since we don’t filter prefixes for OOV ahead of time, it’s possible that some of them will have different lengths. When this is the case, we can’t do RNN prediction in batch.
This method _tries_ to do prediction in batch, and, when it fails, just does prediction sequentially and concatenates all of the results.
- textattack.constraints.grammaticality.language_models.learning_to_write.language_model_helpers.util_reverse(item)[source]
“Learning To Write” Language Model
- class textattack.constraints.grammaticality.language_models.learning_to_write.learning_to_write.LearningToWriteLanguageModel(window_size=5, **kwargs)[source]
Bases:
LanguageModelConstraint
A constraint based on the L2W language model.
The RNN-based language model from “Learning to Write With Cooperative Discriminators” (Holtzman et al, 2018).
https://arxiv.org/pdf/1805.06087.pdf
https://github.com/windweller/l2w
Reused by Jia et al., 2019, as a substitution for the Google
1-billion words language model (in a revised version the attack of Alzantot et al., 2018).
https://worksheets.codalab.org/worksheets/0x79feda5f1998497db75422eca8fcd689
- get_log_probs_at_index(text_list, word_index)[source]
Gets the probability of the word at index word_index according to the language model.
- CACHE_PATH = 'constraints/grammaticality/language-models/learning-to-write'
RNN Language Model
- class textattack.constraints.grammaticality.language_models.learning_to_write.rnn_model.RNNModel(rnn_type, ntoken, ninp, nhid, nlayers, cutoffs, proj=False, dropout=0.5, tie_weights=False, lm1b=False)[source]
Bases:
Module
Container module with an encoder, a recurrent module, and a decoder.
Based on official pytorch examples
- forward(input, hidden)[source]
Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Module
instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
- training: bool