ezpz.examples.generateΒΆ
Interactive text generation loop for Hugging Face causal language models.
Launch with:
1 | |
Help output (python3 -m ezpz.examples.generate --help):
1 2 3 4 5 6 7 8 9 10 11 | |
main()
ΒΆ
Load a model and enter an interactive text generation REPL.
Source code in src/ezpz/examples/generate.py
parse_args()
ΒΆ
prompt_model(model, tokenizer, prompt, max_length=64, **kwargs)
ΒΆ
Generate text using a model and tokenizer.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
model
|
AutoModelForCausalLM
|
Causal LM used for generation. |
required |
tokenizer
|
AutoTokenizer
|
Tokenizer that encodes/decodes text. |
required |
prompt
|
str
|
Input prompt to seed generation. |
required |
max_length
|
int
|
Maximum number of tokens to generate. |
64
|
**kwargs
|
object
|
Extra parameters forwarded to |
{}
|
Returns:
| Type | Description |
|---|---|
str
|
Decoded text returned by the model. |
Examples: