LlmInference

public class LlmInference

LlmInference Task Java API

Nested Classes

class LlmInference.LlmInferenceOptions Options for setting up an LlmInference

Public Methods

void
close()
Closes and cleans up the LlmInference.
static LlmInference
createFromOptions(Context context, LlmInference.LlmInferenceOptions options)
Creates an LlmInference Task.
String
generateResponse(String inputText)
Generates a response based on the input text.
void
generateResponseAsync(String inputText)
Generates a response based on the input text.
int
sizeInTokens(String text)
Runs an invocation of only the tokenization for the LLM, and returns the size (in tokens) of the result.

Inherited Methods

Public Methods

public void close ()

Closes and cleans up the LlmInference.

public static LlmInference createFromOptions (Context context, LlmInference.LlmInferenceOptions options)

Creates an LlmInference Task.

Parameters
context
options

public String generateResponse (String inputText)

Generates a response based on the input text.

Parameters
inputText a String for processing.
Throws
if the inference fails.

public void generateResponseAsync (String inputText)

Generates a response based on the input text.

Parameters
inputText a String for processing.
Throws
if the inference fails.

public int sizeInTokens (String text)

Runs an invocation of only the tokenization for the LLM, and returns the size (in tokens) of the result. Cannot be called while a generateResponse(String) query is active. generateResponse

Parameters
text The text to tokenize.
Returns
  • The number of tokens in the resulting tokenization of the text.
Throws
if the tokenization fails.