10.0+50.3 The ai module

The ai module lets you access the LLM integration of QF-Test from scripts.

 
 
String ask(String message, String configName, Double temperature, Double topP, Integer topK, Double frequencyPenalty, Double presencePenalty, Integer maxOutputTokens, List<String> stopSequences, Integer timeoutMs) throws TestException

Sends a message to a configured LLM and returns the response.

print(ai.ask("Are you there?", "my-llm"))
                print(ai.ask(
                "Let's try something more complex",
                "my-llm",
                temperature=0.7,
                topK=4,
                topP=0.0,
                presencePenalty=1.0,
                frequencyPenalty=0.9,
                maxOutputTokens=1000,
                stopSequences=['stop!'],
                timeoutMs=10000,
                ))
              
Example 50.5:  Usage of ai.ask (Jython script)
Parameters
message The message to send to the LLM.
configurationName The name of a LLM configuration as set in LLM Configurations in the QF-Test options
temperature Controls the randomness of the generated text.
topK Controls how many of the most likely tokens should be considered.
topP Controls which tokens should be considered.
presencePenalty Prevents the llm from reusing words.
frequencyPenalty Prevents the llm from repeating words.
maxOutputTokens Maximum number of tokens that can be generated in the response.
stopSequences Prevent LLM from generating more text after that string appears.
timeoutMs How long to wait for a response from the LLM, in milliseconds.
Returns The answer from the LLM as a string.
 
void addCustomModel(final String name, final Object responseFunction)

Adds a custom model to the list of defined AI configurations

if ("myLLM" not in ai.getConfigNames()) {
                ai.addCustomModel("myLLM", { msg ->
                qf.logMessage("Input for the LLM: " + msg)
                return "Unfortunately, I cannot help you."
                })
                }
Example 50.6:  Usage of ai.addCustomModel (Jython script)
Parameters
name The AI configuration name under which the custom model can be referenced
responseFunction A function which receives the query – its first parameter is the request itself, an optional second parameter holds the request parameters
 
void removeCustomModel(final String name)

Removes the custom model from the list of defined models

Parameters
name The name under which the custom model was registered
 
List<String> getConfigNames()

Returns a list of all available AI configuration names

Returns A List of all available AI configuration names
 
void setDefaultConfig(String provider, String baseUrl, String apiKey, String modelName, String displayName="Default")

Define a default LLM configuration to use instead of the first one configured in Options > Artificial Intelligence.

Parameters
provider Currently available provider types: OpenAIGeneric, Anthropic, Gemini and Ollama.
baseUrl The base API URL endpoint to the provider. Usually ends with /v1 oder similar.
apiKey The API key to submit to the provider.
modelName The name of the model to use, like gpt-4o or gemini-2.0-flash.
displayName How QF-Test will identify the configuration in logs and error messages, defaults to Default
 
void resetDefaultConfig()

Reset the default LLM configuration previously set via ai.setDefaultConfig to use the first one defined in Options > Artificial Intelligence.