This feature is experimental and may be subject to change. If you encounter any issues while using this functionality, please report bugs and errors in our bug tracking system.
Creates a new AI session for interaction with Large Language Models (LLMs).
The session maintains conversation history and context between interactions.
Each session can be configured with different parameters to optimize for specific use cases like code analysis, content generation, or data processing.
Common usage patterns:
- Exception analysis and debugging assistance
- Code documentation generation
- Query optimization suggestions
- Security review assistance
- Performance optimization recommendations
The session persists until explicitly terminated or the application restarts.
LuceeCreateAIsession( name=string, systemMessage=string, limit=numeric, temperature=numeric );
Argument |
Description |
Default |
name
string,
required
|
Specifies which AI endpoint configuration to use. Can be provided in two formats:
- Direct endpoint name:
The name of an endpoint as defined in the Lucee Administrator (similar to how datasource names work)
- Default reference:
Using the format "default:category" to use the endpoint configured as the default for that specific category in the Lucee Administrator.
Currently supported default categories:
- exception: For exception analysis
- documentation: For documentation tasks
The endpoint configurations and their default category assignments are managed in the Lucee Administrator.
Alias:
aiName, nameAI
|
|
systemMessage
string,
optional
|
Initial instruction set that defines the AI's behavior and expertise for this session. This message sets the context and rules for all subsequent interactions.
Best practices:
- Be specific about the AI's role and expertise
- Define output format requirements clearly
- Include any necessary constraints or rules
- Specify error handling preferences
- Define response structure expectations
The system message persists throughout the session and influences all responses.
Alias:
initalMesssage, message
|
|
limit
numeric,
optional
|
Maximum number of question-answer pairs to keep in the conversation history. Once reached, older messages will be removed.
Consider:
- Higher limits provide more context but consume more memory
- Lower limits are more memory efficient but may lose important context
- For complex analysis tasks, consider limits of 10-20
- For simple Q&A, limits of 5-10 may suffice
Alias:
conversationHistoryLimit
|
50
|
temperature
numeric,
optional
|
Controls response randomness (0.0 to 1.0). Lower values make responses more focused and deterministic, higher values make them more creative and varied.
Recommended settings:
- 0.0-0.3: Technical analysis, debugging, code review
- 0.3-0.5: Documentation generation, error explanations
- 0.5-0.7: General purpose interactions
- 0.7-1.0: Creative content generation
For exception analysis and debugging, lower temperatures (0.2-0.3) are recommended for more consistent and precise responses.
|
0.7
|
Examples
There are currently no examples for this function
18,528ms WARN No examples for function luceecreateaisession
See also