AI / LLM Integration
The AI support feature is experimental and the function API may be subject to change.
Functions
- LuceeAIgetMetadata() This feature is experimental and may be subject to change. If you encounter any issues while using this functionality, please report bugs and errors in our bug tracking system. Retrieves metadata about a configured AI endpoint. Returns information about the endpoint's configuration, and optionally detailed information about available models and associated files
- LuceeAIhas() This feature is experimental and may be subject to change. If you encounter any issues while using this functionality, please report bugs and errors in our bug tracking system. Checks whether a specific AI endpoint is configured and available for use.
- LuceeCreateAIsession() This feature is experimental and may be subject to change. If you encounter any issues while using this functionality, please report bugs and errors in our bug tracking system. Creates a new AI session for interaction with Large Language Models (LLMs). The session maintains conversation history and context between interactions
- LuceeInquiryAIsession() This feature is experimental and may be subject to change. If you encounter any issues while using this functionality, please report bugs and errors in our bug tracking system. Sends a question/message to an AI session and returns the response. The function maintains conversation context from previous interactions within the same session
Guides
- AI (experimental) # AI (Experimental) Lucee 6.2 includes experimental support for AI integration, which will be finalized with the release of Lucee 7. This documentation is subject to change, reflecting Lucee's aim to remain adaptable to future advancements. Feedback is welcome to help tailor functionality to users' needs. ## Configuration In Lucee 6
- AI Integration for Documentation (Experimental) Guide to configuring AI for use in Lucee's Documentation tab, leveraging retrieval-augmented generation (RAG) and enhanced search functionality.