AI Prompt - Single shot LLM Classification/Prompting
Please note that this may have licensing implications. Please discuss with your account manager if you are unsure if you have full access.
Overview
Release 1.9 introduces a structured API and Model to perform simple Single-shot Classification/Prompting using State-of-the-Art LLM Models.
Examples where these type of models can provide significant benefits:
Zero Knowledge Classification - Typically, classification engines (such as Sofi) require data and examples to provide any form of classification. LLMs are excellent at providing zero knowledge classification based on their understanding of the base training data (typically billions of documents from all languages).
Summarisation - LLMs are excellent at summarising large blocks of text or structured data. Single-shot classification could be used to provide:
provide a summary of all the interactions of a call
summarisation of a large request into a shorter by-line or Short description
provide a concise summary of a knowledge article
Prioritisation/Sentiment analysis - LLMs are good at understanding urgency or sentiment in correspondence
Knowledge generation - LLMs can take information provided in Problem records and provide a starting point for a knowledge article.
AI Prompts
AI Prompts allow you to design and test single-shot classification prompts.
Fields
Field | Description |
---|---|
Name | Display value for the AI Prompt |
Key | Lookup value for the AI Prompt. Used from the AI Prompt API to initiate the Prompt from an action. |
LLM Model | Link to an LLM Model configuration. Contains the information on the provider, model, and authentication tokens to be used. Can be changed to quickly test the results from different providers and models. |
Description | Description/documentation for the Prompt |
User prompt | This is the main user/action prompt presented to the LLM. This is where the actions you are asking the LLM to take are laid out and explained. This field is a Template, allowing you to provide information from associated records. You can use the ‘Pre-processing Script’ to retrieve information required for the template. |
System prompt | The System Prompt refers to the initial input or instruction given to the model to generate a response. This prompt sets the direction and parameters for the model's output. Typically, in the system prompt you provide the model with any instructions on:
|
Pre-processing Script | Provides the ability to retrieve information from Servicely and include it in the request to the LLM. This could include information from related records, classifications, groups, etc. Script Context
|
Post-processing Script | Provides the ability to process the response from the LLM before returning it to the AI Prompt API Call. Script Context
|
Test Setup Script | Provides a mechanism to specify context for the ‘Test’ buttons on the AI Prompt form. The ‘Test prompts only’ and ‘Test Model’ buttons allow you to rapidly iterate over the authoring of the Prompt, and the ‘Test Setup Script’ allows you to set the test environment for them. |
Testing
‘Test prompts only’ button
The ‘Test prompts only’ button allows you to test the System and UserPrompts before sending them to the LLM. This includes loading any content in the Pre-processing Script. The text for the prompts will be displayed, and can be copied to the Clipboard using the ‘Copy’ button.
‘Test Model’ button
The ‘Test Model’ button formats the request and sends it to the LLM, and then displays the results along with information on the model, latency, token usage, cost, and result.
System LLM Models
The SystemLLMModels table contains information on the available LLM Models to use. The initial 1.9 release provides support for OpenAI and Anthropic, and provides the current set of available models from those providers. More providers will be implemented in upcoming releases.
System LLM Usage
The System LLM Usage table tracks the model, provider, token and cost usage of the LLM providers.
Initiating from the API
Function | Description |
---|---|
| Executes the SystemAiPrompt and returns the value returned by the Post-processing Script. aiPromptKey - The unique ‘Key’ field of the AI Prompt context - A key/value pair containing any context that is to be passed to the Pre-processing Script. |
Servicely Documentation