Boost Customer Service Efficiency: Using OpenAI Tools for Seamless Support
Are you looking to enhance your customer service and streamline support workflows? Discover how you can leverage OpenAI's tools to build a more efficient and controlled customer service experience. This guide focuses on using the ChatCompletion
endpoint with a required tool to ensure deterministic outcomes and greater control over customer interactions.
Why Use Tools in Customer Service?
Integrating tools into your customer service workflow offers several key advantages:
- Consistency: Ensure specific support actions are always available.
- Control: Define clear exit points and manage conversations effectively.
- Efficiency: Automate tasks and gather necessary information quickly.
Setting Up Your Customer Service Agent
Here's how to configure your customer service agent using OpenAI's tools:
1. Define Your Tools
Clearly define the tools your customer service LLM (Large Language Model) will use. For example:
speak_to_user
: Sends messages to the customer, providing information, or asking for case-related details.get_instructions
: Retrieves specific instructions based on the customer's problem type (fraud, refund, information).
tools = [
{
"type": "function",
"function": {
"name": "speak_to_user",
"description": "Use this to speak to the user to give them information and to ask for anything required for their case.",
"parameters": {
"type": "object",
"properties": {
"message": {
"type": "string",
"description": "Text of message to send to user. Can cover multiple topics."
}
},
"required": ["message"]
}
}
},
{
"type": "function",
"function": {
"name": "get_instructions",
"description": "Used to get instructions to deal with the user's problem.",
"parameters": {
"type": "object",
"properties": {
"problem": {
"type": "string",
"enum": ["fraud","refund","information"],
"description": """The type of problem the customer has. Can be one of:
- fraud: Required to report and resolve fraud.
- refund: Required to submit a refund request.
- information: Used for any other informational queries."""
}
},
"required": [
"problem"
]
}
}
}
]
2. Provide Instructions for Common Issues
Create a set of instructions for your assistant to follow for different customer problems. For example, instructions for fraud reporting, refund requests, and general information queries.
3. Craft a System Prompt
Develop a clear and concise system prompt to guide the LLM's behavior. The prompt should instruct the assistant to:
- Understand the customer's problem and fetch relevant instructions.
- Follow the instructions to resolve the issue, confirming actions with the customer.
- Offer further assistance or close the case.
assistant_system_prompt ="""You are a customer service assistant. Your role is to answer user questions politely and competently.
You should follow these instructions to solve the case:
- Understand their problem and get the relevant instructions.
- Follow the instructions to solve the customer's problem. Get their confirmation before performing a permanent operation like a refund or similar.
- Help them with any other problems or close the case.
Only call a tool once in a single message.
If you need to fetch a piece of information from a system or document that you don't have access to, give a clear, confident answer with some dummy values."""
4. Implement the submit_user_message
Function
This function handles user messages and orchestrates tool calls until a response to the user is required. It ensures that the interaction loops through necessary tool calls until a final response is generated.
5. Implement the execute_function
Function
This function executes the tool calls, retrieves relevant information, and formats responses for the user.
Using tool_choice='required'
for Deterministic Tool Use
The tool_choice='required'
parameter in the ChatCompletion
API call ensures that a tool is always used in each interaction. This is particularly useful in customer service scenarios where you want to guarantee specific actions are taken.
response = client.chat.completions.create(model=GPT_MODEL
,messages=messages
,temperature=0
,tools=tools
,tool_choice='required'
)
Example: Handling a Fraudulent Activity Report
Let's walk through an example where a customer reports a stolen item:
- User: "Hi, I have had an item stolen that was supposed to be delivered to me yesterday."
- Assistant: Uses
get_instructions
to identify the problem type as "fraud." - Assistant: Uses the "fraud" instructions to ask for details about the incident.
- Assistant: Sends a message to the user asking for more information about the fraudulent activity.
- User: Provides details about the stolen shirt.
- Assistant: Reports the incident to the security team (hypothetically) and arranges a refund, seeking confirmation.
Evaluating the Customer Service Agent
To ensure your customer service agent performs as expected, consider using a GPT-based customer to simulate real interactions. This involves:
- Creating a
customer_system_prompt
that defines the customer's query and instructions. - Developing an
execute_conversation
function that facilitates back-and-forth communication between the customer and the agent. - Using a set of predefined questions to test the agent's ability to resolve common issues like refunds or damaged goods.
Benefits of This Approach:
- Improved Customer Satisfaction: Providing quick and accurate responses.
- Reduced Support Costs: Automating routine tasks and streamlining workflows.
- Enhanced Control: Maintaining consistent and reliable customer interactions.
By implementing these strategies, you can unlock powerful customer service solutions with OpenAI's tools, leading to happier customers and more efficient operations.