Creating Queries
Queries are how you interact with models, tools, agents and teams in ARK. A query sends input to any target, or set of targets, and receives a response. You can create queries using kubectl
, the fark
CLI, the ARK APIs or our OpenAI-compatible endpoints.
What is a Query?
A query is a request to execute a task using an agent or team. It contains:
- Input: The prompt or question you want to ask
- Target: Which agent or team should process the request
- Output: The response from the agent or team
User Messages
If a type
is not specified, then the input of a query is a single user message:
apiVersion: ark.mckinsey.com/v1alpha1
kind: Query
metadata:
name: query-simple
spec:
input: "What is the capital of France"
targets:
- type: agent
name: sample-agent
Apply this sample query with:
# Create the 'sample agent' if you haven't already, then run the query.
kubectl apply -f samples/agents/sample-agent.yaml
kubectl apply -f samples/queries/query-simple.yaml
Check the result:
kubectl get query query-simple -o yaml
Structured Conversations
The messages
query type can be used to input a structured conversation (messages array):
apiVersion: ark.mckinsey.com/v1alpha1
kind: Query
metadata:
name: conversation-messages
namespace: default
spec:
type: messages
input:
- role: user
content: "Calculate 1+1."
- role: assistant
content: "2"
- role: user
content: "Calculate 2+2."
- role: assistant
content: "4"
- role: user
content: "What is the sum of all previous sums?"
targets:
- type: agent
name: sample-agent
Apply this query using kubectl:
# Create the 'sample agent' if you haven't already, then run the query.
kubectl apply -f samples/agents/sample-agent.yaml
kubectl apply -f samples/queries/query-conversation-messages.yaml
Check the result:
kubectl get query conversation-messages -o yaml
The messages
input type allows you to build up conversation context, by adding the response of each query into an OpenAI message object array.
Notes:
type
is optional; if omitted, it defaults touser
(string input).- For
type: messages
,input
must be an array of objects withrole
andcontent
fields. - Supported roles:
user
,assistant
,system
, and provider-specific roles.
Using fark CLI
Query an agent directly:
fark agent weather-agent "What's the weather like in New York?"
Query a team:
fark team team-seq "Analyze this data and provide recommendations"
Create a named query:
fark query my-query
Using OpenAI-Compatible Endpoints
The OpenAI-compatible API lets you use familiar tools and libraries to interact with your agents and teams.
Explore the Ark APIs docs to see how to run the APIs in detail.
List Available Targets
You can use the OpenAI List Models API to show all available query targets. This will show things like tool/get_weater
, model/claude-4-opus
, agent/weather-reporter
, or team/coding-team
.
See all available agents, teams, models, and tools:
curl http://ark-api.default.127.0.0.1.nip.io:8080/openai/v1/models
Query a Agent
Query a model, agent, team or tool using the exact same syntax that you would use to query an LLM with the OpenAI SDK:
curl -X POST http://ark-api.default.127.0.0.1.nip.io:8080/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "agent/weather-agent",
"messages": [
{"role": "user", "content": "What'\''s the weather like in New York?"}
]
}'
Response:
{
"id": "query-abc123",
"object": "chat.completion",
"model": "agent/weather-agent",
"choices": [
{
"message": {
"role": "assistant",
"content": "The current weather in New York is 72°F with partly cloudy skies..."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 12,
"completion_tokens": 25,
"total_tokens": 37
}
}
Using the OpenAI SDK
As the ARK APIs offer OpenAI compatible APIs, you can use any OpenAI SDK to issue queries:
from openai import OpenAI
client = OpenAI(
api_key="not-needed",
base_url="http://ark-api.default.127.0.0.1.nip.io:8080/openai/v1"
)
response = client.chat.completions.create(
model="agent/github-repo-accessor",
messages=[
{"role": "user", "content": "Find repositories about kubernetes"}
]
)
print(response.choices[0].message.content)
Query a Team
curl -X POST http://ark-api.default.127.0.0.1.nip.io:8080/openai/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "team/team-seq",
"messages": [
{"role": "user", "content": "Analyze customer feedback and suggest improvements"}
]
}'
Response:
{
"id": "query-def456",
"object": "chat.completion",
"model": "team/team-seq",
"choices": [
{
"message": {
"role": "assistant",
"content": "Based on the customer feedback analysis:\n1. Product quality is highly rated\n2. Shipping speed needs improvement\n3. Customer service response time should be reduced..."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 18,
"completion_tokens": 45,
"total_tokens": 63
}
}
Next Steps
- Explore the Ark APIs for complete endpoint documentation
- Learn about Tools and MCP Servers to extend agent capabilities
- Check out Tips on Building Agentic Use Cases for best practices