Goose Adapter
openharness-goose (Python)
MCP-first architecture with multi-model support and session management.
Available
v0.1.0
MCP Native
Installation
pip install openharness-goose
Prerequisites
Goose must be running as a server:
# Install Goose
brew install block/tap/goose # macOS
# Start the server (default port 3000)
goose server
Quick Start
from openharness_goose import GooseAdapter
from openharness.types import ExecuteRequest
# Connect to local Goose server
adapter = GooseAdapter(base_url="http://localhost:3000")
# Start a session with working directory
session_id = await adapter.start_session(
working_directory="/path/to/project"
)
# Execute a prompt
result = await adapter.execute(
ExecuteRequest(
message="Help me understand this codebase",
session_id=session_id
)
)
print(result.output)
# Stop the session when done
await adapter.stop_session(session_id)
Capabilities
MCP-First Architecture
Goose's core feature is native Model Context Protocol (MCP) support. Add powerful tool integrations through MCP servers for filesystem access, databases, APIs, and more.
Supported
| Domain | Capability | Notes |
|---|---|---|
| Sessions | start_session() | With working directory |
| Sessions | list_sessions() | |
| Sessions | get_session() | With history |
| Sessions | stop_session() | |
| Sessions | resume_session() | |
| Sessions | export_session() | JSON format |
| Sessions | import_session() | |
| Execution | execute() | Sync execution |
| Execution | execute_stream() | SSE-based |
| MCP | add_extension() | stdio, SSE, builtin |
| MCP | remove_extension() | |
| Tools | list_tools() | Per-session |
| Tools | invoke_tool() | Direct invocation |
| Models | update_provider() | 25+ providers |
Not Supported
| Domain | Reason | Workaround |
|---|---|---|
| Agents | Uses sessions | Sessions provide similar functionality |
| Memory | No persistent memory | Session history is maintained |
| Hooks | Not supported | Wrap adapter methods |
| Subagents | Not supported | Use multiple sessions |
Session Management
Goose uses sessions to maintain conversation context and working directory:
# Start a new session
session_id = await adapter.start_session(
working_directory="/my/project",
recipe_name="developer" # Optional pre-configured recipe
)
# List all sessions
sessions = await adapter.list_sessions()
for session in sessions:
print(f"{session.id}: {session.name}")
# Resume an existing session
await adapter.resume_session(session_id)
# Export session (for backup/sharing)
export_data = await adapter.export_session(session_id)
# Import session
new_session_id = await adapter.import_session(json.dumps(export_data))
# Delete session
await adapter.delete_session(session_id)
MCP Extensions
Add powerful tools through Model Context Protocol servers:
from openharness_goose.types import GooseExtension
# Add a built-in extension
await adapter.add_extension(
session_id,
GooseExtension(
name="developer",
type="builtin",
)
)
# Add an MCP server via stdio
await adapter.add_extension(
session_id,
GooseExtension(
name="filesystem",
type="stdio",
cmd="npx",
args=["-y", "@anthropic-ai/mcp-server-filesystem", "/tmp"],
)
)
# Add an MCP server via SSE
await adapter.add_extension(
session_id,
GooseExtension(
name="remote-tools",
type="sse",
uri="https://example.com/mcp/sse",
)
)
# List extensions for a session
extensions = await adapter.get_session_extensions(session_id)
# Remove an extension
await adapter.remove_extension(session_id, "filesystem")
Extension Types
| Type | Description | Config |
|---|---|---|
builtin | Built-in Goose extensions | name only |
stdio | Local MCP server via subprocess | cmd, args, env |
sse | Remote MCP server via SSE | uri |
Streaming
from openharness.types import ExecuteRequest
async for event in adapter.execute_stream(
ExecuteRequest(
message="Explain this code step by step",
session_id=session_id
)
):
if event.type == "text":
print(event.content, end="")
elif event.type == "tool_call_start":
print(f"\n[Tool: {event.name}]")
elif event.type == "tool_result":
print(f"[Result: {event.output}]")
elif event.type == "done":
print("\n[Complete]")
Event Types
| Event | Description |
|---|---|
text | Text content chunk |
tool_call_start | Tool invocation beginning |
tool_result | Tool execution result |
tool_call_end | Tool invocation complete |
error | Error occurred |
done | Execution complete |
Multi-Model Support
Goose supports 25+ LLM providers:
# Switch to Anthropic
await adapter.update_provider(
session_id,
provider="anthropic",
model="claude-3-5-sonnet-20241022",
)
# Switch to OpenAI
await adapter.update_provider(
session_id,
provider="openai",
model="gpt-4o",
)
# Use local models via Ollama
await adapter.update_provider(
session_id,
provider="ollama",
model="llama3.2",
)
Supported Providers
Anthropic, OpenAI, Google (Gemini), Azure OpenAI, AWS Bedrock, Ollama, Groq, Together AI, Mistral, Cohere, and many more.
Configuration
Adapter Options
adapter = GooseAdapter(
base_url="http://localhost:3000", # Goose server URL
timeout=60.0, # Request timeout in seconds
)
Starting Goose Server
# Default settings
goose server
# Custom port
goose server --port 8080
# With working directory
goose server --working-dir /my/project
Provider Configuration
Configure LLM providers through Goose CLI:
goose configure