Installation

pip install openharness-goose

Prerequisites

Goose must be running as a server:

# Install Goose
brew install block/tap/goose  # macOS

# Start the server (default port 3000)
goose server

Quick Start

from openharness_goose import GooseAdapter
from openharness.types import ExecuteRequest

# Connect to local Goose server
adapter = GooseAdapter(base_url="http://localhost:3000")

# Start a session with working directory
session_id = await adapter.start_session(
    working_directory="/path/to/project"
)

# Execute a prompt
result = await adapter.execute(
    ExecuteRequest(
        message="Help me understand this codebase",
        session_id=session_id
    )
)
print(result.output)

# Stop the session when done
await adapter.stop_session(session_id)

Capabilities

MCP-First Architecture

Goose's core feature is native Model Context Protocol (MCP) support. Add powerful tool integrations through MCP servers for filesystem access, databases, APIs, and more.

Supported

DomainCapabilityNotes
Sessionsstart_session()With working directory
Sessionslist_sessions()
Sessionsget_session()With history
Sessionsstop_session()
Sessionsresume_session()
Sessionsexport_session()JSON format
Sessionsimport_session()
Executionexecute()Sync execution
Executionexecute_stream()SSE-based
MCPadd_extension()stdio, SSE, builtin
MCPremove_extension()
Toolslist_tools()Per-session
Toolsinvoke_tool()Direct invocation
Modelsupdate_provider()25+ providers

Not Supported

DomainReasonWorkaround
AgentsUses sessionsSessions provide similar functionality
MemoryNo persistent memorySession history is maintained
HooksNot supportedWrap adapter methods
SubagentsNot supportedUse multiple sessions

Session Management

Goose uses sessions to maintain conversation context and working directory:

# Start a new session
session_id = await adapter.start_session(
    working_directory="/my/project",
    recipe_name="developer"  # Optional pre-configured recipe
)

# List all sessions
sessions = await adapter.list_sessions()
for session in sessions:
    print(f"{session.id}: {session.name}")

# Resume an existing session
await adapter.resume_session(session_id)

# Export session (for backup/sharing)
export_data = await adapter.export_session(session_id)

# Import session
new_session_id = await adapter.import_session(json.dumps(export_data))

# Delete session
await adapter.delete_session(session_id)

MCP Extensions

Add powerful tools through Model Context Protocol servers:

from openharness_goose.types import GooseExtension

# Add a built-in extension
await adapter.add_extension(
    session_id,
    GooseExtension(
        name="developer",
        type="builtin",
    )
)

# Add an MCP server via stdio
await adapter.add_extension(
    session_id,
    GooseExtension(
        name="filesystem",
        type="stdio",
        cmd="npx",
        args=["-y", "@anthropic-ai/mcp-server-filesystem", "/tmp"],
    )
)

# Add an MCP server via SSE
await adapter.add_extension(
    session_id,
    GooseExtension(
        name="remote-tools",
        type="sse",
        uri="https://example.com/mcp/sse",
    )
)

# List extensions for a session
extensions = await adapter.get_session_extensions(session_id)

# Remove an extension
await adapter.remove_extension(session_id, "filesystem")

Extension Types

TypeDescriptionConfig
builtinBuilt-in Goose extensionsname only
stdioLocal MCP server via subprocesscmd, args, env
sseRemote MCP server via SSEuri

Streaming

from openharness.types import ExecuteRequest

async for event in adapter.execute_stream(
    ExecuteRequest(
        message="Explain this code step by step",
        session_id=session_id
    )
):
    if event.type == "text":
        print(event.content, end="")
    elif event.type == "tool_call_start":
        print(f"\n[Tool: {event.name}]")
    elif event.type == "tool_result":
        print(f"[Result: {event.output}]")
    elif event.type == "done":
        print("\n[Complete]")

Event Types

EventDescription
textText content chunk
tool_call_startTool invocation beginning
tool_resultTool execution result
tool_call_endTool invocation complete
errorError occurred
doneExecution complete

Multi-Model Support

Goose supports 25+ LLM providers:

# Switch to Anthropic
await adapter.update_provider(
    session_id,
    provider="anthropic",
    model="claude-3-5-sonnet-20241022",
)

# Switch to OpenAI
await adapter.update_provider(
    session_id,
    provider="openai",
    model="gpt-4o",
)

# Use local models via Ollama
await adapter.update_provider(
    session_id,
    provider="ollama",
    model="llama3.2",
)

Supported Providers

Anthropic, OpenAI, Google (Gemini), Azure OpenAI, AWS Bedrock, Ollama, Groq, Together AI, Mistral, Cohere, and many more.

Configuration

Adapter Options

adapter = GooseAdapter(
    base_url="http://localhost:3000",  # Goose server URL
    timeout=60.0,                        # Request timeout in seconds
)

Starting Goose Server

# Default settings
goose server

# Custom port
goose server --port 8080

# With working directory
goose server --working-dir /my/project

Provider Configuration

Configure LLM providers through Goose CLI:

goose configure