Skip to content

Deployment: Dockerfile and Smithery config #3

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 40 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Generated by https://smithery.ai. See: https://smithery.ai/docs/config#dockerfile
# Use a Node.js image to build the server
FROM node:20-alpine AS builder

# Set working directory
WORKDIR /app

# Copy package.json and package-lock.json to the container
COPY package.json package-lock.json ./

# Install dependencies
RUN npm install --ignore-scripts

# Copy the source code
COPY src ./src
COPY tsconfig.json ./

# Build the project
RUN npm run build

# Use a smaller Node.js image for the final output
FROM node:20-alpine AS release

# Set working directory
WORKDIR /app

# Copy built files and package.json
COPY --from=builder /app/build ./build
COPY --from=builder /app/package.json ./

# Install only production dependencies
RUN npm ci --omit=dev

# Set environment variables required by the server
ENV LLAMA_CLOUD_INDEX_NAME="<YOUR_INDEX_NAME>"
ENV LLAMA_CLOUD_PROJECT_NAME="<YOUR_PROJECT_NAME>"
ENV LLAMA_CLOUD_API_KEY="<YOUR_API_KEY>"

# Set the entry point to run the server
ENTRYPOINT ["node", "build/index.js"]
8 changes: 8 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,14 @@ npm run watch

## Installation

### Installing via Smithery

To install LlamaCloud for Claude Desktop automatically via [Smithery](https://smithery.ai/server/mcp-server-llamacloud):

```bash
npx -y @smithery/cli install mcp-server-llamacloud --client claude
```

To use with Claude Desktop, add the server config:

On MacOS: `~/Library/Application Support/Claude/claude_desktop_config.json`
Expand Down
25 changes: 25 additions & 0 deletions smithery.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
# Smithery configuration file: https://smithery.ai/docs/config#smitheryyaml

startCommand:
type: stdio
configSchema:
# JSON Schema defining the configuration options for the MCP.
type: object
required:
- llamaCloudIndexName
- llamaCloudProjectName
- llamaCloudApiKey
properties:
llamaCloudIndexName:
type: string
description: The index name for the LlamaCloud server.
llamaCloudProjectName:
type: string
description: The project name for the LlamaCloud server.
llamaCloudApiKey:
type: string
description: The API key for accessing the LlamaCloud server.
commandFunction:
# A function that produces the CLI command to start the MCP on stdio.
|-
(config) => ({ command: 'node', args: ['build/index.js'], env: { LLAMA_CLOUD_INDEX_NAME: config.llamaCloudIndexName, LLAMA_CLOUD_PROJECT_NAME: config.llamaCloudProjectName, LLAMA_CLOUD_API_KEY: config.llamaCloudApiKey } })