Skip to content

Sprocket SDK for building inference workers on Together Dedicated Containers

Notifications You must be signed in to change notification settings

togethercomputer/sprocket

Repository files navigation

Sprocket

Build inference workers for Together's Dedicated Containers. You implement setup() and predict() — Sprocket handles the queue, file transfers, and HTTP server.

Installation

pip install sprocket --extra-index-url https://pypi.together.ai/

Example

import sprocket

class MyModel(sprocket.Sprocket):
    def setup(self):
        self.model = load_your_model()

    def predict(self, args: dict) -> dict:
        return {"output": self.model(args["input"])}

if __name__ == "__main__":
    sprocket.run(MyModel(), "my-org/my-model")

Deploy with the Jig CLI:

together beta jig deploy

Together provisions GPUs, handles autoscaling, and routes jobs to your workers.

About

Sprocket SDK for building inference workers on Together Dedicated Containers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published