torch-pipeline-parallelism

13 stars 1 forks
21

Guidance for implementing PyTorch pipeline parallelism for distributed model training. This skill should be used when tasks involve implementing pipeline parallelism, distributed training with model partitioning across GPUs/ranks, AFAB (All-Forward-All-Backward) scheduling, or inter-rank tensor communication using torch.distributed.

Third-Party Agent Skill: Review the code before installing. Agent skills execute in your AI assistant's environment and can access your files. Learn more about security

Installation for Agentic Skill

View all platforms →
skilz install letta-ai/skills/torch-pipeline-parallelism
skilz install letta-ai/skills/torch-pipeline-parallelism --agent opencode
skilz install letta-ai/skills/torch-pipeline-parallelism --agent codex
skilz install letta-ai/skills/torch-pipeline-parallelism --agent gemini

First time? Install Skilz: pip install skilz

Works with 22+ AI coding assistants

Cursor, Aider, Copilot, Windsurf, Qwen, Kimi, and more...

View All Agents
Download Agent Skill ZIP

Extract and copy to ~/.claude/skills/ then restart Claude Desktop

1. Clone the repository:
git clone https://github.com/letta-ai/skills
2. Copy the agent skill directory:
cp -r skills/ai/benchmarks/letta/terminal-bench-2/trajectory-feedback/torch-pipeline-parallelism ~/.claude/skills/

Need detailed installation help? Check our platform-specific guides:

Related Agentic Skills

Agentic Skill Details

Repository
skills
Stars
13
Forks
1
Type
Technical
Meta-Domain
data ai
Primary Domain
machine learning
Market Score
21

Report Security Issue

Found a security vulnerability in this agent skill?