knowledge-distillation

24

Compress large language models using knowledge distillation from teacher to student models. Use when deploying smaller models with retained performance, transferring GPT-4 capabilities to open-source models, or reducing inference costs. Covers temperature scaling, soft targets, reverse KLD, logit distillation, and MiniLLM training strategies.

Marketplace

Third-Party Agent Skill: Review the code before installing. Agent skills execute in your AI assistant's environment and can access your files. Learn more about security

Installation for Agentic Skill

View all platforms →
skilz install zechenzhangAGI/AI-research-SKILLs/knowledge-distillation
skilz install zechenzhangAGI/AI-research-SKILLs/knowledge-distillation --agent opencode
skilz install zechenzhangAGI/AI-research-SKILLs/knowledge-distillation --agent codex
skilz install zechenzhangAGI/AI-research-SKILLs/knowledge-distillation --agent gemini

First time? Install Skilz: pip install skilz

Works with 22+ AI coding assistants

Cursor, Aider, Copilot, Windsurf, Qwen, Kimi, and more...

View All Agents
Download Agent Skill ZIP

Extract and copy to ~/.claude/skills/ then restart Claude Desktop

1. Clone the repository:
git clone https://github.com/zechenzhangAGI/AI-research-SKILLs
2. Copy the agent skill directory:
cp -r AI-research-SKILLs/19-emerging-techniques/knowledge-distillation ~/.claude/skills/

Need detailed installation help? Check our platform-specific guides:

Related Agentic Skills

Agentic Skill Details

Type
Non-Technical
Meta-Domain
general
Primary Domain
general
Sub-Domain
machine learning models model
Market Score
24

Report Security Issue

Found a security vulnerability in this agent skill?