long-context

27

Extend context windows of transformer models using RoPE, YaRN, ALiBi, and position interpolation techniques. Use when processing long documents (32k-128k+ tokens), extending pre-trained models beyond original context limits, or implementing efficient positional encodings. Covers rotary embeddings, attention biases, interpolation methods, and extrapolation strategies for LLMs.

Marketplace

Third-Party Agent Skill: Review the code before installing. Agent skills execute in your AI assistant's environment and can access your files. Learn more about security

Installation for Agentic Skill

View all platforms →
skilz install zechenzhangAGI/AI-research-SKILLs/long-context
skilz install zechenzhangAGI/AI-research-SKILLs/long-context --agent opencode
skilz install zechenzhangAGI/AI-research-SKILLs/long-context --agent codex
skilz install zechenzhangAGI/AI-research-SKILLs/long-context --agent gemini

First time? Install Skilz: pip install skilz

Works with 22+ AI coding assistants

Cursor, Aider, Copilot, Windsurf, Qwen, Kimi, and more...

View All Agents
Download Agent Skill ZIP

Extract and copy to ~/.claude/skills/ then restart Claude Desktop

1. Clone the repository:
git clone https://github.com/zechenzhangAGI/AI-research-SKILLs
2. Copy the agent skill directory:
cp -r AI-research-SKILLs/19-emerging-techniques/long-context ~/.claude/skills/

Need detailed installation help? Check our platform-specific guides:

Related Agentic Skills

Agentic Skill Details

Type
Non-Technical
Meta-Domain
general
Primary Domain
general
Sub-Domain
machine learning models model
Market Score
27

Report Security Issue

Found a security vulnerability in this agent skill?