Command Palette
Search for a command to run...
UCIT Continuous Instruction Tuning Dataset
Date
Size
Paper URL
License
CC BY 4.0
UCIT, the full name of which is Unseen Continual Instruction Tuning, is a benchmark dataset for continuous instruction tuning of multimodal large-scale language models, which was jointly released in 2025 by the Institute of Automation, Chinese Academy of Sciences, the School of Frontier Interdisciplinary Sciences, and the School of Artificial Intelligence of the University of Chinese Academy of Sciences.HiDe-LLaVA: Hierarchical Decoupling for Continual Instruction Tuning of Multimodal Large Language Model".
This dataset contains six carefully selected task subsets, each corresponding to a task type not encountered by the model during supervised fine-tuning. Together, they form a challenging "unseen task" test set that can be used to fairly evaluate the model's adaptability and structure preservation capabilities. Each example consists of a task prompt/instruction and the corresponding expected correct execution (ground-truth response), which is used to measure the model's performance under zero-shot conditions.
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.