HyperAIHyperAI

Command Palette

Search for a command to run...

UCIT Continuous Instruction Tuning Dataset

Date

2 months ago

Size

6.07 GB

Organization

Chinese Academy of Sciences
University of Chinese Academy of Sciences

Paper URL

2503.12941

License

CC BY 4.0

UCIT, the full name of which is Unseen Continual Instruction Tuning, is a benchmark dataset for continuous instruction tuning of multimodal large-scale language models, which was jointly released in 2025 by the Institute of Automation, Chinese Academy of Sciences, the School of Frontier Interdisciplinary Sciences, and the School of Artificial Intelligence of the University of Chinese Academy of Sciences.HiDe-LLaVA: Hierarchical Decoupling for Continual Instruction Tuning of Multimodal Large Language Model".

This dataset contains six carefully selected task subsets, each corresponding to a task type not encountered by the model during supervised fine-tuning. Together, they form a challenging "unseen task" test set that can be used to fairly evaluate the model's adaptability and structure preservation capabilities. Each example consists of a task prompt/instruction and the corresponding expected correct execution (ground-truth response), which is used to measure the model's performance under zero-shot conditions.

UCIT.torrent
Seeding 1Downloading 0Completed 9Total Downloads 23
  • UCIT/
    • README.md
      1.65 KB
    • README.txt
      3.3 KB
      • data/
        • UCIT.zip
          6.07 GB

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
UCIT Continuous Instruction Tuning Dataset | Datasets | HyperAI