Currently, you can finetune the following models through the platform:

  1. Mistral-7b
  2. Mixtral - MOE (8x7b)
  3. deepseek-coder-instruct-6.7b (an opensource model by DeepSeek, please contact us for finetuning this model)