Skip to content

Ongoing Research Project for continaual pre-training LLM(dense mode)

Notifications You must be signed in to change notification settings

okoge-kaz/llm-recipes

Repository files navigation

LLM Recipes

Table of Contents

  1. Installation
  2. Instruction Tuning
  3. LLM Continual Pre-Training

Installation

To install the package, run the following command:

pip install -r requirements.txt

If you want to use the library in multi-nodes, you need to install the below packages:

module load openmpi/4.x.x

pip install mpi4py

FlashAttention

To install the FlashAttention, run the following command: (GPU is required)

pip install ninja packaging wheel
pip install flash-attn --no-build-isolation

ABCI

If you use ABCI to run the experiments, install scripts are available in llm-recipes/install.sh.

Instruction Tuning

scripts/abci/instruction contains the scripts to run instruction tunings on ABCI.

LLM Continual Pre-Training

Docs is coming soon.

About

Ongoing Research Project for continaual pre-training LLM(dense mode)

Resources

Stars

Watchers

Forks

Packages

No packages published