Skip to content
View okoge-kaz's full-sized avatar

Highlights

  • Pro

Organizations

@rioyokotalab @turingmotors

Block or report okoge-kaz

Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
okoge-kaz/README.md

Hi there 👋

I'm a second-year Master's student working at the intersection of HPC and Machine Learning. My research focuses on distributed training of large models and low-precision training. Also, I am in charge of the maintenance of Pre-training Library for LLMs and conducting experiments on training LLMs.

I am a core contributor of Swallow Project which is a Japanese LLM development initiative.

Popular repositories Loading

  1. llm-recipes llm-recipes Public

    Ongoing Research Project for continaual pre-training LLM(dense mode)

    Python 44 4

  2. moe-recipes moe-recipes Public

    Ongoing research training Mixture of Expert models.

    Python 21 2

  3. megatron-deepspeed-turing-techblog megatron-deepspeed-turing-techblog Public

    Turing Tech Blog Repository

    Python 5 1

  4. llm-jp-sakura-ansible llm-jp-sakura-ansible Public

    Jinja 5 2

  5. turing-techblog-megatron-deepspeed turing-techblog-megatron-deepspeed Public

    環境構築方法の詳細は以下のLinkから

    Python 2 1

  6. wandb_watcher wandb_watcher Public

    ABCI 大規模言語モデル構築支援にてwandbのジョブを監視するためのツール

    Python 2