🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
-
Updated
Jun 12, 2024 - Python
🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
Train transformer-based models.
Personal Project: MPP-Qwen14B(Multimodal Pipeline Parallel-Qwen14B). Don't let the poverty limit your imagination! Train your own 14B LLaVA-like MLLM on RTX3090/4090 24GB.
使用LLaMA-Factory微调多模态大语言模型的示例代码 Demo of Finetuning Multimodal LLM with LLaMA-Factory
Language Modeling Research Hub, a comprehensive compendium for enthusiasts and scholars delving into the fascinating realm of language models (LMs), with a particular focus on large language models (LLMs)
Code example for pretraining an LLM with vanilla PyTorch training loop
Official Repository for the Uni-Mol Series Methods
PITI: Pretraining is All You Need for Image-to-Image Translation
Saprot: Protein Language Model with Structural Alphabet
Customized Pretraining for NLG Tasks
Code repository for the conference paper "Organoid Segmentation Using Self-Supervised Learning: How Complex Should the Pretext Task Be?" published and presented at the International Conference on Biomedical and Bioinformatics Engineering (ICBBE) 2023.
飞桨大模型开发套件,提供大语言模型、跨模态大模型、生物计算大模型等领域的全流程开发工具链。
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Using Pre-training and Interaction Modeling for ancestry-specific disease prediction using multiomics data from the UK Biobank
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
[NeurIPS2022] Egocentric Video-Language Pretraining
[ICCV2023] UniVTG: Towards Unified Video-Language Temporal Grounding
Official implementation of Matrix Variational Masked Autoencoder (M-MAE) for ICML paper "Information Flow in Self-Supervised Learning" (https://arxiv.org/abs/2309.17281)
Official implementation of ICML 2024 paper "Matrix Information Theory for Self-supervised Learning" (https://arxiv.org/abs/2305.17326)
Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.
To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."