llama3
Here are 287 public repositories matching this topic...
The server version of my in-development assistant, written in Go
-
Updated
Jun 2, 2024 - HTML
Testing the capabilities of the Llama 3 language model, specifically the Meta-Llama-3-8B-Instruct variant with 8 billion parameters.
-
Updated
Jun 2, 2024 - Python
Devon: An open-source pair programmer
-
Updated
Jun 2, 2024 - Python
Repo for DermAssist: Your AI Assitant for Skin Problems. Powered by a vision model and a reliable RAG system.
-
Updated
Jun 2, 2024 - Python
Pathways: multi-modal AI/ML models on discord
-
Updated
Jun 2, 2024 - JavaScript
Python scraper based on AI
-
Updated
Jun 2, 2024 - Python
Start building LLM-empowered multi-agent applications in an easier way.
-
Updated
Jun 2, 2024 - Python
I'm developing a personalized LLM model using LLAMA 3 to create tailored emails based on user data. This project aims to enhance communication efficiency by generating personalized content, leveraging advanced NLP techniques for more meaningful and relevant interactions.
-
Updated
Jun 2, 2024 - Jupyter Notebook
Llama.cpp é uma biblioteca desenvolvida em C++ para a implementação eficiente de grandes modelos de linguagem, como o LLaMA da Meta. Otimizada para rodar em diversas plataformas, incluindo dispositivos com recursos limitados, oferece performance, velocidade de inferência e uso eficiente da memória, essenciais para a execução de grandes. modelos
-
Updated
Jun 2, 2024 - Python
A lightweight, fast, parallel inference server for Llama
-
Updated
Jun 2, 2024 - Python
Your AI second brain. Get answers to your questions, whether they be online or in your own notes. Use online AI models (e.g gpt4) or private, local LLMs (e.g llama3). Self-host locally or use our cloud instance. Access from Obsidian, Emacs, Desktop app, Web or Whatsapp.
-
Updated
Jun 2, 2024 - Python
Tensor parallelism is all you need. Run LLMs on weak devices or make powerful devices even more powerful by distributing the workload and dividing the RAM usage.
-
Updated
Jun 2, 2024 - C++
🤖 The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.
-
Updated
Jun 2, 2024 - C++
Improve this page
Add a description, image, and links to the llama3 topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llama3 topic, visit your repo's landing page and select "manage topics."