Agentic Hub
FindProjectsSkillsCollectionsGuide
All projects
LLM ToolingPython

vllm

vllm-project/vllm
View on GitHub

A high-throughput and memory-efficient inference and serving engine for LLMs

Stars
76k
Forks
15k
Last push
today
License
Apache-2.0
Good First Issues
41
Browse on GitHub
Help Wanted
0
Browse on GitHub

Topics

amdblackwellcudadeepseekdeepseek-v3gptgpt-ossinference

More LLM Tooling

See all

ollama

ollama/ollama
LLM Tooling

Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.

168k 15kGotoday
No open beginner issues

llama.cpp

ggml-org/llama.cpp
LLM Tooling

LLM inference in C/C++

103k 17kC++today
37 good first35 help wanted

litellm

BerriAI/litellm
LLM Tooling

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

43k 7.1kPythontoday
No open beginner issues
Agentic Hub

Your launchpad into the agentic AI open source ecosystem. AI-curated projects, beginner scores, and live status. Updated weekly.

Explore

  • All Projects
  • Categories
  • Contribution Guide

Resources

  • Good First Issue
  • Up For Grabs
  • GitHub: ai-agents

MIT License. Built with Next.js on Cloudflare Pages. Data refreshed weekly via GitHub Actions + AI enrichment.

Source