Related Projects相关项目
oBeaver is inspired by and builds upon the ideas from the following excellent open-source projects. We are grateful to each of these communities for paving the way.oBeaver 的灵感来源于以下优秀的开源项目,并在其理念基础上构建。我们衷心感谢这些社区的开拓与贡献。
Ollama
Run large language models locally with a simple CLI. The pioneer of making local LLM inference accessible to everyone.通过简洁的 CLI 在本地运行大语言模型。引领本地 LLM 推理走进千家万户的先驱。
oMLX
Run large language models on Apple Silicon, ONNX-based. Demonstrating the power of on-device inference on macOS.在 Apple Silicon 上运行大语言模型,基于 ONNX。展示了 macOS 上设备端推理的强大潜力。
vLLM
High-throughput and memory-efficient inference engine for LLMs. Setting the standard for production-grade LLM serving.高吞吐量、高内存效率的 LLM 推理引擎。为生产级 LLM 服务树立了标杆。
Foundry Local
Microsoft's local model inference runtime with NPU/GPU/CPU acceleration. Enabling hardware-optimised local inference across devices.微软本地模型推理运行时,支持 NPU/GPU/CPU 加速。实现跨设备的硬件优化本地推理。
ONNX Runtime GenAI
Generative AI extensions for ONNX Runtime. Bringing efficient generative model inference to the ONNX ecosystem.ONNX Runtime 的生成式 AI 扩展。将高效的生成模型推理引入 ONNX 生态系统。
Olive
Microsoft's model optimization toolkit for ONNX Runtime. Streamlining model conversion, quantization, and optimization workflows.微软面向 ONNX Runtime 的模型优化工具包。简化模型转换、量化和优化工作流。