.. _examples_index: ======== Examples ======== .. toctree:: :maxdepth: 2 :hidden: ai_podcast chatbot gradio_chatinterface pdf_chatbot langchain_streamlit_doc_chat Here you can find examples and resources to learn about how to use Xinference. Demos ===== End-to-end applications of using Xinference: * `Voice Conversations with AI Agents on M2 Max `_ * `Interacting with LLM Models: A Command-Line Example `_ * `Interacting with LLM Models: A Gradio ChatInterface Example `_ * `PDF Chatbot with Local LLM and Embeddings `_ * `Local Doc Conversations with LangChain and Streamlit `_ If you come across other examples in your own workflows we encourage you to contribute a `PR `_! Tutorials ========= The following tutorials cover the basics of using Xinference in different scenarios: * `[Notebook] Question-answering(QA) Application with Xinference, Milvus and LangChain `_ * `Using Xinference local LLMs within LlamaIndex `_ * `[Chinese] 如何让 Chatbox 接入开源大模型,实现免费聊天 `_ * `[Chinese] 摆脱 OpenAI 依赖,8 分钟教你用开源生态构建全栈 AI 应用 `_ * `[Chinese] 使用全套开源工具构建 LLM 应用实战: 在 Dify 调用 Baichuan 开源模型能力 `_ Third-Party Library Integrations ================================ Xinference is designed to seamlessly integrate and deploy open-sourced AI models, so we want to incorporate support for mainstream toolkits in the AI landscape. Xinference can be used with the following third-party libraries: * LangChain `Text Embedding Models `_ and `LLMs `_ * `LlamaIndex Xinference LLM `_