构建自己的个人人工智能助手:构建文本和语音本地LLM的分步指南
In this tutorial we will create a personal local LLM assistant, that you can talk to. You will be able to record your voice using your microphone and send to the LLM. The LLM will return the answer with text AND speech.
高级RAG 02:揭开PDF解析的面纱
20240215 Additional content: Unveiling PDF Parsing: How to extract formulas from scientific pdf papers
如何在没有矢量数据库的情况下进行RAG
Introduction
When it comes to bestowing Large Language Models (LLMs) with long-term memory, the prevalent approach often involves a Retrieval Augmented Generation (RAG) solution, with vector databases acting as the storage mechanism for the long-term memory. This begs the question: Can we achieve the same results without vector databases?
2024年十大数据和人工智能趋势
从LLM将现代数据堆栈转换为矢量数据库的数据可观察性,以下是我对2024年顶级数据工程趋势的预测。
嵌入+知识图:RAG系统的终极工具
The advent of large language models (LLMs) , trained on vast amounts of text data, has been one of the most significant breakthroughs in natural language processing. The ability of these models to generate remarkably fluent and coherent text with just a short prompt has opened up new possibilities for conversational AI, creative writing, and a wide array of other applications.
最后7B参数模型胜过GPT-4!
We are entering the era of small & highly efficient models!
TimeGPT:时间序列预测的第一个基础模型
写杀手提示:掌握人工智能惊人结果的提示工程
Prompt engineering, or prompt design, is crafting instructions for LLMs to get desired responses. It’s essential for ensuring accurate, high-quality responses from a large language model.
高级RAG 07:探索表格的RAG
Implementing RAG presents a challenge, especially when it comes to effectively parsing and understanding tables in unstructured documents. This is particularly difficult with scanned documents or documents in image format. There are at least three aspects of these challenges:
使用LangChain、LLM和Streamlit构建用于复杂SQL数据库交互的聊天应用程序
In this article we will see how we can use large language models (LLMs) to interact with a complex database using Langchain
agents and tools, and then deploying the chat application using Streamlit
.