💡 The story is a primer on MCP (a topic explicitly listed under the agents category) using AI tools like Ollama and LangChain, focusing on agent-related content.
💡 The story likely addresses YAML usage in AI development workflows or tools (common in frameworks like LangChain/LlamaIndex), fitting the tools category for development tools & frameworks.
💡 The story introduces LangManus, an open-source autonomous agent built using LangChain and LangGraph, which aligns with the agents category covering autonomous agents and agentic workflows.
💡 The story shows a project built using LLMs for structured data extraction and knowledge graph construction, which involves tools like vector databases or LangChain (categorized under engineering tools)
💡 The story compares a Rust-based indexing and querying pipeline (a GenAI development tool) to Langchain, a well-known AI framework, which falls under the tools category.
💡 The story inquires about AI development tools (LangChain, LlamaIndex) for PDF parsing in the context of RAG, which aligns with the tools category under Engineering.
💡 The story addresses solving the out-of-context chunk problem for Retrieval-Augmented Generation (RAG), a core technique used in AI development tools and frameworks like LangChain or LlamaIndex.
💡 The story discusses discontinuing the use of LangChain, an AI development framework explicitly listed under the tools category, for building AI agents.
💡 The story focuses on methods to obtain structured output from LLMs, which aligns with prompt engineering tools and frameworks (like LangChain) that enable structured output handling, falling under the 'tools' category.
💡 The story focuses on chunking, a critical technique in Retrieval-Augmented Generation (RAG) applications that relies on AI development tools like vector databases and frameworks such as LangChain.
💡 The story inquires about a local-first AI solution for PDF search, which typically relies on tools like LlamaIndex, LangChain (RAG frameworks) or vector databases—all explicitly listed under the 'tools' category.
💡 The article focuses on learnings from a year of building with LLMs, which typically involves using development tools and frameworks (e.g., LangChain, vector databases) that are categorized under 'tools'.
💡 The article is about systematically improving Retrieval-Augmented Generation (RAG), a technique heavily reliant on AI development tools and frameworks like LangChain or LlamaIndex, which falls under the 'tools' category in Engineering.
💡 The story introduces Cognita, an open-source RAG framework for modular applications, which aligns with the 'tools' category covering AI development frameworks like LangChain and LlamaIndex.
💡 The story focuses on LLM ingestion and preprocessing, which are core functionalities supported by AI development tools (e.g., LangChain, LlamaIndex) for data preparation in LLM workflows.
💡 The story involves LangChainGo (a Go framework for building LLM applications, classified as a tool), alongside Gemma (a model) and Ollama (local inference infrastructure), all AI-related. The most specific category here is tools due to LangChainGo being a core development tool.
💡 The story focuses on building custom chatbots with knowledge bases for websites, which typically involves using AI development tools and frameworks like vector databases or LangChain.