LangGraph Platform: New deployment options for scalable agent infrastructure We've rebranded our service for deploying and scaling LangGraph apps as LangGraph Platform. Learn about the multiple deployment options and what LangGraph Platform entails.
Few-shot prompting to improve tool-calling performance We ran a few experiments, which show how few-shot prompting can significantly enhance model accuracy - especially for complex tasks. Read on for how we did it (and the results).
Improving core tool interfaces and docs in LangChain See our latest improvements to our core tool interfaces that make it turn any code into a tool, handle diverse inputs, enrich tool outputs, and handle tool errors effectively.
Announcing LangGraph v0.1 & LangGraph Cloud: Running agents at scale, reliably Our new infrastructure for running agents at scale, LangGraph Cloud, is available in beta. We also have a new stable release of LangGraph.
Aligning LLM-as-a-Judge with Human Preferences Deep dive into self-improving evaluators in LangSmith, motivated by the rise of LLM-as-a-Judge evaluators plus research on few-shot learning and aligning human preferences.
How Factory used LangSmith to automate their feedback loop and improve iteration speed by 2x How Factory AI uses LangSmith to debug issues and close the product feedback loop, resulting in a 2x improvement in iteration speed.
Workspaces in LangSmith for improved collaboration and organization LangSmith activities and workflows now happen in workspaces that separate resources between teams, business units, or deployment environments.
Announcing LangSmith is now a transactable offering in the Azure Marketplace Today, we’re thrilled to announce that enterprises can purchase LangSmith in the Azure Marketplace as an Azure Kubernetes Application. LangSmith is a unified DevOps
LangChain Integrates NVIDIA NIM for GPU-optimized LLM Inference in RAG Roughly a year and a half ago, OpenAI launched ChatGPT and the generative AI era really kicked off. Since then we’ve seen rapid growth
Use Case Accelerant: Extraction Service Today we’re excited to announce our newest OSS use-case accelerant: an extraction service. LLMs are a powerful tool for extracting structured data from unstructured
Reflection Agents Reflection is a prompting strategy used to improve the quality and success rate of agents and similar AI systems. This post outlines how to build 3 reflection techniques using LangGraph, including implementations of Reflexion and Language Agent Tree Search.
Rakuten Group builds with LangChain and LangSmith to deliver premium products for its business clients and employees Rakuten Group is well known for operating one of the largest online shopping malls in Japan. The company has 70+ businesses in fields such as