·Concavity Team
RAG做不到的事,全文档推理(FCR)做到了!准确率获得4倍提升!
文通过实战案例解析检索增强生成(RAG)在复杂任务中的瓶颈,并介绍新技术——全文档推理如何攻克这些难题。
Read postTrust Infrastructure for Long Context Reasoning
Founded by researchers from Caltech, Columbia University, and Amazon
Early research release. Models and inference engine available.
文通过实战案例解析检索增强生成(RAG)在复杂任务中的瓶颈,并介绍新技术——全文档推理如何攻克这些难题。
Read postRAG is great for search; it often fails for cross-document reasoning. FCR is a reasoning runtime that constructs a usable full-context environment, then verifies grounding inline as reasoning progresses.
Read postAttention's quadratic cost is the fundamental bottleneck for long-context inference. We propose Superlinear Multi-Step Attention: a fully trainable attention architecture that achieves O(L^{3/2}) complexity while preserving random context access.
Read post