Day 83 of 133
Advanced RAG: parent-child, GraphRAG, fine-tuned embeddings
Auto-merging retrievers; GraphRAG for multi-hop; when to fine-tune embeddings.
DSA · NeetCode Trees
- Binary Tree Maximum Path SumDSA · Trees
Interview questions to prep
- What does the recursion return vs what it updates globally? Why those two different things?
- What's the time and space complexity, and where does the space go?
GenAI · Advanced RAG
Interview questions to prep
- When does parent-child retrieval beat naive chunked retrieval?
- What's the failure mode of auto-merging, and when does it return a chunk that's too large?
Interview questions to prep
- How does GraphRAG help with multi-hop reasoning?
- What's the cost of building and maintaining the knowledge graph in GraphRAG — and when is it not worth it?
Interview questions to prep
- When is it worth fine-tuning the embedding model on your domain?
- How would you build training pairs for embedding fine-tuning when you don't have labeled relevance data?
References & further reading
- LangChain — RAG concepts ↗LangChain
- Pinecone — Vector Databases Explained ↗Pinecone