Day 83 of 133

Advanced RAG: parent-child, GraphRAG, fine-tuned embeddings

Auto-merging retrievers; GraphRAG for multi-hop; when to fine-tune embeddings.

DSA · NeetCode Trees

  • Interview questions to prep

    1. What does the recursion return vs what it updates globally? Why those two different things?
    2. What's the time and space complexity, and where does the space go?

GenAI · Advanced RAG

  • Interview questions to prep

    1. When does parent-child retrieval beat naive chunked retrieval?
    2. What's the failure mode of auto-merging, and when does it return a chunk that's too large?
  • Interview questions to prep

    1. How does GraphRAG help with multi-hop reasoning?
    2. What's the cost of building and maintaining the knowledge graph in GraphRAG — and when is it not worth it?
  • Interview questions to prep

    1. When is it worth fine-tuning the embedding model on your domain?
    2. How would you build training pairs for embedding fine-tuning when you don't have labeled relevance data?

References & further reading