The ‘brownie recipe problem’: why LLMs must have fine-grained context to deliver real-time results

Taryn Plumb February 4, 2026 CleoP made with MidjourneyToday’s LLMs excel at reasoning, but can still struggle with context. This is particularly true in real-time ordering systems like Instacart. Instacart CTO Anirban Kundu calls it the “brownie recipe problem.” It’s not as simple as telling an LLM ‘I want to make brownies.’ To be truly assistive when planning the meal, the model must go beyond that simple directive to understand what’s available in the user’s market based on their preferences — say, organic eggs versus regular eggs — and factor that into what’s deliverable in their geography so food doesn’t spoil. This among other critical factors. For Instacart, the challenge is juggling latency with the right mix of context to provide experiences in, ideally,…

Read more on VentureBeat