The ‘brownie recipe problem’: why LLMs must have fine-grained context to deliver real-time results
venturebeatToday’s LLMs excel at reasoning, but can still struggle with context. This is particularly true in real-time ordering systems like Instacart.
Instacart CTO Anirban Kundu calls it the "brownie recipe problem."
It's not as simple as telling an LLM ‘I want to make brownies.’ To be truly assistive when planning the meal, the model must go beyond that simple directive to understand what’s available in the user’s market based on their preferences — say, organic eggs versus regular eggs — and factor that into what’s deliverable in their geography so food doesn’t spoil. This among other critical factors.
For Instacart, the challenge is juggling latency with the right mix of context to provide experiences in, ideally, less than one second’s time.
“If reasoning itself takes 15 seconds, and if every interaction is that slow, you're gonna lose the user,” Kundu said at a recent ...
Copyright of this story solely belongs to venturebeat . To see the full text click HERE

