Large media company elevates user experience with AI-powered recipe search and conversation

About our client

Our client is Australia’s number one food media brand and ultimate kitchen resource, offering over 50,000 reliable, triple-tested recipes to home cooks. The platform continually evolves to streamline the weekly grocery planning, shopping, and cooking journey for its audience.

The challenge: When 50,000 recipes create search friction

As our client’s recipe library grew, the challenge of efficiently connecting users with the right recipe became increasingly complex.

Traditional search methods struggled to understand nuanced user requests based on specific ingredients, dietary needs, or desired cooking times, potentially leading to user frustration, and hindering Taste’s goal of getting people eating better, tastier food.

AI recipe finder: How LLM and RAG delivered highly relevant recipes

Mantel partnered with our client to solve their complex search challenges, implementing several strategies together. The core of this collaboration was an AI-powered search, utilising LLM Vector Search and Retrieval Augmented Generation (RAG). This allowed users to easily source recipes based on defined requirements, such as ingredients, cuisine, portion size, or time to cook. 

Furthermore, the solution incorporated Gemini to enable a conversational capability for the user, carefully limiting the LLM’s context to only the recipes found during the search. This joint solution demonstrated strong performance, achieving a median latency of approximately 3 seconds, a low cost of $0.004 per query, and delivering highly relevant recipe results. The successful deployment of the Retrieval Augmented Generation (RAG) component was underpinned by 48,000 pre-processed recipes that were added to the vector database.

The outcome

With a focus on class-leading experience and technology, an experimental approach was a necessary strategy from the Mantel and client team to allow for various approaches to be quickly tested and validated. Model selection and evaluation were primary focus areas, driven by the evolution of LLMs and latency requirements. This included comprehensive testing, which involved end-to-end, component-specific tests and LLMs to score relevancy where applicable. 

This solution was released to the public in November 2024, allowing Australians a simple and intuitive way to access our client’s range of recipes, exclusively through their app.