Google and Vertex AI Search Grounds

This blog post explains why you need grounding with large language models (LLMs) and how Vertex AI's Grounding with Google Search may help with little effort

Grounding models' output to data sources reduces content fabrication if you give them access to them

Fresh material based on learnt patterns is what generative AI models provide, not time-specific factual responses like weather predictions

Vertex AI Search can be used as a datastore so that language models can be grounded to your own text data

With Vertex AI Search you integrate your own data, regardless of format, to refine the model output

Linking to Vertex AI Search's data repositories improves the grounded model's accuracy and use case-specific responses

A per-request charging approach is employed by Vertex AI Search, also referred to as Vertex AI Search for Retail API

The per-request cost of Vertex AI Search is the main cost, but other fees may apply depending on the API query infrastructure