BigQuery enhancements to boost gen AI inference
google cloudblogPeople often think of BigQuery in the context of data warehousing and analytics, but it is a crucial part of the AI ecosystem as well. And today, we’re excited to share significant performance improvements to BigQuery that make it even easier to extract insights from your data with generative AI.
In addition to native model inference where computation takes place entirely in BigQuery, we offer several batch-oriented generative AI capabilities that combine distributed execution in BigQuery with distributed execution with remote LLM inference on Vertex AI, with functions such as:
- ML.GENERATE_TEXT to generate text via Gemini, other Google-hosted partner LLMs (e.g., Anthropic Claude, Llama) or any open-source LLMs
- ML.GENERATE_EMBEDDING to generate text or multimodal embeddings
- AI.GENERATE_TABLE to generate structured tabular data via LLMs and their constrained decoding capabilities.
In addition to the above table-valued functions, you can use our row-wise functions such as AI.GENERATE ...
Copyright of this story solely belongs to google cloudblog . To see the full text click HERE