To enhance AI applications using Vertex Prompt Optimizer with Gemini, configure the optimizer with your prompt template, sample prompts, system instructions, and select Gemini as the target model.
Here is the code snippet you can refer to:

In the above code we are using the following key points:
- Initialization:
- Sets up the Vertex AI client with the specified project and location.
- Optimization Parameters:
- system_instruction: Guides the model to provide concise and accurate answers.
- prompt_template: Defines the structure of the prompt, where {question} is a placeholder for input.
- target_model: Specifies the Gemini model version to optimize for.
- eval_metrics_types: Sets the evaluation metric, such as 'summarization_quality'.
- optimization_mode: Indicates that both system instructions and sample prompts will be optimized.
- input_data_path and output_data_path: Define the Google Cloud Storage paths for input sample prompts and output optimized prompts, respectively.
- Prompt Optimization Job:
- Creates and runs a prompt optimization job with the specified parameters.
Hence, by configuring Vertex Prompt Optimizer with appropriate parameters and selecting Gemini as the target model, you can effectively enhance prompt performance in AI applications.