What's a best practice to follow when designing an experiment to test Performance Max campaigns?

Create a new CPA or ROAS goal for Performance Max campaigns to achieve.

Optimize Performance Max to a comparable CPA or ROAS target to other campaigns.

Run the Performance Max experiment for one to two weeks before evaluating results.

Keep Performance Max budget limited to 10% of other campaigns' budgets.

Explanation

Analysis of Correct Answer(s)

  • Optimize Performance Max to a comparable CPA or ROAS target to other campaigns.
    • This is a best practice for designing an experiment because it ensures a fair and meaningful comparison. If the goal is to evaluate Performance Max's efficiency and effectiveness relative to existing campaigns or overall marketing goals, it must be held to similar performance standards. Setting a comparable CPA (Cost Per Acquisition) or ROAS (Return On Ad Spend) allows you to assess if PMax can achieve your desired business outcomes across Google's full inventory, potentially with improved efficiency or scale, without skewing results by setting an easier or harder target.

Analysis of Incorrect Options

  • Run the Performance Max experiment for one to two weeks before evaluating results.

    • This is generally too short for an effective experiment. Performance Max campaigns, like many automated Google Ads campaign types, require a significant learning period to gather data, optimize bids, and understand audience signals across various channels. A minimum of 3-4 weeks, and often longer (e.g., 6 weeks or a full conversion cycle), is recommended to allow the campaign to stabilize, exit the learning phase, and produce statistically significant results. Evaluating too early can lead to premature and inaccurate conclusions.
  • Create a new CPA or ROAS goal for Performance Max campaigns to achieve.

    • While you technically can create a new goal, the emphasis for experiment design (especially for comparison) should be on a comparable target. Creating an entirely new or different goal without considering existing campaign performance makes it difficult to ascertain if PMax is truly performing better or worse than your current strategy or if it can meet your business's established efficiency requirements. The goal of an experiment is often to see if PMax can meet existing targets more efficiently or at a larger scale.
  • Keep Performance Max budget limited to 10% of other campaigns' budgets.

    • Limiting the budget so drastically can starve the campaign of the resources needed to learn, scale, and deliver meaningful performance during an experiment. Performance Max is designed to find conversions across a vast inventory, and a severely restricted budget can hinder its ability to explore these opportunities, collect sufficient data, and optimize effectively. While controlled budgeting is wise, a 10% limit relative to all other campaigns might be too low to obtain representative results or allow the campaign to reach its potential during the test period.