FAQ

Do I need to create a campaign to get recommendations?

No, creating a campaign is not mandatory. BayBE offers two entry points for generating recommendations:

For more details on when to choose one method over the other, see here.

BayBE recommends A but experimentalists do B. What now?

Don’t panic and grab your towel. Recommendations from BayBE are just … well, “recommendations”. The measurements you feed back to BayBE need not to be related to the original recommendation in any way. In fact, requesting recommendations and adding data are two separate actions, and there is no formal requirement to perform these actions in any particular order nor to “respond” to recommendations in any form.

Note, however, that subsequent recommendations may be affected by earlier steps in your campaign, depending on your settings for the allow_recommending_already_measured and allow_recommending_already_recommended flags.

Checklist for Designing BayBE Optimization Campaigns

This checklist collects common questions that you need to ask yourself when designing a BayBE optimization campaign. It also provides documentation references that will help you with the corresponding setup.

Note that this is not a comprehensive guide of all BayBE’s functionalities, but rather a “quick start” meant to help you with the most basic design principles.

Defining Targets

Should the target value be maximized, minimized, or be matched to a specific value?

Specify this when defining the target.

Should multiple target be optimized simultaneously?

See how to use multi-target objectives.

Defining Parameter Search Space

Are only some parameter values of interest/possible?

See how to exclude some parameter values from being recommended, such as by defining bounds for continuous parameters or active values for discrete parameters.

Are only some parameter combinations of interest/possible?

See how to exclude some parameter combinations from being considered by using constraints or constrained searchspaces. Alternatively, if the aim is to use only a few specific parameter configurations the search space can be created from a dataframe rather than from the product of all possible parameter combinations.

Are some parameters non-numeric or allow only discrete numbers?

Use discrete rather than continuous parameters.

Is it possible to encode discrete parameters based on domain knowledge to capture relationships between categories (e.g., ordered values, molecular fingerprints, model embeddings)?

See how to encode discrete parameters or provide custom encodings.

Account for Specifics of Data Availability or Data Acquisition Procedure

Is no prior data available and the experiments should be done in batches?

Use clustering or sampling recommenders to diversify the first batch of parameter settings to be tested.

Is additional data from historic or other partially-related experiments available?
Will the outcome measurements of different parameter setting become available at different times?

Advanced: Adjust how Recommendations are Prioritized

Is the aim to reduce the overall uncertainty across different regions of the search space rather than optimize a specific objective?