![gaussian software optimization gaussian software optimization](https://miro.medium.com/max/1838/1*73E0r2uiLmjY_i-eNyuq7g.png)
However, labeling (or querying) is often expensive. We will soon see how these two problems are related, but not the same.įor many machine learning problems, unlabeled data is readily available. Instead, we should drill at locations showing high promise about the gold content. We, again, can not drill at every location. In this problem, we want to find the location of the maximum gold content. Problem 2: Location of Maximum Gold (Bayesian Optimization) Instead, we should drill at locations providing high information about the gold distribution. We can not drill at every location due to the prohibitive cost. In this problem, we want to accurately estimate the gold distribution on the new land. Problem 1: Best Estimate of Gold Distribution (Active Learning) We now discuss two common objectives for the gold mining problem.
![gaussian software optimization gaussian software optimization](https://i.ytimg.com/vi/gsN-e0NWhlw/hqdefault.jpg)
Thus, we want to minimize the number of drillings required while still finding the location of maximum gold quickly. We can learn the gold distribution by drilling at different locations. Initially, we have no idea about the gold distribution. For now, let us not worry about the X-axis or the Y-axis units. It is bi-modal, with a maximum value around x = 5 x = 5 x = 5. Let us suppose that the gold distribution f ( x ) f(x) f ( x ) looks something like the function below. We want to find the location along this line with the maximum gold while only drilling a few times (as drilling is expensive). Krige modeled the gold concentrations using a Gaussian Process.įor now, we assume that the gold is distributed about a line. Our goal is to mine for gold in an unknown land Interestingly, our example is similar to one of the first use of Gaussian Processes (also called kriging), where Prof. Let us start with the example of gold mining. More generally, Bayesian Optimization can be used to optimize any black-box function. In this article, we talk about Bayesian Optimization, a suite of techniques often used to tune hyperparameters.
![gaussian software optimization gaussian software optimization](https://media.springernature.com/full/springer-static/image/art%3A10.1038%2Fs41598-017-12600-3/MediaObjects/41598_2017_12600_Fig1_HTML.jpg)
To effectively use these algorithms, we need to pick good hyperparameter values. Many modern machine learning algorithms have a large number of hyperparameters.