Gaussian Process Regression on Synthetic Data Using the scikit Library

I’ve been looking at Gaussian process regression (GPR) recently. There are several libraries available for GPR. GPR is very complex.

Overall, the scikit library has pretty good documentation. But the documentation for the scikit GaussianProcessRegressor module is quite weak in my opinion. So, I decided to spend a few hours exploring.

First, I created some synthetic data to work with. I used a PyTorch 5-10-1 neural network to generate 200 training items and 40 test items. Each item has five predictor values from -1.0 to +1.0 followed by the target value to predict which is a value between 0.0 and 1.0. The data looks like:

-0.1660   0.4406  -0.9998  -0.3953  -0.7065  0.1672
 0.0776  -0.1616   0.3704  -0.5911   0.7562  0.6022
-0.9452   0.3409  -0.1654   0.1174  -0.7192  0.2288
 0.9365  -0.3732   0.3846   0.7528   0.7892  0.0379
. . .

Note: Gaussian process regression was designed to work with strictly numeric data. However, GPR can be used with categorical predictor variables by using one-hot encoding. For example, if you have a variable Color with possible values red, blue, green, then you could encode red = 100, blue = 010, green = 001.

Note: In theory, GPR creates a matrix (inverse kernel matrix) that has size num_train x num_train. So for 200 training items, the matrix would be 200 x 200 which is pretty big. In practice, the scikit GPR implementation uses some engineering tricks based on the fact that the kernel matrix is symmetric. All that said, GPR doesn’t scale well to very large datasets.

The main challenge when working with GPR is picking a good kernel function and a good set of parameter(s) for the kernel function. After a lot of experimentation, I ended up using:

  print("Creating GPR model with RBF(1.0) kernel ")
  krnl = RBF(length_scale=1.0, length_scale_bounds="fixed") 
  gpr_model = GaussianProcessRegressor(kernel=krnl,
    normalize_y=False, random_state=1, alpha=0.001)
  print("Done ")

There’s a lot of things going on here. The length_scale_bounds=”fixed” parameter to the RBF kernel function instructs the module to NOT attempt to programmatically optimize the value of the length_scale parameter. That’s a big discussion that’s outside the scope of this blog post. The alpha=0.001 value controls regularization. GPR is extremely sensitive to the choice of kernel function, the choice of its parameters, and the choice of alpha.

Anyway, my demo model scored 97.50% accuracy on the training data (195 out of 200 correct) and 75.00% accuracy on the test data (30 out of 40 correct). I defined a correct prediction as one that’s within 10% of the true target value.

GPR is highly susceptible to overfitting. When GPR works, it can work very well. But when GPR doesn’t work, the results are often very poor.



One of the advantages of using synthetic data for machine learning experiments is that different forms of data can be designed for different purposes. I suppose that synthetic hair has the same characteristic — I’m assuming that these three interesting examples are in fact synthetic and not real hair.


Demo code. Replace “lt” (less-than) with Boolean operator symbol.

# synthetic_gpr.py
# scikit Gaussian process regression on synthetic data

# Anaconda3-2022.10  Python 3.9.13
# scikit 1.0.2  Windows 10/11 

import numpy as np
import pickle
from sklearn.gaussian_process import GaussianProcessRegressor

from sklearn.gaussian_process.kernels import RBF
# RBF, ConstantKernel, Matern, RationalQuadratic,
# ExpSineSquared, DotProduct

# -----------------------------------------------------------

def accuracy(model, data_X, data_y, pct_close):
  # correct within pct of true target 
  n_correct = 0; n_wrong = 0

  for i in range(len(data_X)):
    X = data_X[i].reshape(1, -1)  # one-item batch
    y = data_y[i]
    pred = model.predict(X)       # predicted target value

    if np.abs(pred - y) "lt" np.abs(pct_close * y):
      n_correct += 1
    else:
      n_wrong += 1
  acc = (n_correct * 1.0) / (n_correct + n_wrong)
  return acc

# -----------------------------------------------------------

def root_mse(model, data_X, data_y):
  preds = model.predict(data_X)
  sum = np.sum((preds - data_y)**2)
  return np.sqrt(sum / len(data_y))

# -----------------------------------------------------------

def main():
  # 0. prepare
  print("\nBegin scikit Gaussian process regression ")
  print("Predict synthetic data ")
  np.random.seed(1)
  np.set_printoptions(edgeitems=5, linewidth=100,
    sign=" ", formatter={'float': '{: 7.4f}'.format})

# -----------------------------------------------------------

  # 1. load data
  print("\nLoading 200 train and 40 test data for GPR ")
  train_file = ".\\Data\\synthetic_train.txt"
  train_X = np.loadtxt(train_file, delimiter="\t", 
    usecols=(0,1,2,3,4),
    comments="#", dtype=np.float64)
  train_y = np.loadtxt(train_file, delimiter="\t", 
    usecols=5, comments="#", dtype=np.float64) 

  test_file = ".\\Data\\synthetic_test.txt"
  test_X = np.loadtxt(test_file, delimiter="\t",
    usecols=(0,1,2,3,4),
    comments="#", dtype=np.float64)
  test_y = np.loadtxt(test_file, delimiter="\t",
    usecols=5, comments="#", dtype=np.float64) 
  print("Done ")

  print("\nFirst four X data: ")
  print(train_X[0:4][:])
  print(". . .")
  print("\nFirst four targets: ")
  print(train_y[0:4])
  print(". . .")

# -----------------------------------------------------------

  # 2. create and train GPR model
  print("\nCreating GPR model with RBF(1.0) kernel ")

  # GaussianProcessRegressor(kernel=None, *, alpha=1e-10,
  #  optimizer='fmin_l_bfgs_b', n_restarts_optimizer=0,
  #  normalize_y=False, copy_X_train=True, random_state=None)
  #
  # default: ConstantKernel(1.0, constant_value_bounds="fixed")
  #  * RBF(1.0, length_scale_bounds="fixed")
  # scikit-learn.org/stable/modules/gaussian_process.html

  # 2a. explore hyperparameters
  scales = [0.1, 0.5, 1.0, 1.5, 2.0]
  alphas = [0.0, 0.0001, 0.001, 0.01, 0.1]

  for s in scales:
    for a in alphas:
      krnl = RBF(length_scale=s, length_scale_bounds="fixed")
      gpr_model = GaussianProcessRegressor(kernel=krnl,
        normalize_y=False, random_state=1, alpha=a)
      gpr_model.fit(train_X, train_y)
      train_acc = accuracy(gpr_model, train_X, train_y, 0.10)
      train_rmse = root_mse(gpr_model, train_X, train_y)
      test_acc = accuracy(gpr_model, test_X, test_y, 0.10)
      test_rmse = root_mse(gpr_model, test_X, test_y)
      print("scale = %0.1f alpha = %0.4f |  train acc = %0.4f \
 rmse = %0.4f  test acc = %0.4f  rmse = %0.4f" % \
(s, a, train_acc, train_rmse, test_acc, test_rmse))

  krnl = RBF(length_scale=1.0, length_scale_bounds="fixed") 
  gpr_model = GaussianProcessRegressor(kernel=krnl,
    normalize_y=False, random_state=1, alpha=0.001)
  print("Done ")

  print("\nTraining model ")
  gpr_model.fit(train_X, train_y)
  print("Done ")

# -----------------------------------------------------------

  # 3. compute model accuracy
  print("\nComputing accuracy (within 0.10 of true) ")
  acc_train = accuracy(gpr_model, train_X, train_y, 0.10)
  print("\nAccuracy on train data = %0.4f " % acc_train)
  acc_test = accuracy(gpr_model, test_X, test_y, 0.10)
  print("Accuracy on test data = %0.4f " % acc_test)

  # 4. use model to predict 
  x = np.array([[-0.5, 0.5, -0.5, 0.5, -0.5]],
    dtype=np.float64)
  print("\nPredicting for x = ")
  print(x)
  y_pred = gpr_model.predict(x)
  print("\nPredicted y = %0.4f " % y_pred)

  # 5. save model
  print("\nSaving trained GPR model ")
  fn = ".\\Models\\gpr_model.pkl"
  with open(fn,'wb') as f:
    pickle.dump(gpr_model, f)

  # load and use model
  # path = ".\\Models\\gpr_model.pkl"
  # with open(path, 'rb') as f:
  #   loaded_model = pickle.load(f)
  # X = (set values for X)
  # y_pred = loaded_model.predict(X)

  print("\nEnd GPR prediction ")

# -----------------------------------------------------------

if __name__ == "__main__":
  main()

Training data. Replace commas with tabs, or modify program.

# synthetic_train.txt
# five predictors in [-1, +1] then target in [0, 1]
# 200 items (with 40 in synthetic_test.txt)
#
-0.1660,0.4406,-0.9998,-0.3953,-0.7065,0.1672
0.0776,-0.1616,0.3704,-0.5911,0.7562,0.6022
-0.9452,0.3409,-0.1654,0.1174,-0.7192,0.2288
0.9365,-0.3732,0.3846,0.7528,0.7892,0.0379
-0.8299,-0.9219,-0.6603,0.7563,-0.8033,0.2519
0.0663,0.3838,-0.3690,0.3730,0.6693,0.1069
-0.9634,0.5003,0.9777,0.4963,-0.4391,0.1664
-0.1042,0.8172,-0.4128,-0.4244,-0.7399,0.1558
-0.9613,0.3577,-0.5767,-0.4689,-0.0169,0.5935
-0.7065,0.1786,0.3995,-0.7953,-0.1719,0.7598
0.3888,-0.1716,-0.9001,0.0718,0.3276,0.1770
0.1731,0.8068,-0.7251,-0.7214,0.6148,0.2912
-0.2046,-0.6693,0.8550,-0.3045,0.5016,0.4202
0.2473,0.5019,-0.3022,-0.4601,0.7918,0.3491
-0.1438,0.9297,0.3269,0.2434,-0.7705,0.0823
0.1568,-0.1837,-0.5259,0.8068,0.1474,0.0665
-0.9943,0.2343,-0.3467,0.0541,0.7719,0.5256
0.2467,-0.9684,0.8589,0.3818,0.9946,0.0878
-0.6553,-0.7257,0.8652,0.3936,-0.8680,0.2929
0.8460,0.4230,-0.7515,-0.9602,-0.9476,0.1252
-0.9434,-0.5076,0.7201,0.0777,0.1056,0.4330
0.9392,0.1221,-0.9627,0.6013,-0.5341,0.0265
0.6142,-0.2243,0.7271,0.4942,0.1125,0.0469
0.4260,0.1194,-0.9749,-0.8561,0.9346,0.5281
0.1362,-0.5934,-0.4953,0.4877,-0.6091,0.1092
0.6937,-0.5203,-0.0125,0.2399,0.6580,0.1057
-0.6864,-0.9628,-0.8600,-0.0273,0.2127,0.5779
0.9772,0.1595,-0.2397,0.1019,0.4907,0.1556
0.3385,-0.4702,-0.8673,-0.2598,0.2594,0.3073
-0.8669,-0.4794,0.6095,-0.6131,0.2789,0.8373
0.0493,0.8496,-0.4734,-0.8681,0.4701,0.3925
0.8639,-0.9721,-0.5313,0.2336,0.8980,0.1118
0.9004,0.1133,0.8312,0.2831,-0.2200,0.0876
0.0991,0.8524,0.8375,-0.2102,0.9265,0.4419
-0.6521,-0.7473,-0.7298,0.0113,-0.9570,0.3835
0.6190,-0.3105,0.8802,0.1640,0.7577,0.1090
0.6895,0.8108,-0.0802,0.0927,0.5972,0.1437
0.1982,-0.9689,0.1870,-0.1326,0.6147,0.2463
-0.3695,0.7858,0.1557,-0.6320,0.5759,0.5365
-0.1596,0.3581,0.8372,-0.9992,0.9535,0.8498
-0.2468,0.9476,0.2094,0.6577,0.1494,0.0496
0.1737,0.5000,0.7166,0.5102,0.3961,0.0958
0.7290,-0.3546,0.3416,-0.0983,-0.2358,0.0699
-0.3652,0.2438,-0.1395,0.9476,0.3556,0.0430
-0.6029,-0.1466,-0.3133,0.5953,0.7600,0.2127
-0.4596,-0.4953,0.7098,0.0554,0.6043,0.3261
0.1450,0.4663,0.0380,0.5418,0.1377,0.0330
-0.8636,-0.2442,-0.8407,0.9656,-0.6368,0.1682
0.6237,0.7499,0.3768,0.1390,-0.6781,0.0463
-0.5499,0.1850,-0.3755,0.8326,0.8193,0.1407
-0.4858,-0.7782,-0.6141,-0.0008,0.4572,0.4887
0.7033,-0.1683,0.2334,-0.5327,-0.7961,0.0941
0.0317,-0.0457,-0.6947,0.2436,0.0880,0.1309
0.5031,-0.5559,0.0387,0.5706,-0.9553,0.0575
-0.3513,0.7458,0.6894,0.0769,0.7332,0.3084
0.2205,0.5992,-0.9309,0.5405,0.4635,0.0901
-0.4806,-0.4859,0.2646,-0.3094,0.5932,0.6241
0.9809,-0.3995,-0.7140,0.8026,0.0831,0.0408
0.9495,0.2732,0.9878,0.0921,0.0529,0.1220
-0.9476,-0.6792,0.4913,-0.9392,-0.2669,0.9015
0.7247,0.3854,0.3819,-0.6227,-0.1162,0.2923
-0.5922,-0.5045,-0.4757,0.5003,-0.0860,0.2289
-0.8861,0.0170,-0.5761,0.5972,-0.4053,0.1817
0.6877,-0.2380,0.4997,0.0223,0.0819,0.0957
0.9189,0.6079,-0.9354,0.4188,-0.0700,0.0571
-0.1428,-0.7820,0.2676,0.6059,0.3936,0.1013
0.5324,-0.3151,0.6917,-0.1425,0.6480,0.1896
-0.8432,-0.9633,-0.8666,-0.0828,-0.7733,0.5548
-0.9444,0.5097,-0.2103,0.4939,-0.0952,0.1387
-0.0520,0.6063,-0.1952,0.8094,-0.9259,0.0177
-0.7487,0.2370,-0.9793,0.0773,-0.9940,0.1966
-0.6247,0.2450,0.8116,0.9799,0.4222,0.0451
0.4636,0.8186,-0.1983,-0.5003,-0.6531,0.1567
-0.7064,-0.4714,0.6382,-0.3788,0.9648,0.7585
-0.4667,0.0673,-0.3711,0.8215,-0.2669,0.0804
0.8778,-0.9381,0.4338,0.7820,-0.9454,0.0195
0.0441,-0.3480,0.7190,0.1170,0.3805,0.1678
-0.4198,-0.9813,0.1535,-0.3771,0.0345,0.5454
0.8328,-0.1471,-0.5052,-0.2574,0.8637,0.2704
-0.3712,-0.6505,0.2142,-0.1728,0.6327,0.4883
-0.6297,0.4038,-0.5193,0.1484,-0.3020,0.1714
0.0380,-0.6506,0.1414,0.9935,0.6337,0.0729
0.1887,0.9520,0.8031,0.1912,-0.9351,0.0640
0.9507,-0.6640,0.9456,0.5349,0.6485,0.0298
0.2652,0.3375,-0.0462,-0.9737,-0.2940,0.4137
-0.0627,-0.0852,-0.7247,-0.9782,0.5166,0.7502
-0.3601,0.9688,-0.5595,-0.3226,0.0478,0.1941
-0.7504,-0.3750,0.0090,0.3477,0.5403,0.3102
-0.7393,-0.9542,0.0382,0.6200,-0.9748,0.2378
-0.1015,0.8296,0.2887,-0.9895,-0.0311,0.6075
0.7186,0.6608,0.2983,0.3474,0.1570,0.0899
0.3435,-0.2951,0.7117,-0.6099,0.4946,0.4180
-0.4208,0.5476,-0.1445,0.6154,-0.2929,0.0689
-0.3827,0.4665,0.4889,-0.5572,-0.5718,0.4225
-0.6021,-0.7150,-0.2458,-0.9467,-0.7782,0.7099
-0.8389,-0.5366,-0.5847,0.8347,0.4226,0.2624
0.1078,-0.3910,0.6697,-0.1294,0.8469,0.3057
-0.7476,0.9521,-0.6803,-0.5948,-0.1376,0.4047
-0.1916,-0.7065,0.4586,-0.6225,0.2878,0.6037
0.2019,0.4979,0.2764,0.1943,-0.4090,0.0832
0.4632,0.8906,-0.1489,0.5644,-0.8877,0.0219
-0.2098,-0.3998,-0.8398,0.8093,-0.2597,0.1390
0.0614,-0.0118,-0.7357,-0.5871,-0.8476,0.1657
-0.2859,-0.7839,0.5751,-0.7868,0.9714,0.8383
-0.6457,0.1448,-0.9103,0.5742,-0.6208,0.1730
-0.7001,0.1022,-0.5668,0.5184,0.4458,0.2293
-0.6469,0.7239,-0.9604,0.7205,0.1178,0.1725
0.4339,0.9747,-0.4438,-0.9924,0.8678,0.4090
0.7158,0.4577,0.0334,0.4139,0.5611,0.1091
0.5012,0.2264,-0.1963,0.3946,-0.9938,0.0315
0.5498,0.7928,-0.5214,-0.7585,-0.5594,0.1785
0.0863,-0.4266,-0.7233,-0.4197,0.2277,0.4115
-0.3517,-0.0853,-0.1118,0.6563,-0.1473,0.0893
-0.5570,-0.0655,-0.3705,0.2537,0.7547,0.2980
-0.1046,0.5689,-0.0861,0.3125,-0.7363,0.0683
0.2110,0.5335,0.0094,-0.0039,0.6858,0.1916
-0.8644,0.1465,0.8855,0.0357,-0.6111,0.3472
0.4015,0.0805,0.8977,0.2487,0.6760,0.1444
-0.9841,0.9787,-0.8446,-0.3557,0.8923,0.4954
-0.4885,0.6054,-0.0443,-0.7313,0.8557,0.7082
0.7919,-0.0169,0.7134,-0.1628,0.3669,0.1911
-0.6209,0.9300,-0.4116,-0.7931,-0.7114,0.3805
-0.9718,0.4319,0.1290,0.5892,0.0142,0.1359
0.5557,-0.1870,0.2955,-0.6404,-0.3564,0.1996
-0.6548,-0.1827,-0.5172,-0.1862,0.9504,0.6260
0.7150,0.2392,-0.4959,0.5857,-0.1341,0.0591
-0.2850,-0.3394,0.3947,-0.4627,0.6166,0.6368
-0.0242,0.7107,0.7768,-0.6312,0.1707,0.5246
0.7964,-0.1078,0.8437,-0.4420,0.2177,0.2204
-0.9725,-0.1666,0.8770,-0.3139,0.5595,0.7465
-0.6505,-0.3161,-0.7108,0.4335,0.3986,0.3074
0.3847,-0.5454,-0.1507,-0.2562,-0.2894,0.1592
-0.8847,0.2633,0.4146,0.2272,0.2966,0.2562
0.0284,0.7507,-0.6321,-0.0743,-0.1421,0.1000
-0.0054,-0.6770,-0.3151,-0.4762,0.6891,0.5823
0.2140,-0.7091,0.0192,-0.4061,0.7193,0.4641
0.3432,0.2669,-0.7505,-0.0588,0.9731,0.1799
-0.6966,0.2783,0.1313,-0.0627,-0.1439,0.2982
0.1985,0.6999,0.5022,0.1587,0.8494,0.2123
0.2473,-0.9040,-0.4308,-0.8779,0.4070,0.6292
0.3369,-0.2428,-0.6236,0.4940,-0.3192,0.0713
0.0513,-0.9430,0.2885,-0.2987,-0.5416,0.2808
-0.1322,-0.2351,-0.0604,0.9590,-0.2712,0.0451
0.7783,-0.2901,-0.5090,0.8220,-0.9129,0.0401
0.9015,0.1128,-0.2473,0.9901,-0.8833,0.0075
0.1424,-0.6391,0.2619,0.9618,0.7498,0.0471
-0.0963,0.4169,0.5549,-0.0103,0.0571,0.1907
-0.7156,0.4538,-0.0460,-0.1022,0.7720,0.4308
0.0552,-0.1818,-0.4622,-0.8560,-0.1637,0.4666
-0.7812,0.3461,-0.0001,0.5542,-0.7128,0.1051
-0.8336,-0.2016,0.5939,-0.6166,0.5356,0.8498
-0.9666,-0.2027,-0.2378,0.3187,-0.8582,0.2150
-0.6948,-0.9668,-0.7724,0.3036,-0.1947,0.4500
0.9869,0.6690,0.3992,0.8365,-0.9205,0.0107
-0.0520,-0.3017,0.8745,-0.0209,0.0793,0.1857
0.7541,-0.4928,-0.4524,-0.3433,0.0951,0.1853
-0.5597,0.3429,-0.7144,-0.8118,0.7404,0.7314
0.1431,0.0516,-0.8480,0.7483,0.9023,0.0800
0.6250,-0.4324,0.0557,-0.3212,0.1093,0.1558
0.5490,-0.3484,0.7797,0.5034,0.5253,0.0667
-0.0610,-0.5785,-0.9170,-0.3563,-0.9258,0.2819
-0.1391,0.5356,0.0720,-0.9203,-0.7304,0.4478
-0.6132,-0.3287,-0.8954,0.2102,0.0241,0.3704
0.6954,-0.0919,-0.9692,0.7461,0.3124,0.0816
0.6460,0.9036,-0.8982,-0.5299,-0.8733,0.1188
-0.8368,-0.0538,-0.7489,0.5458,0.6828,0.3506
-0.9134,-0.0271,-0.5212,0.9049,0.8878,0.2666
-0.3103,0.7957,-0.1308,-0.5284,0.8817,0.4597
0.3684,-0.8702,0.7408,0.4028,0.2099,0.0529
0.2010,0.6292,-0.8918,-0.7390,0.6849,0.3385
0.2367,0.0626,-0.5034,-0.4098,0.7454,0.3716
0.7940,-0.5932,0.6525,0.7635,-0.0265,0.0281
0.1969,0.0545,0.2496,0.7101,-0.4357,0.0339
-0.7698,-0.5460,0.1920,-0.5211,-0.7372,0.6477
-0.6763,0.6897,0.2044,0.9271,-0.3086,0.0684
0.2314,-0.8816,0.5006,0.8964,0.0694,0.0619
-0.6149,0.5059,-0.9854,-0.3435,0.8352,0.4869
0.2093,0.6452,0.7590,-0.3580,-0.7541,0.2543
0.4426,-0.1193,-0.7465,0.1796,-0.9279,0.0717
-0.9758,-0.3933,-0.9572,0.9950,0.1641,0.2834
-0.4132,0.8579,0.0142,-0.0906,0.1757,0.2005
-0.2567,-0.5111,0.1691,0.3917,-0.8561,0.1319
0.9422,0.5061,0.6123,0.5033,-0.8399,0.0167
-0.1025,0.4086,0.3633,0.3943,0.2372,0.0794
-0.6980,0.5216,0.5621,0.8082,-0.5325,0.0590
-0.3589,0.6310,0.2271,0.5200,-0.1447,0.0576
-0.8011,-0.7699,-0.2532,-0.6123,0.6415,0.8838
-0.0178,-0.8237,-0.5298,-0.0768,-0.6028,0.2720
-0.9490,0.4588,0.4498,-0.3392,0.6870,0.6993
0.3141,0.1621,-0.5985,0.0591,0.7889,0.1626
-0.3900,0.7419,0.8175,-0.3403,0.3661,0.4925
0.6995,0.3342,-0.3113,-0.6972,0.2707,0.3227
0.6956,0.6437,0.2565,0.9126,0.1798,0.0235
-0.3265,0.9839,-0.2395,0.9854,0.0376,0.0340
-0.6554,-0.8509,-0.2594,-0.7532,0.2690,0.8467
0.8599,-0.7015,-0.2102,-0.0768,0.1219,0.1032
0.5607,-0.0256,-0.1600,-0.4760,0.8216,0.4237
-0.2896,0.9484,-0.7545,-0.6249,0.7789,0.3187
0.1668,-0.3815,-0.9985,-0.5448,-0.7092,0.2443
0.7462,0.4006,-0.0590,0.6543,-0.0083,0.0368

Test data.

# synthetic_test.txt
#
-0.2730,-0.4488,0.8495,-0.2260,-0.0142,0.3416
-0.2335,-0.4049,0.4352,-0.6183,-0.7636,0.4298
0.6740,0.4883,0.1810,-0.5142,0.2465,0.3049
-0.8650,0.7611,-0.0801,0.5277,-0.4922,0.1211
0.1828,-0.1424,-0.2358,-0.7466,-0.5115,0.2580
0.4834,0.2300,0.3448,-0.9832,0.3568,0.5880
0.0064,-0.5382,-0.6502,-0.6300,0.6885,0.6752
0.0929,0.6329,-0.0325,0.1799,0.5745,0.1265
-0.7995,0.0740,-0.2680,0.2086,0.9176,0.3982
0.5813,0.2902,-0.2122,0.3779,-0.1920,0.0372
-0.7278,-0.0987,-0.3312,-0.5641,0.8515,0.8319
0.4933,0.0839,0.4011,0.8611,0.7252,0.0609
-0.6651,-0.4737,-0.8568,0.9573,-0.5272,0.1747
-0.5785,0.0056,-0.7901,-0.2223,0.0760,0.4402
-0.3216,0.1118,0.0735,-0.2188,0.3925,0.3857
0.1230,-0.2838,0.2262,0.8715,0.1938,0.0483
0.9592,-0.1180,0.4792,-0.9248,0.5295,0.4681
-0.4456,0.0697,0.5359,-0.8938,0.0981,0.7935
0.6020,0.2992,0.8629,-0.8505,-0.4464,0.3471
0.1995,0.6659,0.7921,0.9454,0.9970,0.0833
-0.7207,-0.8589,-0.8531,-0.9705,0.9436,0.9408
-0.3066,-0.2927,-0.4923,0.8220,0.4513,0.1368
-0.9481,-0.0770,-0.4374,-0.9421,0.7694,0.9307
0.5931,-0.3507,-0.3842,0.8562,0.9538,0.0633
0.0471,0.9039,0.7760,0.0361,-0.2545,0.1801
0.2104,0.9808,0.5478,-0.3314,-0.8220,0.2518
-0.6302,0.0537,-0.1658,0.6013,0.8664,0.1972
0.9148,0.9189,-0.9243,-0.8848,-0.9884,0.1336
-0.3746,-0.8886,-0.4123,-0.2880,0.9074,0.6426
0.0060,0.2867,-0.7775,0.5161,0.7039,0.1107
0.6885,0.7810,-0.2363,-0.1124,-0.7968,0.0847
0.7811,0.8450,-0.6877,0.7594,0.2640,0.0410
-0.5787,-0.3098,-0.6802,-0.1113,-0.8325,0.3033
0.3821,0.1476,0.7466,-0.5107,0.2592,0.3741
-0.9311,0.0324,0.7265,0.9683,-0.9803,0.0842
-0.9049,-0.9797,-0.0196,-0.9090,-0.4433,0.8601
0.2799,-0.4106,-0.4607,0.1811,-0.2389,0.1149
0.2664,-0.2932,-0.4259,-0.7336,0.8742,0.6800
0.6097,0.8761,-0.6292,0.8663,0.8715,0.0530
0.1029,-0.6294,-0.1158,-0.6294,0.8948,0.6741
This entry was posted in Scikit. Bookmark the permalink.

1 Response to Gaussian Process Regression on Synthetic Data Using the scikit Library

  1. Pingback: Gaussian Process Regression Using the scikit Library -- Visual Studio Magazine

Comments are closed.