Training the Facemark object sometimes led to segfaults, I have noticed, that it was often combined with -nan as the logged objective value. After searching in the code I found the function supportVectorRegression(…) which has some very mysterious variable declarations.
Mainly, there are the single value arrays lambda and upper_bound. The variables are never passed into any functions as an array, but instead the value of index GETI(i) is used for some calculations. It should be obvious now that whenever GET(i) is not 0, this will have bad consequences. My question is, does anyone know what the intention was? I figure it has something to do with the supportVectorRegression Algorithm. I haven’t checked the literature yet (sklearn probably has something similar).
code Snippet:
#define GETI(i) ((int) y[i])
…
double lambda[1], upper_bound[1];
lambda[0] = 0.5 / C;
upper_bound[0] = HUGE_VAL;
…
G = -y[i] + lambda[GETI(i)] * beta[i];
H = QD[i] + lambda[GETI(i)];
after changing the define to be always 0, the segfaults went away