Does svm->train() has a size limit? (segmentation fault)

Does svm->train() has a size limit?
cv::Mat data(134181242, 2, CV_32F,;
cv::Mat label(134181242, 1, CV_32SC1,;
svm->train(data, ROW_SAMPLE, label);
will fail with segmentation fault.
However, if I split the data and label vector into smaller pieces, everything works fine.
Kernel is LINEAR and type is C_SVC. data is a vector<float> and label is a vector<int>. Of course, data.size() == label.size() * 2

any chance, you’re running on 32bits ? (4gb limit)

Ubuntu 64 bit, 32GB ram, svm goes up to 10GB+ usage.

1 Like