leavesis a library implementing prediction code for GBRT (Gradient Boosting Regression Trees) models inpure Go.The goal of the project - make it possible to use models from popular GBRT frameworks in Go programs without C API bindings.
NOTE:Before1.0.0
release the API is a subject to change.
- General Features:
- support parallel predictions for batches
- support sigmoid, softmax transformation functions
- support getting leaf indices of decision trees
- Support LightGBM (repo) models:
- read models from
text
format and fromJSON
format - support
gbdt
,rf
(random forest) anddart
models - support multiclass predictions
- addition optimizations for categorical features (for example,one hotdecision rule)
- addition optimizations exploiting only prediction usage
- read models from
- Support XGBoost (repo) models:
- read models from binary format
- support
gbtree
,gblinear
,dart
models - support multiclass predictions
- support missing values (
nan
)
- Support scikit-learn (repo) tree models (experimental support):
- read models from pickle format (protocol
0
) - support
sklearn.ensemble.GradientBoostingClassifier
- read models from pickle format (protocol
In order to start, go get this repository:
go get github /dmitryikh/leaves
Minimal example:
packagemain
import(
"fmt"
"github /dmitryikh/leaves"
)
funcmain() {
// 1. Read model
useTransformation:=true
model,err:=leaves.LGEnsembleFromFile("lightgbm_model.txt",useTransformation)
iferr!=nil{
panic(err)
}
// 2. Do predictions!
fvals:=[]float64{1.0,2.0,3.0}
p:=model.PredictSingle(fvals,0)
fmt.Printf("Prediction for %v: %f\n",fvals,p)
}
In order to use XGBoost model, just changeleaves.LGEnsembleFromFile
,toleaves.XGEnsembleFromFile
.
Documentation is hosted on godoc (link). Documentation contains complex usage examples and full API reference. Some additional information about usage examples can be found inleaves_test.go.
Mostleavesfeatures are tested to be compatible with old and coming versions of GBRT libraries. Incompatibility.mdone can found detailed report aboutleavescorrectness against different versions of external GBRT libraries.
Some additional information on new features and backward compatibility can be found inNOTES.md.
Below are comparisons of prediction speed on batches (~1000 objects in 1 API
call). Hardware: MacBook Pro (15-inch, 2017), 2,9 GHz Intel Core i7, 16 ГБ
2133 MHz LPDDR3. C API implementations were called from Python bindings. But
large batch size should neglect overhead of Python bindings.leaves
benchmarks were run by means of golang test framework:go test -bench
.See
benchmarkfor mode details on measurments. See
testdata/README.mdfor data preparation pipelines.
Single thread:
Test Case | Features | Trees | Batch size | C API | leaves |
---|---|---|---|---|---|
LightGBMMS LTR | 137 | 500 | 1000 | 49ms | 51ms |
LightGBMHiggs | 28 | 500 | 1000 | 50ms | 50ms |
LightGBM KDD Cup 99* | 41 | 1200 | 1000 | 70ms | 85ms |
XGBoost Higgs | 28 | 500 | 1000 | 44ms | 50ms |
4 threads:
Test Case | Features | Trees | Batch size | C API | leaves |
---|---|---|---|---|---|
LightGBMMS LTR | 137 | 500 | 1000 | 14ms | 14ms |
LightGBMHiggs | 28 | 500 | 1000 | 14ms | 14ms |
LightGBM KDD Cup 99* | 41 | 1200 | 1000 | 19ms | 24ms |
XGBoost Higgs | 28 | 500 | 1000 | ? | 14ms |
(?) - currenly I'm unable to utilize multithreading form XGBoost predictions by means of Python bindings
(*) - KDD Cup 99 problem involves continuous and categorical features simultaneously
- LightGBM models:
- limited support of transformation functions (support only sigmoid, softmax)
- XGBoost models:
- limited support of transformation functions (support only sigmoid, softmax)
- could be slight divergence between C API predictions vs.leavesbecause of floating point convertions and comparisons tolerances
- scikit-learn tree models:
- no support transformations functions. Output scores israw scores(as from
GradientBoostingClassifier.decision_function
) - only pickle protocol
0
is supported - could be slight divergence between sklearn predictions vs.leavesbecause of floating point convertions and comparisons tolerances
- no support transformations functions. Output scores israw scores(as from
In case if you are interested in the project or if you have questions, please contact with me by
email: khdmitryiat
gmail