Tensor library for machine learning
Note that this project is under active development.
Some of the development is currently happening in thellama.cppandwhisper.cpprepos
- Low-level cross-platform implementation
- Integer quantization support
- Broad hardware support
- Automatic differentiation
- ADAM and L-BFGS optimizers
- No third-party dependencies
- Zero memory allocations during runtime
git clone https://github.com/ggerganov/ggml
cdggml
#install python dependencies in a virtual environment
python3.10 -m venv.venv
source.venv/bin/activate
pip install -r requirements.txt
#build the examples
mkdir build&&cdbuild
cmake..
cmake --build.--config Release -j 8
#run the GPT-2 small 117M model
../examples/gpt-2/download-ggml-model.sh 117M
./bin/gpt-2-backend -m models/gpt-2-117M/ggml-model.bin -p"This is an example"
For more information, checkout the corresponding programs in theexamplesfolder.
#fix the path to point to your CUDA compiler
cmake -DGGML_CUDA=ON -DCMAKE_CUDA_COMPILER=/usr/local/cuda-12.1/bin/nvcc..
cmake -DCMAKE_C_COMPILER="$(hipconfig -l)/clang"-DCMAKE_CXX_COMPILER="$(hipconfig -l)/clang++"-DGGML_HIPBLAS=ON
#linux
source/opt/intel/oneapi/setvars.sh
cmake -G"Ninja"-DCMAKE_C_COMPILER=icx -DCMAKE_CXX_COMPILER=icpx -DGGML_SYCL=ON..
#windows
"C:\Program Files (x86)\Intel\oneAPI\setvars.bat"
cmake -G"Ninja"-DCMAKE_C_COMPILER=cl -DCMAKE_CXX_COMPILER=icx -DGGML_SYCL=ON..
Download and unzip the NDK from this downloadpage.Set the NDK_ROOT_PATH environment variable or provide the absolute path to the CMAKE_ANDROID_NDK in the command below.
cmake.. \
-DCMAKE_SYSTEM_NAME=Android \
-DCMAKE_SYSTEM_VERSION=33 \
-DCMAKE_ANDROID_ARCH_ABI=arm64-v8a \
-DCMAKE_ANDROID_NDK=$NDK_ROOT_PATH
-DCMAKE_ANDROID_STL_TYPE=c++_shared
#create directories
adb shell'mkdir /data/local/tmp/bin'
adb shell'mkdir /data/local/tmp/models'
#push the compiled binaries to the folder
adb push bin/*/data/local/tmp/bin/
#push the ggml library
adb push src/libggml.so /data/local/tmp/
#push model files
adb push models/gpt-2-117M/ggml-model.bin /data/local/tmp/models/
adb shell
cd/data/local/tmp
exportLD_LIBRARY_PATH=/data/local/tmp
./bin/gpt-2-backend -m models/ggml-model.bin -p"this is an example"