Friday, March 14, 2025

How to Build TensorFlow Lite C API from Source Inside WSL

How to Build TensorFlow Lite C API from Source Inside WSL

TensorFlow Lite is a lightweight, efficient runtime for deploying machine learning models on edge devices. It's ideal for environments that are low-power and performance-critical such as embedded systems mobile devices, and microcontrollers.

Building the TensorFlow Lite C API from source inside Windows Subsystem for Linux (WSL) allows you to integrate AI inference into native C applications. This is useful when working on constrained devices, building low-level systems, or working with existing C/C++ codebases.

Step 1: Set Up WSL and Create a .wslconfig File (Optional)

+

To prevent Bazel crashes from memory exhaustion, increase the memory limit for WSL:

# On Windows (not WSL):
Create C:\Users\<yourname>\.wslconfig with the following content:

[wsl2]
memory=6GB
processors=4

To do this with Notepad:

  • Open the Start menu and type Notepad
  • Paste the above configuration text into the new file
  • Click File > Save As...
  • Set File name: .wslconfig
  • Set Save as type: to All Files
  • Save it to C:\Users\<yourname>\

Then from PowerShell:

wsl --shutdown

Step 2: Install Prerequisites

sudo apt update
sudo apt install -y build-essential clang git wget python3-pip

Step 3: Install Numpy

pip install numpy

Step 4: Install Bazelisk

wget https://github.com/bazelbuild/bazelisk/releases/download/v1.17.0/bazelisk-linux-amd64 -O bazelisk
chmod +x bazelisk
sudo mv bazelisk /usr/local/bin/bazelisk

Step 5: Set the Required Bazel Version

export USE_BAZEL_VERSION=5.3.0
echo 'export USE_BAZEL_VERSION=5.3.0' >> ~/.bashrc
source ~/.bashrc

Step 6: Clone TensorFlow and Check Out the Version

git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow
git checkout v2.12.0

Step 7: Build TensorFlow Lite C API

Optional but recommended: limit RAM usage to avoid crashes.

export BAZEL_BUILD_OPTS="--local_ram_resources=2048"
cd tensorflow/tensorflow/lite/c
bazelisk build -c opt $BAZEL_BUILD_OPTS --define=flatbuffer_op_resolver=false //tensorflow/lite/c:libtensorflowlite_c.so

Step 8: Install the Library and Headers

cd ~/tensorflow
sudo cp bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so /usr/local/lib/
sudo ldconfig

# Copy required top-level headers
sudo mkdir -p /usr/local/include/tflite
sudo cp tensorflow/lite/c/c_api.h /usr/local/include/tflite/

# Copy all internal TensorFlow Lite C API dependencies
sudo mkdir -p /usr/local/include/tensorflow/lite/core/c
sudo cp tensorflow/lite/core/c/c_api.h /usr/local/include/tensorflow/lite/core/c/
sudo cp tensorflow/lite/core/c/c_api_types.h /usr/local/include/tensorflow/lite/core/c/

# Copy additional headers required by the C API
sudo mkdir -p /usr/local/include/tensorflow/lite
sudo cp tensorflow/lite/builtin_ops.h /usr/local/include/tensorflow/lite/

Step 9: Verify With a Simple C Program

#include "tflite/c_api.h"
#include <stdio.h>

int main() {
    TfLiteModel* model = TfLiteModelCreateFromFile("model.tflite");
    if (!model) {
        printf("Failed to load TensorFlow Lite model\n");
        return 1;
    }
    printf("TensorFlow Lite model loaded successfully!\n");
    TfLiteModelDelete(model);
    return 0;
}

Compile it with:

gcc -o tflite_test tflite_test.c -I/usr/local/include -L/usr/local/lib -ltensorflowlite_c
./tflite_test

Conclusion

Now that you’ve built the TensorFlow Lite C API from source inside WSL, you're ready to run AI inference directly in your C applications. This setup is ideal for embedded AI applications such as digital signal processing. By building from source, you gain control when integrating with systems where Python or heavy dependencies are incompatible.