Tensorflow
Last updated
Was this helpful?
Last updated
Was this helpful?
The Tensorflow filter plugin allows running machine learning inference tasks on the records of data coming from input plugins or stream processors. This filter uses as the inference engine, and requires Tensorflow Lite shared library to be present during build and at runtime.
Tensorflow Lite is a lightweight open source deep learning framework used for mobile and IoT applications. Tensorflow Lite only handles inference, not training. It loads pre-trained models (.tflite
files) that are converted into Tensorflow Lite format (FlatBuffer
). You can read more on converting .
The Tensorflow plugin for Fluent Bit has the following limitations:
Currently supports single-input models
Uses Tensorflow 2.3 header files
The plugin supports the following configuration parameters:
input_field
Specify the name of the field in the record to apply inference on.
none
model_file
Path to the model file (.tflite
) to be loaded by Tensorflow Lite.
none
include_input_fields
Include all input filed in filter's output.
True
normalization_value
Divide input values to normalization_value
.
none
To create a Tensorflow Lite shared library:
Clone the .
Install the package manager.
Run the following command to create the shared library:
The script creates the shared librarybazel-bin/tensorflow/lite/c/libtensorflowlite_c.so
.
Copy the library to a location such as /usr/lib
that can be used by Fluent Bit.
If Tensorflow plugin initializes correctly, it reports successful creation of the interpreter, and prints a summary of model's input and output types and dimensions.
The command:
produces an output like:
The Tensorflow filter plugin is disabled by default. You must build Fluent Bit with the Tensorflow plugin enabled. In addition, it requires access to Tensorflow Lite header files to compile. Therefore, you must pass the address of the Tensorflow source code on your machine to the :