Examples

  • Example implementation of loading an image, passing it through a model, and receiving the resulting image

  • The constants at the top should be modified by the user:

    • IMG_FILE : Path to the input image
    • ENGINE_FILE : Path to the ONNX file
    • OUTPUT_PATH : Path to the output image
    • GPU_ID : GPU ID to be used
#include "dllengine.h"

#include <opencv2/opencv.hpp>
#include <opencv2/core.hpp>
#include <opencv2/imgcodecs.hpp>

using namespace cv;

const std::string IMG_FILE = "path/to/image.png";
const std::string ENGINE_FILE = "path/to/onnxfile.onnx";
const std::string OUTPUT_PATH = "example.png";
const int GPU_ID = 0; 

int main()
{
    Inferencer* inferencer = get_inferencer(ENGINE_FILE, GPU_ID);

    Mat img_input = imread(IMG_FILE, IMREAD_GRAYSCALE);
    int input_height = img_input.rows;
    int input_width = img_input.cols;
    auto* output_buffer = new unsigned char[img_input.rows * img_input.cols * 1]; // grayscale model

    if (inferencer != nullptr) 
    {
        if (do_inference(inferencer, img_input.data, output_buffer, img_input.cols, img_input.rows) != 0)
        {
            std::cout << "Error is occurred while inferencing.";
        }
        else
        {
            Mat img_pp = Mat(input_height, input_width, CV_8U, output_buffer);
            imwrite(OUTPUT_PATH, img_pp);
            std::cout << "Inference is done. " << std::endl;
        }
        remove_inferencer(inferencer);
    }
}
  • explanation:

    Inferencer* inferencer = get_inferencer(ENGINE_FILE, GPU_ID)
    
    The model is created through the function get_inferencer. The class pointer returned during this process must be passed for future model execution.
    If an error occurs during the creation process, it returns a null pointer.


    do_inference(inferencer, img_input.data, output_buffer, img_input.cols, img_input.rows)
    
    The function do_inference is used to perform Image Restoration. Upon successful execution, the grayscale image will be stored in the buffer.
    If the function executes successfully, it returns 0.


    remove_inferencer(inferencer)
    
    Through the remove_inferencer function, the pointer to the model is removed.