Use TensorFlow Serving in Symfony 7

Use TensorFlow Serving in Symfony 7

Machine learning is widely used for creating artificial intelligence applications. A key challenge is efficiently deploying and serving machine learning models at scale. TensorFlow Serving addresses this challenge, providing a high-performance system for deploying models in a production environment. This post explores the integration of TensorFlow Serving with Symfony 7.

Prepare environment

Ensure that TensorFlow Serving is installed on your system. Refer to the post on how to install TensorFlow Serving within a Docker container.

Code

Open the .env file and include the new environment variable TENSORFLOW_SERVING_URL.

.env

TENSORFLOW_SERVING_URL=http://192.168.0.201:8501/v1/models

Here's how you can use TensorFlow Serving with Symfony:

src/Controller/TestController.php

<?php

namespace App\Controller;

use Symfony\Component\DependencyInjection\Attribute\Autowire;
use Symfony\Component\HttpFoundation\JsonResponse;
use Symfony\Component\HttpFoundation\Request;
use Symfony\Component\HttpFoundation\Response;
use Symfony\Component\Routing\Attribute\Route;
use Symfony\Contracts\HttpClient\HttpClientInterface;

class TestController
{
    public function __construct(#[Autowire(env: 'TENSORFLOW_SERVING_URL')] private $tfUrl)
    {
    }

    #[Route('/')]
    public function index(HttpClientInterface $client): Response
    {
        $url = $this->tfUrl.'/half_plus_two:predict';

        $requestData = [
            'instances' => [
                [3.0, 5.0, 7.0],
            ],
        ];

        $response = $client->request(Request::METHOD_POST, $url, ['json' => $requestData]);
        $responseData = $response->toArray()['predictions'];

        return new JsonResponse($responseData); // Output: [[3.5,4.5,5.5]]
    }
}

Let's analyze the code:

  • We specify the TensorFlow Serving URL, incorporating both the model name (half_plus_two) and the predict endpoint.
  • We set up the input data for the model using a multidimensional array. In this example, there is a single input data instance with three features.
  • An HTTP POST request is sent to the TensorFlow Serving endpoint. The input data is sent in JSON format.
  • The response from TensorFlow Serving is converted to an array.
  • The predicted values are returned as a JSON response. You can customize this according to your requirements, whether it involves rendering a view or providing an alternative response type.

Leave a Comment

Cancel reply

Your email address will not be published.