# triton-identity_backend **Repository Path**: luo_zhi_cheng/triton-identity_backend ## Basic Information - **Project Name**: triton-identity_backend - **Description**: 23.12 23.12 - **Primary Language**: Unknown - **License**: BSD-3-Clause - **Default Branch**: r23.12 - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2024-01-11 - **Last Updated**: 2024-08-22 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README [![License](https://img.shields.io/badge/License-BSD3-lightgrey.svg)](https://opensource.org/licenses/BSD-3-Clause) # Triton Inference Server Identity Backend A simple Triton backend that copies input tensors to corresponding output tensors. This backend is used primarily for testing. To learn more about writing your own Triton backend including simple examples, see the documentation included in the [backend repo](https://github.com/triton-inference-server/backend). Ask questions or report problems with the Identity backend in the main Triton [issues page](https://github.com/triton-inference-server/server/issues). ## Build Use cmake to build and install in a local directory. ``` $ mkdir build $ cd build $ cmake -DTRITON_ENABLE_GPU=ON -DCMAKE_INSTALL_PREFIX:PATH=`pwd`/install .. $ make install ``` The following required Triton repositories will be pulled and used in the build. By default the "main" branch/tag will be used for each repo but the listed CMake argument can be used to override. * triton-inference-server/backend: -DTRITON_BACKEND_REPO_TAG=[tag] * triton-inference-server/core: -DTRITON_CORE_REPO_TAG=[tag] * triton-inference-server/common: -DTRITON_COMMON_REPO_TAG=[tag] If you are building on a release branch (or on a development branch that is based off of a release branch), then you must set these cmake arguments to point to that release branch as well. For example, if you are building the r23.04 identity_backend branch then you need to use the following additional cmake flags: ``` -DTRITON_BACKEND_REPO_TAG=r23.04 -DTRITON_CORE_REPO_TAG=r23.04 -DTRITON_COMMON_REPO_TAG=r23.04 ``` ## Custom Metric Example When `TRITON_ENABLE_METRICS` is enabled, this backend implements an example of registering a custom metric to Triton's existing metrics endpoint via the [Metrics API](https://github.com/triton-inference-server/server/blob/main/docs/user_guide/metrics.md#custom-metrics). This metric will track the cumulative `input_byte_size` of all requests to this backend per-model. Here's an example output of the custom metric from Triton's metrics endpoint after a few requests to each model: ``` # HELP input_byte_size_counter Cumulative input byte size of all requests received by the model # TYPE input_byte_size_counter counter input_byte_size_counter{model="identity_uint32",version="1"} 64.000000 input_byte_size_counter{model="identity_fp32",version="1"} 32.000000 ``` This example can be referenced to implement custom metrics for various use cases.