# Torchscript

TorchScript is a tool in the PyTorch ecosystem that enables *<mark style="color:yellow;">**serialising PyTorch models into a format that can be run independently of Python**</mark>*.

This is particularly useful in production environments where running a Python process might have performance drawbacks or isn't feasible.

TorchScript allows the use of PyTorch for training models with its flexible and friendly Pythonic interface, while also offering a path to use the trained model in an optimized, potentially multi-threaded, lower-level language runtime.&#x20;

{% embed url="<https://pytorch.org/docs/stable/jit.html>" %}

## <mark style="color:purple;">**Examples of how you could use TorchScript in training**</mark>

#### <mark style="color:green;">**Deploying AI models in a C++ environment**</mark>

If you have an AI model trained in PyTorch using Python, you can use TorchScript to *serialise your model and then load and run it in a C++ environment* without needing Python at all.&#x20;

This is extremely useful in production scenarios where you might want to eliminate Python dependencies for performance reasons or integrate with existing C++ codebases.

#### <mark style="color:green;">**Optimising for mobile and embedded devices**</mark>

TorchScript provides an efficient way to *<mark style="color:yellow;">optimise your models for mobile and embedded devices</mark>* that might not have Python interpreters.&#x20;

For instance, you can train a PyTorch model on a powerful server with a lot of computational resources and then convert the model to TorchScript to deploy it on a *mobile device for on-device inference*.

#### <mark style="color:green;">**Multi-threading and GPU Acceleration**</mark>

TorchScript models can be executed in parallel, taking advantage of multi-core CPUs and GPUs.&#x20;

This is useful for tasks such as image processing, where you might want to process multiple images at the same time. &#x20;

If you've trained a PyTorch model for image classification, for example, you can use TorchScript to serialise your model and then use it in a multi-threaded application to classify multiple images in parallel.

Remember, TorchScript models are just like regular PyTorch models, and can be *<mark style="color:yellow;">trained, fine-tuned, or used for inference</mark>*. The primary difference is that TorchScript models can run independently from a Python runtime, making them more versatile in different production scenarios.
