# Architecture

## A/V Processing

<figure><img src="https://951110271-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FAQRvnKSkK1SsZB0HeYhh%2Fuploads%2FkUItsZ0yRh1wSAgtMiN6%2Fcore-arch.png?alt=media&#x26;token=818e653b-5762-4200-b0f6-47a29198c6c9" alt=""><figcaption></figcaption></figure>

1. Core launches and monitors FFmpeg processes
2. FFmpeg can use HTTP, RTMP, and SRT services as streaming backends for processing incoming and outgoing A/V content.
3. Several storage locations are available for the HTTP service: In-memory file system, aka MemFS (very fast without disk I/O.) Disk file system, aka DiskFS, for storage on the HDD/SSD of the host system.
4. Optionally, FFmpeg can access host system devices such as GPU and USB interfaces (requires FFmpeg built-in support).

{% hint style="success" %}
FFmpeg can also use external input and output URLs.
{% endhint %}
