by Tobi Lehman [email protected]
This is a demo HLS (HTTP Live Stream) player written in C . It has a basic HLS server too.
Apple created HLS to make a video streaming format that was built on top of HTTP(S). In a nutshell, the backend is an HTTP server that serves a playlist, the playlist has a bunch of little video file segments, and then the client has to fetch those segments, download them, decode them and play them.
To keep this repo focused on the client side, we will take a 12 second MP4 file and produce an HLS stream of 3 segments using FFmpeg:
To build the client, run:
cmake -B build
cmake --build build
Then to run it, run:
./build/hls_client
See Streaming Server to get the server running, the client will connect to the server.
The client connects to the server, requests the manifest, and then loops over all the segments, requesting them and buffering them for the player.
sequenceDiagram
client->>server: GET /manifest.m3u8
server-->>client: (return)
client->>server: GET /segment_1.ts
server-->>client: (return)
client->>server: GET /segment_2.ts
server-->>client: (return)
The "H" in HLS is for HTTP, those right arrows are regular HTTP GET requests
The streaming server is a basic civetweb server that serves up the manifest and segments created by this command:
(cd data && ffmpeg -i F-35.mp4 -c:v libx264 -g 105 -keyint_min 105 -hls_time 2 -hls_list_size 0 -f hls output.m3u8)
To build the server, run:
cmake -B build
cmake --build build
Then to run it, run:
./build/hls_server
This will create an HLS endpoint at http://localhost:8080/output.m3u8
To extract all the keyframes from an mp4 file, run this ffmpeg command:
(cd data/ && ffmpeg -i F-35.mp4 -vf "select='eq(pict_type\,I)'" -vsync vfr keyframe_d.png)
On the example video I gave, there were only 4 keyframes. This will limit how many segments you can emit from your streaming server.