Bazaar is a fully async GraphQL Server written in Rust using Actix Web
,
Async-GraphQL
and SQLx
. It implements the basic functionality you would
expect to find on an e-commerce platform (see functionality.md for more information).
It's my hope that this project will be a useful example for others to see how to compose a production ready Rust application from some of the awesome crates in the ecosystem, along with how various things you would typically see in enterprise applications eg. Testing, Observability, CI etc. can be implemented.
Although this project is intended to be fully functional, certain parts of it will be mocked/stubbed/out-of-scope to keep the bounds on what this side project does reasonable.
I'm still very much learning, so if you spot any issues or have any improvements or feedback on anything within the app, please feel free to drop me a message or raise an issue on the repo.
See below for a diagram of the architecture as it currently looks. In reality you most probably wouldn't have all of the telemetry backends that are present, however I just wanted to have a go at integrating and playing around with them in this project, so I've left the connections in.
- Stock & Stock Management - Will be included via some stub functionality
- Checkout & Payments - This might be included in the future
The app outputs structured json logs, if you want a more human friendly view of them you can download the node-bunyan cli and pipe them through that.
There is a demo docker-compose
file within the repository that should handle
setting up all dependencies and the application within the docker
environment. It can be run with:
docker-compose -f docker-compose.demo.yaml up
If you want to run all of the dependencies in docker, but the app with rust
, you can use the
docker-compose.yaml
file. Once the docker-compose has spun up, migrate the database by running SKIP_DOCKER=true ./scripts/init_db.sh
.
Once the migration has finished the application can then be run with cargo run --bin app
.
The minimum dependencies the app requires is a running instance of Postgres
, the easiest way to set this up
is to run the ./scripts/init_db.sh
script, which will pull down the latest
docker postgres image, start it up, and run the migrations.
You can then run the actual application with cargo run --bin app
.
docker build --tag bazaar --file Dockerfile .
docker run -p 8000:8000 bazaar
Most environment variables are managed through configuration files found within
the configuration
directory, however some that are not included in there (and
need to be set up manually in order to run the application) can be found below
Name | Key | Description | Example |
---|---|---|---|
Authentication Secret Key | SECRET_KEY |
Holds the secret key used while hashing passwords | KbPeShVmYq3t6w9y$B&E)H@McQfTjWnZ |
Private key for Refresh Token | REFRESH_TOKEN_PRIVATE_KEY |
Holds the private key for signing the refresh token JWTs | Typical RSA Private Key (.pem format) |
Public key for Refresh Token | REFRESH_TOKEN_PUBLIC_KEY |
Holds the public key for verifying the refresh token JWTs | Typical RSA Public Key (.pem format) |
Private key for Access Token | ACCESS_TOKEN_PRIVATE_KEY |
Holds the private key for signing the access token JWTs | Typical RSA Private Key (.pem format) |
Public key for Access Token | ACCESS_TOKEN_PUBLIC_KEY |
Holds the public key for verifying the access token JWTs | Typical RSA Public Key (.pem format) |
The CI pipeline includes checks on sqlx-data.json
, if
it detects that there have been changes without updating this file it will fail CI. See preparing for SQLx offline below for more details.
Although it's not pretty there's a small binary in the workspace which can be
used to generate the graphql schema for the application and write it to
schema.graphql
. To do so just run cargo run --bin schema
. The easiest way to
keep it up to date is to create a basic pre-commit
hook to run it for you.
Name | Purpose | Installation |
---|---|---|
SQLx CLI | Database Migrations | cargo install --version=0.2.0 sqlx-cli |
PSQL | Used predominately for the utilities of psql |
brew install libpq && brew link --force libpq |
direnv | This is just a nice way of managing environment variables within projects | brew install direnv |
A majority of the functionality are tested through integration
tests, found in
the /tests
directory. This is to ensure the public API that's exposed behaves
as intended.
As all authentication is done through HttpOnly
cookies, you'll generally see a
pattern where different HTTP Clients (currently reqwest)
are set up throughout the tests, with the client holding the auth cookies as
appropriate
Using OpenSSL to generate key pairs: replace what's in the <>
Private & Public Key Pair
openssl genpkey -out <name>.pem -algorithm RSA -pkeyopt rsa_keygen_bits:<len>
Extracting Public key
openssl rsa -in <name>.pem -pubout > <name>.pub
# Ensure $DATABASE_URL is correctly set
sqlx migrate add <migration name>
These are set up to run manually with ./scripts/init_db.sh
, optionally you can
pass a skip variable if a docker instance is already running SKIP_DOCKER=true ./scripts/init_db.sh
If you need to run them manually, you can do so with the sqlx CLI:
sqlx migrate run
CI will fail if the offline SQLX schema hasn't been updated when it should have
been. Again it's probably useful to add this as a pre-commit
hook on the
project
cargo sqlx prepare -- --lib
- Luca Palmieri's - Zero to Production In Rust: This is a great resource that I would highly recommend if you're interested in Rust at all, but particularly if you're interested in building Web Applications/APIs