I recently started using Turso, which is a great database option for serverless as well as to keep costs low. Its pricing is more than competitive with something like AWS DynamoDB, which is quite interesting as a alternative to it.
One problem though, there's no support by any good ORMs or similar in the Rust ecosystem for Turso.
Instead, I was currently writing something like this, just to get a User out from the database via their session:
let database_context = ctx.data::<DatabaseContext>().expect("Missing database context");let conn = database_context.users.get().expect("No connection to users database");letmut stmt = conn
.prepare("SELECT users.user_id, users.first_name, users.last_name, users.country, users.onboarding_step, users.demo_user
FROM auth_sessions AS sessions
LEFT JOIN users AS users ON users.user_id = sessions.user_id
WHERE sessions.session_id = :session",).await
.expect("Failed to create the prepared statement to query sessions table");let row = stmt
.query(libsql::named_params!{":session": auth_session.session.clone()}).await
.expect("Failed to query sessions table");.next().await
.expect("Failed to get the first row of the sessions table");ifletSome(row)= row {let user: User =de::from_row::<lib_database::User>(&row).expect("Failed to deserialize user").into();}
The above does the following:
Pulls the connection info from the async-graphql request context
Establishes a connection to the users database
Creates a prepared statement
Runs the query with the prepared statement and passes in the named parameters and tries to get the rows
Tries to get the first row (a Result<Option<Row>>)
And finally, if that row was not None, tries to deserialize the row first into the lib_database::User type which was manually created, and finally into the User GraphQL type, which has a From implementation to convert from the database type to the GraphQL type
…quite verbose if I were to be honest, and not something that I'd like to bloat my codebase with.
There are also some obvious issues here:
Nothing is checking my SQL statement is valid and that those tables/columns even exist
Nothing verifies that my SQL is getting the necessary data to be able to deserialize into the lib_database::User type
Instead I'd like something closer to the following, checked at compile time:
Verifies that the query only refers to tables and columns that exist, by checking against the database schema file (not needing a running database)
Verify that all named parameters in the query is also populated
Create some form of an anonymous record/map with the column name as the key for each row
The code could look something like:
// Construct a type-safe builder based on the parameters in the query.
let query =query!("SELECT users.user_id, users.first_name, users.last_name, users.country, users.onboarding_step, users.demo_user
FROM auth_sessions
LEFT JOIN users ON users.user_id = auth_sessions.user_id
WHERE auth_sessions.session_id = :session").session(auth_session.session.clone().into()).build();// Easily grab the first row.
let row = query.first(&conn).await;// Access each selected field in a type-safe manner.
let user =User::new(row.user_id, row.first_name, row.last_name, row.onboarding_step);
We want as much as possible to be checked at compile-time, and that means that we very likely will need to reach for a Procedural Macro. In Rust, we have two forms of macros:
Procedural macros: Defined in their own create via #[proc_macro] and similar, depending on what type you want (e.g. custom!(...), #[derive(CustomDerive)] or #[CustomAttribute])
Proc macros cannot be used in the same crate as they are defined, so we'll first need to create a new crate that we can use with our project:
$ cargo new lib-macros --lib
We'll then need to edit our lib-macros/Cargo.toml to add a few crates, both for our proc macro but also for our SQL parsing logic:
1
[package]
2
name="lib-macros"
3
version="0.1.0"
4
edition="2021"
5
6
[lib]
7
proc-macro=true
8
9
[profile.release]
10
opt-level=3# Optimize for speed.
11
lto=true# Enable Link Time Optimization.
12
codegen-units=1# Reduce Parallel Code Generation Units to Increase Optimization.
13
strip=true# Automatically strip symbols from the binary.
14
debug=false
15
16
[dependencies]
17
# Parsing SQL.
18
sqlparser="0.51.0"
19
# Macro dependencies.
20
syn="2.0"
21
quote="1.0"
22
proc-macro2="1.0"
23
24
[dev-dependencies]
25
# Necessary for testing and also for any project actually using this proc-macro.
Note that when using this proc-macro in your project, you'll basically need all the dependencies that are listed in [dev-dependencies], since the generated code will rely on these.
We'll then tackle the functionality in two steps:
Handle parsing SQL from strings and files
Stitch things together in a proc-macro that will generate our code
With our helper functions in place for working with the SQL, we can finally piece together our proc-macro.
Here's a rough overview of what the code does:
Reads the input query as a string literal
Uses our SQL helpers to
Extract the selected fields from the query
Get all table information from a file located at ../database/users.sql (that is, outside of the lib-macros directory, and inside another directory called database)
Validate that the selected fields actually exist in the correct tables
Extract all the params under the assumption that they are the only things starting with :
Prepare some of the code generation we'll need later
Prepare the struct field names for our QueryParams typesafe builder, with a type of libsql::Value
Prepare the parameter name and struct field value pairs for libsql::named_params! which is used to pass the parameters to a prepared statement
Prepare our struct of our return data based on our selected fields and the types that they have, which is found by looking them up in the table information
Finally, we put it all together and set up the various structs with their fields, as well as an impl of our struct that provides some convenience functions
execute: Runs the query and deserializes each libsql::row into our custom return data struct
first: Simply returns the first row if there is any, or None
Adjust the path to your database schema and save the contents below into lib-macros/src/lib.rs.
To test the proc-macro, we cannot set up unit tests but instead need to add an integration test. Create the director lib-macros/tests and add the file lib-macros/tests/lib.rs:
1
uselib_macros::query;
2
3
constTABLE_SCHEMA:&str=r#"
4
CREATE TABLE users (
5
user_id varchar(36) NOT NULL,
6
first_name varchar(255),
7
last_name varchar(255),
8
country varchar(3),
9
onboarding_step varchar(255) NOT NULL DEFAULT 'first',
10
demo_user boolean NOT NULL DEFAULT false,
11
created_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
12
updated_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
13
PRIMARY KEY (user_id)
14
);
15
16
CREATE TABLE auth_sessions (
17
session_id varchar(36) NOT NULL,
18
user_id varchar(36) NOT NULL,
19
created_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
20
updated_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
21
expires_at timestamp NOT NULL,
22
PRIMARY KEY (session_id)
23
);
24
"#;
25
26
async fnpopulate_db(conn:&libsql::Connection){
27
// Create the table schema.
28
let stmts =TABLE_SCHEMA.split(";").filter(|s|!s.trim().is_empty()).collect::<Vec<_>>();
29
for stmt in stmts {
30
conn.execute(stmt,libsql::params![])
31
.await
32
.expect(format!("Failed to execute statement: {}", stmt).as_str());
33
}
34
35
// Populate some data into the table.
36
conn.execute("INSERT INTO users (user_id, first_name, last_name, country, onboarding_step, demo_user) VALUES ('1234', 'John', 'Doe', 'USA', 'first', false)",libsql::params![]).await.expect("Failed to insert row into users table");
LEFT JOIN users ON users.user_id = auth_sessions.user_id
61
WHERE auth_sessions.session_id = :session"
62
).session("some-session-id".into()).build();
63
let row = query.first(&conn).await;
64
65
// Check that we got a single row back.
66
assert!(row.is_some());
67
68
// Check that the row has the expected length.
69
let rows = query.execute(&conn).await;
70
assert_eq!(rows.len(),1);
71
}
This will let you validate that the proc-macro actually works as expected. Adjust the test query to something that fits with your own table definition, or copy the example SQL below from the Errors section, and put it into the expected location in database/users.sql:
CREATEINDEXIF NOT EXISTS auth_sessions_user_lookup ON auth_sessions (user_id);
Take the following faulty query (note the field onboarding, which should be onboarding_step):
let conn = database_context.users.get().expect("No connection to users database");let query =query!("SELECT user_id, first_name, last_name, country, onboarding FROM users");let query = query.build();let rows = query.execute(&conn).await;let users = rows
.into_iter().map(|row|User::new(row.user_id, row.first_name, row.last_name, row.onboarding_step)).collect();
It will give a compile time error:
Field 'onboarding' does not exist in table 'users'
Which is very helpful, since it blocks us from keeping an invalid query, and points out both the field it cannot find as well as which table it was trying to find it on.