Improved Turso (libsql) ergonomics in Rust

Christian Kjær in a casual setting :)
Christian Kjær
18 min read

·

1. October 2024

·

, , ,

I recently started using Turso, which is a great database option for serverless as well as to keep costs low. Its pricing is more than competitive with something like AWS DynamoDB, which is quite interesting as a alternative to it.

One problem though, there's no support by any good ORMs or similar in the Rust ecosystem for Turso.

Instead, I was currently writing something like this, just to get a User out from the database via their session:

let database_context = ctx.data::<DatabaseContext>().expect("Missing database context");
let conn = database_context.users.get().expect("No connection to users database");
let mut stmt = conn
    .prepare(
        "SELECT users.user_id, users.first_name, users.last_name, users.country, users.onboarding_step, users.demo_user
        FROM auth_sessions AS sessions
        LEFT JOIN users AS users ON users.user_id = sessions.user_id
        WHERE sessions.session_id = :session",
    )
    .await
    .expect("Failed to create the prepared statement to query sessions table");
let row = stmt
    .query(libsql::named_params! { ":session": auth_session.session.clone() })
    .await
    .expect("Failed to query sessions table");
    .next()
    .await
    .expect("Failed to get the first row of the sessions table");

if let Some(row) = row {
    let user: User = de::from_row::<lib_database::User>(&row)
        .expect("Failed to deserialize user")
        .into();
}

The above does the following:

  • Pulls the connection info from the async-graphql request context
  • Establishes a connection to the users database
  • Creates a prepared statement
  • Runs the query with the prepared statement and passes in the named parameters and tries to get the rows
  • Tries to get the first row (a Result<Option<Row>>)
  • And finally, if that row was not None, tries to deserialize the row first into the lib_database::User type which was manually created, and finally into the User GraphQL type, which has a From implementation to convert from the database type to the GraphQL type

…quite verbose if I were to be honest, and not something that I'd like to bloat my codebase with.

There are also some obvious issues here:

  • Nothing is checking my SQL statement is valid and that those tables/columns even exist
  • Nothing verifies that my SQL is getting the necessary data to be able to deserialize into the lib_database::User type

Instead I'd like something closer to the following, checked at compile time:

  • Verifies that the query only refers to tables and columns that exist, by checking against the database schema file (not needing a running database)
  • Verify that all named parameters in the query is also populated
  • Create some form of an anonymous record/map with the column name as the key for each row

The code could look something like:

// Construct a type-safe builder based on the parameters in the query.
let query = query!(
    "SELECT users.user_id, users.first_name, users.last_name, users.country, users.onboarding_step, users.demo_user
    FROM auth_sessions
    LEFT JOIN users ON users.user_id = auth_sessions.user_id
    WHERE auth_sessions.session_id = :session"
).session(auth_session.session.clone().into()).build();
// Easily grab the first row.
let row = query.first(&conn).await;
// Access each selected field in a type-safe manner.
let user =
  User::new(row.user_id, row.first_name, row.last_name, row.onboarding_step);

So let's build it!

Compile time checking

We want as much as possible to be checked at compile-time, and that means that we very likely will need to reach for a Procedural Macro. In Rust, we have two forms of macros:

  • Declarative macros: Defined using macro_rules! (e.g. println!)
  • Procedural macros: Defined in their own create via #[proc_macro] and similar, depending on what type you want (e.g. custom!(...), #[derive(CustomDerive)] or #[CustomAttribute])

Proc macros cannot be used in the same crate as they are defined, so we'll first need to create a new crate that we can use with our project:

$ cargo new lib-macros --lib

We'll then need to edit our lib-macros/Cargo.toml to add a few crates, both for our proc macro but also for our SQL parsing logic:

1[package]
2name = "lib-macros"
3version = "0.1.0"
4edition = "2021"
5
6[lib]
7proc-macro = true
8
9[profile.release]
10opt-level = 3 # Optimize for speed.
11lto = true # Enable Link Time Optimization.
12codegen-units = 1 # Reduce Parallel Code Generation Units to Increase Optimization.
13strip = true # Automatically strip symbols from the binary.
14debug = false
15
16[dependencies]
17# Parsing SQL.
18sqlparser = "0.51.0"
19# Macro dependencies.
20syn = "2.0"
21quote = "1.0"
22proc-macro2 = "1.0"
23
24[dev-dependencies]
25# Necessary for testing and also for any project actually using this proc-macro.
26typed-builder = "0.20.0"
27libsql = "0.6.0"
28serde = { version = "1.0.209", features = ["serde_derive"] }
29serde_json = "1.0.127"
30# Only for testing.
31tokio = { version = "1", features = ["macros"] }

Note that when using this proc-macro in your project, you'll basically need all the dependencies that are listed in [dev-dependencies], since the generated code will rely on these.

We'll then tackle the functionality in two steps:

  1. Handle parsing SQL from strings and files
  2. Stitch things together in a proc-macro that will generate our code

Parsing SQL

We will be reaching for the sqlparser crate to be able to parse the SQLite table definitions as well as our queries.

Here's a rough overview of what the code does:

  • select_names: Extract the selected field names from a SELECT statement.
  • create_table_columns: Extract the column names and types from a CREATE TABLE statement.
  • process_sql_file: Read an SQL file, extract all the CREATE TABLE statements, and process them.
  • validate_fields: Validate that the selected fields are found in the tables and named as expected.

Save the contents below into lib-macros/src/sql.rs.

1use sqlparser::ast::{
2 ColumnOption, ColumnOptionDef, DataType, SelectItem, SetExpr, Statement, TableFactor,
3 TableWithJoins,
4};
5use sqlparser::dialect::SQLiteDialect;
6use sqlparser::parser::Parser;
7use std::collections::HashMap;
8use std::fs;
9use std::io;
10
11#[derive(PartialEq, Eq, Debug)]
12pub struct TableInfo {
13 pub table_name: String,
14 pub columns: Vec<(String, String)>,
15}
16
17#[derive(PartialEq, Eq, Debug)]
18pub struct FieldInfo {
19 pub table_name: String,
20 pub alias: String,
21 pub field_name: String,
22}
23
24/// Validate that the selected fields are found in the tables and named as expected.
25pub fn validate_fields(
26 tables: &HashMap<String, TableInfo>,
27 fields: &Vec<FieldInfo>,
28) -> Result<(), Box<dyn std::error::Error>> {
29 // Validate that the fields are found and named as expected.
30 for field in fields {
31 if !tables.contains_key(&field.table_name) {
32 return Err(format!(
33 "Table '{}' was not found in the SQL schema file.",
34 field.table_name
35 )
36 .into());
37 }
38 let table = tables.get(&field.table_name).unwrap();
39 if !table.columns.iter().any(|(name, _)| name == &field.field_name) {
40 return Err(format!(
41 "Field '{}' does not exist in table '{}'.",
42 field.field_name, field.table_name
43 )
44 .into());
45 }
46 }
47
48 Ok(())
49}
50
51/// Process a SQL file and extract the CREATE TABLE statements.
52pub fn process_sql_file(filename: &str) -> io::Result<HashMap<String, TableInfo>> {
53 // Read the contents of our SQL file.
54 let content = fs::read_to_string(filename)?;
55
56 // Split the content on semicolons to get individual statements.
57 let statements: Vec<&str> = content.split(';').collect();
58
59 // Filter and process CREATE TABLE statements.
60 let mut table_map = HashMap::new();
61
62 for statement in statements {
63 let trimmed_stmt = statement.trim();
64 if trimmed_stmt.to_lowercase().starts_with("create table") {
65 match create_table_columns(trimmed_stmt) {
66 Ok(table_info) => {
67 table_map.insert(table_info.table_name.clone(), table_info);
68 }
69 Err(e) => {
70 eprintln!("Error processing statement: {trimmed_stmt}. Error: {e}");
71 }
72 }
73 }
74 }
75
76 Ok(table_map)
77}
78
79/// Extract the columns from a CREATE TABLE statement.
80///
81/// See the test [`test_create_table_columns`] for an example.
82pub fn create_table_columns(query: &str) -> Result<TableInfo, Box<dyn std::error::Error>> {
83 let dialect = SQLiteDialect {};
84 let mut ast = Parser::parse_sql(&dialect, query)?;
85
86 // We expect one statement, a CREATE TABLE
87 if ast.len() != 1 {
88 return Err("Expected a single SQL statement".into());
89 }
90
91 let statement = ast.pop().unwrap();
92
93 // Match the AST to drill into the CreateTable statement.
94 if let Statement::CreateTable(create_table) = statement {
95 let columns = create_table.columns;
96 // Extract column names and data types
97 let column_info: Vec<(String, String)> = columns
98 .iter()
99 .map(|col| {
100 let name = col.name.value.clone();
101 let data_type = sql_to_rust_type(&col.data_type, &col.options);
102 (name, data_type)
103 })
104 .collect();
105
106 // The table name has the format: ObjectName([Ident { value: "users", quote_style: Some('"') }])
107 let table_name =
108 create_table.name.0.into_iter().map(|ident| ident.value).collect::<Vec<_>>().join(".");
109 Ok(TableInfo { table_name, columns: column_info })
110 } else {
111 Err("Expected a CreateTable statement".into())
112 }
113}
114
115/// Convert the SQL data type to the Rust equivalent.
116fn sql_to_rust_type(data_type: &DataType, options: &Vec<ColumnOptionDef>) -> String {
117 let datatype = match data_type {
118 DataType::Int(_) => "i32".to_string(),
119 DataType::BigInt(_) => "i64".to_string(),
120 DataType::SmallInt(_) => "i16".to_string(),
121 DataType::Char(_) => "String".to_string(),
122 DataType::Varchar(_) => "String".to_string(),
123 DataType::Float(_) => "f32".to_string(),
124 DataType::Double => "f64".to_string(),
125 DataType::Boolean => "bool".to_string(),
126 DataType::Date => "chrono::NaiveDate".to_string(),
127 DataType::Time(_, _) => "chrono::NaiveTime".to_string(),
128 DataType::Timestamp(_, _) => "chrono::NaiveDateTime".to_string(),
129 DataType::Decimal(_) => "rust_decimal::Decimal".to_string(),
130 // Add more mappings as needed
131 _ => "".to_string(), // Default to String for unknown types
132 };
133
134 // Check if the type is optional, indicated by no NOT NULL and no DEFAULT value.
135 //
136 // Example of optional:
137 // options: []
138 //
139 // Example of NOT NULL:
140 // options: [ColumnOptionDef { name: None, option: NotNull }]
141 //
142 // Example of NOT NULL with DEFAULT:
143 // options: [ColumnOptionDef { name: None, option: NotNull }, ColumnOptionDef { name: None, option: Default(Value(Boolean(false))) }]
144 let mut has_not_null = false;
145 let mut has_default = false;
146 for option in options {
147 match option {
148 ColumnOptionDef { option: ColumnOption::NotNull, .. } => {
149 has_not_null = true;
150 }
151 ColumnOptionDef { option: ColumnOption::Default(_), .. } => {
152 has_default = true;
153 }
154 _ => {}
155 }
156 }
157
158 if has_not_null || has_default {
159 datatype
160 } else {
161 format!("Option<{datatype}>")
162 }
163}
164
165/// Extract the table name from a FROM clause.
166fn extract_table_name(from: &Vec<TableWithJoins>) -> Option<String> {
167 match from.first() {
168 Some(table_with_joins) => match &table_with_joins.relation {
169 TableFactor::Table { name, .. } => Some(
170 name.clone().0.into_iter().map(|ident| ident.value).collect::<Vec<_>>().join("."),
171 ),
172 _ => None,
173 },
174 None => None,
175 }
176}
177
178/// Extract the selected field names from a SELECT statement.
179///
180/// See the test [`test_select_names`] for an example.
181pub fn select_names(sql: &str) -> Result<Vec<FieldInfo>, Box<dyn std::error::Error>> {
182 let dialect = SQLiteDialect {};
183 let mut ast = Parser::parse_sql(&dialect, sql).unwrap();
184
185 // We expect one statement, a SELECT
186 if ast.len() != 1 {
187 return Err("Expected a single SQL statement".into());
188 }
189 let statement = ast.pop().unwrap();
190
191 // Match the AST to drill into the Query and Select statement.
192 if let Statement::Query(query) = statement {
193 if let SetExpr::Select(select) = *query.body {
194 // Extract field names from the projection
195 let fields: Vec<FieldInfo> = select
196 .projection
197 .iter()
198 .filter_map(|item| match item {
199 SelectItem::UnnamedExpr(expr) => Some((expr.to_string(), expr.to_string())),
200 SelectItem::ExprWithAlias { expr, alias } => {
201 Some((expr.to_string(), alias.value.clone()))
202 }
203 SelectItem::QualifiedWildcard(object_name, _) => {
204 Some((object_name.to_string(), object_name.to_string()))
205 }
206 SelectItem::Wildcard(_) => Some(("*".to_string(), "*".to_string())),
207 })
208 // Clean up name if the field name is prefixed with the table name.
209 .map(|(name, alias)| {
210 let split_name: Vec<&str> = name.split('.').collect();
211
212 // Extract the table name from the field, if it's prefixed, and fallback to the
213 // FROM table name.
214 let table_name = if split_name.len() > 1 {
215 name.split('.')
216 .next()
217 .map(|s| s.to_string())
218 .expect("Expected a table name")
219 } else {
220 extract_table_name(&select.from).expect("Expected a table name")
221 };
222 FieldInfo {
223 table_name,
224 alias: alias.split('.').last().map(|s| s.to_string()).unwrap_or(alias),
225 // Either the field name is of format "table.field" or "field".
226 field_name: split_name.last().map(|s| s.to_string()).unwrap_or(name),
227 }
228 })
229 .collect();
230
231 Ok(fields)
232 } else {
233 Err("Expected a SELECT statement".into())
234 }
235 } else {
236 Err("Expected a Query statement".into())
237 }
238}

We can add a few tests in the same file, that'll demonstrate and validate our functionality:

1#[test]
2fn test_create_table_columns() {
3 let columns = create_table_columns(
4 "
5 CREATE TABLE IF NOT EXISTS
6 \"users\"
7 (
8 -- The user id.
9 \"user_id\" varchar(36) NOT NULL,
10 -- The user's first name.
11 \"first_name\" varchar(255),
12 -- The user's last name.
13 \"last_name\" varchar(255),
14 -- The user's country code (e.g. DK, US, etc.).
15 \"country\" varchar(3),
16 -- If the user is onboarding or not.
17 -- Possible values: 'first' | 'second' | 'third' | 'done'
18 \"onboarding_step\" varchar(255) NOT NULL DEFAULT 'first',
19 -- If the user is a demo user or not.
20 \"demo_user\" boolean NOT NULL DEFAULT false,
21 -- The time the user was created.
22 \"created_at\" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
23 -- The time the user was last updated.
24 \"updated_at\" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
25 PRIMARY KEY (user_id)
26 );
27",
28 );
29
30 // Convert the expected columns from &str to String.
31 let expected_columns = vec![
32 ("user_id", "String"),
33 ("first_name", "Option<String>"),
34 ("last_name", "Option<String>"),
35 ("country", "Option<String>"),
36 ("onboarding_step", "String"),
37 ("demo_user", "bool"),
38 ("created_at", "chrono::NaiveDateTime"),
39 ("updated_at", "chrono::NaiveDateTime"),
40 ]
41 .iter()
42 .map(|(name, data_type)| (name.to_string(), data_type.to_string()))
43 .collect();
44
45 // Validate that the columns found as expected, and have the correct types.
46 assert_eq!(
47 columns.unwrap(),
48 TableInfo { table_name: "users".to_string(), columns: expected_columns }
49 );
50}
51
52#[test]
53fn test_select_names() {
54 let names = select_names(
55 "
56 SELECT
57 users.user_id as tmp_users_name,
58 users.user_id,
59 users.first_name,
60 users.last_name,
61 users.country,
62 users.onboarding_step,
63 users.demo_user,
64 session_id
65 FROM auth_sessions
66 LEFT JOIN users ON users.user_id = auth_sessions.user_id
67 WHERE auth_sessions.session_id = :session",
68 );
69
70 let expected_fields = [
71 ("users", "user_id", "tmp_users_name"),
72 ("users", "user_id", "user_id"),
73 ("users", "first_name", "first_name"),
74 ("users", "last_name", "last_name"),
75 ("users", "country", "country"),
76 ("users", "onboarding_step", "onboarding_step"),
77 ("users", "demo_user", "demo_user"),
78 ("auth_sessions", "session_id", "session_id"),
79 ]
80 .map(|(tbl, name, alias)| FieldInfo {
81 table_name: tbl.to_string(),
82 field_name: name.to_string(),
83 alias: alias.to_string(),
84 });
85
86 // Validate that the fields are found and named as expected.
87 assert_eq!(names.unwrap(), expected_fields)
88}
89
90#[test]
91fn test_validate_fields() {
92 let tables = process_sql_file("../database/users.sql").expect("Failed to process SQL file");
93 let valid_fields = vec![
94 FieldInfo {
95 table_name: "users".to_string(),
96 alias: "user_id".to_string(),
97 field_name: "user_id".to_string(),
98 },
99 FieldInfo {
100 table_name: "users".to_string(),
101 alias: "first_name".to_string(),
102 field_name: "first_name".to_string(),
103 },
104 ];
105 // Validate that the fields are found and named as expected.
106 assert!(validate_fields(&tables, &valid_fields).is_ok());
107
108 let invalid_fields = vec![
109 FieldInfo {
110 table_name: "users".to_string(),
111 alias: "user_id".to_string(),
112 field_name: "user_id".to_string(),
113 },
114 FieldInfo {
115 table_name: "users".to_string(),
116 alias: "first_name".to_string(),
117 field_name: "first_name_does_not_exist".to_string(),
118 },
119 ];
120
121 // Validate that we fail when the field name is invalid.
122 assert!(validate_fields(&tables, &invalid_fields).is_err());
123
124 let invalid_table_name = vec![FieldInfo {
125 table_name: "users_does_not_exist".to_string(),
126 alias: "user_id".to_string(),
127 field_name: "user_id".to_string(),
128 }];
129
130 // Validate that we fail when the table name is invalid.
131 assert!(validate_fields(&tables, &invalid_table_name).is_err());
132}

Proc Macro

With our helper functions in place for working with the SQL, we can finally piece together our proc-macro.

Here's a rough overview of what the code does:

  • Reads the input query as a string literal
  • Uses our SQL helpers to
    • Extract the selected fields from the query
    • Get all table information from a file located at ../database/users.sql (that is, outside of the lib-macros directory, and inside another directory called database)
    • Validate that the selected fields actually exist in the correct tables
  • Extract all the params under the assumption that they are the only things starting with :
  • Prepare some of the code generation we'll need later
    • Prepare the struct field names for our QueryParams typesafe builder, with a type of libsql::Value
    • Prepare the parameter name and struct field value pairs for libsql::named_params! which is used to pass the parameters to a prepared statement
    • Prepare our struct of our return data based on our selected fields and the types that they have, which is found by looking them up in the table information
  • Finally, we put it all together and set up the various structs with their fields, as well as an impl of our struct that provides some convenience functions
    • execute: Runs the query and deserializes each libsql::row into our custom return data struct
    • first: Simply returns the first row if there is any, or None

Adjust the path to your database schema and save the contents below into lib-macros/src/lib.rs.

1use proc_macro::TokenStream;
2use quote::{format_ident, quote};
3use syn::{parse_macro_input, parse_str, Error, LitStr, Type};
4
5mod sql;
6
7#[proc_macro]
8pub fn query(input: TokenStream) -> TokenStream {
9 let input = parse_macro_input!(input as LitStr);
10 let sql = input.value();
11
12 // Validate that we are only selecting fields that exist in the SQL schema.
13 let select_fields =
14 sql::select_names(&sql).expect("Unable to extract select fields from SQL input");
15 let table_map = sql::process_sql_file("../database/users.sql")
16 .expect("Failed to process SQL file '../database/users.sql'");
17 let validation = sql::validate_fields(&table_map, &select_fields);
18
19 // If the validation fails, return an nice compiler error.
20 if let Err(msg) = validation {
21 return Error::new_spanned(input, msg.to_string()).into_compile_error().into();
22 }
23
24 // Extract parameters, indicated by anything starting with a : in the SQL query.
25 let params: Vec<String> =
26 sql.split(':').skip(1).map(|s| s.split_whitespace().next().unwrap().to_string()).collect();
27
28 // Generate the builder struct for each parameter.
29 let param_struct_name = format_ident!("QueryParams");
30 let param_fields = params.iter().map(|param| {
31 let param_name = format_ident!("{}", param);
32 quote! {
33 #param_name: libsql::Value,
34 }
35 });
36
37 // Generate the parameter identifier and struct value pairs for the query.
38 let param_values = params.iter().map(|param| {
39 let param_name = format!(":{}", param);
40 let param_field = format_ident!("{}", param);
41 quote! {
42 #param_name: self.#param_field.clone(),
43 }
44 });
45
46 let data_struct_name = format_ident!("QueryData");
47 let data_fields = select_fields.iter().map(|field_info| {
48 // Use this alternative to avoid name clashes.
49 // let field_name = format_ident!("{}_{}", field_info.table_name, field_info.alias);
50 let field_name = format_ident!("{}", field_info.alias);
51 let table = table_map.get(&field_info.table_name).expect("Table not found");
52 let field_type: Type = parse_str(
53 table
54 .columns
55 .iter()
56 .find(|(name, _)| name == &field_info.field_name)
57 .expect("Field not found")
58 .1
59 .clone()
60 .as_ref(),
61 )
62 .expect("Failed to parse field type");
63 quote! {
64 pub #field_name: #field_type,
65 }
66 });
67
68 let expanded = quote! {
69 {
70
71 #[derive(typed_builder::TypedBuilder)]
72 /// Data structure for the query parameters.
73 struct #param_struct_name {
74 // Unfold the struct fields for each select field.
75 #(#param_fields)*
76 }
77
78 #[derive(Clone, Debug, serde::Deserialize)]
79 /// Data structure for the query result.
80 struct #data_struct_name {
81 // Unfold the struct fields for each select field.
82 #(#data_fields)*
83 }
84
85 /// Create a new instance of the builder, including our builder methods for the parameters.
86 impl #param_struct_name {
87 /// Run the query and return the result.
88 ///
89 /// NOTE: This consumes any errors from Turso, to make the API more ergonomic, since
90 /// in the majority of cases, we won't do anything about the errors after all.
91 pub async fn execute(&self, conn: &libsql::Connection) -> Vec<#data_struct_name> {
92 let mut stmt = conn.prepare(#sql).await.expect("Failed to set up prepared statement");
93 // let mut params: Vec<(String, libsql::Value)> = vec![];
94 // for (name, value) in &self.params {
95 // params.push((name.clone(), value.clone()));
96 // }
97 let mut rows = stmt.query(libsql::named_params! {
98 #(#param_values)*
99 }).await.expect("Failed to execute query");
100 let mut items: Vec<#data_struct_name> = Vec::new();
101 // Get each row from the database, deserialized to `T`.
102 while let Ok(Some(row)) = rows.next().await {
103 // TODO: Deserialize the row into the struct.
104 let item = libsql::de::from_row::<#data_struct_name>(&row).expect("failed to deserialize row");
105 items.push(item);
106 }
107 items
108 }
109
110 /// Run the query and return the first result or `None` if the query returns no results.
111 ///
112 /// NOTE: This consumes any errors from Turso, to make the API more ergonomic, since
113 /// in the majority of cases, we won't do anything about the errors after all.
114 pub async fn first(&self, conn: &libsql::Connection) -> Option<#data_struct_name> {
115 let items = self.execute(conn).await;
116 items.first().cloned()
117 }
118 }
119
120 // Return the builder instantiated builder.
121 #param_struct_name::builder()
122 }
123 };
124
125 TokenStream::from(expanded)
126}
127

To test the proc-macro, we cannot set up unit tests but instead need to add an integration test. Create the director lib-macros/tests and add the file lib-macros/tests/lib.rs:

1use lib_macros::query;
2
3const TABLE_SCHEMA: &str = r#"
4CREATE TABLE users (
5 user_id varchar(36) NOT NULL,
6 first_name varchar(255),
7 last_name varchar(255),
8 country varchar(3),
9 onboarding_step varchar(255) NOT NULL DEFAULT 'first',
10 demo_user boolean NOT NULL DEFAULT false,
11 created_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
12 updated_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
13 PRIMARY KEY (user_id)
14);
15
16CREATE TABLE auth_sessions (
17 session_id varchar(36) NOT NULL,
18 user_id varchar(36) NOT NULL,
19 created_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
20 updated_at timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
21 expires_at timestamp NOT NULL,
22 PRIMARY KEY (session_id)
23);
24"#;
25
26async fn populate_db(conn: &libsql::Connection) {
27 // Create the table schema.
28 let stmts = TABLE_SCHEMA.split(";").filter(|s| !s.trim().is_empty()).collect::<Vec<_>>();
29 for stmt in stmts {
30 conn.execute(stmt, libsql::params![])
31 .await
32 .expect(format!("Failed to execute statement: {}", stmt).as_str());
33 }
34
35 // Populate some data into the table.
36 conn.execute("INSERT INTO users (user_id, first_name, last_name, country, onboarding_step, demo_user) VALUES ('1234', 'John', 'Doe', 'USA', 'first', false)", libsql::params![]).await.expect("Failed to insert row into users table");
37 conn.execute(
38 "INSERT INTO auth_sessions (session_id, user_id, expires_at) VALUES ('some-session-id', '1234', '2030-09-07 22:56:52')",
39 libsql::params![],
40 )
41 .await
42 .expect("Failed to insert row into auth_sessions table");
43}
44
45#[tokio::test]
46async fn test_query_macro() {
47 // Set up the local database.
48 let conn = libsql::Builder::new_local(":memory:")
49 .build()
50 .await
51 .expect("Failed to create in-memory database")
52 .connect()
53 .expect("Failed to connect to in-memory database");
54 populate_db(&conn).await;
55
56 // Test our macro.
57 let query = query!(
58 "SELECT users.user_id, users.first_name, users.last_name, users.country, users.onboarding_step, users.demo_user
59 FROM auth_sessions
60 LEFT JOIN users ON users.user_id = auth_sessions.user_id
61 WHERE auth_sessions.session_id = :session"
62 ).session("some-session-id".into()).build();
63 let row = query.first(&conn).await;
64
65 // Check that we got a single row back.
66 assert!(row.is_some());
67
68 // Check that the row has the expected length.
69 let rows = query.execute(&conn).await;
70 assert_eq!(rows.len(), 1);
71}

This will let you validate that the proc-macro actually works as expected. Adjust the test query to something that fits with your own table definition, or copy the example SQL below from the Errors section, and put it into the expected location in database/users.sql:

└── database
    └── users.sql
└── lib-macros
    └── src/...
    └── tests/...
    └── Cargo.toml

Errors

Let's explore a case where we query the wrong field. We will build our example on the schema below:

1CREATE TABLE IF NOT EXISTS
2 "users"
3 (
4 -- The user id.
5 "user_id" varchar(36) NOT NULL,
6 -- The user's first name.
7 "first_name" varchar(255),
8 -- The user's last name.
9 "last_name" varchar(255),
10 -- The user's country code (e.g. DK, US, etc.).
11 "country" varchar(3),
12 -- If the user is onboarding or not.
13 -- Possible values: 'first' | 'second' | 'third' | 'done'
14 "onboarding_step" varchar(255) NOT NULL DEFAULT 'first',
15 -- If the user is a demo user or not.
16 "demo_user" boolean NOT NULL DEFAULT false,
17 -- The time the user was created.
18 "created_at" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
19 -- The time the user was last updated.
20 "updated_at" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
21 PRIMARY KEY (user_id)
22 );
23
24CREATE TABLE IF NOT EXISTS
25 "auth_sessions"
26 (
27 -- The unique session id.
28 "session_id" varchar(36) NOT NULL,
29 -- The user id.
30 "user_id" varchar(36) NOT NULL,
31 -- The time the session was created.
32 "created_at" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
33 -- The time the session entry was last updated.
34 "updated_at" timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP,
35 -- The time the session is set to expire.
36 "expires_at" timestamp NOT NULL,
37 PRIMARY KEY (session_id)
38 );
39CREATE INDEX IF NOT EXISTS auth_sessions_user_lookup ON auth_sessions (user_id);

Take the following faulty query (note the field onboarding, which should be onboarding_step):

let conn = database_context.users.get().expect("No connection to users database");
let query = query!("SELECT user_id, first_name, last_name, country, onboarding FROM users");
let query = query.build();
let rows = query.execute(&conn).await;
let users = rows
    .into_iter()
    .map(|row| User::new(row.user_id, row.first_name, row.last_name, row.onboarding_step))
    .collect();

It will give a compile time error:

Field 'onboarding' does not exist in table 'users'

Which is very helpful, since it blocks us from keeping an invalid query, and points out both the field it cannot find as well as which table it was trying to find it on.

👉 Let me know what you think over on Medium or in the comments below 👇