Establishing a Postgres Connection

Welcome to part 1 of our Rust Web series! In this series, we're going to be exploring some more advanced concepts in Rust. Specifically, we'll be looking at parallel ideas from our Real World Haskell Series. In these first couple parts, we'll be exploring how to connect Rust and a Postgres database. To start, we'll use the Rust Postgres library. This will help us create a basic database connection so we can make simple queries.

You can see all the code for this article in action by looking at our RustWeb repository. Specifically, you'll want to check out the file pg_basic.rs. If you're already familiar with this library, you can move on to part 2 where we'll get

If you're new to Rust, we have a couple beginner resources for you to start out with. You can read our Rust Beginners Series to get a basic introduction to the language. Or for some more in-depth explanations, you can watch our Rust Video Tutorial!

Creating Tables

We'll start off by making a client object to connect to our database. This uses a query string like we would with any Postgres library.

let conn_string = "host=localhost port=5432 user=postgres";
let mut client : Client = Client::connect(conn_string, NoTls)?;

Note that the connect function generally returns a Result<Client, Error>. In Haskell, we would write this as Either Error Client. By using ? at the end of our call, we can immediately unwrap the Client. The caveat on this is that it only compiles if the whole function returns some kind of Result<..., Error>. This is an interesting monadic behavior Rust gives us. Pretty much all our functions in this article will use this ? behavior.

Now that we have a client, we can use it to run queries. The catch is that we have to know the Raw SQL ourselves. For example, here's how we can create a table to store some users:

client.batch_execute("\
    CREATE TABLE users (
        id SERIAL PRIMARY KEY,
        name TEXT NOT NULL,
        email TEXT NOT NULL,
        age INTEGER NOT NULL
    )
")?;

Inserting with Interpolation

A raw query like that with no real result is the simplest operation we can perform. But, any non-trivial program will require us to customize the queries programmatically. To do this we'll need to interpolate values into the middle of our queries. We can do this with execute (as opposed to batch_execute).

Let's try creating a user. As with batch_execute, we need a query string. This time, the query string will contain values like $1, $2 that we'll fill in with variables. We'll provide these variables with a list of references. Here's what it looks like with a user:

let name = "James";
let email = "james@test.com";
let age = 26;
client.execute(
    "INSERT INTO users (name, email, age) VALUES ($1, $2, $3)",
    &[&name, &email, &age],
)?;

Again, we're using a raw query string. All the values we interpolate must implement the specific class postgres_types::ToSql. We'll see this a bit later.

Fetching Results

The last main type of query we can perform is to fetch our results. We can use our client to call the query function, which returns a vector of Row objects:

for row: Row in client.query("SELECT * FROM users"), &[])? {
  ...
}

For more complicated SELECT statements we would interpolate parameters, as with insertion above. The Row has different Columns for accessing the data. But in our case it's a little easier to use get and the index to access the different fields. Like our Raw SQL calls, this is unsafe in a couple ways. If we use an out of bounds index, we'll get a crash. And if we try to cast to the wrong data type, we'll also run into problems.

for row: Row in client.query("SELECT * FROM users"), &[])? {
  let id: i32 = row.get(0);
  let name: &str = row.get(1);
  let email: &str = row.get(2);
  let age: i32 = row.get(3);

  ...
}

We could then use these individual values to populate whatever data types we wanted on our end.

Joining Tables

If we want to link two tables together, of course we'll also have to know how to do this with Raw SQL. For example, we can make our articles table:

client.batch_execute("\
  CREATE TABLE articles (
    id SERIAL PRIMARY KEY,
    title TEXT NOT NULL,
    body TEXT NOT NULL,
    published_at TIMESTAMP WITH TIME ZONE NOT NULL,
    author_id INTEGER REFERENCES users(id)
  )
")?;

Then, after retrieving a user's ID, we can insert an article written by that user.

for row: Row in client.query("SELECT * FROM users"), &[])? {
  let id: i32 = row.get(0);
  let title: &str = "A Great Article!";
  let body: &str = "You should share this with friends.";
  let cur_time: DateTime<Utc> = Utc::now();
  client.execute(
    "INSERT INTO articles (title, body, published_at, author_id) VALUES ($1, $2, $3, $4)",
    &[&title, &body, &cur_time, &id]
  )?;
}

One of this tricky parts is that this won't compile if you only use the basic postgres dependency in Rust! There isn't a native ToSql instance for the DateTime<Utc> type. However, Rust dependencies can have specific "features". This concept doesn't really exist in Haskell, except through extra packages. You'll need to specify the with-chrono feature for the version of the chrono library you use. This feature, or sub-dependency contains the necessary ToSql instance. Here's what the structure looks like in our Cargo.toml:

[dependencies]
chrono="0.4"
postgres={version="0.17.3", features=["with-chrono-0_4"]}

After this, our code will compile!

Runtime Problems

Now there are lots of reasons we wouldn't want to use a library like this in a formal project. One of the big principles of Rust (and Haskell) is catching errors at compile time. And writing out functions with lots of raw SQL like this makes our program very prone to runtime errors. I encountered several of these as I was writing this small program! At one point, I started writing the SELECT query and absentmindedly forgot to complete it until I ran my program!

At another point, I couldn't decide what timestamp format to use in Postgres. I went back and forth between using a TIMESTAMP or just an INTEGER for the published_at field. I needed to coordinate the SQL for both the table creation query and the fetching query. I often managed to change one but not the other, resulting in annoying runtime errors. I finally discovered I needed TIMESTAMP WITH TIME ZONE and not merely TIMESTAMP. This was a rather painful process with this setup.

Conclusion

In part 2, we'll explore Diesel, a library that lets us use schemas to catch more of these issues at compile time. The framework is more comparable to Persistent in Haskell. It gives us an ORM (Object Relational Mapping) so that we don't have to write raw SQL. This approach is much more suited to languages like Haskell and Rust!

To try out tasks like this in Haskell, take a look at our Production Checklist! It includes a couple different libraries for interacting with databases using ORMs.