Dependencies and Package Databases
Here's the final video in our Haskellings series! We'll figure out how to add dependencies to our exercises. This is a bit trickier than it looks, since we're running GHC outside of the Stack context. But with a little intuition, we can find where Stack is storing its package database and use that when running the exercise!
Next week, we'll do a quick summary of our work on this program, and see what the future holds for it!
Adding Hints
If a user is struggling with a particular exercise, we don't want them to get stuck! In this week's video, we'll see how to add "hints" to the Watcher. This way, a user can get a little extra help when they need it! We can also use this functionality to add more features in the future, like skipping an exercise, or even asking the user questions as part of the exercise!
Executing Executables!
In Haskell, we'd like to think "If it compiles, it works!" But of course this isn't generally the case. So in addition to testing whether or not our exercises compile, we'll also want to run the code the user wrote and see if it works properly! In this week's video, we'll see how we can distinguish between these two different exercise types!
Testing the Watcher
In our Haskellings program, the Watcher is a complex piece of functionality. It has to track changes to particular files, re-compile them on the spot, and keep track of the order in which these exercises should be done. So naturally, writing unit tests for it presents challenges as well! In this video, we'll explore how to simulate file modifications in the middle of a unit test.
Unit Testing Compilation
We've used Hspec before on the blog. But now we'll apply it to our Haskellings program, in conjunction with the program configuration changes we made last week! We'll write a couple simple unit tests that will test the basic compilation behavior of our program. Check out the video to see how!
Using the Handle Abstraction
Our haskellings program is starting to get a bit more complicated, so it would be nice to write some unit tests for it. But this is very difficult to do for a command line application! To set this up, we'll start by refactoring our program to use the Handle abstraction. This allows the program to work regardless of whether it's using the command line or file for its input. So the user will see the command line version, but our tests will be able to use files! We'll also use the process of Compile Driven Development to systematically refactor our program to use these new configuration elements!
Sequencing Exercises in the Watcher
Our watcher program can keep tabs on our exercise files, but it doesn't have any notion of an exercise order. In this week's video, we'll fix this! Our watcher will focus on one specific "current" exercise. When we're done with that, it will move us on to the next one automatically! This video involves a good example of using recursion in our Haskell workflow. But even better, we'll see a rudimentary usage of an MVar to pass information between the threads of our program. Take a look!
Overriding Process Handlings and Terminal Colors
Right now, our Haskellings program doesn't give the user any information or output aside from the actual compilation stream. For this week's video blog, we'll dig in a bit deeper to the process to give the user better information. We'll use the process's exit information and override its handles to control which output the user sees and when. Then as a final flourish, we'll color the user's terminal output result to be more helpful!
Watching Files with FS-Notify!
This week we continue laying out the groundwork for our "Haskellings" project. A key part of the this program is the automated aspect of "watch" mode. A user should be able to modify their file and then immediately see the results of their changes in the watcher window. This week, we get familiar with the fsnotify package, which lets us watch files and directories and take actions in our program when they change!
Haskellings 2: Better Configuration
This week we'll continue working on our nascent Haskellings project. There's a few interesting things we'll learn here around using the directory package. We'll also explore how to use the Seq
data structure to implement a quick Bread-First-Search algorithm!
Starting Haskellings!
After learning about the Rustlings program in the last few weeks, we're now going to try to start replicating it in Haskell! In this week's video blog, we'll learn a little bit about using the ghc
command on its own outside of Stack/Cabal, and then how to run it from within our program using the System.Process
library.
Rustlings Part 2
This week we continue with another Rustlings video tutorial! We'll tackle some more advanced concepts like move semantics, traits, and generics! Next week, we'll start considering how we might build a similar program to teach beginners about Haskell!
Rustlings Video Blog!
We're doing something very new this week. Instead of doing a code writeup, I've actually made . In keeping with the last couple months on content, this first one is still Rust related. We'll walkthrough the Rustlings tool, which is an interactive program that teaches you the basics of the Rust Language! Soon, we'll start exploring how we might do this in Haskell!
You can also watch this video on our YouTube Channel! Subscribe there or sign up for our mailing list!
Rust Web Series Complete!
We're taking a quick breather this week from new content for an announcement. Our recently concluded Rust Web series now has a permanent spot on the advanced page of our website. You can take a look at the series page here! Here's a quick summary of the series:
- Part 1: Postgres - In the first part, we learn about a basic library to enable integration with a Postgresql Database.
- Part 2: Diesel - Next up, we get a little more formal with our database mechanics. We use the Diesel library to provide a schema for our database application.
- Part 3: Rocket - In part 3, we take the next step and start making a web server! We'll learn the basics of the Rocket server library!
- Part 4: CRUD Server - What do we do once we have a database and server library? Combine them of course! In this part, we'll make a CRUD server that can access our database elements using Diesel and Rocket.
- Part 5: Authentication - If your server will actually serve real users, you'll need authentication at some point. We'll see the different mechanisms we can use with Rocket for securing our endpoints.
- Part 6: Front-end Templating - If you're serving a full front-end web app, you'll need some way to customize the HTML. In the last part of the series, we'll see how Rocket makes this easy!
The best part is that you can find all the code for the series on our Github Repo! So be sure to take a look there. And if you're still new to Rust, you can also get your feet wet first with our Beginners Series.
In other exciting news, we'll be trying a completely new kind of content in the next couple weeks. I've written a bit in the past about using different IDEs like Atom and IntelliJ to write Haskell. I'd like to revisit these ideas to give a clearer idea of how to make our lives easier when writing code. But instead of writing articles, I'll be making a few videos to showcase how these work! I hope that a visual display of the IDEs will help make the content more clear.
Unit Tests and Benchmarks in Rust
For a couple months now, we've focused on some specific libraries you can use in Rust for web development. But we shouldn't lose sight of some other core language skills and mechanics. Whenever you write code, you should be able to show first that it works, and second that it works efficiently. If you're going to build a larger Rust app, you should also know a bit about unit testing and benchmarking. This week, we'll take a couple simple sorting algorithms as our examples to learn these skills.
As always, you can take a look at the code for this article on our Github Repo for the series. You can find this week's code specifically in sorters.rs
! For a more basic introduction to Rust, be sure to check out our Rust Beginners Series!
Insertion Sort
We'll start out this article by implementing insertion sort. This is one of the simpler sorting algorithms, which is rather inefficient. We'll perform this sort "in place". This means our function won't return a value. Rather, we'll pass a mutable reference to our vector so we can manipulate its items. To help out, we'll also define a swap
function to change two elements around that same reference:
pub fn swap(numbers: &mut Vec<i32>, i: usize, j: usize) {
let temp = numbers[i];
numbers[i] = numbers[j];
numbers[j] = temp;
}
pub fn insertion_sorter(numbers: &mut Vec<i32>) {
...
}
At its core, insertion sort is a pretty simple algorithm. We maintain the invariant that the "left" part of the array is always sorted. (At the start, with only 1 element, this is clearly true). Then we loop through the array and "absorb" the next element into our sorted part. To absorb the element, we'll loop backwards through our sorted portion. Each time we find a larger element, we switch their places. When we finally encounter a smaller element, we know the left side is once again sorted.
pub fn insertion_sorter(numbers: &mut Vec<i32>) {
for i in 1..numbers.len() {
let mut j = i;
while j > 0 && numbers[j-1] > numbers[j] {
swap(numbers, j, j - 1);
j = j - 1;
}
}
}
Testing
Our algorithm is simple enough. But how do we know it works? The obvious answer is to write some unit tests for it. Rust is actually a bit different from Haskell and most other languages in the canonical approach to unit tests. Most of the time, you'll make a separate test directory. But Rust encourages you to write unit tests in the same file as the function definition. We do this by having a section at the bottom of our file specifically for tests. We delineate a test function with the test
macro:
[#test]
fn test_insertion_sort() {
...
}
To keep things simple, we'll define a random vector of 100 integers and pass it to our function. We'll use assert
to verify that each number is smaller than the next one after it.
#[test]
fn test_insertion_sort() {
let mut numbers: Vec<i32> = random_vector(100);
insertion_sorter(&mut numbers);
for i in 0..(numbers.len() - 1) {
assert!(numbers[i] <= numbers[i + 1]);
}
}
When we run the cargo test
command, Cargo will automatically detect that we have a test suite in this file and run it.
running 1 test...
test sorter::test_insertion_sort ... ok
Benchmarking
So we know our code works, but how quickly does it work? When you want to check the performance of your code, you need to establish benchmarks. These are like test suites except that they're meant to give out the average time it takes to perform a task.
Just as we had a test
macro for making test suites, we can use the bench
macro for benchmarks. Each of these takes a mutable Bencher
object as an argument. To record some code, we'll call iter
on that object and pass a closure that will run our function.
#[bench]
fn bench_insertion_sort_100_ints(b: &mut Bencher) {
b.iter(|| {
let mut numbers: Vec<i32> = random_vector(100);
insertion_sorter(&mut numbers)
});
}
We can then run the benchmark with cargo bench
.
running 2 tests
test sorter::test_insertion_sort ... ignored
test sorter::bench_insertion_sort_100_ints ... bench: 6,537 ns
/iter (+/- 1,541)
So on average, it took about 6ms to sort 100 numbers. On its own, this number doesn't tell us much. But we can get a more clear idea for the runtime of our algorithm by looking at benchmarks of different sizes. Suppose we make lists of 1000 and 10000:
#[bench]
fn bench_insertion_sort_1000_ints(b: &mut Bencher) {
b.iter(|| {
let mut numbers: Vec<i32> = random_vector(1000);
insertion_sorter(&mut numbers)
});
}
#[bench]
fn bench_insertion_sort_10000_ints(b: &mut Bencher) {
b.iter(|| {
let mut numbers: Vec<i32> = random_vector(10000);
insertion_sorter(&mut numbers)
});
}
Now when we run the benchmark, we can compare the results of these different runs:
running 4 tests
test sorter::test_insertion_sort ... ignored
test sorter::bench_insertion_sort_10000_ints ... bench: 65,716,130 ns
/iter (+/- 11,193,188)
test sorter::bench_insertion_sort_1000_ints ... bench: 612,373 ns
/iter (+/- 124,732)
test sorter::bench_insertion_sort_100_ints ... bench: 12,032 ns
/iter (+/- 904)
We see that when we increase the problem size by a factor of 10, we increase the runtime by a factor of nearly 100! This confirms for us that our simple insertion sort has an asymptotic runtime of O(n^2)
, which is not very good.
Quick Sort
There are many ways to sort more efficiently! Let's try our hand at quicksort. For this algorithm, we first "partition" our array. We'll choose a pivot value, and then move all the numbers smaller than the pivot to the left of the array, and all the greater numbers to the right. The upshot is that we know our pivot element is now in the correct final spot!
Here's what the partition algorithm looks like. It works on a specific sub-segment of our vector, indicated by start
and end
. We initially move the pivot element to the back, and then loop through the other elements of the array. The i
index tracks where our pivot will end up. Each time we encounter a smaller number, we increment it. At the very end we swap our pivot element back into its place, and return its final index.
pub fn partition(
numbers: &mut Vec<i32>,
start: usize,
end: usize,
partition: usize)
-> usize {
let pivot_element = numbers[partition];
swap(numbers, partition, end - 1);
let mut i = start;
for j in start..(end - 1) {
if numbers[j] < pivot_element {
swap(numbers, i, j);
i = i + 1;
}
}
swap(numbers, i, end - 1);
i
}
So to finish sorting, we'll set up a recursive helper that, again, functions on a sub-segment of the array. We'll choose a random element and partition by it:
pub fn quick_sorter_helper(
numbers: &mut Vec<i32>, start: usize, end: usize) {
if start >= end {
return;
}
let mut rng = thread_rng();
let initial_partition = rng.gen_range(start, end);
let partition_index =
partition(numbers, start, end, initial_partition);
...
}
Now that we've partitioned, all that's left to do is recursively sort each side of the partition! Our main API function will call this helper with the full size of the array.
pub fn quick_sorter_helper(
numbers: &mut Vec<i32>, start: usize, end: usize) {
if start >= end {
return;
}
let mut rng = thread_rng();
let initial_partition = rng.gen_range(start, end);
let partition_index =
partition(numbers, start, end, initial_partition);
quick_sorter_helper(numbers, start, partition_index);
quick_sorter_helper(numbers, partition_index + 1, end);
}
pub fn quick_sorter(numbers: &mut Vec<i32>) {
quick_sorter_helper(numbers, 0, numbers.len());
}
Now that we've got this function, let's add tests and benchmarks for it:
#[test]
fn test_quick_sort() {
let mut numbers: Vec<i32> = random_vector(100);
quick_sorter(&mut numbers);
for i in 0..(numbers.len() - 1) {
assert!(numbers[i] <= numbers[i + 1]);
}
}
#[bench]
fn bench_quick_sort_100_ints(b: &mut Bencher) {
b.iter(|| {
let mut numbers: Vec<i32> = random_vector(100);
quick_sorter(&mut numbers)
});
}
// Same kind of benchmarks for 1000, 10000, 100000
Then we can run our benchmarks and see our results:
running 9 tests
test sorter::test_insertion_sort ... ignored
test sorter::test_quick_sort ... ignored
test sorter::bench_insertion_sort_10000_ints ... bench: 65,130,880 ns
/iter (+/- 49,548,187)
test sorter::bench_insertion_sort_1000_ints ... bench: 312,300 ns
/iter (+/- 243,337)
test sorter::bench_insertion_sort_100_ints ... bench: 6,159 ns
/iter (+/- 4,139)
test sorter::bench_quick_sort_100000_ints ... bench: 14,292,660 ns
/iter (+/- 5,815,870)
test sorter::bench_quick_sort_10000_ints ... bench: 1,263,985 ns
/iter (+/- 622,788)
test sorter::bench_quick_sort_1000_ints ... bench: 105,443 ns
/iter (+/- 65,812)
test sorter::bench_quick_sort_100_ints ... bench: 9,259 ns
/iter (+/- 3,882)
Quicksort does much better on the larger values, as expected! We can discern that the times seem to only go up by a factor of around 10. It's difficult to determine that the true runtime is actually O(n log n)
. But we can clearly see that we're much closer to linear time!
Conclusion
That's all for this intermediate series on Rust! Next week, we'll summarize the skills we learned over the course of these couple months in Rust. Then we'll look ahead to our next series of topics, including some totally new kinds of content!
Don't forget! If you've never programmed in Rust before, our Rust Video Tutorial provides an in-depth introduction to the basics!
Cleaning our Rust with Monadic Functions
A couple weeks ago we explored how to add authentication to a Rocket Rust server. This involved writing a from_request
function that was very messy. You can see the original version of that function as an appendix at the bottom. But this week, we're going to try to improve that function! We'll explore functions like map
and and_then
in Rust. These can help us write cleaner code using similar ideas to functors and monads in Haskell.
For more details on this code, take a look at our Github Repo! For this article, you should look at rocket_auth_monads.rs
. For a simpler introduction to Rust, take a look at our Rust Beginners Series!
Closures and Mapping
First, let's talk a bit about Rust's equivalent to fmap
and functors. Suppose we have a simple option wrapper and a "doubling" function:
fn double(x: f64) -> {
2.0 * x
}
fn main() -> () {
let x: Option<f64> = Some(5.0);
...
}
We'd like to pass our x
value to the double
function, but it's wrapped in the Option
type. A logical thing to do would be to return None
if the input is None
, and otherwise apply the function and re-wrap in Some
. In Haskell, we describe this behavior with the Functor
class. Rust's approach has some similarities and some differences.
Instead of Functor
, Rust has a trait Iterable
. An iterable type contains any number of items of its wrapped type. And map
is one of the functions we can call on iterable types. As in Haskell, we provide a function that transforms the underlying items. Here's how we can apply our simple example with an Option
:
fn main() -> () {
let x: Option<f64> = Some(5.0);
let y: Option<f64> = x.map(double);
}
One notable difference from Haskell is that map
is a member function of the iterator type. In Haskell of course, there's no such thing as member functions, so fmap
exists on its own.
In Haskell, we can use lambda expressions as arguments to higher order functions. In Rust, it's the same, but they're referred to as closures instead. The syntax is rather different as well. We capture the particular parameters within bars, and then provide a brace-delimited code-block. Here's a simple example:
fn main() -> () {
let x: Option<f64> = Some(5.0);
let y: Option<f64> = x.map(|x| {2.0 * x});
}
Type annotations are also possible (and sometimes necessary) when specifying the closure. Unlike Haskell, we provide these on the same line as the definition:
fn main() -> () {
let x: Option<f64> = Some(5.0);
let y: Option<f64> = x.map(|x: f64| -> f64 {2.0 * x});
}
And Then…
Now using map
is all well and good, but our authentication example involved using the result of one effectful call in the next effect. As most Haskellers can tell you, this is a job for monads and not merely functors. We can capture some of the same effects of monads with the and_then
function in Rust. This works a lot like the bind operator (>>=)
in Haskell. It also takes an input function. And this function takes a pure input but produces an effectful output.
Here's how we apply it with Option
. We start with a safe_square_root
function that produces None
when it's input is negative. Then we can take our original Option
and use and_then
to use the square root function.
fn safe_square_root(x: f64) -> Option<f64> {
if x < 0.0 {
None
} else {
Some(x.sqrt())
}
}
fn main() -> () {
let x: Option<f64> = Some(5.0);
x.and_then(safe_square_root);
}
Converting to Outcomes
Now let's switch gears to our authentication example. Our final result type wasn't Option
. Some intermediate results used this. But in the end, we wanted an Outcome
. So to help us on our way, let's write a simple function to convert our options into outcomes. We'll have to provide the extra information of what the failure result should be. This is the status_error
parameter.
fn option_to_outcome<R>(
result: Option<R>,
status_error: (Status, LoginError))
-> Outcome<R, LoginError> {
match result {
Some(r) => Outcome::Success(r),
None => Outcome::Failure(status_error)
}
}
Now let's start our refactoring process. To begin, let's examine the retrieval of our username and password from the headers. We'll make a separate function for this. This should return an Outcome
, where the success value is a tuple of two strings. We'll start by defining our failure outcome, a tuple of a status and our LoginError
.
fn read_auth_from_headers(headers: &HeaderMap)
-> Outcome<(String, String), LoginError> {
let fail = (Status::BadRequest, LoginError::InvalidData);
...
}
We'll first retrieve the username out of the headers. Recall that this operation returns an Option
. So we can convert it to an Outcome
using our function. We can then use and_then
with a closure taking the unwrapped username.
fn read_auth_from_headers(headers: &HeaderMap)
-> Outcome<(String, String), LoginError> {
let fail = (Status::BadRequest, LoginError::InvalidData);
option_to_outcome(headers.get_one("username"), fail.clone())
.and_then(|u| -> Outcome<(String, String), LoginError> {
...
})
}
We can then do the same thing with the password field. When we've successfully unwrapped both fields, we can return our final Success
outcome.
fn read_auth_from_headers(headers: &HeaderMap)
-> Outcome<(String, String), LoginError> {
let fail = (Status::BadRequest, LoginError::InvalidData);
option_to_outcome(headers.get_one("username"), fail.clone())
.and_then(|u| {
option_to_outcome(
headers.get_one("password"), fail.clone())
.and_then(|p| {
Outcome::Success(
(String::from(u), String::from(p)))
})
})
}
Re-Organizing
Armed with this function we can start re-tooling our from_request
function. We'll start by gathering the header results and invoking and_then
. This unwraps the username and password:
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>)
-> Outcome<AuthenticatedUser, LoginError> {
let headers_result =
read_auth_from_headers(&request.headers());
headers_result.and_then(|(u, p)| {
...
}
...
}
}
Now for the next step, we'll make a couple database calls. Both of our normal functions return Option
values. So for each, we'll create a failure Outcome
and invoke option_to_outcome
. We'll follow this up with a call to and_then
. First we get the user based on the username. Then we find their AuthInfo
using the ID.
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>)
-> Outcome<AuthenticatedUser, LoginError> {
let headers_result =
read_auth_from_headers(&request.headers());
headers_result.and_then(|(u, p)| {
let conn_str = local_conn_string();
let maybe_user =
fetch_user_by_email(&conn_str, &String::from(u));
let fail1 =
(Status::NotFound, LoginError::UsernameDoesNotExist);
option_to_outcome(maybe_user, fail1)
.and_then(|user: UserEntity| {
let fail2 = (Status::MovedPermanently,
LoginError::WrongPassword);
option_to_outcome(
fetch_auth_info_by_user_id(
&conn_str, user.id), fail2)
})
.and_then(|auth_info: AuthInfoEntity| {
...
})
})
}
}
This gives us unwrapped authentication info. We can use this to compare the hash of the original password and return our final Outcome
!
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>)
-> Outcome<AuthenticatedUser, LoginError> {
let headers_result =
read_auth_from_headers(&request.headers());
headers_result.and_then(|(u, p)| {
let conn_str = local_conn_string();
let maybe_user =
fetch_user_by_email(&conn_str, &String::from(u));
let fail1 =
(Status::NotFound, LoginError::UsernameDoesNotExist);
option_to_outcome(maybe_user, fail1)
.and_then(|user: UserEntity| {
let fail2 = (Status::MovedPermanently,
LoginError::WrongPassword);
option_to_outcome(
fetch_auth_info_by_user_id(
&conn_str, user.id), fail2)
})
.and_then(|auth_info: AuthInfoEntity| {
let hash = hash_password(&String::from(p));
if hash == auth_info.password_hash {
Outcome::Success( AuthenticatedUser{
user_id: auth_info.user_id})
} else {
Outcome::Failure(
(Status::Forbidden,
LoginError::WrongPassword))
}
})
})
}
}
Conclusion
Is this new solution that much better than our original? Well it avoids the "triangle of death" pattern with our code. But it's not necessarily that much shorter. Perhaps it's a little more cleaner on the whole though. Ultimately these code choices are up to you! Next time, we'll wrap up our current exploration of Rust by seeing how to profile our code in Rust.
This series has covered some more advanced topics in Rust. For a more in-depth introduction, check out our Rust Video Tutorial!
Appendix: Original Function
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>) -> Outcome<AuthenticatedUser, LoginError> {
let username = request.headers().get_one("username");
let password = request.headers().get_one("password");
match (username, password) {
(Some(u), Some(p)) => {
let conn_str = local_conn_string();
let maybe_user = fetch_user_by_email(&conn_str, &String::from(u));
match maybe_user {
Some(user) => {
let maybe_auth_info = fetch_auth_info_by_user_id(&conn_str, user.id);
match maybe_auth_info {
Some(auth_info) => {
let hash = hash_password(&String::from(p));
if hash == auth_info.password_hash {
Outcome::Success(AuthenticatedUser{user_id: 1})
} else {
Outcome::Failure((Status::Forbidden, LoginError::WrongPassword))
}
}
None => {
Outcome::Failure((Status::MovedPermanently, LoginError::WrongPassword))
}
}
}
None => Outcome::Failure((Status::NotFound, LoginError::UsernameDoesNotExist))
}
},
_ => Outcome::Failure((Status::BadRequest, LoginError::InvalidData))
}
}
}
Rocket Frontend: Templates and Static Assets
In the last few articles, we've been exploring the Rocket library for Rust web servers. Last time out, we tried a couple ways to add authentication to our web server. In this last Rocket-specific post, we'll explore some ideas around frontend templating. This will make it easy for you to serve HTML content to your users!
To explore the code for this article, head over to the "rocket_template" file on our Github repo! If you're still new to Rust, you might want to start with some simpler material. Take a look at our Rust Beginners Series as well!
Templating Basics
First, let's understand the basics of HTML templating. When our server serves out a webpage, we return HTML to the user for the browser to render. Consider this simple index page:
<html>
<head></head>
<body>
<p> Welcome to the site!</p>
</body>
</html>
But of course, each user should see some kind of custom content. For example, in our greeting, we might want to give the user's name. In an HTML template, we'll create a variable or sorts in our HTML, delineated by braces:
<html>
<head></head>
<body>
<p> Welcome to the site {{name}}!</p>
</body>
</html>
Now before we return the HTML to the user, we want to perform a substitution. Where we find the variable {{name}}
, we should replace it with the user's name, which our server should know.
There are many different libraries that do this, often through Javascript. But in Rust, it turns out the Rocket library has a couple easy templating integrations. One option is Tera, which was specifically designed for Rust. Another option is Handlebars, which is more native to Javascript, but also has a Rocket integration. The substitutions in this article are simple, so there's not actually much of a difference for us.
Returning a Template
So how do we configure our server to return this HTML data? To start, we have to attach a "Fairing" to our server, specifically for the Template
library. A Fairing is a server-wide piece of middleware. This is how we can allow our endpoints to return templates:
use rocket_contrib::templates::Template;
fn main() {
rocket::ignite()
.mount("/", routes![index, get_user])
.attach(Template::fairing())
.launch();
}
Now we can make our index
endpoint. It has no inputs, and it will return Rocket's Template
type.
#[get("/")]
fn index() -> Template {
...
}
We have two tasks now. First, we have to construct our context. This can be any "map-like" type with string information. We'll use a HashMap
, populating the name
value.
#[get("/")]
fn index() -> Template {
let context: HashMap<&str, &str> = [("name", "Jonathan")]
.iter().cloned().collect();
...
}
Now we have to render
our template. Let's suppose we have a "templates" directory at the root of our project. We can put the template we wrote above in the "index.hbs" file. When we call the render
function, we just give the name of our template and pass the context
!
#[get("/")]
fn index() -> Template {
let context: HashMap<&str, &str> = [("name", "Jonathan")]
.iter().cloned().collect();
Template::render("index", &context)
}
Including Static Assets
Rocket also makes it quite easy to include static assets as part of our routing system. We just have to mount
the static
route to the desired prefix when launching our server:
fn main() {
rocket::ignite()
.mount("/static", StaticFiles::from("static"))
.mount("/", routes![index, get_user])
.attach(Template::fairing())
.launch();
}
Now any request to a /static/...
endpoint will return the corresponding file in the "static" directory of our project. Suppose we have this styles.css
file:
p {
color: red;
}
We can then link to this file in our index template:
<html>
<head>
<link rel="stylesheet" type="text/css" href="static/styles.css"/>
</head>
<body>
<p> Welcome to the site {{name}}!</p>
</body>
</html>
Now when we fetch our index, we'll see that the text on the page is red!
Looping in our Database
Now for one last piece of integration with our database. Let's make a page that will show a user their basic information. This starts with a simple template:
<!-- templates/user.hbs -->
<html>
<head></head>
<body>
<p> User name: {{name}}</p>
<br>
<p> User email: {{email}}</p>
<br>
<p> User name: {{age}}</p>
</body>
</html>
We'll compose an endpoint that takes the user's ID as an input and fetches the user from the database:
#[get("/users/<uid>")]
fn get_user(uid: i32) -> Template {
let maybe_user = fetch_user_by_id(&local_conn_string(), uid);
...
}
Now we need to build our context from the user information. This will require a match
statement on the resulting user. We'll use Unknown
for the fields if the user doesn't exist.
#[get("/users/<uid>")]
fn get_user(uid: i32) -> Template {
let maybe_user = fetch_user_by_id(&local_conn_string(), uid);
let context: HashMap<&str, String> = {
match maybe_user {
Some(u) =>
[ ("name", u.name.clone())
, ("email", u.email.clone())
, ("age", u.age.to_string())
].iter().cloned().collect(),
None =>
[ ("name", String::from("Unknown"))
, ("email", String::from("Unknown"))
, ("age", String::from("Unknown"))
].iter().cloned().collect()
}
};
Template::render("user", &context)
}
And to wrap it up, we'll render
the "user" template! Now when users get directed to the page for their user ID, they'll see their information!
Conclusion
Next week, we'll go back to some of our authentication code. But we'll do so with the goal of exploring a more universal Rust idea. We'll see how functors and monads still find a home in Rust. We'll explore the functions that allow us to clean up heavy conditional code just as we could in Haskell.
For a more in-depth introduction to Rust basics, be sure to take a look at our Rust Video Tutorial!
Authentication in Rocket
Last week we enhanced our Rocket web server. We combined our server with our Diesel schema to enable a series of basic CRUD endpoints. This week, we'll continue this integration, but bring in some more cool Rocket features. We'll explore two different methods of authentication. First, we'll create a "Request Guard" to allow a form of Basic Authentication. Then we'll also explore Rocket's amazingly simple Cookies integration.
As always, you can explore the code for this series by heading to our Github repository. For this article specifically, you'll want to take a look at the rocket_auth.rs
file
If you're just starting your Rust journey, feel free to check out our Beginners Series as well!
New Data Types
To start off, let's make a few new types to help us. First, we'll need a new database table, auth_infos
, based on this struct:
#[derive(Insertable)]
pub struct AuthInfo {
pub user_id: i32,
pub password_hash: String
}
When the user creates their account, they'll provide a password. We'll store a hash of that password in our database table. Of course, you'll want to run through all the normal steps we did with Diesel to create this table. This includes having the corresponding Entity
type.
We'll also want a couple new form types to accept authentication information. First off, when we create a user, we'll now include the password in the form.
#[derive(FromForm, Deserialize)]
struct CreateInfo {
name: String,
email: String,
age: i32,
password: String
}
Second, when a user wants to login, they'll pass their username (email) and their password.
#[derive(FromForm, Deserialize)]
struct LoginInfo {
username: String,
password: String,
}
Both these types should derive FromForm
and Deserialize
so we can grab them out of "post" data. You might wonder, do we need another type to store the same information that already exists in User
and UserEntity
? It would be possible to write CreateInfo
to have a User
within it. But then we'd have to manually write the FromForm
instance. This isn't difficult, but it might be more tedious than using a new type.
Creating a User
So in the first place, we have to create our user so they're matched up with their password. This requires taking the CreateInfo
in our post request. We'll first unwrap the user fields and insert our User
object. This follows the patterns we've seen so far in this series with Diesel.
#[post("/users/create", format="json", data="<create_info>")]
fn create(db: State<String>, create_info: Json<CreateInfo>)
-> Json<i32> {
let user: User = User
{ name: create_info.name.clone(),
email: create_info.email.clone(),
age: create_info.age};
let connection = ...;
let user_entity: UserEntity = diesel::insert_into(users::table)...
…
}
Now we'll want a function for hashing our password. We'll use the SHA3 algorithm, courtesy of the rust-crypto
library:
fn hash_password(password: &String) -> String {
let mut hasher = Sha3::sha3_256();
hasher.input_str(password);
hasher.result_str()
}
We'll apply this function on the input password and attach it to the created user ID. Then we can insert the new AuthInfo
and return the created ID.
#[post("/users/create", format="json", data="<create_info>")]
fn create(db: State<String>, create_info: Json<CreateInfo>)
-> Json<i32> {
...
let user_entity: UserEntity = diesel::insert_into(users::table)...
let password_hash = hash_password(&create_info.password);
let auth_info: AuthInfo = AuthInfo
{user_id: user_entity.id, password_hash: password_hash};
let auth_info_entity: AuthInfoEntity =
diesel::insert_into(auth_infos::table)..
Json(user_entity.id)
}
Now whenever we create our user, they'll have their password attached!
Gating an Endpoint
Now that our user has a password, how do we gate endpoints on authentication? Well the first approach we can try is something like "Basic Authentication". This means that every authenticated request contains the username and the password. In our example we'll get these directly out of header elements. But in a real application you would want to double check that the request is encrypted before doing this.
But it would be tiresome to apply the logic of reading the headers in every handler. So Rocket has a powerful functionality called "Request Guards". Rocket has a special trait called FromRequest
. Whenever a particular type is an input to a handler function, it runs the from_request
function. This determines how to derive the value from the request. In our case, we'll make a wrapper type AuthenticatedUser
. This represents a user that has included their auth info in the request.
struct AuthenticatedUser {
user_id: i32
}
Now we can include this type in a handler signature. For this endpoint, we only allow a user to retrieve their data if they've logged in:
#[get("/users/my_data")]
fn login(db: State<String>, user: AuthenticatedUser)
-> Json<Option<UserEntity>> {
Json(fetch_user_by_id(&db, user.user_id))
}
Implementing the Request Trait
The trick of course is that we need to implement the FromRequest
trait! This is more complicated than it sounds! Our handler will have the ability to short-circuit the request and return an error. So let's start by specifying a couple potential login errors we can throw.
#[derive(Debug)]
enum LoginError {
InvalidData,
UsernameDoesNotExist,
WrongPassword
}
The from_request
function will take in a request and return an Outcome
. The outcome will either provide our authentication type or an error. The last bit of adornment we need on this is lifetime specifiers for the request itself and the reference to it.
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>)
-> Outcome<AuthenticatedUser, LoginError> {
...
}
}
Now the actual function definition involves several layers of case matching! It consists of a few different operations that have to query the request or query our database. For example, let's consider the first layer. We insist on having two headers in our request: one for the username, and one for the password. We'll use request.headers()
to check for these values. If either doesn't exist, we'll send a Failure
outcome with invalid data. Here's what that looks like:
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>)
-> Outcome<AuthenticatedUser, LoginError> {
let username = request.headers().get_one("username");
let password = request.headers().get_one("password");
match (username, password) {
(Some(u), Some(p)) => {
...
}
_ => Outcome::Failure(
(Status::BadRequest,
LoginError::InvalidData))
}
}
}
In the main branch of the function, we'll do 3 steps:
- Find the user in our database based on their email address/username.
- Find their authentication information based on the ID
- Hash the input password and compare it to the database hash
If we are successful, then we'll return a successful outcome:
Outcome::Success(AuthenticatedUser(user_id: user.id))
The number of match
levels required makes the function definition very verbose. So we've included it at the bottom as an appendix. We know how to take such a function and write it more cleanly in Haskell using monads. In a couple weeks, we'll use this function as a case study to explore Rust's monadic abilities.
Logging In with Cookies
In most applications though, we'll won't want to include the password in the request each time. In HTTP, "Cookies" provide a way to store information about a particular user that we can track on our server.
Rocket makes this very easy with the Cookies
type! We can always include this mutable type in our requests. It works like a key-value store, where we can access certain information with a key like "user_id"
. Since we're storing auth information, we'll also want to make sure it's encoded, or "private". So we'll use these functions:
add_private(...)
get_private(...)
remove_private(...)
Let's start with a "login" endpoint. This will take our LoginInfo
object as its post data, but we'll also have the Cookies
input:
#[post("/users/login", format="json", data="<login_info>")]
fn login_post(db: State<String>, login_info: Json<LoginInfo>, mut cookies: Cookies) -> Json<Option<i32>> {
...
}
First we have to make sure a user of that name exists in the database:
#[post("/users/login", format="json", data="<login_info>")]
fn login_post(
db: State<String>,
login_info: Json<LoginInfo>,
mut cookies: Cookies)
-> Json<Option<i32>> {
let maybe_user = fetch_user_by_email(&db, &login_info.username);
match maybe_user {
Some(user) => {
...
}
}
None => Json(None)
}
}
Then we have to get their auth info again. We'll hash the password and compare it. If we're successful, then we'll add the user's ID as a cookie. If not, we'll return None
.
#[post("/users/login", format="json", data="<login_info>")]
fn login_post(
db: State<String>,
login_info: Json<LoginInfo>,
mut cookies: Cookies)
-> Json<Option<i32>> {
let maybe_user = fetch_user_by_email(&db, &login_info.username);
match maybe_user {
Some(user) => {
let maybe_auth = fetch_auth_info_by_user_id(&db, user.id);
match maybe_auth {
Some(auth_info) => {
let hash = hash_password(&login_info.password);
if hash == auth_info.password_hash {
cookies.add_private(Cookie::new(
"user_id", u ser.id.to_string()));
Json(Some(user.id))
} else {
Json(None)
}
}
None => Json(None)
}
}
None => Json(None)
}
}
A more robust solution of course would loop in some error behavior instead of returning None
.
Using Cookies
Using our cookie now is pretty easy. Let's make a separate "fetch user" endpoint using our cookies. It will take the Cookies
object and the user ID as inputs. The first order of business is to retrieve the user_id
cookie and verify it exists.
#[get("/users/cookies/<uid>")]
fn fetch_special(db: State<String>, uid: i32, mut cookies: Cookies)
-> Json<Option<UserEntity>> {
let logged_in_user = cookies.get_private("user_id");
match logged_in_user {
Some(c) => {
...
},
None => Json(None)
}
}
Now we need to parse the string value as a user ID and compare it to the value from the endpoint. If they're a match, we just fetch the user's information from our database!
#[get("/users/cookies/<uid>")]
fn fetch_special(db: State<String>, uid: i32, mut cookies: Cookies)
-> Json<Option<UserEntity>> {
let logged_in_user = cookies.get_private("user_id");
match logged_in_user {
Some(c) => {
let logged_in_uid = c.value().parse::<i32>().unwrap();
if logged_in_uid == uid {
Json(fetch_user_by_id(&db, uid))
} else {
Json(None)
}
},
None => Json(None)
}
And when we're done, we can also post a "logout" request that will remove the cookie!
#[post("/users/logout", format="json")]
fn logout(mut cookies: Cookies) -> () {
cookies.remove_private(Cookie::named("user_id"));
}
Conclusion
We've got one more article on Rocket before checking out some different Rust concepts. So far, we've only dealt with the backend part of our API. Next week, we'll investigate how we can use Rocket to send templated HTML files and other static web content!
Maybe you're more experienced with Haskell but still need a bit of an introduction to Rust. We've got some other materials for you! Watch our Rust Video Tutorial for an in-depth look at the basics of the language!
Appendix: From Request Function
impl<'a, 'r> FromRequest<'a, 'r> for AuthenticatedUser {
type Error = LoginError;
fn from_request(request: &'a Request<'r>) -> Outcome<AuthenticatedUser, LoginError> {
let username = request.headers().get_one("username");
let password = request.headers().get_one("password");
match (username, password) {
(Some(u), Some(p)) => {
let conn_str = local_conn_string();
let maybe_user = fetch_user_by_email(&conn_str, &String::from(u));
match maybe_user {
Some(user) => {
let maybe_auth_info = fetch_auth_info_by_user_id(&conn_str, user.id);
match maybe_auth_info {
Some(auth_info) => {
let hash = hash_password(&String::from(p));
if hash == auth_info.password_hash {
Outcome::Success(AuthenticatedUser{user_id: 1})
} else {
Outcome::Failure((Status::Forbidden, LoginError::WrongPassword))
}
}
None => {
Outcome::Failure((Status::MovedPermanently, LoginError::WrongPassword))
}
}
}
None => Outcome::Failure((Status::NotFound, LoginError::UsernameDoesNotExist))
}
},
_ => Outcome::Failure((Status::BadRequest, LoginError::InvalidData))
}
}
}
Joining Forces: An Integrated Rust Web Server
We've now explored a couple different libraries for some production tasks in Rust. A couple weeks ago, we used Diesel to create an ORM for some database types. And then last week, we used Rocket to make a basic web server to respond to basic requests. This week, we'll put these two ideas together! We'll use some more advanced functionality from Rocket to make some CRUD endpoints for our database type. Take a look at the code on Github here!
If you've never written any Rust, you should start with the basics though! Take a look at our Rust Beginners Series!
Database State and Instances
Our first order of business is connecting to the database from our handler functions. There are some direct integrations you can check out between Rocket, Diesel, and other libraries. These can provide clever ways to add a connection argument to any handler.
But for now we're going to keep things simple. We'll re-generate the PgConnection
within each endpoint. We'll maintain a "stateful" connection string to ensure they all use the same database.
Our Rocket server can "manage" different state elements. Suppose we have a function that gives us our database string. We can pass that to our server at initialization time.
fn local_conn_string() -> String {...}
fn main() {
rocket::ignite()
.mount("/", routes![...])
.manage(local_conn_string())
.launch();
}
Now we can access this String
from any of our endpoints by giving an input the State<String>
type. This allows us to create our connection:
#[get(...)]
fn fetch_all_users(database_url: State<String>) -> ... {
let connection = pgConnection.establish(&database_url)
.expect("Error connecting to database!");
...
}
Note: We can't use the PgConnection
itself because stateful types need to be thread safe.
So any other of our endpoints can now access the same database. Before we start writing these, we need a couple things first though. Let's recall that for our Diesel ORM we made a User
type and a UserEntity
type. The first is for inserting/creating, and the second is for querying. We need to add some instances to those types so they are compatible with our endpoints. We want to have JSON instances (Serialize, Deserialize), as well as FromForm
for our User
type:
#[derive(Insertable, Deserialize, Serialize, FromForm)]
#[table_name="users"]
pub struct User {
...
}
#[derive(Queryable, Serialize)]
pub struct UserEntity {
...
}
Now let's see how we get these types from our endpoints!
Retrieving Users
We'll start with a simple endpoint to fetch all the different users in our database. This will take no inputs, except our stateful database URL. It will return a vector of UserEntity
objects, wrapped in Json
.
#[get("/users/all")]
fn fetch_all_users(database_url: State<String>)
-> Json<Vec<UserEntity>> {
...
}
Now all we need to do is connect to our database and run the query function. We can make our users vector into a Json
object by wrapping with Json()
. The Serialize
instance lets us satisfy the Responder
trait for the return value.
#[get("/users/all")]
fn fetch_all_users(database_url: State<String>)
-> Json<Vec<UserEntity>> {
let connection = PgConnection::establish(&database_url)
.expect("Error connecting to database!");
Json(users.load::<UserEntity>(&connection)
.expect("Error loading users"))
}
Now for getting individual users. Once again, we'll wrap a response in JSON. But this time we'll return an optional, single, user. We'll use a dynamic capture parameter in the URL for the User ID.
#[get("/users/<uid>")]
fn fetch_user(database_url: State<String>, uid: i32)
-> Option<Json<UserEntity>> {
let connection = ...;
...
}
We'll want to filter on the users table by the ID. This will give us a list of different results. We want to specify this vector as mutable. Why? In the end, we want to return the first user. But Rust's memory rules mean we must either copy or move this item. And we don't want to move a single item from the vector without moving the whole vector. So we'll remove the head from the vector entirely, which requires mutability.
#[get("/users/<uid>")]
fn fetch_user(database_url: State<String>, uid: i32)
-> Option<Json<UserEntity>> {
let connection = ...;
use rust_web::schema::users::dsl::*;
let mut users_by_id: Vec<UserEntity> =
users.filter(id.eq(uid))
.load::<UserEntity>(&connection)
.expect("Error loading users");
...
}
Now we can do our case analysis. If the list is empty, we return None
. Otherwise, we'll remove the user from the vector and wrap it.
#[get("/users/<uid>")]
fn fetch_user(database_url: State<String>, uid: i32) -> Option<Json<UserEntity>> {
let connection = ...;
use rust_web::schema::users::dsl::*;
let mut users_by_id: Vec<UserEntity> =
users.filter(id.eq(uid))
.load::<UserEntity>(&connection)
.expect("Error loading users");
if users_by_id.len() == 0 {
None
} else {
let first_user = users_by_id.remove(0);
Some(Json(first_user))
}
}
Create/Update/Delete
Hopefully you can see the pattern now! Our queries are all pretty simple. So our endpoints all follow a similar pattern. Connect to the database, run the query and wrap the result. We can follow this process for the remaining three endpoints in a basic CRUD setup. Let's start with "Create":
#[post("/users/create", format="application/json", data = "<user>")]
fn create_user(database_url: State<String>, user: Json<User>)
-> Json<i32> {
let connection = ...;
let user_entity: UserEntity = diesel::insert_into(users::table)
.values(&*user)
.get_result(&connection).expect("Error saving user");
Json(user_entity.id)
}
As we discussed last week, we can use data
together with Json
to specify the form data in our post request. We de-reference the user with *
to get it out of the JSON wrapper. Then we insert the user and wrap its ID to send back.
Deleting a user is simple as well. It has the same dynamic path as fetching a user. We just make a delete
call on our database instead.
#[delete("/users/<uid>")]
fn delete_user(database_url: State<String>, uid: i32) -> Json<i32> {
let connection = ...;
use rust_web::schema::users::dsl::*;
diesel::delete(users.filter(id.eq(uid)))
.execute(&connection)
.expect("Error deleting user");
Json(uid)
}
Updating is the last endpoint, which takes a put
request. The endpoint mechanics are just like our other endpoints. We use a dynamic path component to get the user's ID, and then provide a User
body with the updated field values. The only trick is that we need to expand our Diesel knowledge a bit. We'll use update
and set
to change individual fields on an item.
#[put("/users/<uid>/update", format="json", data="<user>")]
fn update_user(
database_url: State<String>, uid: i32, user: Json<User>)
-> Json<UserEntity> {
let connection = ...;
use rust_web::schema::users::dsl::*;
let updated_user: UserEntity =
diesel::update(users.filter(id.eq(uid)))
.set((name.eq(&user.name),
email.eq(&user.email),
age.eq(user.age)))
.get_result::<UserEntity>(&connection)
.expect("Error updating user");
Json(updated_user)
}
The other gotcha is that we need to use references (&
) for the string fields in the input user. But now we can add these routes to our server, and it will manipulate our database as desired!
Conclusion
There are still lots of things we could improve here. For example, we're still using .expect
in many places. From the perspective of a web server, we should be catching these issues and wrapping them with "Err 500". Rocket also provides some good mechanics for fixing that. Next week though, we'll pivot to another server problem that Rocket solves adeptly: authentication. We should restrict certain endpoints to particular users. Rust provides an authentication scheme that is neatly encoded in the type system!
For a more in-depth introduction to Rust, watch our Rust Video Tutorial. It will take you through a lot of key skills like understanding memory and using Cargo!
Rocket: Web Servers in Rust!
Welcome back to our series on building simple apps in Rust. Last week, we explored the Diesel library which gave us an ORM for database interaction. For the next few weeks, we'll be trying out the Rocket library, which makes it quick and easy to build a web server in Rust! This is comparable to the Servant library in Haskell, which we've explored before.
This week, we'll be working on the basic building blocks of using this library. The reference code for this article is available here on Github!
Rust combines some of the neat functional ideas of Haskell with some more recognizable syntax from C++. To learn more of the basics, take a look at our Rust Beginners Series!
Our First Route
To begin, let's make a simple "hello world" endpoint for our server. We don't specify a full API definition all at once like we do with Servant. But we do use a special macro before the endpoint function. This macro describes the route's method and its path.
#[get("/hello")]
fn index() -> String {
String::from("Hello, world!")
}
So our macro tells us this is a "GET" endpoint and that the path is /hello
. Then our function specifies a String
as the return value. We can, of course, have different types of return values, which we'll explore those more as the series goes on.
Launching Our Server
Now this endpoint is useless until we can actually run and launch our server. To do this, we start by creating an object of type Rocket
with the ignite()
function.
fn main() {
let server: Rocket = rocket::ignite();
...
}
We can then modify our server by "mounting" the routes we want. The mount
function takes a base URL path and a list of routes, as generated by the routes
macro. This function returns us a modified server:
fn main() {
let server: Rocket = rocket::ignite();
let server2: Rocket = server.mount("/", routes![index]);
}
Rather than create multiple server objects, we'll just compose these different functions. Then to launch our server, we use launch
on the final object!
fn main() {
rocket::ignite().mount("/", routes![index]).launch();
}
And now our server will respond when we ping it at localhost:8000/hello
! We could, of course, use a different base path. We could even assign different routes to different bases!
fn main() {
rocket::ignite().mount("/api", routes![index]).launch();
}
Now it will respond at /api/hello
.
Query Parameters
Naturally, most endpoints need inputs to be useful. There are a few different ways we can do this. The first is to use path components. In Servant, we call these CaptureParams
. With Rocket, we'll format our URL to have brackets around the variables we want to capture. Then we can assigned them with a basic type in our endpoint function:
#[get("/math/<name>")]
fn hello(name: &RawStr) -> String {
format!("Hello, {}!", name.as_str())
}
We can use any type that satisfies the FromParam
trait, including a RawStr
. This is a Rocket specific type wrapping string-like data in a raw format. With these strings, we might want to apply some sanitization processes on our data. We can also use basic numeric types, like i32
.
#[get("/math/<first>/<second>")]
fn add(first: i32, second: i32) -> String {
String::from(format!("{}", first + second))
}
This endpoint will now return "11" when we ping /math/5/6
.
We can also use "query parameters", which all go at the end of the URL. These need the FromFormValue
trait, rather than FromParam
. But once again, RawStr
and basic numbers work fine.
#[get("/math?<first>&<second>)]
fn multiply(first: i32, second: i32) {
String::from(format!("{}", first * second)
}
Now we'll get "30" when we ping /math?5&6
.
Post Requests
The last major input type we'll deal with is post request data. Suppose we have a basic user type:
struct User {
name: String,
email: String,
age: i32
}
We'll want to derive various classes for it so we can use it within endpoints. From the Rust "Serde" library we'll want Deserialize
and Serialize
so we can make JSON elements out of it. Then we'll also want FromForm
to use it as post request data.
#[derive(FromForm, Deserialize, Serialize)]
struct User {
...
}
Now we can make our endpoint, but we'll have to specify the "format" as JSON and the "data" as using our "user" type.
#[post("/users/create", format="json", data="<user>")]
fn create_user(user: Json<User>) -> String {
...
}
We need to provide the Json
wrapper for our input type, but we can use it as though it's a normal User
. For now, we'll just return a string echoing the user's information back to us. Don't forget to add each new endpoint to the routes
macro in your server definition!
#[post("/users/create", format="json", data="<user>")]
fn create_user(user: Json<User>) -> String {
String::from(format!(
"Created user: {} {} {}", user.name, user.email, user.age))
}
Conclusion
Next time, we'll explore making a more systematic CRUD server. We'll add database integration and see some other tricks for serializing data and maintaining state. Then we'll explore more advanced topics like authentication, static files, and templating!
If you're going to be building a web application in Rust, you'd better have a solid foundation! Watch our Rust Video Tutorial to get an in-depth introduction!