Skip to content

Commit 8129da9

Browse files
authored
Merge pull request #62 from dcarosone/book-typos-and-small-edits
Book typos and small edits
2 parents 3e22dc2 + 1fe8a28 commit 8129da9

File tree

4 files changed

+87
-67
lines changed

4 files changed

+87
-67
lines changed

docs/src/concepts/futures.md

Lines changed: 63 additions & 47 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,15 @@ Futures abstract over *computation*. They describe the "what", independent of th
66

77
## Send and Sync
88

9-
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between concurrent parts of a program: Send and Sync. Notably, both the Send and Sync traits abstract over *strategies* of concurrent work, compose neatly, and don't prescribe an implementation.
9+
Luckily, concurrent Rust already has two well-known and effective concepts abstracting over sharing between concurrent parts of a program: `Send` and `Sync`. Notably, both the `Send` and `Sync` traits abstract over *strategies* of concurrent work, compose neatly, and don't prescribe an implementation.
1010

11-
As a quick summary, `Send` abstracts over passing data in a computation over to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to enforce the "losing access" behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around, and the ownership and borrowing rules prevent subsequent access.
11+
As a quick summary:
1212

13-
Note how we avoided any word like *"thread"*, but instead opted for "computation". The full power of `Send` (and subsequently also `Sync`) is that they relieve you of the burden of knowing *what* shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
13+
- `Send` abstracts over *passing data* in a computation to another concurrent computation (let's call it the receiver), losing access to it on the sender side. In many programming languages, this strategy is commonly implemented, but missing support from the language side expects you to enforce the "losing access" behaviour yourself. This is a regular source of bugs: senders keeping handles to sent things around and maybe even working with them after sending. Rust mitigates this problem by making this behaviour known. Types can be `Send` or not (by implementing the appropriate marker trait), allowing or disallowing sending them around, and the ownership and borrowing rules prevent subsequent access.
1414

15-
`Sync` is about sharing data between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways for two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something *needs* synchronisation while not being specific about the *how*.
15+
- `Sync` is about *sharing data* between two concurrent parts of a program. This is another common pattern: as writing to a memory location or reading while another party is writing is inherently unsafe, this access needs to be moderated through synchronisation.[^1] There are many common ways for two parties to agree on not using the same part in memory at the same time, for example mutexes and spinlocks. Again, Rust gives you the option of (safely!) not caring. Rust gives you the ability to express that something *needs* synchronisation while not being specific about the *how*.
16+
17+
Note how we avoided any word like *"thread"*, but instead opted for "computation". The full power of `Send` and `Sync` is that they relieve you of the burden of knowing *what* shares. At the point of implementation, you only need to know which method of sharing is appropriate for the type at hand. This keeps reasoning local and is not influenced by whatever implementation the user of that type later uses.
1618

1719
`Send` and `Sync` can be composed in interesting fashions, but that's beyond the scope here. You can find examples in the [Rust Book][rust-book-sync].
1820

@@ -30,12 +32,12 @@ While computation is a subject to write a whole [book](https://computationbook.c
3032

3133
## Deferring computation
3234

33-
As mentioned above `Send` and `Sync` are about data. But programs are not only about data, they also talk about *computing* the data. And that's what [`Futures`][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
35+
As mentioned above, `Send` and `Sync` are about data. But programs are not only about data, they also talk about *computing* the data. And that's what [`Futures`][futures] do. We are going to have a close look at how that works in the next chapter. Let's look at what Futures allow us to express, in English. Futures go from this plan:
3436

3537
- Do X
36-
- If X succeeds, do Y
38+
- If X succeeded, do Y
3739

38-
towards
40+
towards:
3941

4042
- Start doing X
4143
- Once X succeeds, start doing Y
@@ -48,67 +50,81 @@ Remember the talk about "deferred computation" in the intro? That's all it is. I
4850

4951
Let's have a look at a simple function, specifically the return value:
5052

51-
fn read_file(path: &str) -> Result<String, io::Error> {
52-
let mut file = File.open(path)?;
53-
let mut contents = String::new();
54-
file.read_to_string(&mut contents)?;
55-
contents
56-
}
53+
```rust
54+
fn read_file(path: &str) -> Result<String, io::Error> {
55+
let mut file = File.open(path)?;
56+
let mut contents = String::new();
57+
file.read_to_string(&mut contents)?;
58+
contents
59+
}
60+
```
61+
62+
You can call that at any time, so you are in full control on when you call it. But here's the problem: the moment you call it, you transfer control to the called function until it returns a value - eventually.
63+
Note that this return value talks about the past. The past has a drawback: all decisions have been made. It has an advantage: the outcome is visible. We can unwrap the results of the program's past computation, and then decide what to do with it.
64+
65+
But we wanted to abstract over *computation* and let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that *describes* a computation without running it. Let's look at the function again:
66+
67+
```rust
68+
fn read_file(path: &str) -> Result<String, io::Error> {
69+
let mut file = File.open(path)?;
70+
let mut contents = String::new();
71+
file.read_to_string(&mut contents)?;
72+
contents
73+
}
74+
```
5775

58-
You can call that at any time, so you are in full control of when you call it. But here's the problem: the moment you call it, you transfer control to the called function. It returns a value.
59-
Note that this return value talks about the past. The past has a drawback: all decisions have been made. It has an advantage: the outcome is visible. We can unwrap the presents of program past and then decide what to do with it.
76+
Speaking in terms of time, we can only take action *before* calling the function or *after* the function returned. This is not desirable, as it takes from us the ability to do something *while* it runs. When working with parallel code, this would take from us the ability to start a parallel task while the first runs (because we gave away control).
6077

61-
But here's a problem: we wanted to abstract over *computation* to be allowed to let someone else choose how to run it. That's fundamentally incompatible with looking at the results of previous computation all the time. So, let's find a type that describes a computation without running it. Let's look at the function again:
78+
This is the moment where we could reach for [threads](https://en.wikipedia.org/wiki/Thread_). But threads are a very specific concurrency primitive and we said that we are searching for an abstraction.
6279

63-
fn read_file(path: &str) -> Result<String, io::Error> {
64-
let mut file = File.open(path)?;
65-
let mut contents = String::new();
66-
file.read_to_string(&mut contents)?;
67-
contents
68-
}
80+
What we are searching for is something that represents ongoing work towards a result in the future. Whenever we say "something" in Rust, we almost always mean a trait. Let's start with an incomplete definition of the `Future` trait:
6981

70-
Speaking in terms of time, we can only take action *before* calling the function or *after* the function returned. This is not desirable, as it takes from us the ability to do something *while* it runs. When working with parallel code, this would take from us the ability to start a parallel task while the first runs (because we gave away control).
82+
```rust
83+
trait Future {
84+
type Output;
7185

72-
This is the moment where we could reach for [threads](https://en.wikipedia.org/wiki/Thread_). But threads are a very specific concurrency primitive and we said that we are searching for an abstraction.
73-
What we are searching is something that represents ongoing work towards a result in the future. Whenever we say `something` in Rust, we almost always mean a trait. Let's start with an incomplete definition of the `Future` trait:
86+
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
87+
}
88+
```
89+
90+
Looking at it closely, we see the following:
7491

75-
trait Future {
76-
type Output;
77-
78-
fn poll(self: Pin<&mut Self>, cx: &mut Context) -> Poll<Self::Output>;
79-
}
92+
- It is generic over the `Output`.
93+
- It provides a function called `poll`, which allows us to check on the state of the current computation.
94+
- (Ignore `Pin` and `Context` for now, you don't need them for high-level understanding.)
8095

81-
Ignore `Pin` and `Context` for now, you don't need them for high-level understanding. Looking at it closely, we see the following: it is generic over the `Output`. It provides a function called `poll`, which allows us to check on the state of the current computation.
8296
Every call to `poll()` can result in one of these two cases:
8397

84-
1. The future is done, `poll` will return [`Poll::Ready`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready)
85-
2. The future has not finished executing, it will return [`Poll::Pending`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending)
98+
1. The computation is done, `poll` will return [`Poll::Ready`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Ready)
99+
2. The computation has not finished executing, it will return [`Poll::Pending`](https://doc.rust-lang.org/std/task/enum.Poll.html#variant.Pending)
86100

87-
This allows us to externally check if a `Future` has finished doing its work, or is finally done and can give us the value. The most simple way (but not efficient) would be to just constantly poll futures in a loop. There's optimisations here, and this is what a good runtime does for you.
88-
Note that calling `poll` after case 1 happened may result in confusing behaviour. See the [Future docs](https://doc.rust-lang.org/std/future/trait.Future.html) for details.
101+
This allows us to externally check if a `Future` still has unfinished work, or is finally done and can give us the value. The most simple (but not efficient) way would be to just constantly poll futures in a loop. There are optimisations possible, and this is what a good runtime does for you.
102+
Note that calling `poll` again after case 1 happened may result in confusing behaviour. See the [futures-docs](https://doc.rust-lang.org/std/future/trait.Future.html) for details.
89103

90104
## Async
91105

92-
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. The example from above, implemented in `async-std`, would look like this:
106+
While the `Future` trait has existed in Rust for a while, it was inconvenient to build and describe them. For this, Rust now has a special syntax: `async`. The example from above, implemented with `async-std`, would look like this:
107+
108+
```rust
109+
use async_std::fs::File;
93110

94-
use async_std::fs::File;
95-
96-
async fn read_file(path: &str) -> Result<String, io::Error> {
97-
let mut file = File.open(path).await?;
98-
let mut contents = String::new();
99-
file.read_to_string(&mut contents).await?;
100-
contents
101-
}
111+
async fn read_file(path: &str) -> Result<String, io::Error> {
112+
let mut file = File.open(path).await?;
113+
let mut contents = String::new();
114+
file.read_to_string(&mut contents).await?;
115+
contents
116+
}
117+
```
102118

103119
Amazingly little difference, right? All we did is label the function `async` and insert 2 special commands: `.await`.
104120

105-
This function sets up a deferred computation. When this function is called, it will produce a `Future<Output=Result<String, io::Error>>` instead of immediately returning a `Result<String, io::Error>`. (Or, more precisely, generate a type for you that implements `Future<Output=Result<String, io::Error>>`.)
121+
This `async` function sets up a deferred computation. When this function is called, it will produce a `Future<Output=Result<String, io::Error>>` instead of immediately returning a `Result<String, io::Error>`. (Or, more precisely, generate a type for you that implements `Future<Output=Result<String, io::Error>>`.)
106122

107123
## What does `.await` do?
108124

109-
The `.await` postfix does exactly what it says on the tin: the moment you use it, the code will wait until the requested action (e.g. opening a file or reading all data in it) is finished. `.await?` is not special, it's just the application of the `?` operator to the result of `.await`. So, what is gained over the initial code example? Were getting futures and then immediately waiting for them?
125+
The `.await` postfix does exactly what it says on the tin: the moment you use it, the code will wait until the requested action (e.g. opening a file or reading all data in it) is finished. The `.await?` is not special, it's just the application of the `?` operator to the result of `.await`. So, what is gained over the initial code example? We're getting futures and then immediately waiting for them?
110126

111-
The `.await` points act as a marker. Here, the code will wait for a `Future` to produce its value. How will a future finish? You dont need to care! The marker allows the code later *executing* this piece of code (usually called the “runtime”) when it can take some time to care about all the other things it has to do. It will come back to this point when the operation you are doing in the background is done. This is why this style of programming is also called *evented programming*. We are waiting for *things to happen* (e.g. a file to be opened) and then react (by starting to read).
127+
The `.await` points act as a marker. Here, the code will wait for a `Future` to produce its value. How will a future finish? You don't need to care! The marker allows the component (usually called the “runtime”) in charge of *executing* this piece of code to take care of all the other things it has to do while the computation finishes. It will come back to this point when the operation you are doing in the background is done. This is why this style of programming is also called *evented programming*. We are waiting for *things to happen* (e.g. a file to be opened) and then react (by starting to read).
112128

113129
When executing 2 or more of these functions at the same time, our runtime system is then able to fill the wait time with handling *all the other events* currently going on.
114130

0 commit comments

Comments
 (0)