Category Archives: Uncategorized

Rust and Direct2D

I have been hard at work improving the API coverage of Rust for Windows and reached a little milestone with the windows crate being advanced enough to build a complete desktop app with hardware-accelerated rendering and animation using Direct2D. This is actually a Rust port of a C++ sample I wrote for one of my Pluralsight courses. Enjoy!

https://github.com/kennykerr/samples-rs

Rust for Windows

I am excited to finally talk about the grand plan we have been working on for some time, namely the unification of the Windows API. No more Win32 here, WinRT there, COM this, UWP that. Just stop it. Rust for Windows lets you use any Windows API directly and seamlessly via the windows crate.

Whether its timeless functions like CreateEvent and WaitForSingleObject, powerful graphics engines like Direct3D, traditional windowing functions like CreateWindowEx and DispatchMessage, or more recent UI frameworks like Composition and Xaml, the windows crate has you covered.

This is an early preview, but finally having metadata for the entire Windows API is a huge step toward making Windows development easier and more approachable for all developers.

The repo has everything you need to get started:

https://github.com/microsoft/windows-rs/

In particular, the readme has a short guide to getting started. There are also some simple examples that you can follow. And of course, we have updated Robert Mikhayelyan’s Minesweeper port.

If having the entire Windows API at your fingertips seems a little daunting, I have also published some Rust documentation for the Windows API. This lets you browse or search for just the API you need and makes it a lot easier to find what you are looking for.

If you have questions or run into issues, please use the GitHub repo to get in touch.

Repo: https://github.com/microsoft/windows-rs

API docs: https://microsoft.github.io/windows-docs-rs

Samples: https://github.com/kennykerr/samples-rs

https://blogs.windows.com/windowsdeveloper/2021/01/20/making-win32-apis-more-accessible-to-more-languages

Improving the IDE for Rust/WinRT

We’ve looked at the basics of getting started with Rust/WinRT and how to optimize your inner loop by reducing the build time thanks to Cargo’s caching of dependencies. Now let’s look at improving the quality of the development experience inside VS Code.

The main problem with Rust/WinRT’s import macros is that it doesn’t generate Rust code. Instead, it generates a token stream that is directly ingested by the Rust compiler. While this is quite efficient, it can be less than desirable from a developer’s perspective. Without Rust code, it becomes very difficult to debug into the generated code. Rust code is also required by the rust-analyzer VS Code extension in order to provide code completion hints.

Fortunately, Rust/WinRT provides a build macro to complement the import macro we’ve already been using. Both macros accept the same input syntax for describing dependencies and types to be imported. The difference is that the build macro helps to generate actual Rust code that you can read, include, and step through in a debugger.

The first thing we’ll do is add a build script to the bindings sub crate we created in the last installment. Note that it is intentionally called a build script and not a source file. Even though it’s just Rust code, its compiled separately. The way cargo knows it’s the build script is to call the file build.rs and place it in the root of the package, not inside the src folder. Now add the following to the build.rs file you created in the bindings package:

winrt::build!(
    dependencies
        os
    types
        windows::system::diagnostics::*
);

fn main() {
    build();
}

Again, the tokens within the build macro are exactly the same as those we previously used with the import macro. Only the name of the macro itself has changed. Due to certain limitations in the Rust compiler, we need to generate the code inside the main function but cannot actually place the build macro inside the main function. So we have this awkward dance where the build macro really just generates a build function that the main function then calls to generate the Rust code for the package. Once the issues with the Rust compiler have been resolved, we should be able to streamline this process.

To underscore that the build script is compiled separately, we need to add winrt as a distinct dependency of the build inside the bindings project’s Cargo.toml file:

[dependencies]
winrt = "0.7.0"

[build-dependencies]
winrt = "0.7.0"

This ensures that the dependency is available to the build script without having to be available to the project’s source code. You can of course have the same dependency in both if that were needed, as it is in this case.

Now inside the binding project’s src/lib.rc file we can include the Rust code generated in the build script, instead of calling the import macro:

include!(concat!(env!("OUT_DIR"), "/winrt.rs"));

Note that the OUT_DIR environment variable is only available if the project has a build script. It’s also why the build macro couldn’t just generate the winrt.rs file directly: the OUT_DIR environment variable is only set when the build script is executed and not when it is compiled, which is when the build macro is executed.

And that’s all we need to do to switch from using the import macro to the build macro. You can now recompile the sample project and you should find it works just as it did before. The difference is that now we can both debug the code and make use of code completion hints.

Debugging can be achieved either with the Microsoft C/C++ VS Code extension or with the CodeLLDB extension in combination with the rust-analyzer extension. Once you’ve picked an appropriate extension, you can simply begin debugging and step into any of the generated code and you’ll land up somewhere inside the generated winrt.rs source file. The build macro even ensures that the Rust code is properly formatted for readability.

Code completion also works reasonably well with the rust-analyzer extension, but it does have a few limitations and can struggle a bit with the sheer amount of code that Rust/WinRT generates. I’ll give you two tips to help you get started.

The first is to ensure that rust-analyzer can find the generated code. That’s what the “Load Out Dirs From Check” setting is for. Make sure this is checked:

The second is to place any use declarations at the top of your Rust source file, otherwise rust-analyzer will fail to correctly produce code completion hints. Last time, we wrote the use declaration inside the main function. That won’t do. Instead, update the main.rs source file as follows:

use bindings::windows::system::diagnostics::*;

fn main() -> winrt::Result<()> {
    for process in ProcessDiagnosticInfo::get_for_processes()?
        .into_iter()
        .take(5)
    {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }
 
    Ok(())
}

Now, you should be able to rely on rust-analyzer to provide code completion hints:

If code completion isn’t working too well or you just want to browse available APIs you still have options. You can go to the official documentation for Windows APIs. Of course, you’ll need to translate the C#/C++ specific naming conventions to Rust. Alternatively, you can get Cargo to generate documentation for the generated bindings:

C:\sample\bindings>cargo doc --open
    Updating crates.io index
  Downloaded quote v1.0.7
  Downloaded serde_json v1.0.54
   Compiling proc-macro2 v1.0.18
.
.
.
    Finished dev [unoptimized + debuginfo] target(s) in 32.09s
     Opening C:\sample\bindings\target\doc\bindings\index.html

Cargo will open the browser where you search or browse for any of the available APIs. Naturally, this will only include APIs that were generated by the build macro. If you’d like to see more, simply add more types to the build macro and rerun Cargo.

Optimizing the build with Rust/WinRT

In Getting started with Rust/WinRT we used the import macro to generate Rust bindings for Windows APIs directly into the Rust module where the import macro is used. This can be a nested module if you wish. Here’s an example using the Windows.System.Diagnostics namespace, which is documented here.

mod bindings {
    winrt::import!(
        dependencies
            os
        types
            windows::system::diagnostics::*
    );
}

Notice how in the following main function, I now use bindings as part of the Rust path for the windows::system::diagnostics module:

fn main() -> winrt::Result<()> {
    use bindings::windows::system::diagnostics::*;

    for process in ProcessDiagnosticInfo::get_for_processes()? {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }

    Ok(())
}

This will give you a quick dump of the processes currently running on your machine:

C:\sample>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 10.54s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
id:   284 packaged: false name: Registry
id:  8616 packaged: true  name: RuntimeBroker.exe
id: 10732 packaged: false name: svchost.exe
.
.
.

As I pointed out last time, the time it takes can quickly become prohibitive. One option is to implement the bindings module in a separate bindings.rs file. While this will give a marginal improvement, Cargo is far better at caching the results if you stick the code in its own crate. Back in the console, let’s add a sub crate to house the generated bindings.

C:\sample>cargo new --lib bindings
     Created library `bindings` package

We then need to update the outer project to tell Cargo that it now depends on this new bindings library. To do that, we need to add bindings as a dependency in the Cargo.toml file for the sample project:

[dependencies]
winrt = "0.7.0"
bindings = { path = "bindings" }

While the first dependency is resolved via crates.io the bindings dependency uses a relative path to find the sub crate. This is all it takes to ensure that cargo will automatically build and cache the new dependency. Now let’s get the bindings library configured to import the WinRT types. Inside the bindings project, open the Cargo.toml file where we can add the winrt dependency:

[dependencies]
winrt = "0.7.0"

We can then simply remove the import macro from the original project’s main.rs source file and add it to the bindings project’s lib.rs source file:

winrt::import!(
    dependencies
        os
    types
        windows::system::diagnostics::*
);

The first time I run the example, it completes in about 10 seconds:

C:\sample>cargo run
   Compiling bindings v0.1.0 (C:\sample\bindings)
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 10.61s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
.
.
.

Now let’s make a small change to see whether incremental build time improves. Back in the sample project’s main function, we can turn the vector returned by get_for_processes into an iterator and limit the results to the first 5 processes:

fn main() -> winrt::Result<()> {
    use bindings::windows::system::diagnostics::*;

    for process in ProcessDiagnosticInfo::get_for_processes()?
        .into_iter()
        .take(5)
    {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }

    Ok(())
}

Cargo does quick work of recompiling and gets us running in under a second:

C:\sample>cargo run
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 0.69s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
id:   284 packaged: false name: Registry
id:  1064 packaged: false name: smss.exe
id:  1484 packaged: false name: csrss.exe

Now that’s much better. But there’s more! Using the import macro has a few drawbacks, but I’ll talk about creating your own build script next time. So stay tuned!

Getting started with Rust/WinRT

Getting started with Rust/WinRT is quite simple thanks in large part to the polished toolchain that Rust developers enjoy. Here are a few links if you don’t need any help getting started. Read on if you’d like to learn some tips and tricks to make the most out of the Windows Runtime.

GitHub: https://github.com/microsoft/winrt-rs
Docs.rs: https://docs.rs/winrt/
Crates.io: https://crates.io/crates/winrt

Install the following prerequisites:

Visual Studio 2019 – be sure to install the C++ tools as this is required by the Rust compiler (only the linker is required).
Visual Studio Code – this is the default IDE used for Rust.
Python – be sure to install the x64 version as this is required for debugging support.
Git – Rust has deep support for Git.
Rust – this installs `rustup` which is a tool for installing Rust toolchains and common Rust related tooling.

Now open VS Code and type `Ctrl+Shift+X` to open the extensions panel and install the following extensions:

rust-analyzer – there are others, but this is the only Rust extension that I’ve tried that actually works reliably most of the time.
CodeLLDB – you can also use the Microsoft C++ extension for debugging, but this one does a better job of integrating with the IDE.
C/C++ – the Microsoft C++ extension doesn’t integrate as well with the IDE, but provides superior debugging information, so you may want to have that on hand for an emergency.

You should be prompted to download and install the Rust language server. Go ahead and let that install. You may need to restart VS Code and give it a few moments to load, after which it should all be ready and working pretty well.

Let’s now start real simple with a new cargo package:

C:\>cargo new sample
     Created binary (application) `sample` package

Cargo is Rust’s package manager. This command will create a minimal project that you can open with VS Code:

C:\>cd sample
C:\sample>code .

Open the Cargo.toml file that cargo created for the project and add the WinRT crate as a dependency:

[dependencies]
winrt = "0.7.0"

That’s the current version as I write this, but you can check Crates.io for the latest version. You can use cargo once again to build the application:

C:\sample>cargo build
    Updating crates.io index
   Compiling proc-macro2 v1.0.18
   Compiling unicode-xid v0.2.0
   Compiling syn v1.0.30
   Compiling ryu v1.0.5
   Compiling serde v1.0.111
   Compiling itoa v0.4.5
   Compiling sha1 v0.6.0
   Compiling quote v1.0.6
   Compiling serde_json v1.0.53
   Compiling winrt_gen_macros v0.7.0
   Compiling winrt_gen v0.7.0
   Compiling winrt_macros v0.7.0
   Compiling winrt v0.7.0
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 19.65s

The first time you run cargo, it goes ahead and downloads any dependencies recursively. This might seem like a lot, but it is not unusual for Rust crates to depend on a variety of other crates. The good news is that cargo will cache the compiled crates and reuse those results, ensuring subsequent builds are very snappy. You can also have cargo run the application directly, which will rebuild the application if necessary:

C:\sample>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.06s
     Running `target\debug\sample.exe`
Hello, world!

Notice how the initial build took 20 seconds, while the subsequent run hardly took any time at all. Of course, nothing had changes so let’s change that. In the project’s src folder you’ll find a main.rs source file. There you can see the source of the Hello world greeting. Let’s now use the winrt::import macro to generate Rust bindings for WinRT APIs.

winrt::import!(
    dependencies
        os
    types
        windows::data::xml::dom::*
);

It doesn’t really matter where in main.rs you put this code, but I usually put it at the top as it logically includes a bunch of Rust code that you can then use in your application. The import macro has two parts. There are the dependencies that identify the WinRT components you wish to make use of in your application and the specific subset of types within those dependencies that you actually want to use. In this case, I’ve used “os” to make use of all available operating system APIs. Those correspond to the version of Windows that you happen to be running on. It’s also possible to target a specific Windows SDK version. I’ve also constrained the import macro to just those types in the windows::data::xml::dom module. This corresponds to the Windows.Data.Xml.Dom namespace in the official documentation. As you might have guessed, this is a Rust path and you can certainly constrain it further to include only specific types within different modules if you wish.

Let’s now replace the main function provided by cargo with something a little more interesting. Here I’m using the XmlDocument struct generated by the import macro, which is documented here.

fn main() -> winrt::Result<()> {
    use windows::data::xml::dom::*;

    let doc = XmlDocument::new()?;
    doc.load_xml("<html>hello world</html>")?;

    let root = doc.document_element()?;
    assert!(root.node_name()? == "html");
    assert!(root.inner_text()? == "hello world");

    Ok(())
}

If you were to recompile at this point, you may notice it taking just a little while:

C:\sample>cargo run
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 8.71s
     Running `target\debug\sample.exe`

8 seconds isn’t so bad, but as you add more types the import macro will naturally have more work to do and the Rust compiler will spend more time processing the results. The time it takes can quickly become prohibitive and a more scalable solution is needed. Still, the import macro is handy for getting started or just quickly calling a specific Windows API. Another option is to use a Rust build script to generate and cache the results of importing WinRT types.

We’ll cover that next time, so stay tuned!

Rust/WinRT is now on GitHub

We are excited to announce that the Rust/WinRT project finally has a permanent and public home on GitHub:

https://github.com/microsoft/winrt-rs

A lot has happened around the world since my last update. I hope this finds you and your family well. We have also had to adjust the way we work at Microsoft. Ryan Levick and I have been hard at work getting things ready for us to continue the development of Rust/WinRT out in the open. We look forward to hearing from all of you whether in the form of feedback or contributions. There is much to be done, but we are excited to be able to share what we have accomplished thus far.

For more information, please see the official announcement and don’t forget to try Robert Mikhayelyan’s very cool demo.

Rust/WinRT coming soon

My rust adventure continues as I have been furiously working on Rust/WinRT for the last five months or so. I am indebted to Ryan Levick for patiently answering all of my questions and also jumping in and getting deeply involved in the project early on. I am also looking forward to opening it up to the community as soon as possible. Even then, it will be early days and much still do. I remember chatting with Martyn Lovell about this a few years ago and we basically agreed that it takes about three years to build a language projection. Naturally, you can get value out of it before then but that’s what you need to keep in mind when you consider completeness.

Still, I’m starting to be able to make API calls with Rust/WinRT and its very satisfying to see this come together. So, I’ll leave you with a sneak peek to give you sense of what calling Windows APIs looks like in Rust. Here’s the venerable Windows.Foundation.Uri class:

use windows::foundation::*;

let uri = Uri::create_uri("https://kennykerr.ca")?;
assert!(uri.domain()? == "kennykerr.ca");
assert!(uri.port()? == 443);
assert!(uri.to_string()? == "https://kennykerr.ca/");

Immediately you’ll notice this looks far more like Rust (if you’re familiar with Rust) than it looks like C++ or C#. Notice the snake_case on module and method names and the ? operator for error propagation. The Uri class has a constructor that’s implemented by a factory method called CreateUri. Since Rust lacks constructors, we simply take that CreateUri method and project it as create_uri to conform to Rust’s naming conventions. The to_string method comes from the IStringable interface that the Uri class implements. Even though Rust doesn’t support type inheritance, Rust/WinRT ensures that you get the same classy type system that WinRT is built on. Under the hood, Rust/WinRT will naturally use QueryInterface to query for the IStringable interface so that it just works. You can also expect the same on-the-metal performance and efficiency as you do from C++/WinRT.

Here’s another example using the Windows.ApplicationModel.DataTransfer namespace to copy some value onto the clipboard:

use windows::application_model::data_transfer::*;

let content = DataPackage::new()?;
content.set_text("Rust/WinRT")?;

Clipboard::set_content(content)?;
Clipboard::flush()?;

Here we’re calling the DataPackage’s default constructor, but of course Rust doesn’t have constructors. The default constructor is thus replaced with the conventional new method.

And finally, here’s an example of using the Windows.UI.Composition API:

use windows::foundation::numerics::*;
use windows::ui::composition::*;
use windows::ui::*;

let compositor = Compositor::new()?;
let visual = compositor.create_sprite_visual()?;
let red = Colors::red()?;
assert!(red == Color { a: 255, r: 255, g: 0, b: 0 });

let brush = compositor.create_color_brush_with_color(red)?;
visual.set_brush(brush)?;

visual.set_offset(Vector3 { x: 1.0, y: 2.0, z: 3.0, })?;
assert!(visual.offset()? == Vector3 { x: 1.0, y: 2.0, z: 3.0 });

Here you can see we’re creating a Compositor. We use the compositor to create a sprite visual with a red brush and then set the visual’s offset. This seems simple, but that’s a testament to the sheer amount of work that’s already gone into Rust/WinRT to make it seem so natural and native to Rust. The Composition API is one of only two type hierarchies in the Windows API and requires special attention to get right in any language projection, let alone a language that lacks traditional inheritance.

My point here is not to claim these are superb APIs. There may well be a better way to do these tasks in Rust. The point is that Rust/WinRT lets you call any Windows API past, present, and future using code generated on the fly directly from the canonical metadata describing the API and right into your Rust package where you can call them as if they were just another Rust module.

I’m looking forward to sharing more about Rust/WinRT.

Rust: an IDE for your project

Next: Coming soon
Previous: Your first package

So, you’ve built your first package with Cargo. It doesn’t take long before the lack of an IDE really becomes a problem. As is customary these days, there are many choices. I’ve been using VS Code.

https://code.visualstudio.com/

Go ahead and install that. Note that Code is not really an IDE in the traditional sense. Even though it shares a name with Visual Studio, it really has nothing to do with Visual Studio. Rather, it is more of a do-it-yourself IDE builder if you will. Still, it’s pretty neat and is better than the alternatives I have come across. Next, you will want to install the C/C++ extension by Microsoft. This will enable things like debugging support.

And then install the Rust (rls) extension.

This provides basic IntelliSense (when it works) as well as IDE support for things like building and debugging.

It’s early days for the IDE support, but you’ll get syntax highlighting and something that feels like an IDE. I’ll talk about debugging and IntelliSense, building and much more in due course. But first, what’s more pertinent is how to set up your environment to model what a traditional C++ developer on Windows might think of as a collection of projects. Think of a Visual Studio solution and the enormous complexity that Visual Studio attempts to orchestrate when it presents a set of projects in its Solution Explorer, with project references and build dependencies all administered by msbuild. VS Code certainly has no notion of a solution in that sense, but what it does offer is surprisingly refreshing and combined with Rust’s built-in support for dependencies turns out to be superior in many ways. So, find an empty folder and give this a try.

Start by creating a binary package:

C:\projects>cargo new app
     Created binary (application) `app` package

Now create a library package that the app will use:

C:\projects>cargo new --lib fruit
     Created library `fruit` package

Notice I used the –lib option to tell Cargo to use its library template rather than the default binary template. Recall that Rust infers the type of package based on the presence of a file named src/main.rs or src/lib.rs respectively. So these two projects are almost identical so far. And just for good measure, let’s create a third package:

C:\projects>cargo new --lib apples
     Created library `apples` package

Right, you should now have three folders that look something like this:

C:\projects>dir
    <DIR>          app
    <DIR>          apples
    <DIR>          fruit

Notice there’s no “solution” file, but VS Code is quite happy to simply to open a folder directly:

C:\projects>code .

Here I’m using the “.” shortcut to indicate the current folder.

Notice that VS Code’s Explorer panel simply presents the three folders and their contents and as you can see, the only difference between the binary and library packages is the name of the source file in each package’s src folder. Before looking at the code, let’s set up our package dependencies. We would like the app to depend on the fruit library and the fruit library in turn to depend on the apples library package. Such dependencies are described in the Cargo.toml file that each package contains in its package root.

Open app/Cargo.toml and you’ll see that Cargo filled in some basics. Add the fruit dependency such that app/Cargo.toml looks something like this:

[package]
name = "app"
version = "0.1.0"

[dependencies]
fruit = { path = "../fruit" }

Cargo lets you specify dependencies in a variety of ways, but this simply tells Cargo that this package depends on a package in a relative location in the file system. Cargo will thus ensure that the fruit package is built and ready for consumption by the app package. Now you should know how to update fruit/Cargo.toml to depend on the apples package:

[package]
name = "fruit"
version = "0.1.0"

[dependencies]
apples = { path = "../apples" }

And that’s it. In Visual Studio parlance, you’ve set up a solution with a few projects that include build dependencies. Don’t believe me? Change to the app folder and let’s see what happens when we build it:

C:\projects>cd app

We can now simply tell Cargo to build the app package as we learned previously. But notice what happens:

C:\projects\app>cargo build
   Compiling apples v0.1.0 (C:\projects\apples)
   Compiling fruit v0.1.0 (C:\projects\fruit)
   Compiling app v0.1.0 (C:\projects\app)
    Finished dev [unoptimized + debuginfo] target(s) in 0.99s

Cargo chased down the package dependencies recursively, first building the apples package, then the fruit package, and finally the app itself. As you might expect, any time you make a change to any source in any of these packages, Cargo will automatically rebuild as needed. On the other hand, if nothing has changed it will simply reuse build artifacts or do the equivalent of an incremental build:

C:\projects\app>cargo build
    Finished dev [unoptimized + debuginfo] target(s) in 0.08s

Now for good measure, let’s see if we can actually call some code down through these package dependencies. We’ll start at the bottom. Open apples/src/lib.rs and replace whatever’s there with a simple apple function:

pub fn apple() -> &'static str {
    "Granny Smith"
}

Don’t worry too much about the syntax at this stage. We will shortly switch gears and talk about the Rust language itself. Essentially, this is just saying that we would like a public (pub) function (fn) called apple that doesn’t have any parameters and returns a static string slice, typically a string literal.

Now open fruit/src/lib.rs and replace whatever’s there with a fruit function:

pub fn fruit() -> &'static str {
    apples::apple()
}

This function has the same signature as before, but simply calls the apple function and forwards on its return value. Finally, we can call the fruit function from our app’s main function inside app/src/main.rs and replacing whatever’s there with this main function:

fn main() {
    println!("Hello, {}!", fruit::fruit());
}

And that’s it. Save your work and go back to the command prompt and build the app:

C:\projects\app>cargo build
   Compiling apples v0.1.0 (C:\projects\apples)
   Compiling fruit v0.1.0 (C:\projects\fruit)
   Compiling app v0.1.0 (C:\projects\app)
    Finished dev [unoptimized + debuginfo] target(s) in 0.79s

You’ll notice that Cargo once again understands that it should rebuild all dependencies since they have all changed. As I mentioned before, rather than Cargo’s build command we could more easily just use Cargo’s run command to both build and run the app:

C:\projects\app>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.02s
     Running `target\debug\app.exe`
Hello, Granny Smith!

Well hello there, Granny Smith. Wasn’t that fun! We’ve gone from building individual Rust source files, to building a Rust package, to being able to build a set of Rust packages with really very little effort on our part. VS Code is not without issues and there are certainly more and different ways to manage packages and editors, but this is enough to get you started.

Join me next time for the adventures of a C++ developer learning Rust.