Author Archives: Kenny Kerr

Improving the IDE for Rust/WinRT

We’ve looked at the basics of getting started with Rust/WinRT and how to optimize your inner loop by reducing the build time thanks to Cargo’s caching of dependencies. Now let’s look at improving the quality of the development experience inside VS Code.

The main problem with Rust/WinRT’s import macros is that it doesn’t generate Rust code. Instead, it generates a token stream that is directly ingested by the Rust compiler. While this is quite efficient, it can be less than desirable from a developer’s perspective. Without Rust code, it becomes very difficult to debug into the generated code. Rust code is also required by the rust-analyzer VS Code extension in order to provide code completion hints.

Fortunately, Rust/WinRT provides a build macro to complement the import macro we’ve already been using. Both macros accept the same input syntax for describing dependencies and types to be imported. The difference is that the build macro helps to generate actual Rust code that you can read, include, and step through in a debugger.

The first thing we’ll do is add a build script to the bindings sub crate we created in the last installment. Note that it is intentionally called a build script and not a source file. Even though it’s just Rust code, its compiled separately. The way cargo knows it’s the build script is to call the file build.rs and place it in the root of the package, not inside the src folder. Now add the following to the build.rs file you created in the bindings package:

winrt::build!(
    dependencies
        os
    types
        windows::system::diagnostics::*
);

fn main() {
    build();
}

Again, the tokens within the build macro are exactly the same as those we previously used with the import macro. Only the name of the macro itself has changed. Due to certain limitations in the Rust compiler, we need to generate the code inside the main function but cannot actually place the build macro inside the main function. So we have this awkward dance where the build macro really just generates a build function that the main function then calls to generate the Rust code for the package. Once the issues with the Rust compiler have been resolved, we should be able to streamline this process.

To underscore that the build script is compiled separately, we need to add winrt as a distinct dependency of the build inside the bindings project’s Cargo.toml file:

[dependencies]
winrt = "0.7.0"

[build-dependencies]
winrt = "0.7.0"

This ensures that the dependency is available to the build script without having to be available to the project’s source code. You can of course have the same dependency in both if that were needed, as it is in this case.

Now inside the binding project’s src/lib.rc file we can include the Rust code generated in the build script, instead of calling the import macro:

include!(concat!(env!("OUT_DIR"), "/winrt.rs"));

Note that the OUT_DIR environment variable is only available if the project has a build script. It’s also why the build macro couldn’t just generate the winrt.rs file directly: the OUT_DIR environment variable is only set when the build script is executed and not when it is compiled, which is when the build macro is executed.

And that’s all we need to do to switch from using the import macro to the build macro. You can now recompile the sample project and you should find it works just as it did before. The difference is that now we can both debug the code and make use of code completion hints.

Debugging can be achieved either with the Microsoft C/C++ VS Code extension or with the CodeLLDB extension in combination with the rust-analyzer extension. Once you’ve picked an appropriate extension, you can simply begin debugging and step into any of the generated code and you’ll land up somewhere inside the generated winrt.rs source file. The build macro even ensures that the Rust code is properly formatted for readability.

Code completion also works reasonably well with the rust-analyzer extension, but it does have a few limitations and can struggle a bit with the sheer amount of code that Rust/WinRT generates. I’ll give you two tips to help you get started.

The first is to ensure that rust-analyzer can find the generated code. That’s what the “Load Out Dirs From Check” setting is for. Make sure this is checked:

The second is to place any use declarations at the top of your Rust source file, otherwise rust-analyzer will fail to correctly produce code completion hints. Last time, we wrote the use declaration inside the main function. That won’t do. Instead, update the main.rs source file as follows:

use bindings::windows::system::diagnostics::*;

fn main() -> winrt::Result<()> {
    for process in ProcessDiagnosticInfo::get_for_processes()?
        .into_iter()
        .take(5)
    {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }
 
    Ok(())
}

Now, you should be able to rely on rust-analyzer to provide code completion hints:

If code completion isn’t working too well or you just want to browse available APIs you still have options. You can go to the official documentation for Windows APIs. Of course, you’ll need to translate the C#/C++ specific naming conventions to Rust. Alternatively, you can get Cargo to generate documentation for the generated bindings:

C:\sample\bindings>cargo doc --open
    Updating crates.io index
  Downloaded quote v1.0.7
  Downloaded serde_json v1.0.54
   Compiling proc-macro2 v1.0.18
.
.
.
    Finished dev [unoptimized + debuginfo] target(s) in 32.09s
     Opening C:\sample\bindings\target\doc\bindings\index.html

Cargo will open the browser where you search or browse for any of the available APIs. Naturally, this will only include APIs that were generated by the build macro. If you’d like to see more, simply add more types to the build macro and rerun Cargo.

Optimizing the build with Rust/WinRT

In Getting started with Rust/WinRT we used the import macro to generate Rust bindings for Windows APIs directly into the Rust module where the import macro is used. This can be a nested module if you wish. Here’s an example using the Windows.System.Diagnostics namespace, which is documented here.

mod bindings {
    winrt::import!(
        dependencies
            os
        types
            windows::system::diagnostics::*
    );
}

Notice how in the following main function, I now use bindings as part of the Rust path for the windows::system::diagnostics module:

fn main() -> winrt::Result<()> {
    use bindings::windows::system::diagnostics::*;

    for process in ProcessDiagnosticInfo::get_for_processes()? {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }

    Ok(())
}

This will give you a quick dump of the processes currently running on your machine:

C:\sample>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 10.54s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
id:   284 packaged: false name: Registry
id:  8616 packaged: true  name: RuntimeBroker.exe
id: 10732 packaged: false name: svchost.exe
.
.
.

As I pointed out last time, the time it takes can quickly become prohibitive. One option is to implement the bindings module in a separate bindings.rs file. While this will give a marginal improvement, Cargo is far better at caching the results if you stick the code in its own crate. Back in the console, let’s add a sub crate to house the generated bindings.

C:\sample>cargo new --lib bindings
     Created library `bindings` package

We then need to update the outer project to tell Cargo that it now depends on this new bindings library. To do that, we need to add bindings as a dependency in the Cargo.toml file for the sample project:

[dependencies]
winrt = "0.7.0"
bindings = { path = "bindings" }

While the first dependency is resolved via crates.io the bindings dependency uses a relative path to find the sub crate. This is all it takes to ensure that cargo will automatically build and cache the new dependency. Now let’s get the bindings library configured to import the WinRT types. Inside the bindings project, open the Cargo.toml file where we can add the winrt dependency:

[dependencies]
winrt = "0.7.0"

We can then simply remove the import macro from the original project’s main.rs source file and add it to the bindings project’s lib.rs source file:

winrt::import!(
    dependencies
        os
    types
        windows::system::diagnostics::*
);

The first time I run the example, it completes in about 10 seconds:

C:\sample>cargo run
   Compiling bindings v0.1.0 (C:\sample\bindings)
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 10.61s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
.
.
.

Now let’s make a small change to see whether incremental build time improves. Back in the sample project’s main function, we can turn the vector returned by get_for_processes into an iterator and limit the results to the first 5 processes:

fn main() -> winrt::Result<()> {
    use bindings::windows::system::diagnostics::*;

    for process in ProcessDiagnosticInfo::get_for_processes()?
        .into_iter()
        .take(5)
    {
        println!(
            "id: {:5} packaged: {:5} name: {}",
            process.process_id()?,
            process.is_packaged()?,
            process.executable_file_name()?
        );
    }

    Ok(())
}

Cargo does quick work of recompiling and gets us running in under a second:

C:\sample>cargo run
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 0.69s
     Running `target\debug\sample.exe`
id:     4 packaged: false name: System
id:   176 packaged: false name: Secure System
id:   284 packaged: false name: Registry
id:  1064 packaged: false name: smss.exe
id:  1484 packaged: false name: csrss.exe

Now that’s much better. But there’s more! Using the import macro has a few drawbacks, but I’ll talk about creating your own build script next time. So stay tuned!

Getting started with Rust/WinRT

Getting started with Rust/WinRT is quite simple thanks in large part to the polished toolchain that Rust developers enjoy. Here are a few links if you don’t need any help getting started. Read on if you’d like to learn some tips and tricks to make the most out of the Windows Runtime.

GitHub: https://github.com/microsoft/winrt-rs
Docs.rs: https://docs.rs/winrt/
Crates.io: https://crates.io/crates/winrt

Install the following prerequisites:

Visual Studio 2019 – be sure to install the C++ tools as this is required by the Rust compiler (only the linker is required).
Visual Studio Code – this is the default IDE used for Rust.
Python – be sure to install the x64 version as this is required for debugging support.
Git – Rust has deep support for Git.
Rust – this installs `rustup` which is a tool for installing Rust toolchains and common Rust related tooling.

Now open VS Code and type `Ctrl+Shift+X` to open the extensions panel and install the following extensions:

rust-analyzer – there are others, but this is the only Rust extension that I’ve tried that actually works reliably most of the time.
CodeLLDB – you can also use the Microsoft C++ extension for debugging, but this one does a better job of integrating with the IDE.
C/C++ – the Microsoft C++ extension doesn’t integrate as well with the IDE, but provides superior debugging information, so you may want to have that on hand for an emergency.

You should be prompted to download and install the Rust language server. Go ahead and let that install. You may need to restart VS Code and give it a few moments to load, after which it should all be ready and working pretty well.

Let’s now start real simple with a new cargo package:

C:\>cargo new sample
     Created binary (application) `sample` package

Cargo is Rust’s package manager. This command will create a minimal project that you can open with VS Code:

C:\>cd sample
C:\sample>code .

Open the Cargo.toml file that cargo created for the project and add the WinRT crate as a dependency:

[dependencies]
winrt = "0.7.0"

That’s the current version as I write this, but you can check Crates.io for the latest version. You can use cargo once again to build the application:

C:\sample>cargo build
    Updating crates.io index
   Compiling proc-macro2 v1.0.18
   Compiling unicode-xid v0.2.0
   Compiling syn v1.0.30
   Compiling ryu v1.0.5
   Compiling serde v1.0.111
   Compiling itoa v0.4.5
   Compiling sha1 v0.6.0
   Compiling quote v1.0.6
   Compiling serde_json v1.0.53
   Compiling winrt_gen_macros v0.7.0
   Compiling winrt_gen v0.7.0
   Compiling winrt_macros v0.7.0
   Compiling winrt v0.7.0
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 19.65s

The first time you run cargo, it goes ahead and downloads any dependencies recursively. This might seem like a lot, but it is not unusual for Rust crates to depend on a variety of other crates. The good news is that cargo will cache the compiled crates and reuse those results, ensuring subsequent builds are very snappy. You can also have cargo run the application directly, which will rebuild the application if necessary:

C:\sample>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.06s
     Running `target\debug\sample.exe`
Hello, world!

Notice how the initial build took 20 seconds, while the subsequent run hardly took any time at all. Of course, nothing had changes so let’s change that. In the project’s src folder you’ll find a main.rs source file. There you can see the source of the Hello world greeting. Let’s now use the winrt::import macro to generate Rust bindings for WinRT APIs.

winrt::import!(
    dependencies
        os
    types
        windows::data::xml::dom::*
);

It doesn’t really matter where in main.rs you put this code, but I usually put it at the top as it logically includes a bunch of Rust code that you can then use in your application. The import macro has two parts. There are the dependencies that identify the WinRT components you wish to make use of in your application and the specific subset of types within those dependencies that you actually want to use. In this case, I’ve used “os” to make use of all available operating system APIs. Those correspond to the version of Windows that you happen to be running on. It’s also possible to target a specific Windows SDK version. I’ve also constrained the import macro to just those types in the windows::data::xml::dom module. This corresponds to the Windows.Data.Xml.Dom namespace in the official documentation. As you might have guessed, this is a Rust path and you can certainly constrain it further to include only specific types within different modules if you wish.

Let’s now replace the main function provided by cargo with something a little more interesting. Here I’m using the XmlDocument struct generated by the import macro, which is documented here.

fn main() -> winrt::Result<()> {
    use windows::data::xml::dom::*;

    let doc = XmlDocument::new()?;
    doc.load_xml("<html>hello world</html>")?;

    let root = doc.document_element()?;
    assert!(root.node_name()? == "html");
    assert!(root.inner_text()? == "hello world");

    Ok(())
}

If you were to recompile at this point, you may notice it taking just a little while:

C:\sample>cargo run
   Compiling sample v0.1.0 (C:\sample)
    Finished dev [unoptimized + debuginfo] target(s) in 8.71s
     Running `target\debug\sample.exe`

8 seconds isn’t so bad, but as you add more types the import macro will naturally have more work to do and the Rust compiler will spend more time processing the results. The time it takes can quickly become prohibitive and a more scalable solution is needed. Still, the import macro is handy for getting started or just quickly calling a specific Windows API. Another option is to use a Rust build script to generate and cache the results of importing WinRT types.

We’ll cover that next time, so stay tuned!

Rust/WinRT is now on GitHub

We are excited to announce that the Rust/WinRT project finally has a permanent and public home on GitHub:

https://github.com/microsoft/winrt-rs

A lot has happened around the world since my last update. I hope this finds you and your family well. We have also had to adjust the way we work at Microsoft. Ryan Levick and I have been hard at work getting things ready for us to continue the development of Rust/WinRT out in the open. We look forward to hearing from all of you whether in the form of feedback or contributions. There is much to be done, but we are excited to be able to share what we have accomplished thus far.

For more information, please see the official announcement and don’t forget to try Robert Mikhayelyan’s very cool demo.

Rust/WinRT coming soon

My rust adventure continues as I have been furiously working on Rust/WinRT for the last five months or so. I am indebted to Ryan Levick for patiently answering all of my questions and also jumping in and getting deeply involved in the project early on. I am also looking forward to opening it up to the community as soon as possible. Even then, it will be early days and much still do. I remember chatting with Martyn Lovell about this a few years ago and we basically agreed that it takes about three years to build a language projection. Naturally, you can get value out of it before then but that’s what you need to keep in mind when you consider completeness.

Still, I’m starting to be able to make API calls with Rust/WinRT and its very satisfying to see this come together. So, I’ll leave you with a sneak peek to give you sense of what calling Windows APIs looks like in Rust. Here’s the venerable Windows.Foundation.Uri class:

use windows::foundation::*;

let uri = Uri::create_uri("https://kennykerr.ca")?;
assert!(uri.domain()? == "kennykerr.ca");
assert!(uri.port()? == 443);
assert!(uri.to_string()? == "https://kennykerr.ca/");

Immediately you’ll notice this looks far more like Rust (if you’re familiar with Rust) than it looks like C++ or C#. Notice the snake_case on module and method names and the ? operator for error propagation. The Uri class has a constructor that’s implemented by a factory method called CreateUri. Since Rust lacks constructors, we simply take that CreateUri method and project it as create_uri to conform to Rust’s naming conventions. The to_string method comes from the IStringable interface that the Uri class implements. Even though Rust doesn’t support type inheritance, Rust/WinRT ensures that you get the same classy type system that WinRT is built on. Under the hood, Rust/WinRT will naturally use QueryInterface to query for the IStringable interface so that it just works. You can also expect the same on-the-metal performance and efficiency as you do from C++/WinRT.

Here’s another example using the Windows.ApplicationModel.DataTransfer namespace to copy some value onto the clipboard:

use windows::application_model::data_transfer::*;

let content = DataPackage::new()?;
content.set_text("Rust/WinRT")?;

Clipboard::set_content(content)?;
Clipboard::flush()?;

Here we’re calling the DataPackage’s default constructor, but of course Rust doesn’t have constructors. The default constructor is thus replaced with the conventional new method.

And finally, here’s an example of using the Windows.UI.Composition API:

use windows::foundation::numerics::*;
use windows::ui::composition::*;
use windows::ui::*;

let compositor = Compositor::new()?;
let visual = compositor.create_sprite_visual()?;
let red = Colors::red()?;
assert!(red == Color { a: 255, r: 255, g: 0, b: 0 });

let brush = compositor.create_color_brush_with_color(red)?;
visual.set_brush(brush)?;

visual.set_offset(Vector3 { x: 1.0, y: 2.0, z: 3.0, })?;
assert!(visual.offset()? == Vector3 { x: 1.0, y: 2.0, z: 3.0 });

Here you can see we’re creating a Compositor. We use the compositor to create a sprite visual with a red brush and then set the visual’s offset. This seems simple, but that’s a testament to the sheer amount of work that’s already gone into Rust/WinRT to make it seem so natural and native to Rust. The Composition API is one of only two type hierarchies in the Windows API and requires special attention to get right in any language projection, let alone a language that lacks traditional inheritance.

My point here is not to claim these are superb APIs. There may well be a better way to do these tasks in Rust. The point is that Rust/WinRT lets you call any Windows API past, present, and future using code generated on the fly directly from the canonical metadata describing the API and right into your Rust package where you can call them as if they were just another Rust module.

I’m looking forward to sharing more about Rust/WinRT.

Rust: an IDE for your project

Next: Coming soon
Previous: Your first package

So, you’ve built your first package with Cargo. It doesn’t take long before the lack of an IDE really becomes a problem. As is customary these days, there are many choices. I’ve been using VS Code.

https://code.visualstudio.com/

Go ahead and install that. Note that Code is not really an IDE in the traditional sense. Even though it shares a name with Visual Studio, it really has nothing to do with Visual Studio. Rather, it is more of a do-it-yourself IDE builder if you will. Still, it’s pretty neat and is better than the alternatives I have come across. Next, you will want to install the C/C++ extension by Microsoft. This will enable things like debugging support.

And then install the Rust (rls) extension.

This provides basic IntelliSense (when it works) as well as IDE support for things like building and debugging.

It’s early days for the IDE support, but you’ll get syntax highlighting and something that feels like an IDE. I’ll talk about debugging and IntelliSense, building and much more in due course. But first, what’s more pertinent is how to set up your environment to model what a traditional C++ developer on Windows might think of as a collection of projects. Think of a Visual Studio solution and the enormous complexity that Visual Studio attempts to orchestrate when it presents a set of projects in its Solution Explorer, with project references and build dependencies all administered by msbuild. VS Code certainly has no notion of a solution in that sense, but what it does offer is surprisingly refreshing and combined with Rust’s built-in support for dependencies turns out to be superior in many ways. So, find an empty folder and give this a try.

Start by creating a binary package:

C:\projects>cargo new app
     Created binary (application) `app` package

Now create a library package that the app will use:

C:\projects>cargo new --lib fruit
     Created library `fruit` package

Notice I used the –lib option to tell Cargo to use its library template rather than the default binary template. Recall that Rust infers the type of package based on the presence of a file named src/main.rs or src/lib.rs respectively. So these two projects are almost identical so far. And just for good measure, let’s create a third package:

C:\projects>cargo new --lib apples
     Created library `apples` package

Right, you should now have three folders that look something like this:

C:\projects>dir
    <DIR>          app
    <DIR>          apples
    <DIR>          fruit

Notice there’s no “solution” file, but VS Code is quite happy to simply to open a folder directly:

C:\projects>code .

Here I’m using the “.” shortcut to indicate the current folder.

Notice that VS Code’s Explorer panel simply presents the three folders and their contents and as you can see, the only difference between the binary and library packages is the name of the source file in each package’s src folder. Before looking at the code, let’s set up our package dependencies. We would like the app to depend on the fruit library and the fruit library in turn to depend on the apples library package. Such dependencies are described in the Cargo.toml file that each package contains in its package root.

Open app/Cargo.toml and you’ll see that Cargo filled in some basics. Add the fruit dependency such that app/Cargo.toml looks something like this:

[package]
name = "app"
version = "0.1.0"

[dependencies]
fruit = { path = "../fruit" }

Cargo lets you specify dependencies in a variety of ways, but this simply tells Cargo that this package depends on a package in a relative location in the file system. Cargo will thus ensure that the fruit package is built and ready for consumption by the app package. Now you should know how to update fruit/Cargo.toml to depend on the apples package:

[package]
name = "fruit"
version = "0.1.0"

[dependencies]
apples = { path = "../apples" }

And that’s it. In Visual Studio parlance, you’ve set up a solution with a few projects that include build dependencies. Don’t believe me? Change to the app folder and let’s see what happens when we build it:

C:\projects>cd app

We can now simply tell Cargo to build the app package as we learned previously. But notice what happens:

C:\projects\app>cargo build
   Compiling apples v0.1.0 (C:\projects\apples)
   Compiling fruit v0.1.0 (C:\projects\fruit)
   Compiling app v0.1.0 (C:\projects\app)
    Finished dev [unoptimized + debuginfo] target(s) in 0.99s

Cargo chased down the package dependencies recursively, first building the apples package, then the fruit package, and finally the app itself. As you might expect, any time you make a change to any source in any of these packages, Cargo will automatically rebuild as needed. On the other hand, if nothing has changed it will simply reuse build artifacts or do the equivalent of an incremental build:

C:\projects\app>cargo build
    Finished dev [unoptimized + debuginfo] target(s) in 0.08s

Now for good measure, let’s see if we can actually call some code down through these package dependencies. We’ll start at the bottom. Open apples/src/lib.rs and replace whatever’s there with a simple apple function:

pub fn apple() -> &'static str {
    "Granny Smith"
}

Don’t worry too much about the syntax at this stage. We will shortly switch gears and talk about the Rust language itself. Essentially, this is just saying that we would like a public (pub) function (fn) called apple that doesn’t have any parameters and returns a static string slice, typically a string literal.

Now open fruit/src/lib.rs and replace whatever’s there with a fruit function:

pub fn fruit() -> &'static str {
    apples::apple()
}

This function has the same signature as before, but simply calls the apple function and forwards on its return value. Finally, we can call the fruit function from our app’s main function inside app/src/main.rs and replacing whatever’s there with this main function:

fn main() {
    println!("Hello, {}!", fruit::fruit());
}

And that’s it. Save your work and go back to the command prompt and build the app:

C:\projects\app>cargo build
   Compiling apples v0.1.0 (C:\projects\apples)
   Compiling fruit v0.1.0 (C:\projects\fruit)
   Compiling app v0.1.0 (C:\projects\app)
    Finished dev [unoptimized + debuginfo] target(s) in 0.79s

You’ll notice that Cargo once again understands that it should rebuild all dependencies since they have all changed. As I mentioned before, rather than Cargo’s build command we could more easily just use Cargo’s run command to both build and run the app:

C:\projects\app>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.02s
     Running `target\debug\app.exe`
Hello, Granny Smith!

Well hello there, Granny Smith. Wasn’t that fun! We’ve gone from building individual Rust source files, to building a Rust package, to being able to build a set of Rust packages with really very little effort on our part. VS Code is not without issues and there are certainly more and different ways to manage packages and editors, but this is enough to get you started.

Join me next time for the adventures of a C++ developer learning Rust.

Rust: your first package

Next: An IDE for your project
Previous: Getting started

So, you’ve installed the tools and you’re ready to get started. What now? Hello world is satisfying but not very practical. It’s time to create your first package. Open the Visual Studio x64 command prompt. It’s important that the architecture (x64) matches what Rust assumes as its default. It should look something like this:

Now change to a directory where you can create a few projects. I tend to use a root directory called git for all my projects:

C:\git>

There are a few reasons for this. The obvious one being that I tend to use Git for version-control. The other is that a default installation of Windows does not include any other folder starting with a G in the root of the C drive, making it easy to switch to the git folder with tab completion, otherwise I might have gone with something like “projects” if not for the two or three folders that start with a P.

Last time I introduced the Cargo tool just enough to print out the Rust version. In a moment, we’re going to use it do something useful. As a Windows developer, you might fire up Visual Studio and use that to create a new project. There are numerous project templates for all manner of languages and applications. As a Rust developer, you’re going to use cargo to do that and much more.

But first, I want to dispel any mystery around what a Rust package is. If you’ve spent any time with Visual Studio, particularly as a C++ developer, you know that managing project configurations can be a nightmare. Fortunately, Rust packages are quite simple to the point where you can easily create them by hand. It helps to do this at least once so that you understand just what makes up a Rust package.

There are two main kinds of project or package types in Rust. I’ll talk about a third category a little later in this series. You can create a binary package or a library package. A binary package produces an executable that you can run, just like any Windows executable. A library package is similar to a C++ library.

Start by creating a folder for your package and switching to that folder:

C:\git>md hello

C:\git>cd hello

Now create a text file called cargo.toml in this folder:

C:\git\hello>notepad cargo.toml

A .toml file is similar to a .ini file with an actual specification, making for a very simple way to define key/value pairs for describing and configuring your Rust package and its dependencies. Unlike the horrendously complicated XML format used by Visual Studio, you can actually write your package’s cargo.toml file by hand without any difficulty. Here’s all you’ll need to get started:

[package]
name = "hello"
version = "0.1.0" 

That’ll do. Be sure to save that in the cargo.toml file in the root of the package folder. You’ll notice that it doesn’t say anything about what kind of package this is or how it ought to be built. Much of that will be inferred. Cargo will figure this out based on what Rust source files are present. Having created the cargo.toml file, you can now create a src subfolder for the package’s source code.

C:\git\hello>md src

Within the src folder, you can create one of two Rust source files. If you create a file named main.rs then cargo will assume you are creating a binary package. If on the other hand you create a file named lib.rs then cargo will assume you are creating a library package. Let’s start with a binary package.

C:\git\hello>notepad src\main.rs

A .rs file is where you write your Rust source code, much like .cs files for C# and .cpp/.h files for C++. Here again is the simple Hello World example we used last time:

fn main() {
    println!("Hello world!");
}

Be sure to save that in the src\main.rs source file. And that’s it: you’ve created your first Rust package! You can now use Cargo to build the package as follows:

C:\git\hello>cargo build

You’ll notice that running the Cargo build command caused a few other artifacts to be created, but you should only consider the cargo.toml file along with any source files in the src subfolder to be the actual source code for your project that you might include in version control. You will notice that Cargo created a file called Cargo.lock in your package folder. This file is entirely managed by Cargo and you should not edit it yourself. There are reasons why you may or may not want to include this file in version control, but we’ll talk about that later. More importantly, the Cargo build command created an executable from your package:

C:\git\hello>dir /s /b hello.exe
C:\git\hello\target\debug\hello.exe

You can run this directly:

C:\git\hello>target\debug\hello.exe
Hello world!

And if it’s more convenient, you can have Cargo run it for you:

C:\git\hello>cargo run
    Finished dev [unoptimized + debuginfo] target(s) in 0.03s
     Running `target\debug\hello.exe`
Hello world!

The cargo tool also provides commands for creating packages so you can save yourself a few moments if so desired. You can ask Cargo to create a binary package as follows:

C:\git>cargo new hello
     Created binary (application) `hello` package

C:\git>cd hello

C:\git\hello>cargo build
   Compiling hello v0.1.0 (C:\git\hello)
    Finished dev [unoptimized + debuginfo] target(s) in 0.71s

Of course, you already know that there’s no magic here. It just created the cargo.toml file, the src subfolder, and a main.rs file with a simple main function to get you started. You can also have Cargo create a package in the current folder, rather than a subfolder, using the Cargo init command. Note that Cargo will also default to turning your package into a Git repo. If you’d rather not have it do that you can opt out as follows:

C:\git>cargo new hello --vcs none

OK, you’ve created your first Rust package. Wasn’t that easy! Now what about library packages? And where’s the IDE?! Join me next time for the adventures of a C++ developer learning Rust.

Rust: getting started

Next: Your first package
Previous: My Rust adventure begins

So, you’re a C++ developer and want to learn Rust. You’ve come to the right place. 😊 It’s not hard to get started with Rust, but as a C++ developer and especially one with a preference for Windows you are likely to run into some of the same challenges I have. I hope that sharing my experience will help you to get started with Rust a little more quickly than I did.

On Windows, Rust requires the Microsoft C++ build tools. You guessed right: Rust depends on the thing you love most about C++… the linker! And a few other things… You can download the C++ build tools separately, or just install Visual Studio 2019. I recommend the latter.

https://visualstudio.microsoft.com/downloads/

Visual Studio provides a dizzying array of options, most of which are completely irrelevant to the C++ developer. You can painstakingly pick any of the “individual components” the installer provides or just pick some “workloads”. I suggest you do the latter. Here’s what I do to keep things simple.

That’s right. Pick all three “Windows” workloads. You may not think you need them all but invariably, some dependency will arise where these are required, and this just keeps things simple. The only “individual component” I add to the mix is “Git for Windows”. That’s important because, well, Git rules the world and you’re going to need that real soon.

You can scroll through the endless list or just use the search box to narrow things down. And that’s it. Hit the install button and then go and grab a coffee.

Once Visual Studio is installed, you can head over to the Rust website to get started.

https://www.rust-lang.org/

You should notice the prominent “Get Started” button. Go ahead and click that. You should then see a few options. You can for example “try Rust without installing” but what’s the fun in that. Instead, go for the “RUSTUP-INIT.EXE” button that links to https://win.rustup.rs/ and run the resulting executable. This will pop up a console installer.

Go ahead and pick option 1 and the installer will get on with downloading the various Rust build tools. Soon enough that will be done and you will be ready to create your first project!

To get started you need a console window. You can use any command prompt, but I suggest using the Visual Studio tools command prompt. You can then confirm that Rust is installed with a simple version command: cargo –version

What’s cargo? Well you could also run the same command with rustc, the Rust compiler, but to be honest you’ll almost never run the Rust compiler directly. So, just get used to running Cargo for all your build needs. Cargo is officially Rust’s package manager, whatever that means. Package managers may be a little foreign to the average C++ developer. Unlike C++, Rust has an integrated packaging story and Cargo is the tool that you use to manage packages. It might help if I told you that practically every Rust project is a package. In that light, Cargo is really the Rust project manager. You can use Cargo to create projects, build projects, test projects, publish projects, and much more. It’s just that Rust projects are called packages… Except when they’re called crates, but that’s enough about that confusing topic for one day.

As I mentioned, rustc is the Rust compiler. We can quickly put Hello World behind us as follows. Create a text file, with the .rs extension, and include a simple main function:

C:\hello_world>type app.rs

fn main() {
    println!("Hello world!");
}

Not a fan of that formatting? You’re not alone, but more about formatting later. You can now compile this program as follows:

C:\hello_world>rustc app.rs

And lo and behold you’ve created your first Rust program:

C:\hello_world>dir

    150,528 app.exe
  1,363,968 app.pdb
         49 app.rs

Notice there’s both an executable and a .pdb file for debugging. Naturally, you can simply run the executable from the console:

C:\hello_world>app
Hello world!

OK, you’ve confirmed that Rust is installed and managed to compile your first bit of Rust code. Awesome! What about Cargo and building actual projects? Please tell me there’s an IDE and Notepad isn’t going to be my Rusty lot in life? What about testing and debugging? Oh there’s so much to explore. So join me next time for the adventures of a C++ developer learning Rust.

My Rust adventure begins

Next: Getting started

I have come to the point with C++/WinRT where I am largely satisfied with how it works and leverages C++ to the best of its ability. There is always room for improvement and I will continue to evolve and optimize C++/WinRT as the C++ language itself advances. But as a technology, the Windows Runtime has always been about more than just one language and we have started working on a few different projects to add support for various languages. None of these efforts could however draw me away from C++… that is until Rust showed up on my radar.

Rust is an intriguing language for me. It closely resembles C++ in many ways, hitting all the right notes when it comes to compilation and runtime model, type system and deterministic finalization, that I could not help but get a little excited about this fresh new take on language design. I have spent almost every waking moment over the last few months (that I’m not hanging out with my family) exploring, studying, and experimenting with Rust. I looked for the usual signs of a language that is not really geared for a systems programmer like myself, but found none. To the contrary, I found that while it has its own unique and dramatic learning curve, it also has the potential to solve some of the most vexing issues with C++’s relationship to WinRT. Imagine C++/WinRT without any need for IDL, faster build times, and a simple and integrated build system.

And so it is that I have started building the WinRT language projection for Rust. I’m just getting started and have much to learn, but the plan is to build complete and deep support for WinRT in a way that is natural and familiar for the Rust developer. This is not going to look very much like C++/WinRT because idiomatic Rust does not look and feel like C++, but I plan to apply the same level of rigor in producing WinRT support for Rust that is both very efficient and a joy to use.

I’ll be sharing more about my adventures with Rust right here on kennykerr.ca but if you’d like to follow along more closely, take a look at the Rust winmd parser I wrote to get things started:

https://github.com/microsoft/winmd-rs

This is largely based on the C++ winmd parser library. While certainly not complete, it has just enough in place to allow me to now spend some time exploring and laying the groundwork for the WinRT support. The plan is to turn this Rust crate into a complete winmd parser for both reading and generating winmd files. A separate Rust crate will then provide the actual support for consuming and producing WinRT APIs.

But I’m getting ahead of myself. Do let me know what you think. I’d love to hear from you. And don’t forget to check back soon as I will probably start writing about the adventures of a C++ developer learning Rust. 🙂

C++/WinRT and xlang repos

If you follow along on GitHub, you may have noticed a few changes in the C++/WinRT and xlang world. It became clear that having one repo for a variety of projects and languages just wasn’t practical. Developers interested in working on one language or library inevitably had to deal with all of it, creating an unnecessarily steep learning curve. And while we have a lot of ambitions for the xlang project, its clear that C++/WinRT remains our flagship project. To that end, and to make it easier to work with the two most popular projects under the xlang umbrella, we’ve split the projects up as follows:

C++/WinRT
Repository: https://github.com/microsoft/cppwinrt
Documentation: https://aka.ms/cppwinrt
NuGet package: http://aka.ms/cppwinrt/nuget
Visual Studio extension: http://aka.ms/cppwinrt/vsix

C++ winmd parser library
Repository: https://github.com/microsoft/winmd
NuGet package: http://aka.ms/winmd/nuget

Existing projects related to cross-platform support
Repository: https://github.com/microsoft/xlang

You may also have noticed some new GitHub repos for Java and C# language support. Obviously, we’d love to add support for every popular language, but our resources are limited. We have experimented with adding support for both. The C# project might seem a little curious given that C# currently supports WinRT directly, but we have discovered through our experience with C++/WinRT that we can provide a far better experience by separating the WinRT support from the compiler itself. On a personal note, I’m spending a lot of my time working with the Rust language and can’t wait to share more about that eventually. 😉