DemesneGH commented on code in PR #173:
URL: 
https://github.com/apache/incubator-teaclave-trustzone-sdk/pull/173#discussion_r1991362395


##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.

Review Comment:
   Give the quick build and install guidance, such as `make` and the link to 
"run in QEMUv8" documentation



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.

Review Comment:
   Prefer more details on the memory setting, refer to the code in build.rs.



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host

Review Comment:
   It's good to have a detailed usage of the CA as in the current documentation.
   Additionally, could we have the description on the interface of TAs, and 
which part is run in TAs, which part is run in CA? It is not much clear about 
this.



##########
examples/mnist-rs/host/src/tee.rs:
##########
@@ -0,0 +1,150 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+use optee_teec::{Context, ErrorKind, Operation, ParamNone, ParamTmpRef, 
Session, Uuid};

Review Comment:
   Format: add a new blank line



##########
examples/mnist-rs/host/src/commands/mod.rs:
##########
@@ -0,0 +1,19 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+pub mod infer;

Review Comment:
   Format: add a new blank line between line16 and line17



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host
+
+There are three subcommands in the host:
+
+1. Train
+
+    Trains a new model and exports it to the given path.
+
+    ``` shell
+    cargo run -- train -o model.bin
+    ```
+
+    This subcommand downloads the MNIST dataset, trains a new model, and 
outputs
+    the model to the given path.
+
+    For detailed usage, run: `cargo run -- train --help`.
+
+2. Infer
+
+    Loads a model from the given path, tests it with a given image, and prints
+    the inference result.
+
+    ```shell
+    # cargo run -- infer -m model.bin -b samples/7.bin -i samples/7.png
+    cargo run -- infer [-b ${binary_path} | -i ${image_path}]
+    ```
+
+    This subcommand loads the model the model from the given path and tests it
+    with the given binaries and images, and prints the inference results. For
+    convenience, you can use the sample binaries and images in the `samples`
+    folder.
+
+    For detailed usage, run: `cargo run -- infer --help`.

Review Comment:
   Same consideration as in line33



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host
+
+There are three subcommands in the host:
+
+1. Train
+
+    Trains a new model and exports it to the given path.
+
+    ``` shell
+    cargo run -- train -o model.bin
+    ```
+
+    This subcommand downloads the MNIST dataset, trains a new model, and 
outputs
+    the model to the given path.
+
+    For detailed usage, run: `cargo run -- train --help`.

Review Comment:
   If the output isn't too long I'd prefer to include it here so others can see 
the full usage without needing to build it or checkout the code.



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host
+
+There are three subcommands in the host:
+
+1. Train
+
+    Trains a new model and exports it to the given path.
+
+    ``` shell
+    cargo run -- train -o model.bin
+    ```
+
+    This subcommand downloads the MNIST dataset, trains a new model, and 
outputs
+    the model to the given path.
+
+    For detailed usage, run: `cargo run -- train --help`.
+
+2. Infer
+
+    Loads a model from the given path, tests it with a given image, and prints
+    the inference result.
+
+    ```shell
+    # cargo run -- infer -m model.bin -b samples/7.bin -i samples/7.png
+    cargo run -- infer [-b ${binary_path} | -i ${image_path}]
+    ```
+
+    This subcommand loads the model the model from the given path and tests it
+    with the given binaries and images, and prints the inference results. For
+    convenience, you can use the sample binaries and images in the `samples`
+    folder.
+
+    For detailed usage, run: `cargo run -- infer --help`.
+
+3. Serve
+
+    Loads a model from the given path, starts a web server and serves it as an
+    API.
+
+    ```shell
+    cargo run -- serve -m model.bin
+    ```
+
+    This subcommand loads the model the model from the given path and starts a
+    web server to provide inference APIs.
+
+    **Available APIs**:
+
+    | Method | Endpoint | Body |
+    | ---- | ---- | ---- |
+    | POST | `/inference/image` | an image with dimensions 28x28 |
+    | POST | `/inference/binary` | a 784-byte binary |
+
+    You can test the server with the following commands:
+
+    ```shell
+    # Perform inference using an image
+    curl --data-binary "@./samples/7.png" http://localhost:3000/inference/image
+    # Perform inference using a binary file
+    curl --data-binary "@./samples/7.bin" 
http://localhost:3000/inference/binary
+    ```
+
+    For detailed usage, run: `cargo run -- serve --help`.
+
+## Credits
+
+This demo project is inspired by the crates and examples from
+[tracel-ai/burn](https://github.com/tracel-ai/burn), including:
+
+1. 
[crates/burn-no-std-tests](https://github.com/tracel-ai/burn/tree/v0.16.0/crates/burn-dataset)
+2. 
[examples/custom-training-loop](https://github.com/tracel-ai/burn/tree/v0.16.0/examples/custom-training-loop)
+3. 
[examples/mnist-inference-web](https://github.com/tracel-ai/burn/tree/v0.16.0/examples/mnist-inference-web)

Review Comment:
   We need to address licensing when referencing code from third-party 
projects. Once the v0.4.0 release is complete, we should have clearer 
guidelines on this. So just leave a note for this and nothing to do for now, it 
would be another PR to adjust the license for this example then.



##########
examples/mnist-rs/host/src/commands/infer.rs:
##########
@@ -0,0 +1,80 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agreements.  See the NOTICE file
+// distributed with this work for additional information
+// regarding copyright ownership.  The ASF licenses this file
+// to you under the Apache License, Version 2.0 (the
+// "License"); you may not use this file except in compliance
+// with the License.  You may obtain a copy of the License at
+//
+//   http://www.apache.org/licenses/LICENSE-2.0
+//
+// Unless required by applicable law or agreed to in writing,
+// software distributed under the License is distributed on an
+// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+// KIND, either express or implied.  See the License for the
+// specific language governing permissions and limitations
+// under the License.
+
+use clap::Parser;
+use image::EncodableLayout;
+use optee_teec::Context;
+use proto::{Image, IMAGE_SIZE};
+
+#[derive(Parser, Debug)]
+pub struct Args {
+    /// The path of the model.
+    #[arg(short, long)]
+    model: String,
+    /// The path of the input binary, must be 768 byte binary, can be multiple
+    #[arg(short, long)]
+    binary: Vec<String>,
+    /// The path of the input image, must be dimension of 28x28, can be 
multiple
+    #[arg(short, long)]
+    image: Vec<String>,
+}
+
+pub fn execute(args: &Args) {
+    let model_path = std::path::absolute(&args.model).unwrap();
+    println!("Load model from \"{}\"", model_path.display());
+    let record = std::fs::read(&model_path).unwrap();
+    let mut ctx = Context::new().unwrap();
+    let mut caller = crate::tee::Model::new(&mut ctx, &record).unwrap();

Review Comment:
   Could we handle the error instead of `unwrap()` to trigger panic?
   There're some `unwrap()` in other files too.



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host
+
+There are three subcommands in the host:
+
+1. Train
+
+    Trains a new model and exports it to the given path.
+
+    ``` shell
+    cargo run -- train -o model.bin
+    ```
+
+    This subcommand downloads the MNIST dataset, trains a new model, and 
outputs
+    the model to the given path.
+
+    For detailed usage, run: `cargo run -- train --help`.
+
+2. Infer
+
+    Loads a model from the given path, tests it with a given image, and prints
+    the inference result.
+
+    ```shell
+    # cargo run -- infer -m model.bin -b samples/7.bin -i samples/7.png
+    cargo run -- infer [-b ${binary_path} | -i ${image_path}]

Review Comment:
   Is the `-m` mandatory? It's on line 41 but not on line 42.



##########
examples/mnist-rs/README.md:
##########
@@ -0,0 +1,92 @@
+# MNIST-rs
+
+This demo project demonstrates how to train and perform inference in TEE.
+
+## Install the TAs
+
+There are two TAs in the project:
+
+| TA | UUID | Usage |
+| ---- | ---- | ---- |
+| Train | 1b5f5b74-e9cf-4e62-8c3e-7e41da6d76f6 | for training new Model|
+| Inference | ff09aa8a-fbb9-4734-ae8c-d7cd1a3f6744 | for performing reference|
+
+The `Train TA` consumes more memory than `Inference TA`.
+
+Make sure to install them before attempting to use any functions.
+
+## Running the Host
+
+There are three subcommands in the host:
+
+1. Train
+
+    Trains a new model and exports it to the given path.
+
+    ``` shell
+    cargo run -- train -o model.bin

Review Comment:
   Run `./mnist-rs train` after building CA, not the `cargo run`? Considering 
cross-building and running on another different arch.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@teaclave.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@teaclave.apache.org
For additional commands, e-mail: dev-h...@teaclave.apache.org

Reply via email to