Hello openjfx-dev,

Because Windows is very open hardware-wise, it is difficult to prepare the 
backend for all possible hardware combinations available, especially from 
performance perspective. To make sure JavaFX performance does not degrade 
compared to old D3D backend we would like to call for volunteers to help 
performance-test the backend. These tests will let us get a general idea on how 
the backend behaves on different hardware and which areas of the backend to 
focus on while moving forward with optimization effort for Direct3D 12.

At this point we tested the backend quite extensively on Intel-based integrated 
GPUs and did some testing on a machine running a recent discrete Nvidia GPU. We 
are primarily looking for testing Direct3D 12 backend on a system running an 
AMD discrete GPU, but any hardware combinations are welcome - the more the 
merrier :). Also note that these tests require a Windows machine, as D3D12 
backend is Windows-only.

We run performance testing using RenderPerfTest JavaFX app located in the 
JavaFX repository under "tests/performance/animation/RenderPerfTest". I wrote a 
bash script to use with this app for performance-testing the backend. The 
script will run all available demos on RenderPerfTest with a set amount of 
objects per test, average the FPS results from each run and output the results 
into a CSV file.


How to run perf tests:

  1.  Download the test script - 
https://gist.github.com/lukostyra/bc354a5fd845b82805ffb3380caebe9a
  2.  Get JavaFX Direct3D 12 build - either download the EA2 SDK from [ 
https://jdk.java.net/javafxdirect3d12/ ] or build it from scratch from 
direct3d12 [ https://github.com/openjdk/jfx-sandbox/tree/direct3d12 ] branch 
(make sure to build with -PCONF=Release; at the time of writing this email 
there is no functional difference between the sandbox repo and the EA2 build).
  3.  Put the script in the same directory as renderperf/RenderPerfTest.java - 
I usually copy the contents of tests/performance/animation/RenderPerfTest/src 
into a separate directory where the script is located. The script will look in 
the current directory specifically for "renderperf" directory containing a 
"RenderPerfTest.java" source file.
  4.  Ensure nothing else is running on your system.
  5.  Perform a baseline test run with D3D backend:
   ./run_renderperf_all.sh -j <path_to_jfx_sdk> -b d3d -r 3
When the script is done, this should result in 
"RenderPerf_results_d3d-<date>-<time>.csv" file. This will let us establish how 
your machine performs on RenderPerf’s test cases on the old D3D backend.
  6.  Perform a perf test run for D3D12 backend:
   ./run_renderperf_all.sh -j <path_to_jfx_sdk> -b d3d12 -r 3
This should result in "RenderPerf_results_d3d12-<date>-<time>.csv" file. The 
results will be used as a comparison to D3D backend.

Once you are done, reply to this email thread with details what hardware you 
ran the tests on and with contents of both CSV files specifying which came from 
which test run - I’m pretty sure attachments will be stripped when sending an 
email to the mailing list, so to make sure the results get to us you should 
paste them inline. Also, include the console output of running any JavaFX app 
on D3D12 backend with "-Dprism.verbose=true".

Note that these tests will run for quite a bit of time. Each test case's single 
run will take 15 seconds - 5 seconds of warm-up and 10 seconds of actual test 
run where FPS is measured. Testing every single case, 3 runs each, for a single 
backend takes approximately 40 minutes - ensure your computer won't lock or 
turn off the screens during that process.


Test script reference:
    ./run_renderperf_all.sh -j/--jfx <path_to_jfx_sdk> [-b/--backend <backend>] 
[-r/--runs <runs>]

Where:
    -j, --jfx  -  required; path to directory where JavaFX SDK is located (has 
to be where JavaFX bin and lib folders are located)
    -b, --backend  -  optional, defaults to "d3d12"; short-hand for which Prism 
backend to use. Whatever is set here will be forwarded to "-Dprism.order" 
property.
    -r, --runs  -  optional, defaults to 3; how many times each test case 
should run. Script will average FPS results from these runs.

Notes:
   * Closing currently running test during the warm-up stage (first 5 seconds 
since starting the execution of the test case) should stop the test script 
early.
  * Script starts by declaring an array of test cases and object counts to run. 
We recommend leaving those as they are, however if there are tests which do not 
load because of a "Timeout" message (the script will then exit early) or 
perform significantly worse on D3D backend (below 20 FPS) you can lower the 
object count and retry.


Let me know if you have any problems with running the tests. And, in advance, 
thanks for your help!

Regards,
Lukasz

Reply via email to