From: Giovanni Mascellani gmascellani@codeweavers.com
--- gitlab/README | 8 ++++---- gitlab/build.yml | 19 +++++++++++++++++-- 2 files changed, 21 insertions(+), 6 deletions(-)
diff --git a/gitlab/README b/gitlab/README index b04a3c8f..ef4d0bed 100644 --- a/gitlab/README +++ b/gitlab/README @@ -11,10 +11,10 @@ testing, and uploads it to the GitLab container registry. The Docker script is in the file image.docker.
The file build.yml contains the actual testing targets. Currently -vkd3d is tested on Linux x86-64, with two different Vulkan drivers -(both from Mesa): llvmpipe (a software implementation) and RADV (a -hardware implementation backed by an AMD GPU). The testing logs are -available as CI artifacts. +vkd3d is tested on Linux, on x86-64 and i386, each architecture with +two different Vulkan drivers (both from Mesa): llvmpipe (a software +implementation) and RADV (a hardware implementation backed by an AMD +GPU). The testing logs are available as CI artifacts.
Some custom runner configuration is required in order to run the tests on an AMD GPU. Specifically, a runner tagged with `amd-gpu' must be diff --git a/gitlab/build.yml b/gitlab/build.yml index c10d64a6..780d645b 100644 --- a/gitlab/build.yml +++ b/gitlab/build.yml @@ -25,15 +25,30 @@ paths: - artifacts
-build-radv: +build-radv-64: extends: .build tags: - amd-gpu variables: VK_LOADER_DRIVERS_SELECT: 'radeon_*'
-build-llvmpipe: +build-llvmpipe-64: extends: .build allow_failure: true variables: VK_LOADER_DRIVERS_SELECT: 'lvp_*' + +build-radv-32: + extends: .build + tags: + - amd-gpu + variables: + VK_LOADER_DRIVERS_SELECT: 'radeon_*' + CC: 'gcc -m32' + +build-llvmpipe-32: + extends: .build + allow_failure: true + variables: + VK_LOADER_DRIVERS_SELECT: 'lvp_*' + CC: 'gcc -m32'
How valuable is this? Are there any behaviours that are actually architecture-specific that we should be concerned about?
On Thu Aug 31 21:28:08 2023 +0000, Zebediah Figura wrote:
How valuable is this? Are there any behaviours that are actually architecture-specific that we should be concerned about?
At least once I've found a bug that only happened on i386 (a quirk with `scanf()`, if I remember correctly), so I'd say there is at least some value. It's also quite a simple change, so to me it doesn't require a particularly high added value threshold. Also, a non-trivial amount of programs using vkd3d is indeed 32 bit, so it makes sense to keep an eye on that.
Yeah, I'd like to have 32-bit runs.
Incidentally, I'm much less convinced about the value of llvmpipe/lavapipe runs. It could have been a valuable baseline that everyone can have access to without requiring specific hardware, in the same way that swrast and softpipe used to be for OpenGL, but it's not. That's largely a consequence of its target being desktop applications like GNOME 3, Firefox, Chrome, and so on, and conformance largely being a non-goal.
I'm willing to humour Giovanni to see if we can make the tests pass on llvmpipe and provide something of value there, but if there are concerns about the number of configurations we're running, I'd much rather have radv-32 than llvmpipe-*.
This merge request was approved by Henri Verbeet.
On Mon Sep 4 13:24:07 2023 +0000, Henri Verbeet wrote:
Yeah, I'd like to have 32-bit runs. Incidentally, I'm much less convinced about the value of llvmpipe/lavapipe runs. It could have been a valuable baseline that everyone can have access to without requiring specific hardware, in the same way that swrast and softpipe used to be for OpenGL, but it's not. That's largely a consequence of its target being desktop applications like GNOME 3, Firefox, Chrome, and so on, and conformance largely being a non-goal. I'm willing to humour Giovanni to see if we can make the tests pass on llvmpipe and provide something of value there, but if there are concerns about the number of configurations we're running, I'd much rather have radv-32 than llvmpipe-*.
I'm not exactly sure of the differences in terms of conformance management between llvmpipe and swrast or softpipe, but at least some work is being done towards llvmpipe Vulkan conformance, and [at least some results were achieved](https://www.khronos.org/conformance/adopters/conformant-products#submission_...). For the moment I don't know whether the failing tests are failing due to our fault or to llvmpipe's (well, at least for !289 it seems that the fault is indeed on us, but there are many other failures). Hopefully eventually I'll find some time to investigate.