By exporting it, we always make the new directories available
to subprocesses, regardless of whether the environment
variable existed before `nix-shell` was invoked.
This avoids the scenario where strictDeps is off and cross-compiled
XDG_DATA_DIRS content is brought into the environment.
While probably harmless for data like manpages and completion scripts,
this would cause issues when XDG_DATA_DIRS is used to find executables
or plugins. The Qt framework is known to behave like this and might
have run into incompatibilities.
XDG_DATA_DIRS is to /share as PATH is to /bin.
It was defined as part of the XDG basedir specification.
https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html
While it originated from the X Desktop Group, it is not limited to
the X11 ecosystem, as evidenced by its use in bash-completion.
The removal of ` && -d "$pkg/bin"` is ok, because this optimization is
already performed by `addToSearchPath`.
Fixes cross-compilation when build == host != target == ppc64le.
Glibc invokes objcopy during cross-compilation to ppc64le, which
fails when the nonprefixed objcopy can't understand the target format.
Pyspark switched to pinning py4j==0.10.9 with v3.0.0 - see this commit:
https://github.com/apache/spark/\
commit/fc4e56a54c15e20baf085e6061d3d83f5ce1185d
This meant that since the bump to pyspark v3.0.0 - in this commit:
https://github.com/NixOS/nixpkgs/\
commit/5181547ae6624b462919a806c4d0888e6e4630f4 -
the patch was no longer matching on the 'py4j==0.10.7' string that was
working previously.
The failing patch went unnoticed previously because the version of py4j
pinned by pyspark>=3.0.0 was the same as the py4j provided by nixpkgs.
However, a recent PR (#101636) bumped the version of py4j to 0.10.9.1 in
this commit:
https://github.com/NixOS/nixpkgs/\
commit/43a91282d66223c5cb978d53fbe1033f56dd7f2b
which caused the version pinned by pyspark to no longer match the
version provided by nixpkgs. FWIW, @jonringer flagged this issue on
another PR that tried to bump py4j: #100623.
My solution here was to upgrade the patch's target string to match the
version found in pyspark's current setup.py.