-
Notifications
You must be signed in to change notification settings - Fork 227
L4T R32.3.1 Notes
As of 01 Aug 2021, the dunfell-l4t-r32.3.1
branch was created off the last L4T R32.3.1-based commit into the dunfell
branch, for users
wishing to continue with the older BSP. The dunfell
branch has been updated with L4T R32.6.1/JetPack 4.6.
As of 26 Apr 2020, the dunfell
and zeus-l4t-r32.3.1
branches support L4T R32.3.1/JetPack 4.3 content for Jetson TX1, Jetson TX2, Jetson Nano, and Jetson AGX Xavier. (There is also thud-l4t-r32.3.1
, but it is not actively maintained.)
There are several changes in this version of L4T that required updates to Tegra platform support in this layer.
Support for bootloader updates has been added to tegra210 (Jetson-TX1 and Jetson Nano) platforms. The tegra186-redundant-boot
recipe has been renamed to tegra-redundant-boot
, which installs the l4t_payloader_updater_t210
script on tegra210 platforms. Note that bootloader redundancy on tegra210 is different from tegra186/tegra194 (for example, no A/B slots with failover). See the Bootloader chapter of the L4T documentation for details.
With the wider support for bootloader updates and more module variants that may need different boot-time configuration files, BUP payload packages now support all variants for a MACHINE. Also added is a service
that runs at boot time to populate the TNSPEC field of the /etc/nv_boot_control.conf
file based on the
contents of the EEPROM on the module, so the update tools can select the correct files out of the BUP
payload for the specific module in the system. This differs from stock L4T, where the configuration file is written into the rootfs after the module's EEPROM has been read during the flashing process, but should
result in the same TNSPEC as would be present after using L4T's flash.sh
.
As of L4T R32.3.1, NVIDIA has stopped providing the Tegra Multimedia API kit with the BSP, so if you need the Multimedia API in your builds, you must download the kit to your NVIDIA_DEVNET_MIRROR
directory.
The NVIDIA-specific OpenGL extension header files that used to be extracted from the Multimedia API kit are now obtained from the graphics demos source package in the L4T BSP.
The MACHINE name for the original Jetson Nano developer kit (using SPI flash and an SDcard) has been changed from jetson-nano
to jetson-nano-qspi-sd
. This aligns with NVIDIA's naming and will make it easier to distinguish between the older kit and the upcoming newer kit based on the 0002 SKU that uses eMMC.
Builds for jetson-nano-qspi-sd
now use only the unified SPI+SDcard flash layout XML file (flash_l4t_t210_spi_sd_p3448.xml
), as this layout file has been updated for compatibility with bootloader updates.
Also changed are the workflows for flashing and creating SDcard images for the Nano. The tegraflash.zip
package now includes two shell scripts: doflash.sh
for flashing via USB (which now flashes both the QSPI flash and and an SDcard mounted on the device), and dosdcard.sh
for creating either a file containing SDcard image or for writing directly to an SDcard mounted on your development host.
TensorRT 6.0.1 has a more complicated packaging layout than prior versions, but has the same issue as prior versions where NVIDIA uses the exact same .deb
package names for the Xavier-specific packages and the non-Xavier packages. To make it clearer which packages are which, the tensorrt
recipe looks for the Xavier-specific packages in ${NVIDIA_DEVNET_MIRROR}/DLA
and the non-Xavier packages in ${NVIDIA_DEVNET_MIRROR}/NoDLA
. You must move the packages yourself once you have downloaded them using SDK Manager.
Example, for Xavier:
$ cd ~/Downloads/nvidia/sdkm_downloads
$ mkdir DLA
$ mv tensorrt*.deb *libnvinfer*.deb libnv*parsers*.deb uff*.deb graphsurgeon*.deb DLA/
Example, for all other platforms:
$ cd ~/Downloads/nvidia/sdkm_downloads
$ mkdir NoDLA
$ mv tensorrt*.deb *libnvinfer*.deb libnv*parsers*.deb uff*.deb graphsurgeon*.deb NoDLA/
The following notes from prior releases also apply.
JetPack 4.3 content cannot be downloaded anonymously from NVIDIA's servers. You must use NVIDIA SDK Manager to download the JetPack 4.3 Debian packages to your build host, then add this setting to your build configuration (e.g., in conf/local.conf
under your build directory):
NVIDIA_DEVNET_MIRROR = "file://path/to/downloads"
By default, the SDK Manager downloads to a directory called Downloads/nvidia/sdkm_downloads
under your $HOME
directory, so use that path in the above setting.
If you ran the SDK Manager on Ubuntu 16.04 to download the JetPack packages, you should add the following setting to your build configuration:
CUDA_BINARIES_NATIVE = "cuda-binaries-ubuntu1604-native"
By default, the recipes assume you used Ubuntu 18.04 and reference that version of the CUDA host-side tools.