Skip navigation
> RoadTest Reviews

AVNET 96Boards Dual Camera Mezzanine + Ultra96-V2 - Review

Scoring

Product Performed to Expectations: 10
Specifications were sufficient to design with: 10
Demo Software was of good quality: 8
Product was easy to use: 8
Support materials were available: 10
The price to performance ratio was good: 10
TotalScore: 56 / 60
  • RoadTest: AVNET 96Boards Dual Camera Mezzanine + Ultra96-V2
  • Buy Now
  • Evaluation Type: Development Boards & Tools
  • Was everything in the box required?: Yes
  • Comparable Products/Other parts you considered: NVIDIA Jetson with two cameras, Raspberry Pi with Dual Camera adapter with Intel Neural Compute Stick, OAK-D OpenCV AI kit
  • What were the biggest problems encountered?: The example designs are well laid out and easy to follow, however the video output uses the obsoleted Video On-Screen Display IP core. This limits building the example projects to Vivado 2020.1 until the core is replaced by LogiCORE Video Mixer IP core.

  • Detailed Review:

    Road Test Review: AVNET 96Boards Dual Camera Mezzanine + Ultra96-V2

    Introduction

    The Avnet Ultra96-V2 is a single board computer based on the Xilinx Zynq UltraScale+ MPSoC ZU3EG that conforms to the Linaro 96Boards CE Specification.  The Ultra96-V2 features four Arm Cortex A53 “application cores” suitable for embedded Linux, two Arm Cortex R5 “realtime cores” suitable for RTOS and bare metal mission critical applications, as well as programmable logic resources.  The 96Boards ON Semiconductor Dual Camera Mezzanine is a camera accessory board compatible with the Ultra96-V2 .  The camera mezzanine board features a On Semiconductor AP1302 image processor that features a high-speed MIPI compliant interface.  These combined boards represent a low-cost entry point into edge AI, computer vision, robotics, as well as custom IP development, and embedded multi-core ARM development.

     

    In this RoadTest Review, I will walk through the hardware and software setup and demonstrate a first boot of the system.  Then I demonstrate the steps to build and/or modify the Xilinx hardware, generate a bootable petalinux image, and deploy to the board.  I also show how to use the JTAG and UART interface to create a debug session over USB to a host PC  Finally, I discuss some possibilities for future projects for computer vision and AI inference that take advantage of the AVNET 96Boards Dual Camera Mezzanine + Ultra96-V2.

    Ulatra96v2 with Dual Camera Mezzanine

    Unboxing

    This Road Test Review included three components: the Ultra96-V2 Zynq UltraScale MPSoC board (AES-ULTRA96-V2-G), the 96Boads Dual Camera Mezzanine Card (AES-ACC-U96-ONCAM-MEZ) and Power Supply (AES-ACC-U96-4APWR). Each board comes with standard ESD packaging and a quickstart guide. The Ultra96-V2 also includes a 16GB micro SD card and a voucher for SDSoC License from Xilinx, which is required to build the Ultra96-V2 dual camera examples.  The power supply provides 4 Amps at 12 Volts and is auto-ranging 100-240 VAC, 50/60 Hz and includes international power cords.

    Ultra96 and 96Boards Dual Camera Mezzanine Boxes open

    Ultra96v2 box contentsDual

    Setup and First Boot

    Setup is straight forward.  With the power off, the dual camera mezzanine board attaches to the Ultra96 board by aligning the 60-pin high speed expansion header (white) and the 40 pin low speed connection header (black) applying force until the 60-pin connector is firmly seated and the boards are roughly parallel.  I found that placing my thumbs on the Ultra96-V2 heat sink, my left hand fingers on the mezzanine 40-pin expansion header and my right hand fingers between the cameras in the unpopulated areas of the expansion board worked best. 

    Connector alignment

    Pressing down on dual cam mezzanine

    Prebuilt “Out of Box” Image

     

    A prebuilt “out of box” image that contains a demo application that streams video from each camera to the DisplayPort input of a 1920x1080p monitor. This boot image demonstrates some of the capabilities of the board as well as provides a test to make sure the hardware is functional and is hooked up correctly.  The prebuilt demo image and instructions are available on the Element14 Dual Camera Mezzanine product page as well as the Getting Started Guide.  While booting this image is a worthwhile exercise, I will instead walk through the build process using the Avnet source code and build scripts.

    Building a PetaLinux Image

    While downloading the prebuilt image is straight-forward way to verify the hardware is functional, the next step is to build a project from source and boot from that image.  The source code for the “out of box” petalinux image and accompanying hardware is available in Avnet’s github repo (https://github.com/avnet).  This turns out to be a great resource for getting started with petalinux project configuration and video signal processing hardware designs using Xilinx IP cores.  The configuration files and build scripts are also a good reference for building projects via the Xilinx command line tools.

     

    For this section, I am using Vitis and Vivado 2020.1 in the /tools/Xilinx directory of my Ubuntu development system.  Petalinux was installed to /tools/petalinux.  Vitis, Vivado, and Petalinux tools are available from the Xilinx website (registration required).  For the following steps, I used the SDSoC license provided with the Ultra96v2 board because several IP components are not available with the free Webpack edition.  The following steps assume the Vivado 2020.1 and Petalinux environments are activated by sourcing settings64.sh and settings.sh from the respective tool directories.

     

    A very good description of the source directory structure and setup for this project in the following blog post 

    https://www.element14.com/community/groups/fpga-group/blog/2020/05/01/petalinux-git-howto.  I modified it slightly to reflect my working directory.

    $ mkdir -p ${HOME}/workspace/avnet

    $ cd ${HOME}/workspace/avnet/

    $ git clone https://github.com/Avnet/bdf.git

    $ git clone https://github.com/Avnet/hdl.git

    $ git clone https://github.com/Avnet/petalinux.git

    $ cd bdf

    $ git checkout master

    $ cd ../petalinux

    $ git checkout 2020.1

    $ cd ../hdl

    $ git checkout 2020.1

    I also closely followed the build process described in the blog post:

    https://www.element14.com/community/groups/fpga-group/blog/2021/01/14/ultra96-v2-dual-camera-mezzanine-petalinux-build-instructions.  Here the target is a SD card formatted with a 1GB FAT32 boot partition and the remaining 15GB is formatted ext4. 

     

    $ cd ${HOME}/workspace/avnet/petalinux

    $ ./scripts/make_ultra96v2_dualcam.sh

     

    $ cd ${HOME}/workspace/avnet/petalinux/projects/ultra96v2_dualcam_2020_1

    $ cp ./images/linux/BOOT.BIN /media/${USER}/<UUID of FAT32 partition>/.

    $ cp ./images/linux/boot.scr /media/${USER}/<UUID of FAT32 partition>/.

    $ cp ./images/linux/image.ub /media/${USER}/<UUID of FAT32 partition>/.

    $ sudo rm -rf /media/${USER}/<UUID of ext4 partition>/*

    $ sudo tar xvf ./images/linux/rootfs.tar.gz -C /media/${USER}/<UUID of ext4 partition>/

    $ sync; sync

     

    After booting from the SD card and running the run_1920_1080 application with a monitor attached to the Ultra96 mini DisplayPort, the resulting images from both cameras are successfully displayed.

     

    Hardware Design

    The hardware design for the IP-based video processing chain is built as one of the steps of the “out of box” reference design using Vivado command line tools.  The generated hardware design can be opened in Vivado, and serves as a useful reference design that demonstrates how the processing system, AXI bus and MIPI camera video input stream and video output streams are implemented using Xilinx IP.  The design has been cleanly divided into a video capture block, and a live video output block, as well as PS system block, AXI blocks, clock generation, GPIO, etc.

    Hardware Block Diagram

    Notes on Video IP Cores

    At the time of this writing, the petalinux build script, make_ultra96v2_dualcam.sh, will fail with the latest version of Vivado 2020.2 or newer.  The reason the build fails is this base design uses the Video On Screen Display IP Core, which has been obsoleted and is not supported after Vivado 2020.1.  New designs should use LogiCORE, Video Mixer IP Core, included with Vivado.

    OSD Video IP block

    JTAG Booting and Debug Session

     

    Booting the Ultra96v2 from an SD card is the method described in the “out of box” images and is the default target for building the Ultra96v2 Dual Camera example.  This means that every time we want to try out a modification to the system, we need to power down the Ultra96, eject the SD card, burn the new image on the development PC, insert the SD card with the new image into the Ultra96, and run the new image.  This is a very manual process, and is a little inconvenient for remote development and very limiting for debug.  My preferred method for embedded development is to use a JTAG interface connected to a small “lab computer” on my workbench that I can access remotely from my development computer.

     

    The Ultra96v2 JTAG and UART headers are located along the side of the PCB.  The headers are 2 mm and do not conform to my USB to JTAG interface.  So rather than create an interface board, I opted to purchase the Ultra96 USB to JTAG/UART Pod Adapter Board (AES-ACC-U96-JTAG). 

     


    USB JTAG+UART

     

    I removed the Dual Camera Mezzanine for the next steps to simplify the debug tests.  To test booting from JTAG I modified the OOB test project provided by Avnet to build a version of petalinux that boots via JTAG to a tmpfs ramdisk.  In order to test the JTAG boot to a tmpfs ramdisk, I made the following change to the Avnet build script,  ~/workspace/avnet/petalinux/scripts/make_ultra96v2.sh:

     

    #BOOT_METHOD='EXT4'

    BOOT_METHOD='INITRD'

     

    Running the build system, similar to the steps above, will generate the necessary output files, however ‘image_EXT4.ub’ will now be ‘image_INITRD.ub’.

     

    cd ${HOME}/workspace/avnet/petalinux

    $ ./scripts/make_ultra96v2.sh

    $ # wait for build to complete

    $ cd ../projects/ultra96v2_oob_2020_1

    JTAG Boot

    Avnet provides a tcl script to boot via JTAG, and is invoked with the Xilinx xsdb command line tool:

    $ xsdb boot_jtag_INITRD.tcl

     

    attempting to launch hw_server                                                                                                                                        

    ****** Xilinx hw_server v2020.1

      **** Build date : May 27 2020 at 20:33:44

        ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

     

    INFO: hw_server application started

    INFO: Use Ctrl-C to exit hw_server application

     

    INFO: To connect to this hw_server instance use url: TCP:127.0.0.1:3121

     

    INFO: Configuring the FPGA...

    INFO: Downloading bitstream: ./pre-built/linux/implementation/download.bit to the target.

    INFO: Downloading ELF file: ./pre-built/linux/images/pmufw.elf to the target.                                                                         

    INFO: Downloading ELF file: ./pre-built/linux/images/zynqmp_fsbl.elf to the target.                                                                   

    INFO: Downloading ELF file: ./pre-built/linux/images/u-boot.elf to the target.                                                                        

    INFO: Loading image: ./pre-built/linux/images/image_INITRD.ub at 0x04000000                                                                           

    INFO: Loading image: ./pre-built/linux/images/avnet-boot/avnet_jtag.scr at 0x20000000                                                                 

    INFO: Downloading ELF file: ./pre-built/linux/images/bl31.elf to the target.    

    $

     

    On a separate screen attach to the USB UART device to see the boot output and console login prompt.  In my case, the UART is /dev/ttyUSB1.  I used picocom, but any serial console emulator should work.

     

    $ picocom -b 115200 /dev/ttyUSB1

    Starting internet superserver: inetd.

    Starting syslogd/klogd: done

    Starting tcf-agent: OK

     

    PetaLinux 2020.1 ultra96v2-2020-1 /dev/ttyPS0

     

    ultra96v2-2020-1 login: root

    Password:

    root@ultra96v2-2020-1:~#

     

    We can check the available RAM (about 2GB), filesystem usage, and verify we see the four A53 ARM cores:

     

    root@ultra96v2-2020-1:~# free

                  total        used        free      shared  buff/cache  available

    Mem:        2037460      43344    1967304        208      26812    1937596

    Swap:            0          0          0

     

    root@ultra96v2-2020-1:~# df -h

    Filesystem                Size      Used Available Use% Mounted on

    devtmpfs                733.7M      4.0K    733.7M  0% /dev

    tmpfs                  994.9M    108.0K    994.7M  0% /run

    tmpfs                  994.9M    96.0K    994.8M  0% /var/volatile

     

    root@ultra96v2-2020-1:~# cat /proc/cpuinfo

    processor      : 0

    BogoMIPS        : 200.00

    Features        : fp asimd evtstrm aes pmull sha1 sha2 crc32 cpuid

    CPU implementer : 0x41

    CPU architecture: 8

    CPU variant    : 0x0

    CPU part        : 0xd03

    CPU revision    : 4

     

    processor      : 1

    BogoMIPS        : 200.00

    Features        : fp asimd evtstrm aes pmull sha1 sha2 crc32 cpuid

    CPU implementer : 0x41

    CPU architecture: 8

    CPU variant    : 0x0

    CPU part        : 0xd03

    CPU revision    : 4

     

    processor      : 2

    BogoMIPS        : 200.00

    Features        : fp asimd evtstrm aes pmull sha1 sha2 crc32 cpuid

    CPU implementer : 0x41

    CPU architecture: 8

    CPU variant    : 0x0

    CPU part        : 0xd03

    CPU revision    : 4

     

    processor      : 3

    BogoMIPS        : 200.00

    Features        : fp asimd evtstrm aes pmull sha1 sha2 crc32 cpuid

    CPU implementer : 0x41

    CPU architecture: 8

    CPU variant    : 0x0

    CPU part        : 0xd03

    CPU revision    : 4

     

    JTAG Debug

    Now that the target system is running, the Xilinx debugger can probe the system and control execution.  Below is the output from a simple debug session to view the available targets, halt the Cortex A53 cores, and read the stack pointer:

     

    Start the debugger

     

    $ xsdb

    ****** Xilinx System Debugger (XSDB) v2020.1

      **** Build date : May 27 2020-20:33:44

        ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

     

    Connect to the hw_server

     

    xsdb% connect

    attempting to launch hw_server                                                                                                                        

                                                                                                                                                          

    ****** Xilinx hw_server v2020.1

      **** Build date : May 27 2020 at 20:33:44

        ** Copyright 1986-2020 Xilinx, Inc. All Rights Reserved.

     

    INFO: hw_server application started

    INFO: Use Ctrl-C to exit hw_server application

     

    INFO: To connect to this hw_server instance use url: TCP:127.0.0.1:3121

     

    tcfchan#0

     

    List the JTAG targets

     

    xsdb% jtag targets                  

      1  Avnet USB-to-JTAG/UART Pod V1 1234-oj1A                                                                                                          

         2  xczu3 (idcode 14710093 irlen 12 fpga)

         3  arm_dap (idcode 5ba00477 irlen 4)

    xsdb% jtag targets 1                                                                                                                                  

    xsdb% jtag targets                                                                                                                                    

      1* Avnet USB-to-JTAG/UART Pod V1 1234-oj1A

         2  xczu3 (idcode 14710093 irlen 12 fpga)

         3  arm_dap (idcode 5ba00477 irlen 4)

     

    Show the running cores:

    xsdb% targets                                                                                                                                         

      1  PS TAP

         2  PMU

            3  MicroBlaze PMU (Sleeping. No clock)

         4  PL

      5  PSU

         6  RPU

            7  Cortex-R5 #0 (Halted)

            8  Cortex-R5 #1 (Lock Step Mode)

         9  APU

           10  Cortex-A53 #0 (Running)

           11  Cortex-A53 #1 (Running)

           12  Cortex-A53 #2 (Running)

           13  Cortex-A53 #3 (Running)

     

    The four Cortex-A53 cores are in the running state (which agrees with the /proc/cpuinfo from the linux session).  From here we can do standard debug operations such as set breakpoints, peek/poke memory locations, reset cores, etc.  See Xilinx UG1209 for xsct/xsdb info https://www.xilinx.com/support/documentation/sw_manuals/xilinx2019_1/ug1209-embedded-design-tutorial.pdf

     

    xsdb% targets

      1  PS TAP

        2  PMU

            3  MicroBlaze PMU (Sleeping. No clock)

        4  PL

      5  PSU

        6  RPU

            7  Cortex-R5 #0 (Halted)

            8  Cortex-R5 #1 (Lock Step Mode)

        9* APU

          10  Cortex-A53 #0 (External Debug Request, EL1(NS)/A64)

          11  Cortex-A53 #1 (External Debug Request, EL1(NS)/A64)

          12  Cortex-A53 #2 (External Debug Request, EL1(NS)/A64)

          13  Cortex-A53 #3 (External Debug Request, EL1(NS)/A64)

    xsdb% target 10                           

    xsdb% rrd -defs

          r0: (RW)        r1: (RW)        r2: (RW)        r3: (RW)        r4: (RW)

          r5: (RW)        r6: (RW)        r7: (RW)        r8: (RW)        r9: (RW)

        r10: (RW)      r11: (RW)      r12: (RW)      r13: (RW)      r14: (RW)

        r15: (RW)      r16: (RW)      r17: (RW)      r18: (RW)      r19: (RW)

        r20: (RW)      r21: (RW)      r22: (RW)      r23: (RW)      r24: (RW)

        r25: (RW)      r26: (RW)      r27: (RW)      r28: (RW)      r29: (RW)

        r30: (RW)        sp: (RW)        pc: (RW)      cpsr: (RW)      vfp     

        sys            dbg        acpu_gic     

     

    xsdb% rrd sp

     

    Additional info system architecture is in the Zynq Ultrascale MPSoC Processor System is in the Zynq Ultrascale MPSoC+ Software Developer Guide (UG1137).

    Summary and Next Steps

     

    The Avnet 96Boards Dual Camera Mezzanine + Ultra96-V2 is a versatile and affordable entry computer vision platform.  The Zynq Ultrascale ZU3EG MPSoC has enough resources to support advanced IP-based video processing signal toolchains.  The Dual Camera Mezzanine Card attaches directly onto the Ultra96-V2 and conforms to the 96Boards electrical and physical specifications.  The On Semiconductor AP1302 video processor and controller conforms to MIPI specifications, allowing video processing signal processing chains to be built using Xilinx IP cores for fast, flexible development and prototyping. 

     

    The example design projects from Avnet turned out to be a great way to get started with design and debug on the Ultra96-V2 and Dual Camera Mezzanine Card.  The Zynq Ultrascale+ MPSoC is more powerful, but also more complex than Zynq-7000 or Artix7 systems, and having functional  examples of petalinux PS with IP-based PL designs with build scripts were very useful.  I will draw inspiration from these examples for my future Ultra96 projects.

     

    There are several projects that I would like to explore after this Road Test Review.  An interesting application is deep learning inference using the Xilinx Deep Learning Processing Unit (DPU) for Convolutional Neural Networks (https://www.xilinx.com/products/intellectual-property/dpu.html).  The Ultra96 supports the DPU IP and there are several worked examples available using Deep Learning models such as YOLO, SSD, and many others. This would also be a good opportunity to become familiar with the Video Mixer IP core to display to augment the detected images.  The Dual Camera Mezzanine could also be used for stereo photography and stereo photogrammetry applications to extract 3D models.  One could envision this model being integrated into a 3D navigation system for robotics or autonomous vehicles.  These topics will be discussed in future posts.


Comments

Also Enrolling

Enrollment Closes: May 14 
Enroll
Enrollment Closes: May 14 
Enroll
Enrollment Closes: May 14 
Enroll
Enrollment Closes: May 12 
Enroll
Enrollment Closes: May 3 
Enroll
Enrollment Closes: Apr 23 
Enroll
Enrollment Closes: Apr 19 
Enroll