/fpga-drive-aximm-pcie

Example designs for FPGA Drive FMC

Primary LanguageTclMIT LicenseMIT

fpga-drive-aximm-pcie

This repo contains the example designs for the FPGA Drive FMC mated with several FPGA and MPSoC evaluation boards.

FPGA Drive FMC

Requirements

This project is designed for version 2019.2 of the Xilinx tools (Vivado/Vitis/PetaLinux). If you are using an older version of the Xilinx tools, then refer to the release tags to find the version of this repository that matches your version of the tools.

In order to test this design on hardware, you will need the following:

  • Vivado 2019.2
  • Vitis 2019.2
  • PetaLinux SDK 2019.2
  • FPGA Drive - for connecting a PCIe SSD
  • M.2 PCIe Solid State Drive
  • One of the supported carriers listed below

Supported carrier boards

Description

These are the example designs for the FPGA Drive and FPGA Drive FMC adapters that allow connecting NVMe SSDs to FPGAs via PCIe edge connectors and FPGA Mezzanine Card (FMC) connectors.

The bare metal software application reports on the status of the PCIe link and performs enumeration of the detected PCIe end-points (ie. the SSDs). The project also contains scripts to generate PetaLinux for these platforms to allow accessing the SSDs from the Linux operating system.

Single SSD designs

FPGA Drive FMC single load

The projects in this repo without the "_dual" postfix are intended to be used with only one loaded SSD as shown in the above image. The SSD should be loaded into the first M.2 slot, labelled SSD1. If you are using the older version FPGA Drive FMC (Rev-B) with only one M.2 connector, you will only be able to use the single SSD designs.

Dual SSD designs

FPGA Drive FMC dual load

The projects in this repo with the "_dual" postfix are intended to be used with two loaded SSDs as shown in the above image. The dual designs may not function as expected if only one SSD is loaded. If you are using the older version FPGA Drive FMC (Rev-B) with only one M.2 connector, you will not be able to use the dual designs.

At the moment there are dual designs for these carriers:

  • KCU105
  • ZCU106
  • ZCU111

Build instructions

To use the sources in this repository, please follow these steps:

Windows users

  1. Download the repo as a zip file and extract the files to a directory on your hard drive --OR-- Git users: clone the repo to your hard drive
  2. Open Windows Explorer, browse to the repo files on your hard drive.
  3. In the Vivado directory, you will find multiple batch files (*.bat). Double click on the batch file that is appropriate to your hardware, for example, double-click build-zedboard.bat if you are using the ZedBoard. This will generate a Vivado project for your hardware platform.
  4. Run Vivado and open the project that was just created.
  5. Click Generate bitstream.
  6. When the bitstream is successfully generated, select File->Export->Export Hardware. In the window that opens, tick "Include bitstream" and "Local to project".
  7. Return to Windows Explorer and browse to the Vitis directory in the repo.
  8. Double click the build-vitis.bat batch file. The batch file will run the build-vitis.tcl script and build the Vitis workspace containing the hardware design and the software application.
  9. Run Xilinx Vitis and select the workspace to be the Vitis directory of the repo.
  10. Connect and power up the hardware.
  11. Open a Putty terminal to view the UART output.
  12. In Vitis, select Xilinx Tools->Program FPGA.
  13. Right-click on the application and select Run As->Launch on Hardware (Single Application Debug)

Linux users

  1. Download the repo as a zip file and extract the files to a directory on your hard drive --OR-- Git users: clone the repo to your hard drive
  2. Launch the Vivado GUI.
  3. Open the Tcl console from the Vivado welcome page. In the console, cd to the repo files on your hard drive and into the Vivado subdirectory. For example: cd /media/projects/fpga-drive-aximm-pcie/Vivado.
  4. In the Vivado subdirectory, you will find multiple Tcl files. To list them, type exec ls {*}[glob *.tcl]. Determine the Tcl script for the example project that you would like to generate (for example: build-zcu104.tcl), then source the script in the Tcl console: For example: source build-zcu104.tcl
  5. Vivado will run the script and generate the project. When it's finished, click Generate bitstream.
  6. When the bitstream is successfully generated, select File->Export->Export Hardware. In the window that opens, tick "Include bitstream" and "Local to project".
  7. To build the Vitis workspace, open a Linux command terminal and cd to the Vitis directory in the repo.
  8. The Vitis directory contains the build-vitis.tcl script that will build the Vitis workspace containing the hardware design and the software application. Run the build script by typing the following command: <path-of-xilinx-vitis>/bin/xsct build-vitis.tcl. Note that you must replace <path-of-xilinx-vitis> with the actual path to your Xilinx Vitis installation.
  9. Run Xilinx Vitis and select the workspace to be the Vitis subdirectory of the repo.
  10. Connect and power up the hardware.
  11. Open a Putty terminal to view the UART output.
  12. In Vitis, select Xilinx Tools->Program FPGA.
  13. Right-click on the application and select Run As->Launch on Hardware (Single Application Debug)

Stand-alone software application

A stand-alone software application can be built for this project using the build script contained in the Vitis subdirectory of this repo. The build script creates a Vitis workspace containing the hardware platform (exported from Vivado) and a stand-alone application. The application originates from an example provided by Xilinx which is located in the Vitis installation files. The program demonstrates basic usage of the stand-alone driver including how to check link-up, link speed, the number of lanes used, as well as how to perform PCIe enumeration. The original example applications can be found here:

  • For the AXI PCIe designs: C:\Xilinx\Vitis\2019.2\data\embeddedsw\XilinxProcessorIPLib\drivers\axipcie_v3_1\examples\xaxipcie_rc_enumerate_example.c
  • For the XDMA designs: C:\Xilinx\Vitis\2019.2\data\embeddedsw\XilinxProcessorIPLib\drivers\xdmapcie_v1_0\examples\xdmapcie_rc_enumerate_example.c

PetaLinux

This repo contains a script and configuration files for a PetaLinux project for each one of the hardware platforms. To build the PetaLinux project, please refer to the "README.md" file in the PetaLinux subdirectory of this repo.

Board Specific Notes

KCU105

  • To keep these designs free of paid IP, the KCU105's on-board Ethernet port is not connected in this design. The KCU105's Ethernet PHY has an SGMII interface which is not supported by the free AXI EthernetLite IP.

VC709 and KCU105

These designs are based on the AXI Bridge for PCI Express Gen3 Subsystem . To generate an example stand-alone application for these boards, the Vitis build script makes a local copy of the driver for the AXI Memory Mapped to PCIe Gen2 IP with a few small modifications to make it work with the Gen3 core. If you use or modify these applications, be aware that they refer to the locally copied and modified driver located in EmbeddedSw/XilinxProcessorIPLib/drivers, and that that driver is actually designed for the Gen2 core. In other words, you can expect the driver to work for the example application that checks link-up, link speed/width and enumerates the end points, but anything else may fail due to differences between the driver code and the Gen3 IP specs.

PicoZed

Installation of PicoZed board definition files

To use this project on the PicoZed, you must first install the board definition files for the PicoZed into your Vivado installation.

The following folders contain the board definition files and can be found in this project repository at this location:

https://github.com/fpgadeveloper/fpga-drive-aximm-pcie/tree/master/Vivado/boards/board_files

  • picozed_7015_fmc2
  • picozed_7030_fmc2

Copy those folders and their contents into the C:\Xilinx\Vivado\2019.2\data\boards\board_files folder (this may be different on your machine, depending on your Vivado installation directory).

PicoZed FMC Carrier Card V2

On this carrier, the GBTCLK0 of the LPC FMC connector is routed to a clock synthesizer/MUX, rather than being directly connected to the Zynq. In order to use the FPGA Drive FMC on the PicoZed FMC Carrier Card V2, you will need to reconfigure the clock synthesizer so that it feeds the FMC clock through to the Zynq. To change the configuration, you must reprogram the EEPROM (U14) where the configuration is stored. Avnet provides an SD card boot file that can be run to reprogram the EEPROM to the configuration we need for this project. The boot files have been copied to the links below for your convenience:

Just boot up your PicoZed FMC Carrier Card V2 using one of those boot files, and the EEPROM will be reprogrammed as required for this project. For more information, see the Hardware User Guide for the PicoZed FMC Carrier Card V2.

ZCU106

The ZCU106 has two HPC FMC connectors, HPC0 and HPC1. The HPC0 connector has enough connected gigabit transceivers to support 2x SSDs, each with an independent 4-lane PCIe interface. The HPC1 connector has only 1x connected gigabit transceiver, so it can only support 1x SSD (SSD1) with a 1-lane PCIe interface. This repo contains designs for both of these connectors.

ZCU111

The ZCU111 has a single FMC+ connector that can support 2x SSDs, each with an independent 4-lane PCIe interface.

Troubleshooting

Check the following if the project fails to build or generate a bitstream:

1. Are you using the correct version of Vivado for this version of the repository?

Check the version specified in the Requirements section of this readme file. Note that this project is regularly maintained to the latest version of Vivado and you may have to refer to an earlier commit of this repo if you are using an older version of Vivado.

2. Did you follow the Build instructions in this readme file?

All the projects in the repo are built, synthesised and implemented to a bitstream before being committed, so if you follow the instructions, there should not be any build issues.

3. Did you copy/clone the repo into a short directory structure?

Vivado doesn't cope well with long directory structures, so copy/clone the repo into a short directory structure such as C:\projects\. When working in long directory structures, you can get errors relating to missing files, particularly files that are normally generated by Vivado (FIFOs, etc).

Contribute

We encourage contribution to these projects. If you spot issues or you want to add designs for other platforms, please make a pull request.

About us

This project was developed by Opsero Inc., a tight-knit team of FPGA experts delivering FPGA products and design services to start-ups and tech companies. Follow our blog, FPGA Developer, for news, tutorials and updates on the awesome projects we work on.