genicam/harvesters

πŸŽ‰TRIAL VERSION IS AVAILABLE!πŸŽ‰ Add support of ARM 64 to the `genicam` package

Opened this issue Β· 48 comments

Hey guys, I am sorry for having kept you waiting for long. I have just made the first step at last! Could you tell me the following information, please?:

  1. Your ARM 64 target,
    Its default compiler and its version, that would eventually build its Python 3 (that would be great if you could additionally paste the whole output from gcc -v if possible.), and
  2. The OS on the target and its version.

Anybody is welcome. Please feel free to leave your comment in this thread. I would like to find out the right target/toolchain. Perhaps I would face a technical difficulty that blocks us to have the right binary but I can tell you now that at least I have confirmed I can build a binary though it's not yet tested on any target.

Please keep your fingers crossed!

Thanks, Kazunari.

(from @clintlombard)

We're running the following:
Python version: 3.9
Python compiler: GCC 7.5.0
OS: Ubuntu 18.04 (L4T 32.4)
Target platform: Nvidia Xavier AGX

I'd appreciate it if you could give me the output from gcc -v. Thanks, Kazunari.

gcc -v output:

~$  gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)

@clintlombard Thank you for taking the action so quickly! I appreciate that!

running on Jetson Xavier NX:

$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 

@monzelr Thank you very much!

running on the Jetson Nano 4GB:

$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 

(both of my jetsons are using the JetPack 4.4 with Ubuntu 18.04 LTS. To check this use sudo apt-cache show nvidia-jetpack)

Running in a Docker Container (python:3.8-buster) on an Raspberry Pi 4 with an Ubuntu 20.04 base system

gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/8/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Debian 8.3.0-6' --with-bugurl=file:///usr/share/doc/gcc-8/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-8 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --disable-libphobos --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 8.3.0 (Debian 8.3.0-6)

Thanks for your efforts, kazunarikudo!

@tilman19 Hi, thank you for the information! Regards, Kazunari.

foxsr commented

running on Jetson NX with Jetpack 4.4 on Ubuntu 18.04.6 LTS

Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)`

or Docker container based on arm64v8/ubuntu-20.04:

Using built-in specs.
COLLECT_GCC=/usr/bin/gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/9/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 9.3.0-17ubuntu1~20.04' --with-bugurl=file:///usr/share/doc/gcc-9/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,gm2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-9 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 9.3.0 (Ubuntu 9.3.0-17ubuntu1~20.04)

Thank you for all the great work @kazunarikudo, looking forward to try the arm64 port!

Hii, IΒ΄m using the Ubuntu-20.04 inside the MacPro with M1:

Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/9/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu 9.3.0-17ubuntu1~20.04' --with-bugurl=file:///usr/share/doc/gcc-9/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++,gm2 --prefix=/usr --with-gcc-major-version-only --program-suffix=-9 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --with-target-system-zlib=auto --enable-objc-gc=auto --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 9.3.0 (Ubuntu 9.3.0-17ubuntu1~20.04)

Any updates about the support for the arm64?

@mahwhat Hi, thank you for sharing the information with me. I appreciate that. I have already built a package for an ARM and one of my good colleagues Stefan B. and I are planning to work for it. Since there always be an official due date for every GenICam reference implementation release so I can't guarantee that we can release it this year but it should be completed next year at the latest. Thank you for paying attention to the activity! Regards, Kazunari.

@foxsr Hi, thank you very much for sharing the information with me! Regards, Kazunari.

@kazunarikudo Hii, thank you for the update! If you need someone to help you to build and test the package i would like to help. I already have used the package in some projects, using my old pc with ubuntu 18.04, amd64 and it works perfectly.

@kazunarikudo Hey, thanks for the effort you've put into harvesters. Its helped me a lot. Would be great to see this working on arm. If there is anything i can do to help with the development please do let me know. Thanks again!

Any updates on this?

~$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 

Hi @kazunarikudo ,

Have you had any updates on ARM 64 compatibility or if we can help you with development?

@mahwhat and the other peple who have been interested in it,

Thank you for your message. I can build the binary but I do not have a target to run it on; I am happy to develop Harvester and the genicam package but I am not a person who privately purchases a Jetson board...

Could you speak up if you are willing to try it, please? The binary should be provided under the license that the GenICam committee defines and you should not complain about the quality. From the quality-wise, you should never use it on your production, of course.

I am willing πŸ™‹β€β™‚οΈ
Have been waiting for this for so long, great to see it finally come together!
Have a raspi 4 ready to go

πŸ™‹β€β™‚οΈ

I'd like to test, too!
We have both Jetson Nano and Raspberry Pi here and already some experience with vendor drivers and Docker containers. So this could be a nice switch over!

Thank you for your efforts!!!

@mahwhat and the other peple who have been interested in it,

Thank you for your message. I can build the binary but I do not have a target to run it on; I am happy to develop Harvester and the genicam package but I am not a person who privately purchases a Jetson board...

Could you speak up if you are willing to try it, please? The binary should be provided under the license that the GenICam committee defines and you should not complain about the quality. From the quality-wise, you should never use it on your production, of course.

Hey @kazunarikudo ,

You can count on me to test the Harvester and genicam package for the ARM architecture.

Hi @kazunarikudo

I'd like to test the Harvster and geniCam ARM package as well. I have a Jetson Xavier NX and a Jeston Nano at my disposal for testing.

Hi @kazunarikudo
Great work on these libraries. I'm developing on Raspberry Pi 4 / Ubuntu Server and would be happy to test anything you have. I've done a lot with the libraries on Windows and I'm very impressed (and happy) - just need to get it going on an RPi4.
Thanks!

Hi @kazunarikudo
Thanks for your great work! I would also be very interested in an arm64 version, since I would like to use your package on a nvidia xavier board. Would be very happy if I could support you with testing!
Thanks and looking forward to an arm64 version :)

Running on nvidia Jetson AGX Xavier under with Python 3.6.9 under the NVIDIA Jetson Linux 34.1 which includes the Linux Kernel 5.10, UEFI based bootloader, Ubuntu 20.04 based root file system, NVIDIA drivers, necessary firmwares, toolchain and more. see https://docs.nvidia.com/jetson/archives/r34.1/DeveloperGuide/index.html

gcc output is:

gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.4.0-1ubuntu1~18.04.1' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --with-sysroot=/ --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.4.0 (Ubuntu/Linaro 7.4.0-1ubuntu1~18.04.1) 

Hi @kazunarikudo
Thanks for your great work!
I would be interested in an arm64 version, i would like to use the package on a nvidia jetson nano with Ubuntu 18.04.

Dear Ladies and Gentlemen,

Thank you very much for your patience and I am so sorry for having kept you all waiting so long. I highly appreciate that.

Please find attached, that contains the AArch64 version of the genicam package that should allow you to run Harvester on your AArch64 target. The supported Python versions are 3.7, 3.8, 3.9, and 3.10.

To get Harvester running on the target, you'll need to take care of the installation process by yourself; excuse me but it's inevitable until I upload the package to PyPI.

First, please make sure you have already installed the harvester package; clone the harvesters repository in an arbitrary directory and copy the harvesters directory to your Python's site-packages directory; assume we're working with Python 3.9:

$ git clone https://github.com/genicam/harvesters.git
$ cd harvesters
$ cp -r src/harvesters path/to/lib/python3.9/site-packages

Then you should be able to confirm the site-packages directory has the harvester directory now. After that, install numpy:

$ python3.9 -m pip install numpy

Finally, install the genicam package that I have shared at this time by the following command:

$ python3.9 -m pip install path/to/genicam-1.2.0-cp39-cp39-manylinux2014_aarch64.whl

Now, you are ready to launch Harvester!

Note that the genicam package is more or less experimental anyway. It does not guarantee anything. I hope it works for you and gives you more or less pleasant excitement. Feel free to give me a report when you observed any unexpected behavior. It may take some time but I will try to fix them when I have spare time. If I could make it to the end, the package should be officially released at hopefully the next GenApi reference implementation release in late 2022.

Cheers,
Kazunari

Fantastic @kazunarikudo ! I can't wait to start working with this version on a Raspberry Pi 4! Thank you very much for your expertise and effort!

Sincerely,
-Dino

Great! Much appreciated, Kazunari!

I have tried installing it in jetson nano, and it worked well with python 3.9. I could be able to grab images. Is it possible to provide wheels for GENICAM2? I see the wheels provided above are for GENICAM1, for which I cannot install the latest version of harvesters.

Sincerely,
-Vignesh

Thank you for providing ARM64 support for this great libary! This whole project is a real time saver!

I installed the trial version with Python 3.8.0 and am trying to run the following code on a Jetson Xavier. GCC output:

COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04) 

I am using the mvIMPACT_Acquire GenTL producer and a mvBlueFOX3 camera

I have issues relating to the fetch_buffer process(). At first I thought it is related to #239, since I also have a harvesters.core.PayloadUnknown object loaded into the buffer, but only on the second for loop of my code. The first run of my for loop loads the buffer perfectly and as expected. Only on the second run (i=1) in the for loop below the buffer is filled with an unknown object. I know this because I stepped throught the loop multiple times step by step while debugging.

My code looks something like this:

for i in range(number_of_frames):
            
	#Load buffer 
	with self.ia.fetch_buffer() as buffer:
		#Save components from the Buffer
		component.append(buffer._payload._components[0])
	
		#Create 2D Array from the image Data and save as .tiff image
		out_height = component[i].width
		out_width = component[i].height
		_2d = component[i].data.reshape(out_height, out_width)
		Image.fromarray(_2d,mode='I;16').save("out" + str(i) + ".tiff")  #16-bit unsigned integer pixels = Mono16 

I am unsure if I should have posted this here or in #239, but since this problem is only present when I moved to the Jetson I thought it would fit here. Please correct me if I am wrong.

EDIT

I fixed my issue. The USB buffer size on my Jetson was an issue. This is easily resolved here is a page documenting the process using a MatrixVision Bluefox Camera, but this should also apply to other camera models.

So far Harverster has been working nicely on ARM64, will report if I come across any more problems. :) Thanks to all!

Hi is it possible to provide the wheel for Python 3.6 on arm? Unfortunately our Jetson device is still stuck with the default Python version of Jetpack 4.4

The library work amazingly well on x86 btw!

$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)

No cameras get listed for me on ubuntu 20 ARM using the same script as for windows 10 which works there. Are there any known cti files that work for ARM?

No cameras get listed for me on ubuntu 20 ARM using the same script as for windows 10 which works there. Are there any known cti files that work for ARM?

I have built the CTI file on my arm device using the scripts provided by Matrix Vision.

Thanks for the response. I just built the CTI file successfully. I don't get any cameras listed yet but I will keep trying!

Edit:
I don't know why but it gets detected now. Must have been a type or something in the filename, thanks again!

@hungpham2511 Hi, we as the GenICam committee have stopped supporting Python 3.6 since it's reached out its end of life. I know that's unfortunate for you but I hope you understand. (Thank you for trying out Harvester, anyway!)

@Dotile Hi, thank you for trying out Harvester with the ARM genicam package and for the practical report. Cheers, Kazunari.

@vigneshgarrapally Hi, we as the GenICam committee are about to bundle the ARM version of genicam packages finally. The expected release will be at the end of this year. Yes, I know I have been taking so long time and I admit I have postponed the expected release date many times. However, I hope you understand that there actually is a binary as of today and we as the committee is working to ship it. Thanks, Kazunari.

Thank you very much kazunarikudo for your effort, I tried the Trial package on a Jetson Xavier NX Developer Kit with Ubuntu 20.04.4 LTS with two different USB3 cameras from "Flir" and "Matrix Vision" vendors and for the moment I acquire images without problems. I will do additional operations (reading and setting parameters and deep debugging).
I will stay in touch for the "official" release of the package.

I'm trying to use the trial version, but I'm getting an error saying:
ERROR: genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.

System is Jetpack 4.5, ARM64, Python 3.8. here is my gcc -v:

gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)

Am I doing something wrong? Is there an older version of harvesters I need to work with this?

@rvanderwerf I believe the default python version is 3.6 on the jetpack. Can you confirm you are actually running python3.8?

If you are not using a venv for Python can you try to install the package in the following way:

python3.8 -m pip install genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl

@MathijsNL yes using python 3.8.16 venv

~/harvesters$ python --version
Python 3.8.16 | packaged by conda-forge | (a9dbdca6, Jan 29 2023, 10:19:50)
[PyPy 7.3.11 with GCC 11.3.0]

python -m pip install ~/GenICam_V3_4_0-Linux64_ARM_gcc75-PythonWheels/genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl
ERROR: genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.

python3.8 -m pip install ~/GenICam_V3_4_0-Linux64_ARM_gcc75-PythonWh
eels/genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl
ERROR: genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.

really scratching my head here. I am using miniforge and instead of miniconda, as I couldn't get miniconda running properly.

with -vvv for more verbosity(I don't see anything useful there though):
python3.8 -m pip install -vvv ~/GenICam_V3_4_0-Linux64_ARM_gcc75-Pyt
honWheels/genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl
Using pip 23.0 from /home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip (python 3.8)
Non-user install because site-packages writeable
Created temporary directory: /tmp/pip-build-tracker-2km0229r
Initialized build tracking at /tmp/pip-build-tracker-2km0229r
Created build tracker: /tmp/pip-build-tracker-2km0229r
Entered build tracker: /tmp/pip-build-tracker-2km0229r
Created temporary directory: /tmp/pip-install-gdin98et
Created temporary directory: /tmp/pip-ephem-wheel-cache-q9cijyj1
ERROR: genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.
Exception information:
Traceback (most recent call last):
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/cli/base_command.py", line 160, in exc_logging_wrapper
status = run_func(*args)
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/cli/req_command.py", line 247, in wrapper
return func(self, options, args)
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/commands/install.py", line 415, in run
requirement_set = resolver.resolve(
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 73, in resolve
collected = self.factory.collect_root_requirements(root_reqs)
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 491, in collect_root_requirements
req = self._make_requirement_from_install_req(
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 452, in _make_requirement_from_install_req
self._fail_if_link_is_unsupported_wheel(ireq.link)
File "/home/ryvan/miniforge3/envs/genicam/lib/pypy3.8/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 138, in _fail_if_link_is_unsupported_wheel
raise UnsupportedWheel(msg)
pip._internal.exceptions.UnsupportedWheel: genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.
Remote version of pip: 23.0
Local version of pip: 23.0
Was pip installed by pip? False
Removed build tracker: '/tmp/pip-build-tracker-2km0229r'

Tried a 2nd Jetson Xavier AGX device (dev kit newer JP version) same result Python 3.8.0:

gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)

cat /etc/nv_tegra_release

R32 (release), REVISION: 5.2, GCID: 27767740, BOARD: t186ref, EABI: aarch64, DATE: Fri Jul 9 16:05:07 UTC 2021

 python3.8 -m pip install ~/GenICam_V3_4_0-Linux64_ARM_gcc75-PythonWheels/genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl
genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.

That is really strange, both python and the os seem to be ok.

I don't have such a device lying around so I can't test. Maybe someone else with such a device knows how to get it working.

And unrelated to your issue, you can add output or code in code blocks using this:
```
some code
```

Which will result in:

some code

That will make things more readable.

tried a 3rd device, a standard Jetson Xavier NX devkit:

gcc -v Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/lib/gcc/aarch64-linux-gnu/7/lto-wrapper
Target: aarch64-linux-gnu
Configured with: ../src/configure -v --with-pkgversion='Ubuntu/Linaro 7.5.0-3ubuntu1~18.04' --with-bugurl=file:///usr/share/doc/gcc-7/README.Bugs --enable-languages=c,ada,c++,go,d,fortran,objc,obj-c++ --prefix=/usr --with-gcc-major-version-only --program-suffix=-7 --program-prefix=aarch64-linux-gnu- --enable-shared --enable-linker-build-id --libexecdir=/usr/lib --without-included-gettext --enable-threads=posix --libdir=/usr/lib --enable-nls --enable-bootstrap --enable-clocale=gnu --enable-libstdcxx-debug --enable-libstdcxx-time=yes --with-default-libstdcxx-abi=new --enable-gnu-unique-object --disable-libquadmath --disable-libquadmath-support --enable-plugin --enable-default-pie --with-system-zlib --enable-multiarch --enable-fix-cortex-a53-843419 --disable-werror --enable-checking=release --build=aarch64-linux-gnu --host=aarch64-linux-gnu --target=aarch64-linux-gnu
Thread model: posix
gcc version 7.5.0 (Ubuntu/Linaro 7.5.0-3ubuntu1~18.04)

nvidia version: 
# R32 (release), REVISION: 5.2, GCID: 27767740, BOARD: t186ref, EABI: aarch64, DATE: Fri Jul  9 16:05:07 UTC 2021

python3.8 -m pip install ../GenICam_V3_4_0-Linux64_ARM_gcc75-PythonWheels/genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl
genicam-1.2.0-cp38-cp38-manylinux2014_aarch64.whl is not a supported wheel on this platform.

I don't have a Xavier to spare but I can donate a Jetson Nano to the cause for someone to test/debug :)
or if I can get the toolchain and repo info I can try to build the wheels myself. Thanks!

Hi guys, could you show me the output from the following command first, please

pip --version

Then, could you give it another try after the following command is executed?

pip install --upgrade pip

Note that you may need to call pip3 instead of pip depending on your setup.
Thanks, Kazunari.

Did the latest update (1.4.2) include ARM as well or is the trial version still the one to use for ARM?

Edit:
What I meant is if the latest update also fixes the Genicam dependency on ARM. It seems it doesn't do that yet.

@kazunarikudo do you know when the Genicam update will be ready (approximately)?

Also wondering about this

There are some update of arm64 versions after GenICam_V3_4_0-Linux64_ARM_gcc75-PythonWheels.zip or it is still the latest available?