dji-sdk/Guidance-SDK-ROS

Segmentation fault (core dumped)

ascslab opened this issue · 21 comments

Hi

When i try step2
rosrun guidance guidanceNode

The following was segmentation fault was encounter !
vc4@VC4:~/catkin_ws$ rosrun guidance guidanceNode
Segmentation fault (core dumped)

Segmentation fault can be caused by many possible reasons, eg you don't have read/write permission of USB port, your Guidance device is not properly set and connected. Please provide more information for us to diagnose.

Hi.

With ref to the guidancenode.cpp line 203 if (e_motion ==datatype)
Print("motion frame index
Position. Publish

It seems that position data is not available. Pls advise how to obtain
position data from guidance. Thank
On Apr 7, 2016 3:37 PM, "tangketan" notifications@github.com wrote:

Segmentation fault can be caused by many possible reasons, eg you don't
have read/write permission of USB port, your Guidance device is not
properly set and connected. Please provide more information for us to
diagnose.


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#10 (comment)

Hi
With ref to the guidancenode.cpp line 203 if (e_motion ==datatype)
Print("motion frame index
Position. Publish

It seems that position data is not available. Pls advise how to obtain
position data from guidance. Thank

On Apr 7, 2016 3:37 PM, "tangketan" <notifications@github.com
On Apr 7, 2016 5:21 PM, "Kit Wai Chan" jasonchankitwai@gmail.com wrote:

Hi.

With ref to the guidancenode.cpp line 203 if (e_motion ==datatype)
Print("motion frame index
Position. Publish

It seems that position data is not available. Pls advise how to obtain
position data from guidance. Thank
On Apr 7, 2016 3:37 PM, "tangketan" notifications@github.com wrote:

Segmentation fault can be caused by many possible reasons, eg you don't
have read/write permission of USB port, your Guidance device is not
properly set and connected. Please provide more information for us to
diagnose.


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#10 (comment)

By calling select_motion in main(), you can obtain position data. I've modified the code. See if this version works.

Hi.

I did not find any modified code in this email attachment. Thks

Chan
On Apr 11, 2016 11:08 AM, "tangketan" notifications@github.com wrote:

By calling select_motion in main(), you can obtain position data. I've
modified the code. See if this version works.


You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub
#10 (comment)

@ascslab it's not in email, which is from GIthub notification.

Please pull from repository for the latest code.

Hi.

I downloaded the new code with "select_motion()" . However the program
does not run line 206 prints px,py,pz. It seems that e_motion ==datatype is
never true.

May I know what position data is provided? What is the accuracy and how
position data are obtained?

The five sensor seems to be independent. Will taking out one sensor affects
the overall performance?

Chan.
On Apr 11, 2016 11:50 AM, "Aqua" notifications@github.com wrote:

@ascslab https://github.com/ascslab it's not in email, which is from
GIthub notification.

Please pull from repository for the latest code.


You are receiving this because you were mentioned.

Reply to this email directly or view it on GitHub
#10 (comment)

Did you update the firmware of your Guidance? To use motion(position) data, you have to update to the latest version 1.4.0 with the new Guidance Assistant software.

Motion data provides the velocity and position of the drone in a global frame (instead the old velocity data provides only the velocity in body frame). The accuracy depends on environment.

The five sensors are not independent when calculating the motion data. But since the major sensor is the downward one, taking out the lateral sensors will only affect accuracy slightly.

Hi.

Can you advise how to update the software?
On Apr 13, 2016 10:18 AM, "tangketan" notifications@github.com wrote:

Did you update the firmware of your Guidance? To use motion(position)
data, you have to update to the latest version 1.4.0 with the new Guidance
Assistant software.

Motion data provides the velocity and position of the drone in a global
frame (instead the old velocity data provides only the velocity in body
frame). The accuracy depends on environment.

The five sensors are not independent when calculating the motion data. But
since the major sensor is the downward one, taking out the lateral sensors
will only affect accuracy slightly.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

Download Guidance Assistant Software (v1.3) from the developer web site: http://developer.dji.com/guidance-sdk/downloads/

I created a pull request that fixes a message mismatch fatal error after upgrading to 1.4.0 #12

Hi

I have update to the latest firmware to version 1.4.0 with the new Guidance
Assistant software. I have also downloaded the new code with
"select_motion()" . I have fixed only one camera to Bus1. Does the
position() works with only camera attached to bus1 ?

However the program does not run line 206 prints px,py,pz. It seems that
e_motion ==datatype is never true. It does not printf the px,py,pz.

Jason chan

Did you update the firmware of your Guidance? To use motion(position) data,
you have to update to the latest version 1.4.0 with the new Guidance
Assistant software.

Motion data provides the velocity and position of the drone in a global
frame (instead the old velocity data provides only the velocity in body
frame). The accuracy depends on environment.

The five sensors are not independent when calculating the motion data. But
since the major sensor is the downward one, taking out the lateral sensors
will only affect accuracy slightly.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

Hi

I have update to the latest firmware to version 1.4.0 with the new Guidance
Assistant software. I have also downloaded the new code with
"select_motion()" . I have fixed only one camera to Bus1. Does the
position() works with only camera attached to bus1 ?

However the program does not run line 206 prints px,py,pz. It seems that
e_motion ==datatype is never true. It does not printf the px,py,pz.

Pls help
On Apr 22, 2016 9:24 AM, "Kit Wai Chan" jasonchankitwai@gmail.com wrote:

Hi

I have update to the latest firmware to version 1.4.0 with the new
Guidance Assistant software. I have also downloaded the new code with
"select_motion()" . I have fixed only one camera to Bus1. Does the
position() works with only camera attached to bus1 ?

However the program does not run line 206 prints px,py,pz. It seems that
e_motion ==datatype is never true. It does not printf the px,py,pz.

Jason chan

Did you update the firmware of your Guidance? To use motion(position)
data, you have to update to the latest version 1.4.0 with the new Guidance
Assistant software.

Motion data provides the velocity and position of the drone in a global
frame (instead the old velocity data provides only the velocity in body
frame). The accuracy depends on environment.

The five sensors are not independent when calculating the motion data. But
since the major sensor is the downward one, taking out the lateral sensors
will only affect accuracy slightly.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

e_motion==datatype is never true because of a message mismatch error in our code. I have merged Asiron and another developer's merge request, so this problem should have been fixed.

Only one camera on VBUS1 is not sufficient for position calculation. Downward camera is a must. With more cameras connected, the accuracy and robustness of position data increases.

Hi.

May I know what accuracy can position () provides.

Is the a demo for position function?

Is the code updated in the github tested for position?

Jason
On Apr 25, 2016 4:24 PM, "tangketan" notifications@github.com wrote:

e_motion==datatype is never true because of a message mismatch error in
our code. I have merged Asiron and another developer's merge request, so
this problem should have been fixed.

Only one camera on VBUS1 is not sufficient for position calculation.
Downward camera is a must. With more cameras connected, the accuracy and
robustness of position data increases.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

Hi

I have download a new set of code from github today and compile again. I
run the followings

rosrun guidance guidanceNode
rosrun guidance guidanceNodeTest

and i am able to view the left, right and disparity image. The followings
are the problems

  1. disparity image does not look correct, the objects near camera does not
    appear darker colour, is there any calibration to be done? if yes how to do
    it. thanks

  2. The position printf("(px,py,pz)=(%.2f,%.2f,%.2f)\n" does not get printed
    out as shown in attachment. All five camera are setup with one downfacing.
    Only imu and Ultrasonic gets printed out.

  3. There is nothing in position when i perform rostopic echo
    guidance/position

Pls see the code attached and Pls advice. thanks

[image: Inline image 1][image: Inline image 2][image: Inline image 3]

On Wed, May 4, 2016 at 12:57 PM, Kit Wai Chan jasonchankitwai@gmail.com
wrote:

Hi.

May I know what accuracy can position () provides.

Is the a demo for position function?

Is the code updated in the github tested for position?

Jason
On Apr 25, 2016 4:24 PM, "tangketan" notifications@github.com wrote:

e_motion==datatype is never true because of a message mismatch error in
our code. I have merged Asiron and another developer's merge request, so
this problem should have been fixed.

Only one camera on VBUS1 is not sufficient for position calculation.
Downward camera is a must. With more cameras connected, the accuracy and
robustness of position data increases.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

@ascslab I can't see any attachments.
As to your questions,

  1. Nearer objects have larger disparity values, so it is correct that they look brighter on disparity image.
  2. You need to connect Guidance to N1 flight controller with latest firmware updated.Also you need to place Guidance properly so at least the downward facing camera has a good view.
  3. Same as 2.

Hi.

May I know what is N1 flight controller?

Can you provide another email address. This email address forbid you to
view attachment, i guess.

On May 5, 2016 11:22 AM, "tangketan" notifications@github.com wrote:

@ascslab https://github.com/ascslab I can't see any attachments.
As to your questions,

  1. Nearer objects have larger disparity values, so it is correct that they
    look brighter on disparity image.
  2. You need to connect Guidance to N1 flight controller with latest
    firmware updated.Also you need to place Guidance properly so at least the
    downward facing camera has a good view.
  3. Same as 2.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

N1 flight controller is the flight controller of M100.

Hi

I still have problems getting the p
I have download a new set of code from github today and compile again. I
run the followings

rosrun guidance guidanceNode
rosrun guidance guidanceNodeTest

and i am able to view the left, right and disparity image. The followings
are the problems

  1. disparity image does not look correct, the objects near camera does not
    appear darker colour, is there any calibration to be done? if yes how to do
    it. thanks

  2. The position printf("(px,py,pz)=(%.2f,%.2f,%.2f)\n" does not get printed
    out as shown in attachment. All five camera are setup with one downfacing.
    Only imu and Ultrasonic gets printed out.

  3. There is nothing in position when i perform rostopic echo
    guidance/position

Pls see the code and picture attached and Pls advice. thanks

[image: Inline image 1][image: Inline image 2][image: Inline image 3]

On Wed, May 4, 2016 at 12:57 PM, Kit Wai Chan jasonchankitwai@gmail.com
wrote:

Hi.

May I know what accuracy can position () provides.

Is the a demo for position function?

Is the code updated in the github tested for position?

Jason
On Apr 25, 2016 4:24 PM, "tangketan" notifications@github.com wrote:

e_motion==datatype is never true because of a message mismatch error in
our code. I have merged Asiron and another developer's merge request, so
this problem should have been fixed.

Only one camera on VBUS1 is not sufficient for position calculation.
Downward camera is a must. With more cameras connected, the accuracy and
robustness of position data increases.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#10 (comment)

@ascslab I have answered the exactly same questions 7 days ago!