aws-solutions/distributed-load-testing-on-aws

CSV for input

Closed this issue · 5 comments

I am using a JMeter test that uses a csv files that contains x,y coordinates that the test will use to send requests. In JMeter in the CSV Data Set Config config element I have the following format as the filename: ./File_Name
If I understand correctly that is what is stated in the documentation about using the relative path but when zip up the .jmx and the .csv and load it into the AWS testing rig it fails.

When I run the test with the input csv files in JMeter outside of AWS, it runs with no problem. Any help would be greatly appreciated.

You will just need to leave the path as filename only - for example: File_Name . And later on, put your script and datafile in same location -> zip and upload to test run. It works for me.

@YellowBird218 - thank you for the reply. I tried your suggestion but still not working for me. I am using the "bzm - Concurrency Thread Group" and have an http request under it and then the "CSV Data Set Config" element where in the filename section I just entered the filename with no extensions or prefixes. Then I zipped up the .jmx and .csv but no luck when uploaded to run test. Are you doing anything different?

I figured out that the issue I am having is because I am using x,y coordinate points for the csv input. I have no issues using this type of data in JMeter but it appears that the AWS load testing rig is not accepting the coordinate point data. My csv looks something like this below:

-32546559.06,9854649.265
-32679512.52,9864519.895
-32148952.26,9862145.125

Does anyone have experience with using coordinate points as csv input within AWS load testing?

Hi @chgar101

The issue is that the default delimiter for the csv in JMeter is a comma. Have you tried changing the delimiter to a new line in the JMeter script you are uploading?

We are keeping this issue open pending comment from client.