mrodrig/json-2-csv

Question: How to make the json be converted to csv completely?

Closed this issue ยท 9 comments

I tried to use this package in my project but met an issue:

Here is my json:

{
  id: '123',
  list: [
    {a: 'a'},
    {b: 'b'}
  ]
}

The converted csv is

id,list
123,[{a:'a'},{b:'b'}]

So I refer this issue as not-completely converted.

I found another npm package: jsonexport can convert the above json to as follows:

id,list.a,list.b
123,a,b

This behavior is excellent! I wonder if our json-2-csv package can do it with some magic tweaks?

thanks in advance!

Hi @Jeff-Tian. That's a good point. I had actually changed the module around for v3 so that it was using JSON.stringify(...) for arrays to replace the previous behavior that would convert arrays to something like this: "[a;b]". I felt like JSON.stringify was better than the previous behavior, but I still wasn't the biggest fan of leaving some values in JSON format, especially after seeing the use case mentioned in #97 (converting data for app users, not necessarily developers). The way json-2-csv is setup, it actually already will support the behavior, if the underlying key generation module (deeks) and path evaluation module (doc-path) support it. Since those modules are essentially functionality from the original version of json-2-csv broken out into their own modules for reusability, I think I should be able to get this implemented pretty easily. ๐Ÿ™‚

In fact, I already have the key generation module (deeks) changes necessary to identify and include keys for objects inside an array. Now I'll just need to update the path evaluation module (doc-path) to add support, then I should be able to update the dependencies here, add a few tests, and can push out a new version. I'm hoping to have a new release either tonight (US Eastern Time) or tomorrow afternoon. Thanks again for reporting this! I'll post back once it's all set.

(Same functionality as requested in #80, so it's probably a good time to add it ๐Ÿ™‚)

There are still a few cases that I need to handle in the key extraction module, so unfortunately I won't have an update for json-2-csv out tonight. Hopefully it'll be out tomorrow night. Sorry for the delay!

@mrodrig thank you very much, take your time.

Thanks for your understanding @Jeff-Tian. I was able to get the key extraction portion fixed up tonight and that's published. There's a slight enhancement that I'd like to add in though so that I can add an option to this module that would allow users to specify the desired behavior (whether they want a JSON.stringify value - which is more reliable when converting back to JSON from CSV - or whether they would like the behavior described in your initial post). Sorry that this is taking longer than I initially expected!

Just wanted to give a quick update. It's almost there. I have json-2-csv (locally) generating the desired output from the given input example. Once I get the option added into deeks to allow for the specification of whether array objects should be included in the generated keys, then I'll finalize the json-2-csv behavior and will get the changes out in a PR.

Just released this in NPM version 3.2.0.

If you specify {expandArrayObjects: true} in the options you pass to json2csv(...), then you'll get the behavior that you're looking for. Thanks for opening the issue up and for your patience while I worked through the various issues that I ran into in order to get this behavior working!

Hey guys, can someone demonstrate how to apply the option parameter {expandArrayObjects: true} ?

Thanks

Hi @svrebelo-eth. Sure, happy to help.

Here's a quick snippet showing the behavior of just the expandArrayObjects option:

const converter = require('json-2-csv');

const data = [
    {
        author: "mrodrig",
        modules: [ {name: "json-2-csv"} ]
    },
    {
        author: "jsmith",
        modules: []
    },
    {
        author: "anonymous_user_123",
        modules: [
            {name: "some_module"},
            {name: "another_module"},
            {name: "this other cool module"}
        ]
    }
];

converter.json2csv(data, (error, csv) => {
    if (error) {
        console.error('An error occurred.', error);
    }

    console.log(csv);
}, {
    expandArrayObjects: true,
});

This outputs:

author,modules.name
mrodrig,json-2-csv
jsmith,
anonymous_user_123,"[""some_module"",""another_module"",""this other cool module""]"

The key thing that this feature enables is the iteration of keys across objects inside array values to provide a deeper conversion of the data. For reference, if the option was not provided, the output for this case would be:

author,modules
mrodrig,"[{""name"":""json-2-csv""}]"
jsmith,[]
anonymous_user_123,"[{""name"":""some_module""},{""name"":""another_module""},{""name"":""this other cool module""}]"

One other common use case that I've heard is to "unwind" arrays such that one CSV line will appear for each value contained in the object. For example, using the following options, you would get this output:

Options:

    expandArrayObjects: true,
    unwindArrays: true,
    keys: ["author", "modules.name"], // specify the desired output keys to remove and extra array field

Output:

author,modules.name
mrodrig,json-2-csv
jsmith,null
anonymous_user_123,some_module
anonymous_user_123,another_module
anonymous_user_123,this other cool module

Hope this helps!

Thank you for the explanation! I've did some code completion but it seems my JSON response is more complex than I thought

image

What could be the reason for this case?