piprate/json-gold

Any support for performing markup validation on JSON-LD with json-ld/json schema ?

vibhordubey333 opened this issue · 6 comments

Looking to validate json-ld(below) with json-ld/json schema. If no support is there any other library which can fulfill my purpose ?

{
    "@context": "http://www.w3.org/ns/odrl.jsonld",
    "@type": "Offer",
    "uid": "d84a3427-735d-4cdc-8e4c-51d9dc8c15ee",
    "permission": [
      {
        "uid": "d8bd23da-8e2d-4e3e-97f0-6e59e1793b14",
        "target": "d8bd23da-8e2d-4e3e-97f0-6e59e1793b00",
        "action": "concurrentUse",
        "assigner": "TEST",
        "assignee": "TSTORNG",
        "constraint": [
          {
            "uid": "1ce11f45-f3a3-49a7-8d23-90d25d0fde5e",
            "leftOperand": "count",
            "rightOperand": {
              "@value": "42",
              "@type": "xsd:nonNegativeInteger"
            },
            "operator": "gteq",
            "unit": "Metric_1.2"
          }
        ]
      }
    ],
    "prohibition": [
      {
        "uid": "6d269472-9f2b-4191-bad7-fef05be6cb53",
        "target": "6d269472-9f2b-4191-bad7-fef05be6cb01",
        "action": "acceptTracking",
        "assigner": "NEWORNG",
        "assignee": "TEST",
        "constraint": [
          {
            "uid": "43f1e066-ff85-4fe8-b4f0-087351c2bafb",
            "leftOperand": "count",
            "rightOperand": {
              "@value": "90",
              "@type": "xsd:nonNegativeInteger"
            },
            "operator": "hasPart",
            "unit": "Metric_1.2"
          }
        ]
      }
    ]
  }```

@vibhordubey333 it very much depends on your requirements for validation. If you are looking for classic JSON schema validation (structural and semantic conformance to a rigid schema), then the answer is no. It would be a good feature for some applications, but strictly, it's out of scope for JSON-LD processing.

The way I see it, JSON-LD context describes what the given data is, not what it should be. Unfortunately, I have seen many examples of systems abusing this principle and using JSON-LD context as an aspiration, not description. For these cases, JSON-LD based validation wouldn't provide any help, because the description is inherently broken.

If you are looking for something more practical, for example, an ability to check that your own JSON structures match the desired context (there are no terms/fields that aren't defined in the context, either due to typos or introduction of new fields) I'd recommend using the expansion algorithm, or compaction (with inherent expansion), depending on your data and the desired level of automation. It will be easy to spot if anything is missing.

Thanks for response @kazarena.
Regarding the last part you said:

If you are looking for something more practical, for example, an ability to check that your own JSON structures match the desired context (there are no terms/fields that aren't defined in the context, either due to typos or introduction of new fields) I'd recommend using the expansion algorithm, or compaction (with inherent expansion), depending on your data and the desired level of automation. It will be easy to spot if anything is missing.

I checked structs "JsonLdOptions" and "NewJsonLdOptions" present in "options.go". Couldn't find options to manipulate it to servemy purpose.Any example can you provide where I can validate above mentioned JSON-LD against a JSON-LD/JSON schema ?

Or any other library ?

@vibhordubey333, there are no additional options needed for this kind of validation. If you pass your JSON through a vanilla expansion algorithm (see this example), you will get an error if it's invalid from JSON-LD perspective (a check if you have a valid JSON-LD document). Then, by examining the expanded shape, you will be able to see if the terms were expanded as expected (a check if the provided context describes the data as intended).

But this may not be the validation you are looking for. If you describe your use case, whether the data you are validating comes from internal or external sources, what quality issues you may expect and what you would do with the data if validation fails, I may recommend a strategy.

Hi @kazarena,

Taking below code snippet from above. For instance we need to verify "unit" field, that value should be from the defined schema i.e ["Metric_1.1" , "Metric_1.2" , "Metric_1.3"]. If value is something outside from aforementioned values it will be marked as invalid.
Data which we're trying to validate will come from internal source. And schema which we will be using to validate will be hosted on our organization internal network.

"permission": [
      {
        "uid": "d8bd23da-8e2d-4e3e-97f0-6e59e1793b14",
        "target": "d8bd23da-8e2d-4e3e-97f0-6e59e1793b00",
        "action": "concurrentUse",
        "assigner": "TEST",
        "assignee": "TSTORNG",
        "constraint": [
          {
            "uid": "1ce11f45-f3a3-49a7-8d23-90d25d0fde5e",
            "leftOperand": "count",
            "rightOperand": {
              "@value": "42",
              "@type": "xsd:nonNegativeInteger"
            },
            "operator": "gteq",
            "unit": "Metric_1.2"
          }
        ]
      }
    ],

Hello @kazarena,

Any update ?

@vibhordubey333, sorry for the delay. I've looked at your example above. This kind of validation is out of scope for JSON-LD.

I still don't have enough information about the data you process and architectural requirements to advise on a specific library for validation. The example above looks like a business domain validation and the most efficient way to do it would be validate it in the code, as it would need to be aware of the unit semantics anyway.