Esri/arcgis-pbf

Document transforming geometry

Closed this issue · 6 comments

Related to #7.

It is easy to create a parser for the pbf format, however, it is not documented how to go from the integer encoded coordinates to floating point coordinates. The instructions state

"it is neccessary to multiply by the FeatureCollection's Transform"

however there is no example to go along with this. Multiplying by the scale and and adding the translate does not result in correct coordinates. Perhaps the quantize_origin_postion has a role to play here but it is not documented.

It would be exceptionally helpful to have an example that demonstrates taking these integer coordinate values and returning floating point values.

Comments in #2 also related

@JosiahParry the pbf format in this repo is just a pbf version of the json format already returned by feature services, just with geometries flattened into a coords & lengths array. How you interpret the geometries depends on the query parameters you pass to the server. For more information, I would take a look at documentation concerning QuanitzationParameters here: https://developers.arcgis.com/rest/services-reference/enterprise/query-feature-service-layer-.htm. How you query the service depends on your use case. For instance, for the JSAPI, for our most common query path we setup the QuantizationParameters such that we get back 512x512 pixel tiles.

The json format that is returned by the queries returns the coordinates as is without the delta encoding and as an array rather than interleaved. Undoing the delta encoding and the interleaving requires a little bit of mental gymnastics. How this is done is not documented in this repository. I was able to figure it out (not without a few tufts of hair being pulled out).

The below is an example of how I accomplished this using rust (for posterity). It does not take into account the possibility of having z or m dimensions returned, though.

fn delta_decode(x: &mut [i64], trans: &Translate, scale: &Scale) -> Vec<[f64; 2]> {
    for i in 2..x.len() {
        x[i] = x[i - 2] + x[i]
    }

    let res = x
        .chunks(2)
        .into_iter()
        .map(|c| {
            let x = c[0] as f64 * scale.x_scale + trans.x_translate;
            let y = c[1] as f64;
            let y = ((y * scale.y_scale) - trans.y_translate) * -1f64;
            [x, y]
        })
        .collect::<Vec<[f64; 2]>>();
    res
}

@JosiahParry, f=json queries are delta encoded. In general I would use quantization if you care about performance (e.g., presumably when you are using f=pbf). I don't think f=pbf is valid on all server implementations without quantization parameters specified. I think you need to look for the edit mode quantization flag.

https://services.arcgis.com/P3ePLMYs2RVChkJx/arcgis/rest/services/ACS_Population_by_Race_and_Hispanic_Origin_Boundaries/FeatureServer/2/query?f=json&geometry=%7B%22spatialReference%22%3A%7B%22latestWkid%22%3A3857%2C%22wkid%22%3A102100%7D%2C%22xmin%22%3A-8140237.764258992%2C%22ymin%22%3A4383204.949986987%2C%22xmax%22%3A-7514065.62854699%2C%22ymax%22%3A5009377.085698988%7D&maxRecordCountFactor=3&outFields=B03002_003E%2CB03002_004E%2CB03002_005E%2CB03002_006E%2CB03002_007E%2CB03002_008E%2CB03002_009E%2CB03002_012E%2COBJECTID&outSR=102100&quantizationParameters=%7B%22extent%22%3A%7B%22spatialReference%22%3A%7B%22latestWkid%22%3A3857%2C%22wkid%22%3A102100%7D%2C%22xmin%22%3A-8140237.764258992%2C%22ymin%22%3A4383204.949986987%2C%22xmax%22%3A-7514065.62854699%2C%22ymax%22%3A5009377.085698988%7D%2C%22mode%22%3A%22view%22%2C%22originPosition%22%3A%22upperLeft%22%2C%22tolerance%22%3A1222.992452562501%7D&resultType=tile&returnCentroid=true&returnExceededLimitFeatures=false&spatialRel=esriSpatialRelIntersects&where=1%3D1&geometryType=esriGeometryEnvelope&inSR=102100
(512x512 tile request)

I plan on making some end-to-end examples showing how to setup the server query -> parse into either tile or world coordinates, as I think a lot of people aren't aware of how to actually setup their queries, and f=pbf kind of presumes you are making modern queries. Unfortunately though been in the middle of a major refactor for the last year that's been taking most of my time. Once that's in though I hope I can set aside some time!

Merging with #2