Issues
- 1
❗ Get rid of `sv.BoxAnnotator` from `InferencePipeline` sink before it gets removed in `supervision==0.22.0`
#430 opened by PawelPeczek-Roboflow - 4
NotImplementedError
#341 opened by xiaolu-luu - 3
`inference` crash during installation
#432 opened by SkalskiP - 0
Add Python 3.12 support
#433 opened by PawelPeczek-Roboflow - 2
- 3
- 0
- 0
- 0
How to handle cropping in `workflows`?
#425 opened by PawelPeczek-Roboflow - 0
Get rid of `asyncio` in `workflows` and in some parts of `inference`
#424 opened by PawelPeczek-Roboflow - 0
- 0
Improve `workflows` docs to explain on how init parameters are passed to blocks
#422 opened by PawelPeczek-Roboflow - 0
Make the descriptions of all models parameters easier to understand (especially for people without in-depth CV context)
#421 opened by PawelPeczek-Roboflow - 0
More metadata in core `workflows` blocks
#420 opened by PawelPeczek-Roboflow - 0
`DocTR` model output missing important information that model produces
#419 opened by PawelPeczek-Roboflow - 0
- 0
Discussion: How to structure LLM, LMM blocks regarding structured usage of outputs in other blocks
#417 opened by PawelPeczek-Roboflow - 0
- 0
- 0
- 0
`workflows` Clip block - do not recompute text embeddings over and over
#413 opened by PawelPeczek-Roboflow - 0
- 0
Remove PARENT_ID from top-level outputs and keep only in sv.Detections.data
#406 opened by grzegorz-roboflow - 0
Store image metadata in sv.Detections.data
#405 opened by grzegorz-roboflow - 0
Remove parametrization of fields names
#404 opened by grzegorz-roboflow - 0
Keypoints kind
#401 opened by grzegorz-roboflow - 2
If I try to make an inference with a local RTSP, I still need to run API_KEY and server, right?
#383 opened by YoungjaeDev - 0
- 0
Add `image_id` to images metadata
#391 opened by grzegorz-roboflow - 0
- 1
ImportError
#389 opened by ChristopherMarais - 6
Run Multiple Models all together.
#365 opened by amankumarchagti - 2
current constellation of dependencies inconsistently breaking installation process
#373 opened by grzegorz-roboflow - 1
- 0
Workflows block `DetectionOffsetBlock` is not preserving parent coordinates
#380 opened by PawelPeczek-Roboflow - 9
- 0
Refactor `inference.core.models.base.Model.infer_from_request` to always return `List[InferenceResponse]`
#376 opened by grzegorz-roboflow - 4
- 6
inference.core.exceptions.ModelArtefactError: Unable to load ONNX session. Cause: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from /tmp/cache\chinese-calligraphy-recognition-sl0eb/2\best.onnx failed:Protobuf parsing failed.
#361 opened by Sui-25 - 5
Failed to install inference package ! Error "AttributeError: module 'pkgutil' has no attribute 'ImpImporter'"
#370 opened by sridharreddyu - 2
inference --version
#338 opened by yeldarby - 7
Error Running cogvlm model on Self-Hosted GPU Server with Roboflow Inference (Transformer Version)
#355 opened by YoungjaeDev - 1
Can add, delete, update stream realtime for only one pipeline has been run before?
#362 opened by nmditai - 1
Can't run inference on AWS Lambda
#356 opened by DominiquePaul - 0
`Docker Getting Started` link in docs returns 404
#353 opened by hansent - 0
Add error status code to benchmark output
#350 opened by grzegorz-roboflow - 0
Update description for workflows steps
#344 opened by grzegorz-roboflow - 0
LMM workflow step - rename to LLM
#346 opened by grzegorz-roboflow - 5
Hello, how does the local computer run to obtain the prediction label and detect the bbox coordinates?
#334 opened by KingBoyAndGirl - 1
YOLOWorld Strange behavoir
#329 opened by maxwelllwang