Issues
- 1
- 1
[Feature Request] `max serve`: Validate uvicorn server settings before executing model pipelines
#296 opened by nlaanait - 1
[Feature Request] OpenAI API Compatibility: Only the first element of a list of prompts is considered during generation
#293 opened by nlaanait - 1
[Feature Request] OpenAI API Compatibility: Text Generation Does not stop at specified `stop` argument.
#292 opened by nlaanait - 2
- 1
[Docs] Inconsistent HF repo-id for Llama3.1
#291 opened by nlaanait - 0
- 1
[BUG]: The `mistral` pipeline command is missing
#280 opened by clafollett - 1
[BUG]: No module named architectures
#286 opened by gnodar01 - 1
[BUG]: Running "modularai/llama-3.1" Crashes on CPU
#279 opened by Dasor - 3
- 1
Instructions to use conda do not work... I'd rather use conda, not use WSL.
#290 opened by richlysakowski - 5
[Docs] This is really a forum issue
#284 opened by ltorres6 - 4
[Feature Request] Make `tensor.Tensor` implement `tensor_utils.TensorLike`
#274 opened by owenhilyard - 4
[Docs] "magic run llama3" failed with numpy error "No module named 'numpy.core._multiarray_umath"
#283 opened by liuzhishan - 1
[Docs] MAX serve cloud scale to zero?
#282 opened by ishaangandhi - 7
- 1
[Feature Request] Add option to cache model compilation for `modular/max-openai-api`
#271 opened by remorses - 3
[Docs] Tensor value has no attribute 'argmax'
#278 opened by Str-Gen - 17
[BUG]: magic project channel add pytorch --prepend results in invalid project setup (Example run-onnx-with-python)
#270 opened by bmerkle - 1
[BUG]: Jupyter Kernel fails to start in MAX 24.6
#277 opened by oforero - 0
[Feature Request] Additional `max.engine.Model.execute` variants to aid composability
#275 opened by owenhilyard - 1
- 9
- 3
- 2
[Feature Request] Single compilation unit kernels and/or improved error messages
#269 opened by owenhilyard - 1
[Feature Request] Bundling code using Magic as a single binary or container
#247 opened by ChetanBhasin - 0
[BUG]: VS Code - Mojo Nightly extension doesn't respect a project's local Python environment.
#253 opened by johnsoez4 - 2
[BUG]: 403 when I try to 'magic init'
#255 opened by dcromster - 3
[BUG]: MAX fills up cache
#256 opened by martinvuyk - 2
- 2
[BUG]: missing dependencies in example max/examples/inference/yolo-python-onnx
#261 opened by bmerkle - 1
[BUG]: examples/graph-api/pipelines/weights/download.🔥 does not check the downloaded file
#265 opened by lesoup-mxd - 1
- 1
- 15
- 1
- 1
[BUG]: example /max/examples/inference/mistral7b-python-onnx: killed during loading of checkoint shards
#262 opened by bmerkle - 0
[Feature Request] magic cli: control verbose logging (e.g.display download/install progress, training progress etc.)
#263 opened by bmerkle - 4
- 8
- 2
- 3
[BUG]: Install Magic on Linux
#250 opened by corvinux - 2
[Docs] Unable to access magic projects, or mojo executable after initial "getting started"
#251 opened by russfellows - 2
[BUG]: Error running an example per https://builds.modular.com/builds/llama3/python
#249 opened by skoppisetti - 4
- 1
- 3
- 3
[Docs] magic search returning outdated package metadata for pytorch (M2 Macbook Air)
#245 opened by friediisch - 2
[BUG]: Failed to build installable wheels for some pyproject.toml based projects (geventhttpclient)
#244 opened by HamxAnwar