Replies: 2 comments 4 replies
-
👋 Hello @hdi200, thank you for your interest in Ultralytics 🚀! For new users, we recommend checking out the Docs, which include many helpful Python and CLI usage examples. These may already answer some of your questions. If this is a 🐛 Bug Report, please provide a minimum reproducible example to help us diagnose the issue effectively. If you're exploring custom integration ❓, please share more details about your approach and any relevant logs or error messages. Join the Ultralytics community where it suits you best. For real-time chat, head to Discord 🎧. Prefer in-depth discussions? Check out Discourse. You can also join our Subreddit to learn and share insights with the community. UpgradeEnsure you upgrade to the latest pip install -U ultralytics EnvironmentsYOLO can be run in various environments, each with dependencies like CUDA and CUDNN already installed:
StatusIf this badge is green, all Ultralytics CI tests are currently passing. These tests verify correct operation across macOS, Windows, and Ubuntu for all YOLO Modes and Tasks every 24 hours and on each commit. This is an automated response, and an Ultralytics engineer will assist you soon 😊. |
Beta Was this translation helpful? Give feedback.
-
@hdi200 yes, you can define class names at runtime on iOS by using the YOLO model with Core ML. Integrating CLIP is not necessary for this functionality. |
Beta Was this translation helpful? Give feedback.
-
Hey! Just wondering is it possible to just convert the base yolov8s-worldv2.pt to an ML Model and define classnames to look for on iOS itself, on device at runtime? Rather than pre-preparing the class_names in python? Would I need to integrate clip into my app somehow?
Beta Was this translation helpful? Give feedback.
All reactions