mirror of
https://github.com/RootKit-Org/AI-Aimbot.git
synced 2025-06-21 02:41:01 +08:00
fix some typos (#182)
This commit is contained in:
parent
0768cf8139
commit
fccb5de13c
@ -43,6 +43,6 @@ conda update conda
|
||||
|
||||
That's it! You have successfully installed Miniconda on your system.
|
||||
|
||||
Now when you open up a terminal you should see a prompt and (base) to indicate no conda enviroment is active.
|
||||
Now when you open up a terminal you should see a prompt and (base) to indicate no conda environment is active.
|
||||
|
||||

|
||||
|
@ -127,7 +127,7 @@ Follow these sparkly steps to get your TensorRT ready for action! 🛠️✨
|
||||
You can use one of the .engine models we supply. But if it doesn't work, then you will need to re-export it. Grab the `.pt` file here for the model you want. We recommend `yolov5s.py` or `yolov5m.py` [HERE 🔗](https://github.com/ultralytics/yolov5/releases/tag/v7.0).
|
||||
|
||||
12. **Run the Export Script** 🏃♂️💻
|
||||
Time to execute `export.py` with the following command. Patience is key; it might look frozen, but it's just concentrating hard! Can take up to 20 mintues.
|
||||
Time to execute `export.py` with the following command. Patience is key; it might look frozen, but it's just concentrating hard! Can take up to 20 minutes.
|
||||
|
||||
```
|
||||
python .\export.py --weights ./yolov5s.pt --include engine --half --imgsz 320 320 --device 0
|
||||
|
@ -1,7 +1,7 @@
|
||||
# Performance optimizations
|
||||
|
||||
This version aimes to achieve the best performance possible on AMD hardware.
|
||||
To achieve this, the script acts more as an aim assist insted of a full fledged aimbot.
|
||||
To achieve this, the script acts more as an aim assist instead of a full fledged aimbot.
|
||||
The user will still need to do most on the aim
|
||||
|
||||
Changes that have been made:
|
||||
|
@ -330,7 +330,7 @@ def classify_albumentations(
|
||||
if vflip > 0:
|
||||
T += [A.VerticalFlip(p=vflip)]
|
||||
if jitter > 0:
|
||||
color_jitter = (float(jitter),) * 3 # repeat value for brightness, contrast, satuaration, 0 hue
|
||||
color_jitter = (float(jitter),) * 3 # repeat value for brightness, contrast, saturation, 0 hue
|
||||
T += [A.ColorJitter(*color_jitter, 0)]
|
||||
else: # Use fixed crop for eval set (reproducibility)
|
||||
T = [A.SmallestMaxSize(max_size=size), A.CenterCrop(height=size, width=size)]
|
||||
|
@ -68,7 +68,7 @@ Run information streams from your environment to the W&B cloud console as you tr
|
||||
You can leverage W&B artifacts and Tables integration to easily visualize and manage your datasets, models and training evaluations. Here are some quick examples to get you started.
|
||||
|
||||
<details open>
|
||||
<h3> 1: Train and Log Evaluation simultaneousy </h3>
|
||||
<h3> 1: Train and Log Evaluation simultaneously </h3>
|
||||
This is an extension of the previous section, but it'll also training after uploading the dataset. <b> This also evaluation Table</b>
|
||||
Evaluation table compares your predictions and ground truths across the validation set for each epoch. It uses the references to the already uploaded datasets,
|
||||
so no images will be uploaded from your system more than once.
|
||||
@ -102,7 +102,7 @@ You can leverage W&B artifacts and Tables integration to easily visualize and ma
|
||||
</details>
|
||||
|
||||
<h3> 4: Save model checkpoints as artifacts </h3>
|
||||
To enable saving and versioning checkpoints of your experiment, pass `--save_period n` with the base cammand, where `n` represents checkpoint interval.
|
||||
To enable saving and versioning checkpoints of your experiment, pass `--save_period n` with the base command, where `n` represents checkpoint interval.
|
||||
You can also log both the dataset and model checkpoints simultaneously. If not passed, only the final model will be logged
|
||||
|
||||
<details>
|
||||
|
@ -330,7 +330,7 @@ def classify_albumentations(
|
||||
if vflip > 0:
|
||||
T += [A.VerticalFlip(p=vflip)]
|
||||
if jitter > 0:
|
||||
color_jitter = (float(jitter), ) * 3 # repeat value for brightness, contrast, satuaration, 0 hue
|
||||
color_jitter = (float(jitter), ) * 3 # repeat value for brightness, contrast, saturation, 0 hue
|
||||
T += [A.ColorJitter(*color_jitter, 0)]
|
||||
else: # Use fixed crop for eval set (reproducibility)
|
||||
T = [A.SmallestMaxSize(max_size=size), A.CenterCrop(height=size, width=size)]
|
||||
|
@ -68,7 +68,7 @@ Run information streams from your environment to the W&B cloud console as you tr
|
||||
You can leverage W&B artifacts and Tables integration to easily visualize and manage your datasets, models and training evaluations. Here are some quick examples to get you started.
|
||||
|
||||
<details open>
|
||||
<h3> 1: Train and Log Evaluation simultaneousy </h3>
|
||||
<h3> 1: Train and Log Evaluation simultaneously </h3>
|
||||
This is an extension of the previous section, but it'll also training after uploading the dataset. <b> This also evaluation Table</b>
|
||||
Evaluation table compares your predictions and ground truths across the validation set for each epoch. It uses the references to the already uploaded datasets,
|
||||
so no images will be uploaded from your system more than once.
|
||||
@ -102,7 +102,7 @@ You can leverage W&B artifacts and Tables integration to easily visualize and ma
|
||||
</details>
|
||||
|
||||
<h3> 4: Save model checkpoints as artifacts </h3>
|
||||
To enable saving and versioning checkpoints of your experiment, pass `--save_period n` with the base cammand, where `n` represents checkpoint interval.
|
||||
To enable saving and versioning checkpoints of your experiment, pass `--save_period n` with the base command, where `n` represents checkpoint interval.
|
||||
You can also log both the dataset and model checkpoints simultaneously. If not passed, only the final model will be logged
|
||||
|
||||
<details>
|
||||
|
Loading…
x
Reference in New Issue
Block a user