You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Assistant for *Echoes Of Vision*. Image recognition + simulation control to free your hands! Powered by MaaFramework!
153
+
Assistant for _Echoes Of Vision_. Image recognition + simulation control to free your hands! Powered by MaaFramework!
154
154
155
155
-[MAA Star Resonance](https://github.com/26F-Studio/maa-star-resonance)
156
156
A helper for Star Resonance. Using Electron + text image recognition + ADB simulation control technology, freeing your hands! Powered by MaaFramework and Quasar.
@@ -262,7 +262,7 @@ Thanks to the following developers for their contributions to MaaFramework:
262
262
263
263
## Discussion
264
264
265
-
-QQ Group: 595990173
265
+
Developers are welcome to join the official QQ group (595990173) for integration and development discussions. The group is reserved for engineering topics; product-usage support is not provided, and off-topic or spam accounts may be removed to keep the channel focused.
-Registers custom task modules through AgentServer
41
-
- Seamlessly integrates with the [⭐ boilerplate](https://github.com/MaaXYZ/MaaPracticeBoilerplate).
37
+
- Retains the low-code advantage of JSON; core flows remain visual and easy to edit
38
+
-Hosts custom recognition/actions in the Agent process, making it easier to encapsulate advanced logic
39
+
- Seamlessly integrates with the [⭐ boilerplate](https://github.com/MaaXYZ/MaaPracticeBoilerplate) to provide scaffolding and examples
42
40
43
41
```jsonc
44
42
{
@@ -55,7 +53,7 @@ MaaFramework provides three integration solutions to meet different development
55
53
56
54
```
57
55
58
-
💡 The General UI will automatically connect to your AgentServer child process, and call the corresponding recognition/action when executing `MyReco`/`MyAct`.
56
+
💡 The General UI automatically connects to your Agent process and invokes the registered recognition/action implementations when executing `MyReco`/`MyAct`.
59
57
60
58
```python
61
59
# Python pseudo-code example
@@ -82,6 +80,9 @@ For a complete example, refer to the [template commit](https://github.com/MaaXYZ
82
80
83
81
### Approach 3: Full-Code Development
84
82
83
+
> [!NOTE]
84
+
> MaaFramework offers full multi-language APIs, but code-only workflows lose ecosystem tools (visual editor, visual debugger, General UI). In most cases, the custom extensions in Approach 2 already cover advanced requirements without sacrificing those capabilities.
85
+
85
86
**Applicable Scenarios**:
86
87
87
88
- Deep customization requirements
@@ -108,9 +109,18 @@ def main():
108
109
109
110
## Resource Preparation
110
111
112
+
After you confirm your development approach, prepare the corresponding resource files. The example below uses the project boilerplate as a baseline.
113
+
114
+
> [!TIP]
115
+
>
116
+
> - If you use the project boilerplate, follow the `TIP` marks below for a ready path.
117
+
> - If you choose full-code development, you still need resource files such as image assets and OCR models; otherwise related image-recognition features will be unavailable.
118
+
111
119
### File Structure Specification
112
120
113
-
*⭐If you use the boilerplate, just modify it directly in [folder](https://github.com/MaaXYZ/MaaPracticeBoilerplate/tree/main/assets/resource).*
121
+
> [!TIP]
122
+
>
123
+
> ⭐If you use the boilerplate, modify the [folder](https://github.com/MaaXYZ/MaaPracticeBoilerplate/tree/main/assets/resource) directly.
114
124
115
125
```tree
116
126
my_resource/
@@ -135,20 +145,22 @@ The files in `my_resource/pipeline` contain the main script execution logic and
135
145
136
146
You can refer to the [Task Pipeline Protocol](3.1-PipelineProtocol.md) for writing these files. You can find a simple [demo](https://github.com/MaaXYZ/MaaFramework/blob/main/sample/resource/pipeline/sample.json) for reference.
- No-code visual editor; supports drag-and-drop nodes and JSON import/export
146
158
147
159
### Image Files
148
160
149
161
The files in `my_resource/image` are primarily used for template matching images, feature detection images, and other images required by the pipeline. They are read based on the `template` and other fields specified in the pipeline.
150
162
151
-
Please note that the images used need to be cropped from the lossless original image and scaled to 720p. **UNLESS YOU EXACTLY KNOW HOW MAAFRAMEWORK PROCESSES, DO USE THE CROPPING TOOLS BELOW TO OBTAIN IMAGES.**
163
+
Use lossless source images scaled to 720p before cropping. Unless you're very familiar with MaaFramework's processing, use the capture tools below to obtain images.
152
164
153
165
-[Image Cropping and ROI Extraction Tool](https://github.com/MaaXYZ/MaaFramework/tree/main/tools/ImageCropper)
@@ -160,38 +172,50 @@ Please note that the images used need to be cropped from the lossless original i
160
172
161
173
The files in `my_resource/model/ocr` are ONNX models obtained from [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR) after conversion.
162
174
163
-
You can use our pre-converted files: [MaaCommonAssets](https://github.com/MaaXYZ/MaaCommonAssets/tree/main/OCR). Choose the language you need and store them according to the directory structure mentioned above.
175
+
You can use our pre-converted files: [MaaCommonAssets](https://github.com/MaaXYZ/MaaCommonAssets/tree/main/OCR). Choose the language you need and store them according to the [directory structure above](#file-structure-specification).
164
176
165
177
If needed, you can also fine-tune the official pre-trained models of PaddleOCR yourself (please refer to the official PaddleOCR documentation) and convert them to ONNX files for use. You can find conversion commands [here](https://github.com/MaaXYZ/MaaCommonAssets/tree/main/OCR#command).
166
178
167
179
## Debug
168
180
169
-
- Use [Development Tool](https://github.com/MaaXYZ/MaaFramework/blob/main/README_en.md#development-tool).
170
-
- Some tools will generate `config/maa_option.json` file in the same directory, including:
181
+
After you finish preparing resources, you can start debugging.
182
+
183
+
> [!NOTE]
184
+
> If you choose full-code development, some tools in this section may not work; consider writing your own debug helpers instead.
171
185
172
-
-`logging`: Save the log and generate `debug/maa.log`. Default true.
173
-
-`save_draw`: Saves the image recognition visualization results. All image recognition visualization results drawings during the run will be saved. Default false.
174
-
-`stdout_level`: The console displays the log level. The default is 2 (Error), which can be set to 0 to turn off all console logs, or to 7 to turn on all console logs.
- If you integrate it yourself, you can enable debugging options through the `Toolkit.init_option` / `MaaToolkitConfigInitOption` interface. The generated json file is the same as above.
188
+
Most tools will generate a `config/maa_option.json` file in the same directory, including:
189
+
190
+
-`logging`: Save the log and generate `debug/maa.log`. Default true.
191
+
-`save_draw`: Save visualized image-recognition results during runtime. Default false.
192
+
-`stdout_level`: Console log level. Default 2 (Error); set 0 to silence logs, or 7 to show all logs.
193
+
-`save_on_error`: Save the current screenshot when a task fails. Default true.
194
+
195
+
If you integrate it yourself, you can enable debugging options through the `Toolkit.init_option` / `MaaToolkitConfigInitOption` interface. The generated json file is the same as above.
177
196
178
197
## Run
179
198
180
-
You can using Generic UI (MaaPiCli, MFA, MFW, etc) or by writing integration code yourself.
199
+
> [!NOTE]
200
+
> If you choose full-code development, the UI apps in this chapter may not work; consider writing your own interaction UI.
*⭐If you use the boilerplate, follow its documentation directly and run `install.py` to automatically package the relevant files.*
204
+
We define a [ProjectInterface protocol](3.3-ProjectInterfaceV2.md) to describe the resource files and runtime configuration so General UI can correctly load and run your project.
185
205
186
-
Use MaaPiCli in the `bin` folder of the Release package, and write `interface.json`and place it in the same directory to use it.
206
+
*In short, write an `interface.json` to tell the General UI where your resources are and which tasks can be executed, so it can run for you.*
Please refer to the [Integration Documentation](2.1-Integration.md) and the [Integrated Interface Overview](2.2-IntegratedInterfaceOverview.md).
218
+
219
+
## Communication
196
220
197
-
Please refer to the [Integration Documentation](2.1-Integration.md).
221
+
Developers are welcome to join the official QQ group (595990173) for integration and development discussions. The group is reserved for engineering topics; product-usage support is not provided, and off-topic or spam accounts may be removed to keep the channel focused.
Copy file name to clipboardExpand all lines: docs/en_us/4.1-BuildGuide.md
+24-4Lines changed: 24 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,15 +3,35 @@
3
3
> [!TIP]
4
4
> _You only need to read this chapter if you are ready to develop MaaFramework itself. If you only want to develop applications based on MaaFramework, please refer to [Quick Started](1.1-QuickStarted.md)._
5
5
6
+
Before you start, ensure Git, Python 3, and CMake 3.24+ are installed, along with the toolchain for your platform (Windows: MSVC 2022; Linux/macOS: Ninja + g++/clang).
7
+
6
8
## Local Development
7
9
8
-
1. Download MaaDeps prebuilt
10
+
1. Clone the repo with all submodules (recommended)
11
+
12
+
Submodules contain third-party dependencies and must be initialized before building. You can re-run this step anytime to keep them in sync.
0 commit comments