[WIP] doc: update main and Android README docs; add self to code owners
This commit is contained in:
parent
f94efbacbb
commit
f10a45f530
|
|
@ -30,7 +30,7 @@
|
||||||
/examples/export-docs/ @ggerganov
|
/examples/export-docs/ @ggerganov
|
||||||
/examples/gen-docs/ @ggerganov
|
/examples/gen-docs/ @ggerganov
|
||||||
/examples/gguf/ @ggerganov
|
/examples/gguf/ @ggerganov
|
||||||
/examples/llama.android/ @ggerganov
|
/examples/llama.android/ @ggerganov @hanyin-arm
|
||||||
/examples/llama.swiftui/ @ggerganov
|
/examples/llama.swiftui/ @ggerganov
|
||||||
/examples/llama.vim @ggerganov
|
/examples/llama.vim @ggerganov
|
||||||
/examples/lookahead/ @ggerganov
|
/examples/lookahead/ @ggerganov
|
||||||
|
|
|
||||||
|
|
@ -189,6 +189,7 @@ Instructions for adding support for new models: [HOWTO-add-model.md](docs/develo
|
||||||
- Swift [ShenghaiWang/SwiftLlama](https://github.com/ShenghaiWang/SwiftLlama)
|
- Swift [ShenghaiWang/SwiftLlama](https://github.com/ShenghaiWang/SwiftLlama)
|
||||||
- Delphi [Embarcadero/llama-cpp-delphi](https://github.com/Embarcadero/llama-cpp-delphi)
|
- Delphi [Embarcadero/llama-cpp-delphi](https://github.com/Embarcadero/llama-cpp-delphi)
|
||||||
- Go (no CGo needed): [hybridgroup/yzma](https://github.com/hybridgroup/yzma)
|
- Go (no CGo needed): [hybridgroup/yzma](https://github.com/hybridgroup/yzma)
|
||||||
|
- Android: [llama.android](/examples/llama.android)
|
||||||
|
|
||||||
</details>
|
</details>
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -1,6 +1,22 @@
|
||||||
|
|
||||||
# Android
|
# Android
|
||||||
|
|
||||||
|
## Build with Android Studio
|
||||||
|
|
||||||
|
Import the `examples/llama.android` directory into Android Studio, then perform a Gradle sync and build the project.
|
||||||
|

|
||||||
|
|
||||||
|
This Android binding supports hardware acceleration up to `SME2` for **Arm** and `AMX` for **x86-64** CPUs on Android and ChromeOS devices.
|
||||||
|
It automatically detects the host's hardware to load compatible kernels. As a result, it runs seamlessly on both the latest premium devices and older devices that may lack modern CPU features or have limited RAM, without requiring any manual configuration.
|
||||||
|
|
||||||
|
A minimal Android app frontend is included to showcase the binding’s core functionalities:
|
||||||
|
1. **Parse GGUF metadata** via `GgufMetadataReader` from either a `ContentResolver` provided `Uri` or a local `File`.
|
||||||
|
2. **Obtain a `TierDetection` or `InferenceEngine`** instance through the high-level facade APIs.
|
||||||
|
3. **Send a raw user prompt** for automatic template formatting, prefill, and decoding. Then collect the generated tokens in a Kotlin `Flow`.
|
||||||
|
|
||||||
|
For a production-ready experience that leverages advanced features such as system prompts and benchmarks, check out [Arm AI Chat](https://play.google.com/...) on Google Play.
|
||||||
|
This project is made possible through a collaborative effort by Arm's CT-ML, CE-ML and STE groups.
|
||||||
|
|
||||||
## Build on Android using Termux
|
## Build on Android using Termux
|
||||||
|
|
||||||
[Termux](https://termux.dev/en/) is an Android terminal emulator and Linux environment app (no root required). As of writing, Termux is available experimentally in the Google Play Store; otherwise, it may be obtained directly from the project repo or on F-Droid.
|
[Termux](https://termux.dev/en/) is an Android terminal emulator and Linux environment app (no root required). As of writing, Termux is available experimentally in the Google Play Store; otherwise, it may be obtained directly from the project repo or on F-Droid.
|
||||||
|
|
|
||||||
Loading…
Reference in New Issue