Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions docs/source/android-section.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,16 @@ Deploy ExecuTorch on Android devices with hardware acceleration support.

- {doc}`android-examples` — Explore Android Examples & Demos

## API Reference

- [Java API Reference (Javadoc)](https://pytorch.org/executorch/main/javadoc/index.html) — Full Java class and method reference

```{toctree}
:maxdepth: 1
:hidden:

using-executorch-android
android-backends
android-examples
Java API Reference (Javadoc) <https://pytorch.org/executorch/main/javadoc/index.html>
```
3 changes: 2 additions & 1 deletion docs/source/using-executorch-android.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ All ExecuTorch Android libraries are packaged into an Android library (AAR), exe
The AAR artifact contains the Java library for users to integrate with their Java/Kotlin application code, as well as the corresponding JNI library (.so file), which is loaded by the Java code during initialization.

- [Java library](https://github.com/pytorch/executorch/tree/main/extension/android/executorch_android/src/main/java/org/pytorch/executorch)
- [Java API Reference (Javadoc)](https://pytorch.org/executorch/main/javadoc/index.html)
- JNI contains the JNI binding for the corresponding Java code, and ExecuTorch native library, including
- Core ExecuTorch runtime libraries
- XNNPACK backend
Expand Down Expand Up @@ -240,4 +241,4 @@ using ExecuTorch AAR package.

## Java API reference

Please see [Java API reference](https://pytorch.org/executorch/main/javadoc/).
Please see [Java API reference](https://pytorch.org/executorch/main/javadoc/index.html).
14 changes: 14 additions & 0 deletions extension/android/executorch_android/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -114,3 +114,17 @@ repositories {
url "https://central.sonatype.com/repository/maven-snapshots/"
}
}

android.libraryVariants.all { variant ->
task("generate${variant.name.capitalize()}Javadoc", type: Javadoc) {
source = variant.javaCompileProvider.get().source
classpath += project.files(android.getBootClasspath().join(File.pathSeparator))
classpath += variant.javaCompileProvider.get().classpath
options {
overview = "src/main/javadoc/overview.html"
windowTitle = "ExecuTorch Android Java API"
docTitle = "ExecuTorch Android Java API"
links("https://docs.oracle.com/en/java/javase/11/docs/api/")
}
}
}
Original file line number Diff line number Diff line change
@@ -1,2 +1,49 @@
/** Extension for LLM related use cases for ExecuTorch Android Java/JNI package. */
/**
* ExecuTorch LLM extension for Android.
*
* <p>This package provides Java bindings for running large language models (LLMs)
* on Android using ExecuTorch. It supports text generation, tokenization,
* and streaming token callbacks.
*
* <h2>Quick Start</h2>
*
* <pre>{@code
* import org.pytorch.executorch.extension.llm.LlmModule;
*
* // Load a Llama model
* LlmModule llm = new LlmModule(
* "/data/local/tmp/llama.pte",
* "/data/local/tmp/tokenizer.bin",
* 0.8f
* );
* llm.load();
*
* // Generate text token by token
* llm.generate("Hello, my name is", 200, new LlmCallback() {
* public void onResult(String token) {
* System.out.print(token);
* }
* public void onStats(String stats) {
* System.out.println("\nStats: " + stats);
* }
* });
* }</pre>
*
* <h2>Key Classes</h2>
*
* <ul>
* <li>{@link org.pytorch.executorch.extension.llm.LlmModule} — load and run an LLM</li>
* <li>{@link org.pytorch.executorch.extension.llm.LlmModuleConfig} — configure model paths and settings</li>
* <li>{@link org.pytorch.executorch.extension.llm.LlmGenerationConfig} — control generation (temperature, seq length)</li>
* </ul>
*
* <h2>More Resources</h2>
*
* <ul>
* <li><a href="https://github.com/pytorch/executorch/tree/main/examples/demo-apps/android/LlamaDemo">
* Llama Android Demo App</a> — full working app with UI</li>
* <li><a href="https://pytorch.org/executorch/main/using-executorch-android.html">
* Using ExecuTorch on Android</a></li>
* </ul>
*/
package org.pytorch.executorch.extension.llm;
Original file line number Diff line number Diff line change
@@ -1,2 +1,59 @@
/** ExecuTorch Android Java/JNI package. This is the main package for generic use cases. */
/**
* ExecuTorch Android Java API.
*
* <p>This package provides Java bindings for running ExecuTorch models on Android.
* Use these classes to load a {@code .pte} model file and run inference directly
* from your Java or Kotlin Android app — no C++ required.
*
* <h2>Quick Start</h2>
*
* <p><b>Step 1.</b> Add the dependency to your {@code app/build.gradle.kts}:
*
* <pre>{@code
* dependencies {
* implementation("org.pytorch:executorch-android:${executorch_version}")
* }
* }</pre>
*
* <p><b>Step 2.</b> Load your model and run inference:
*
* <pre>{@code
* import org.pytorch.executorch.EValue;
* import org.pytorch.executorch.Module;
* import org.pytorch.executorch.Tensor;
*
* // Load your exported .pte model file
* Module module = Module.load("/data/local/tmp/model.pte");
*
* // Build an input tensor e.g. a 1x3x224x224 image
* float[] inputData = new float[1 * 3 * 224 * 224];
* Tensor inputTensor = Tensor.fromBlob(inputData, new long[]{1, 3, 224, 224});
*
* // Run inference
* EValue[] output = module.forward(EValue.from(inputTensor));
*
* // Read the result
* float[] scores = output[0].toTensor().getDataAsFloatArray();
* }</pre>
*
* <h2>Key Classes</h2>
*
* <ul>
* <li>{@link org.pytorch.executorch.Module} — load and run a {@code .pte} model</li>
* <li>{@link org.pytorch.executorch.Tensor} — create input tensors and read outputs</li>
* <li>{@link org.pytorch.executorch.EValue} — wrap inputs and unwrap outputs</li>
* <li>{@link org.pytorch.executorch.DType} — supported data types (FLOAT, INT32, etc.)</li>
* </ul>
*
* <h2>More Resources</h2>
*
* <ul>
* <li><a href="https://pytorch.org/executorch/main/using-executorch-android.html">
* Using ExecuTorch on Android</a> — full setup guide, AAR install, build from source</li>
* <li><a href="https://github.com/pytorch/executorch/tree/main/examples/demo-apps/android">
* Android Demo Apps</a> — working example apps you can build and run immediately</li>
* <li><a href="https://pytorch.org/executorch/main/cross-compilation-for-android.html">
* Cross Compilation for Android</a> — using C++ APIs from Android native code</li>
* </ul>
*/
package org.pytorch.executorch;
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
<!DOCTYPE html>
<html lang="en">
<head>
<title>ExecuTorch Android Java API</title>
</head>
<body>

<p>
The ExecuTorch Android Java API lets you run PyTorch models on Android
devices using a simple Java or Kotlin interface.
</p>

<p>
ExecuTorch is PyTorch's solution for on-device AI — from smartphones to
microcontrollers. The Java API wraps the native ExecuTorch runtime and gives
you clean Java classes to load models, build tensors, and run inference.
</p>

<h2>Quick Start</h2>

<p>Add the library to your app:</p>

<pre>
// app/build.gradle.kts
dependencies {
implementation("org.pytorch:executorch-android:${executorch_version}")
}
</pre>

<p>Load a model and run inference:</p>

<pre>
import org.pytorch.executorch.EValue;
import org.pytorch.executorch.Module;
import org.pytorch.executorch.Tensor;

// Load your exported .pte model
Module module = Module.load("/data/local/tmp/model.pte");

// Create an input tensor (1x3x224x224 image)
float[] data = new float[1 * 3 * 224 * 224];
Tensor input = Tensor.fromBlob(data, new long[]{1, 3, 224, 224});

// Run inference
EValue[] output = module.forward(EValue.from(input));
float[] scores = output[0].toTensor().getDataAsFloatArray();
</pre>

<h2>Packages</h2>

<table>
<tr>
<td><b>org.pytorch.executorch</b></td>
<td>Core API. Contains Module to load and run models, Tensor for tensor operations,
and EValue to wrap inputs and outputs.</td>
</tr>
<tr>
<td><b>org.pytorch.executorch.extension.llm</b></td>
<td>LLM extension. Contains LlmModule for running large language models like Llama
with streaming token generation.</td>
</tr>
<tr>
<td><b>org.pytorch.executorch.annotations</b></td>
<td>API annotations. Experimental marks APIs that may change in future releases.</td>
</tr>
</table>

<h2>Resources</h2>

<ul>
<li>
<a href="https://pytorch.org/executorch/main/using-executorch-android.html">
Using ExecuTorch on Android
</a> — setup guide, Maven install, build from source
</li>
<li>
<a href="https://github.com/pytorch/executorch/tree/main/examples/demo-apps/android">
Android Demo Apps
</a> — working example apps
</li>
<li>
<a href="https://pytorch.org/executorch/main/cross-compilation-for-android.html">
Cross Compilation for Android
</a> — using C++ APIs from native Android code
</li>
</ul>

</body>
</html>
Loading