Skip to main content
const fs = await import("node:fs/promises");

const result = await repo
  .createCommit({
    targetBranch: "main",
    commitMessage: "Update dashboard docs",
    expectedHeadSha: currentHeadSha, // optional safety check
    author: { name: "Docs Bot", email: "docs@example.com" },
  })
  .addFileFromString("docs/changelog.md", "# v2.1.0\n- refresh docs\n")
  .addFile("public/logo.svg", await fs.readFile("assets/logo.svg"))
  .deletePath("docs/legacy.txt")
  .send();

console.log(result.commitSha);
console.log(result.refUpdate.newSha);
console.log(result.refUpdate.oldSha);
If the backend rejects the update (for example, the branch moved past expectedHeadSha), repo.createCommit().send() throws a RefUpdateError containing the status, reason, and ref details.

Builder Methods

MethodDescription
addFile(path, source, options)Attach bytes, async iterables, readable streams, or buffers.
addFileFromString(path, contents, options)Add UTF-8 text files.
deletePath(path)Remove files or folders.
send()Finalize the commit and receive metadata about the new commit.

Options

ParameterTypeDescription
targetBranchRequiredBranch name that will receive the commit (for example main).
commitMessageRequiredThe commit message.
authorRequiredProvide name and email for the commit author.
expectedHeadShaOptionalCommit SHA that must match the remote tip; omit to fast-forward unconditionally.
baseBranchOptionalMirrors the base_branch metadata field. Point to an existing branch whose tip should seed targetBranch if it does not exist. When bootstrapping a new branch, omit expectedHeadSha so the service copies from baseBranch; if both fields are provided and the branch already exists, the expectedHeadSha guard still applies.
ephemeralOptionalStore the branch under the refs/namespaces/ephemeral/... namespace. When enabled, the commit is kept out of the primary Git remotes (for example, GitHub) but remains available through storage APIs.
ephemeralBaseOptionalUse alongside baseBranch when the seed branch also lives in the ephemeral namespace. Requires baseBranch to be set.
committerOptionalProvide name and email. If omitted, the author identity is reused.
signalOptionalAbort an in-flight upload with AbortController.
targetRefDeprecated, OptionalFully qualified ref (for example refs/heads/main). Prefer targetBranch.
Ephemeral commits are ideal for short-lived review environments or intermediate build artifacts. They stay isolated from upstream mirrors while still benefiting from the same authentication and storage guarantees.
Files are chunked to 4 MiB segments under the hood, so you can stream large assets without buffering them entirely in memory. File paths are normalized relative to the repository root. The targetBranch must already exist on the remote repository unless you provide baseBranch (or the repository has no refs). To initialize an empty repository, point to its default branch and omit expectedHeadSha. To seed a missing branch inside an existing repo, set baseBranch to the branch you want to copy and omit expectedHeadSha so the service clones that tip before applying your changes.

Streaming Large Files

Use streams or async generators to upload large files without loading them into memory:
import { createReadStream } from "node:fs";

// Simplest approach: Node.js ReadableStream
await repo
  .createCommit({
    targetBranch: "assets",
    expectedHeadSha: "abc123...",
    commitMessage: "Upload latest design bundle",
    author: { name: "Assets Uploader", email: "assets@example.com" },
  })
  .addFile("assets/design-kit.zip", createReadStream("/tmp/large-file.zip"))
  .send();

// Alternative: Async generator for custom chunking
async function* fileChunks() {
  const fs = await import("node:fs/promises");
  const file = await fs.open("/tmp/large-file.zip", "r");
  const chunkSize = 1024 * 1024; // 1MB chunks

  try {
    while (true) {
      const buffer = Buffer.alloc(chunkSize);
      const { bytesRead } = await file.read(buffer, 0, chunkSize);
      if (bytesRead === 0) break;
      yield buffer.subarray(0, bytesRead);
    }
  } finally {
    await file.close();
  }
}

await repo
  .createCommit({
    targetBranch: "assets",
    commitMessage: "Upload with custom chunking",
    author: { name: "Assets Uploader", email: "assets@example.com" },
  })
  .addFile("assets/design-kit.zip", fileChunks())
  .send();
The SDK automatically chunks files to 4 MiB segments, so you can stream large assets (videos, archives, datasets) without buffering them entirely in memory.

Response

The send() method returns the following:
FieldTypeDescription
commitSha (TypeScript)
commit_sha (Python)
StringThe SHA of the created commit
treeSha (TypeScript)
tree_sha (Python)
StringThe SHA of the commit’s tree object
targetBranch (TypeScript)
target_branch (Python)
StringThe branch that received the commit
packBytes (TypeScript)
pack_bytes (Python)
NumberSize of the uploaded pack in bytes
blobCount (TypeScript)
blob_count (Python)
NumberNumber of blobs in the commit
refUpdate (TypeScript)
ref_update (Python)
ObjectContains branch, oldSha/old_sha, and newSha/new_sha