/OshoP

frontend-test

Primary LanguageTypeScriptMIT LicenseMIT


huggingface javascript library logo

Hugging Face JS libraries

This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.

  • @huggingface/hub: Interact with huggingface.co to create or delete repos and commit / download files
  • @huggingface/inference: Use the Inference API to make calls to 100,000+ Machine Learning models!

With more to come, like @huggingface/endpoints to manage your HF Endpoints!

We use modern features to avoid polyfills and dependencies, so the libraries will only work on modern browsers / Node.js >= 18 / Bun / Deno.

The libraries are still very young, please help us by opening issues!

Installation

From NPM

To install via NPM, you can download the libraries as needed:

npm install @huggingface/hub
npm install @huggingface/inference

Then import the libraries in your code:

import { createRepo, commit, deleteRepo, listFiles } from "@huggingface/hub";
import { HfInference } from "@huggingface/inference";
import type { RepoId, Credentials } from "@huggingface/hub";

From CDN or Static hosting

You can run our packages with vanilla JS, without any bundler, by using a CDN or static hosting. Using ES modules, i.e. <script type="module">, you can import the libraries in your code:

<script type="module">
    import { HfInference } from 'https://cdn.jsdelivr.net/npm/@huggingface/inference@1/+esm';
    import { createRepo, commit, deleteRepo, listFiles } from "https://cdn.jsdelivr.net/npm/@huggingface/hub@0/+esm";
</script>

Usage example

import { createRepo, commit } from "@huggingface/hub";
import { HfInference } from "@huggingface/inference";

// use an access token from your free account
const HF_ACCESS_TOKEN = "hf_...";

await createRepo({
  repo: {type: "model", name: "my-user/nlp-test"},
  credentials: {accessToken: HF_ACCESS_TOKEN}
});

await commit({
  repo: {type: "model", name: "my-user/nlp-test"},
  credentials: {accessToken: HF_ACCESS_TOKEN},
  title: "Add model file",
  operations: [{
    operation: "addOrUpdate",
    path: "pytorch_model.bin",
    content: new Blob(...) // Can work with native File in browsers
  }]
});

const inference = new HfInference(HF_ACCESS_TOKEN);

await inference.translation({
  model: 't5-base',
  inputs: 'My name is Wolfgang and I live in Berlin'
})

await inference.textToImage({
  inputs: 'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',
  negative_prompt: 'blurry',
  model: 'stabilityai/stable-diffusion-2',
})

There are more features of course, check each library's README!

Formatting & testing

pnpm install

pnpm -r format
pnpm -r test

Building

pnpm -r build

This will generate ESM and CJS javascript files in packages/*/dist, eg packages/inference/dist/index.mjs.