Easily build machine learning apps with dockable hugging face panels (2023)

Der Hugging Face Hubis an open source collaborative machine learning (ML) platform. The hub acts as a central place for users to explore, experiment, collaborate, and develop machine learning technologies. In the hub, you'll find over 140,000 models, 50,000 ML applications (called Spaces), and 20,000 community-shared datasets.

Using Spaces makes it easy to build and deploy ML-based apps and demos in minutes. Recently, the Hugging Face team added support fordocking areas, allowing users to build any custom application they want by simply writing a Dockerfile.

Another great advantage of Spaces is that once your app is running, you can easily share it with anyone in the world. ๐ŸŒ

This guide will walk you through the basics of creating a Docker Space, configuring it, and deploying code to it. We show you how to build a basicfast APIText generation application used to demonstrate thegoogle/flan-t5-smallA template that can generate text from given input text. Templates like this are used to support text completion in all types of applications. (You can consult a prepared version of the applicationwith a hug face.)

Easily build machine learning apps with dockable hugging face panels (1)

requirements

To follow the steps in this article, you must be signed in to the Hugging Face Hub; he canRecordFree if you don't have an account yet.

Create a new Docker Space ๐Ÿณ

To beginCreate a new roomas shown in Figure 1.

Easily build machine learning apps with dockable hugging face panels (2)

You can choose any name for your project, choose a license, and use Docker as a software development kit (SDK), as shown in Figure 2.

Spaces provides pre-built Docker templates like Argilla and Livebook to help you quickly get started with your ML projects with open source tools. Choosing the "Blank" option means you want to create your Dockerfile manually. But do not worry; We provide a Dockerfile to copy and paste later. ๐Ÿ˜…

Easily build machine learning apps with dockable hugging face panels (3)

After filling out the form and clicking oncreate spaceClicking will create a new repository in your Spaces account. This repository will be associated with the new space you created.

(Video) Gradio Course - Create User Interfaces for Machine Learning Models

To use:If you're new to the Hugging Face Hub ๐Ÿค— check it outIntroduction to repositoriesfor a good introduction to the repositories in the hub.

writing app

Okay, now that you have a repository of empty disk space, it's time to write some code. ๐Ÿ˜Ž

The sample application consists of the following three files:

  • Requirements.txtโ€” Lists the dependencies of a Python project or application
  • app.pyโ€” A Python script in which we will write our FastAPI application
  • Dockerfileโ€” Configure our environment, installedRequirements.txt, so it beginsapp.py

To enter, create each file shown belowvia interface web. To do this, navigate to your spacefiles and versionstab and then selectadd fileโ†’Create a new file(Figure 3). Note that you can also use Git if you prefer.

Easily build machine learning apps with dockable hugging face panels (4)

Be sure to name each file exactly as we've done here. Then copy the contents of each file from here and paste it into the appropriate file in Notepad. After creating and populating all the necessary files, commit each new file to your repository by clicking onMove new file to mainI like.

Python dependency list

It's time to list all the Python packages and their specific versions that are needed for the project to work properly. HappyRequirements.txtthey usually contain the package name and its version number, which can be specified in various formats, for example, B. Exact version numbers, version ranges, or supported versions. file listsfast API,Petitions, Yuvicornto the API along withprayer piece,Torch, YTransformerfor the text generation model.

fastapi==0.74.*orders==2.27.*uvicorn[default]==0.17.*phrase==0.1.*torch==1.11.*transformers==4.*

Define the FastAPI web application

The following code defines a FastAPI web application that uses the Transformers library to generate text based on user input. The app itself is a simple single-endpoint API. He/to generateEndpoint receives text and uses a transformerPipelineto generate a conclusion, which it then returns as an answer.

To give people something to see, we're redirecting FastAPISwagger interactive documentationof the pattern/docsendpoint to the application root. That way, someone visiting your space can play with it without writing any code.

(Video) Machine Learning Projects You NEVER Knew Existed

from fastapi import FastAPIfrom transformers import pipeline# Create a new application instance FastAPIapp = FastAPI()# Initialize the text generation pipeline# This function can generate Text# given an input.pipe = pipeline("text2text-generation", model= " google/flan-t5-small")# Define a function to handle the GET request in `/generate`#. The generate() function is defined as a FastAPI route that accepts a string parameter # called text. The function generates text based on the input # using the pipeline() object and returns a JSON response # containing the text generated in the key "output"@app.get("/generate")def generate(text: str) contains : " "" Generate text from the given input text using the `transformers` text2text generation pipeline. The template used is `google/flan-t5-small`, which can be found [here](<https: //huggingface . co / google/flan-t5-small>).""" # Use the pipe to generate text from the given input text output = pipe(text) # Return the generated text in a JSON response return { "output": output [0 ] ["generated_text"]}

Writing the Dockerfile

In this section, we'll write a Dockerfile that sets up a Python 3.9 environment and installs the packages listed inRequirements.txtand starts a FastAPI application on port 7860.

Let's see this process step by step:

ON Python: 3.9

The line above indicates that we will be using the official Python 3.9 Docker image as the base image for our container. This image is provided by Docker Hub and contains all the files needed to run Python 3.9.

JOB DIR/code

This line sets the working directory inside the container./Code. This is where we will later copy our application's code and dependencies.

COPY ./requirements.txt /code/requirements.txt

The line above copies theRequirements.txtfile from our local directory to/CodeDirectory inside the container. This file lists the python packages our application depends on.

(Video) Python for Bioinformatics - Drug Discovery Using Machine Learning and Data Analysis

RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt

This line usesNuggetto install the packages listed inRequirements.txt. He--no-cache-dirflag tells pip not to use cached packages that--To updateflag saidNuggetto update already installed packages when new versions become available, and the-RFlag specifies the request file to use.

RUN useradd -m -u 1000 userUSER userENV HOME=/home/user \\PATH=/home/user/.local/bin:$PATH

These lines create a newfrom the userdesignated user with userid 1000, switch to that user and set home directory to/homewender. HeENVThe command sets theHEIMYDISTANTEnvironmental variables.DISTANTis changed to include those.local/binDirectory in the user's home directory, so all binaries installed by pip are available from the command line.See the documentationfor more information about user permission.

WORKDIR $HOME/application

This line sets the working directory inside the container.$HOME/application, which/home/users/application.

COPY --chown=Benutzer . $HOME/application

The line above copies the contents of our local directory to the/home/users/applicationDirectory inside the container, with the ownership of the files set to the user we created earlier.

CMD ["uvicorn", "application:application", "--host", "0.0.0.0", "--port", "7860"]
(Video) Andrew Ng on Building a Career in Machine Learning

This line specifies the command to run when the container starts. Launch the FastAPI application with it.uvicornand listens on port 7860.--HostThe flag indicates that the application should listen on all available network interfaces, and the app:app argument indicates this.uvicornto lookup the app object in the app module in our code.

Here is the complete Dockerfile:

# Use the official Python 3.9 image python:3.9# Set the working directory to /codeWORKDIR /code# Copy the contents of the current directory to the container at /codeCOPY ./requirements.txt /code/requirements.txt# Install requirements . txt RUN pip install --no -cache-dir --upgrade -r /code/requirements.txt# Set up a new user named "user" with userid 1000RUN useradd -m -u 1000 user# Change to "user" userUSER user# Set home to the user's home directoryENV HOME=/home/user \\PATH=/home/user/.local/bin:$PATH# Set the working directory to the user's home directoryWORKDIR $HOME/ app# Copy the contents of the current directory for the container in $HOME/app sets the owner to userCOPY --chown=user . $HOME/app# Launch the FastAPI application on port 7860, the default port expected by SpacesCMD ["uvicorn", "app:app", "--host", "0.0.0.0", "--port", " 7860 "]

After committing this file, its scope will change toBuilding, and you'll see the container's build logs appear so you can monitor its status. ๐Ÿ‘€

If you want to check the files again, you can find all files inour app space.

To use:For a more basic introduction to using Docker with FastAPI, see theofficial guidefrom the FastAPI docs.

With the app ๐Ÿš€

If all goes well, your room should change.OperationAfter the build is complete, the Swagger documents generated by FastAPI should be in theapplicationGuide Since these documents are interactive, you can test the terminal by expanding the details of the/to generateend point and clickTent!(Figure 4).

Easily build machine learning apps with dockable hugging face panels (5)

Diploma

This article covered the basics of creating a Docker Space and building and configuring a basic FastAPI text generation application that uses the google/flan-t5-small template. You can use this guide as a starting point to develop more complex and exciting applications that harness the power of machine learning.

For more information about Docker templates and to view selected examples, see theDocker sample page. There you'll find a variety of templates to use as a starting point for your own projects, as well as tips and tricks for getting the most out of Docker templates. Happy coding!

(Video) Google's Machine Learning Virtual Community Day livestream

Videos

1. Emotion Detection of Text Using Machine Learning and Python
(JCharisTech)
2. Resume Analyser Application using NLP Python with Code | Full Responsive Web Application
(Machine Learning Hub)
3. Deep Learning State of the Art (2020)
(Lex Fridman)
4. Python Tutorial - Python Full Course for Beginners
(Programming with Mosh)
5. Python for Data Science - Course for Beginners (Learn Python, Pandas, NumPy, Matplotlib)
(freeCodeCamp.org)
6. I'm a Data Scientist - Build NLP Models with Amazon SageMaker (Level 300)
(AWS Events)
Top Articles
Latest Posts
Article information

Author: Arielle Torp

Last Updated: 01/26/2023

Views: 6103

Rating: 4 / 5 (61 voted)

Reviews: 84% of readers found this page helpful

Author information

Name: Arielle Torp

Birthday: 1997-09-20

Address: 87313 Erdman Vista, North Dustinborough, WA 37563

Phone: +97216742823598

Job: Central Technology Officer

Hobby: Taekwondo, Macrame, Foreign language learning, Kite flying, Cooking, Skiing, Computer programming

Introduction: My name is Arielle Torp, I am a comfortable, kind, zealous, lovely, jolly, colorful, adventurous person who loves writing and wants to share my knowledge and understanding with you.