Introduction
In this blog, we will see how to create a text summarization app using Hugging Face and the Gradio library. Let's get started!
Project Intro
Definition
The Text summarization app used to summarize the given text within a minimum word length of 30 and maximum word length of 130.
Model Used
This app is developed using Facebook's BART Large-sized model, which is fine tuned on CNN daily mail. Integrated this model using Hugging Face's pipeline function.
Import necessary library
For this app, we are going to import only two libraries pipeline
and gradio
.
Pipeline is used to import the generative AI model deployed in Hugging Face and Gradio is a user interface library to test and deploy the AI models.
Import summarization model
Import Facebook's BART generative AI using the code snippet
summarizer = pipeline("summarization", model="facebook/bart-large-cnn")
Create a Gradio Interface
Create a user interface that takes the input as text and gives the summarized input paragraph as output.
# Define summarize function to summarize the input text
def summarize(input):
output = summarizer(input,max_length=130, min_length=30,do_sample=False)
return output[0]['summary_text']
# Closing all the existing Gradio interface tabs
gr.close_all()
# Creating a gradio interface to take the input text
demo = gr.Interface(
fn=summarize,
inputs="text",
outputs="text",
allow_flagging="never"
)
# Launch the gradio interface locally
demo.launch()
Deploy
To deploy the app temporarily for 72 hours, use the below code snippet in the Colab notebook.
demo.launch(share=True)