230902-部署Gradio到已有FastAPI及服务器中

230902-部署Gradio到已有FastAPI及服务器中_第1张图片

1. 官方例子

  • run.py
from fastapi import FastAPI
import gradio as gr

CUSTOM_PATH = "/gradio"

app = FastAPI()


@app.get("/")
def read_main():
    return {"message": "This is your main app"}


io = gr.Interface(lambda x: "Hello, " + x + "!", "textbox", "textbox")
app = gr.mount_gradio_app(app, io, path=CUSTOM_PATH)


# Run this from the terminal as you would normally start a FastAPI app: `uvicorn run:app`
# and navigate to http://localhost:8000/gradio in your browser.

运行方式:uvicorn run:app

2. 油管例子

  • gradio_ui.py
import gradio as gr


def greet(text: str) -> str:
    return text


demo = gr.Interface(
    fn=greet,
    inputs=gr.components.Textbox(label='Input'),
    outputs=gr.components.Textbox(label='Output'),
    allow_flagging='never'
)
  • run.py
from fastapi import FastAPI
import gradio as gr

from gradio_ui import demo

app = FastAPI()

@app.get('/')
async def root():
    return 'Gradio app is running at /gradio', 200

app = gr.mount_gradio_app(app, demo, path='/gradio')
  • 运行方式
uvicorn run:app --host 0.0.0.0 --port 5000
  • 注意事项
1. 在命令行中的格式是<文件对象:挂载对象>
2. 文件对象,不要带py
3. 需要在同一个根目录下

3. 视频演示

230920-部署Gradio到已有FastAPI及服务器中

4. 参考文献

  • mounting-within-another-fast-api-app
  • RajKKapadia/YouTube-Gradio-Deploy-Demo
  • How to deploy Gradio application on Server | Render | Gradio | Python - YouTube

你可能感兴趣的:(Gradio,LLM,fastapi)