因为FreeBSD本身不能装paddle serving,所以我们要在docker里面装。
Paddle Serving官网:GitHub - PaddlePaddle/Serving: A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)
wheel包下载:Serving/doc/Latest_Packages_CN.md at develop · PaddlePaddle/Serving · GitHub
真装起来才发现问题多多,待后面再解决问题。
FreeBSD启动docker
启动boot2docker
查找paddle serving
docker search paddle
paddlepaddle/serving
pull下载
docker pull paddlepaddle/serving
如果不成功,就用飞桨的镜像:
docker pull registry.baidubce.com/paddlepaddle/serving
创建容器并进入linux系统
docker run -p 9292:9292 --name test -dit registry.baidubce.com/paddlepaddle/serving:0.7.0-devel
docker exec -it test bash
安装Paddle Serving
不管在FreeBSD的docker里面,还是普通的linux里面,Paddle Serving的安装步骤都是一样的。
安装
pip安装paddle-serving-server是个技术活,有点难度,比如grpcio等就不好装。
安装Paddle Serving服务器端
为了减小镜像的体积,镜像中没有安装Serving包,要执行下面命令进行安装。
pip install paddle-serving-server -i https://pypi.tuna.tsinghua.edu.cn/simple
安装客户端
pip install paddle-serving-client -i
https://mirror.baidu.com/pypi/simple
安装工具组件
pip install paddle-serving-app -i
https://mirror.baidu.com/pypi/simple
测试推理服务
先下载Paddle Serving代码
git clone https://github.com/PaddlePaddle/Serving
启动房价预测服务
cd ~/Serving/examples/C++/fit_a_line
sh get_data.sh
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9393 --gpu_id 0
在其它远程调用该服务器
curl -XPOST http://0.0.0.0:9393/GeneralModelService/inference -d ' {"tensor":[{"float_data":[0.0137,-0.1136,0.2553,-0.0692,0.0582,-0.0727,-0.1583,-0.0584,0.6283,0.4919,0.1856,0.0795,-0.0332],"elem_type":1,"name":"x","alias_name":"x","shape":[1,13]}],"fetch_var_names":["price"],"log_id":0}'
部署在线推理服务进阶流程
所谓进阶流程,其实还是那三个步骤,获取可用于部署的在线服务的模型、启动服务端和使用客户端访问服务端进行推理。只是每个步骤将会详细介绍一下,有的地方还要编写少量的代码。将逐一介绍具体过程及原理,此外为了保证读者理解上的连贯性,我们将把启动服务端和使用客户端访问服务端进行推理两个步骤合成一个步骤进行介绍。
1. 获取可用于部署在线服务的模型
模型转换API接口的应用示例代码如下所示,
python -m paddle_serving_client.convert --dirname $MODEL_DIR --model_filename $MODEL_FILENAME --params_filename PARAMS_FILENAME --serving_server $SERVING_SERVER_DIR --serving_client $SERVING_CLIENT_DIR
其中各个参数解释如下所示:
- dirname (str) – 需要转换的模型文件存储路径,Program结构文件和参数文件均保存在此目录。
- serving_server (str, 可选) - 转换后的模型文件和配置文件的存储路径。默认值为serving_server。
- serving_client (str, 可选) - 转换后的客户端配置文件存储路径。默认值为serving_client。
- model_filename (str,可选) – 存储需要转换的模型Inference Program结构的文件名称。如果设置为None,则使用 model 作为默认的文件名。默认值为None。
- params_filename (str,可选) – 存储需要转换的模型所有参数的文件名称。当且仅当所有模型参数被保存在一个单独的二进制文件中,它才需要被指定。如果模型参数是存储在各自分离的文件中,设置它的值为None。默认值为None。
案例
启动服务端
请在终端-1执行如下命令启动服务端。
cd ~/Serving/examples/C++/PaddleDetection/faster_rcnn_r50_fpn_1x_coco
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/pddet_demo/2.0/faster_rcnn_r50_fpn_1x_coco.tar
tar xf faster_rcnn_r50_fpn_1x_coco.tar
# python -m paddle_serving_server.serve --model serving_server --port 9494 --gpu_id 0
python -m paddle_serving_server.serve --model serving_server --port 9494
2. 启动客户端
在终端-2中启动客户端
cd ~/Serving/examples/C++/PaddleDetection/faster_rcnn_r50_fpn_1x_coco
python test_client.py 000000570688.jpg
调试
报错distutils.errors.CompileError: command '/usr/bin/gcc' failed with exit code 1
File "/tmp/pip-install-brtc79ep/grpcio_d048c90f8c614ad4809a63239d7761a0/src/python/grpcio/commands.py", line 247, in new_compile
return old_compile(obj, src, ext, cc_args, extra_postargs,
File "/home/linuxskywalk/py310/lib/python3.10/distutils/unixccompiler.py", line 120, in _compile
raise CompileError(msg)
distutils.errors.CompileError: command '/usr/bin/gcc' failed with exit code 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for grpcio
Running setup.py clean for grpcio
Building wheel for grpcio-tools (setup.py) ...
grpcio的编译是个坎....
可以尝试先这样安装:
pip install grpcio grpcio-tools --upgrade --user
报错:required to install pyproject.toml-based projects
执行:cd Serving/python && pip install -r requirements.txt Failed to build av ERROR: Could not build wheels for av, which is required to install pyproject.toml-based projects
安装即可:pip install pyproject
报错:Could not build wheels for av
pkg-config is required for building PyAV [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for av Running setup.py clean for av Failed to build av ERROR: Could not build wheels for av, which is required to install pyproject.toml-based projects
可以单独安装上av或者pyav,但是还是不行,这步暂时过不去