docker 定时部署scrapy小程序。

rancher上部署,属于运行一次容器就断的问题,懒加进程控制处理就直接最后加CMD ["tail", "-f", "/dev/null"]了。

dockerfile:

FROM ubuntu:latest
RUN apt-get update \
  && apt-get install -y python3-pip python3-dev \
  && cd /usr/local/bin \
  && ln -s /usr/bin/python3 python \
  && pip3 install --upgrade pip
RUN apt install -y cron vim
ADD . /code
WORKDIR /code
RUN pip install -r requirements.txt
RUN chmod +x ./start.sh
#ENTRYPOINT ["python3"]
ENTRYPOINT ["sh", "crontab.sh"]
CMD ["tail", "-f", "/dev/null"]

crontab.sh:

#!/bin/bash
set -e
echo "59 23 * * * root cd /code &&./start.sh " >> /etc/crontab
service cron restart
exec "$@"

start.sh:

#!/bin/bash
#set -e
scrapy crawl infospider >> start.log
#exec "$@"

你可能感兴趣的:(docker 定时部署scrapy小程序。)