当前位置: 首页 > news >正文

东莞p2p网站开发费用宁波公司地址

东莞p2p网站开发费用,宁波公司地址,网站整体优化,怎样利用互联网进行网络推广目录 一、理论 1.Celery 二、实验 1.Windows11安装Redis 2.Python3.8环境中配置Celery 3.celery的多目录结构异步执行 4.celery简单结构下的定时任务 三、问题 1.Celery命令报错 2.执行Celery命令报错 3.Win11启动Celery报ValueErro错误 4.Pycharm 无法 import 同目…目录 一、理论 1.Celery 二、实验 1.Windows11安装Redis 2.Python3.8环境中配置Celery 3.celery的多目录结构异步执行 4.celery简单结构下的定时任务 三、问题 1.Celery命令报错 2.执行Celery命令报错 3.Win11启动Celery报ValueErro错误 4.Pycharm 无法 import 同目录下的 .py 文件或自定义模块 一、理论 1.Celery (1) 概念 Celery是一个基于python开发的分布式系统它是简单、灵活且可靠的处理大量消息专注于实时处理的异步任务队列同时也支持任务调度。 (2) 架构 Celery的架构由三部分组成消息中间件message broker任务执行单元worker和任务执行结果存储task result store组成。 1消息中间件 Celery本身不提供消息服务但是可以方便的和第三方提供的消息中间件集成。包括RabbitMQ, Redis等等2任务执行单元 Worker是Celery提供的任务执行的单元worker并发的运行在分布式的系统节点中。3任务结果存储 Task result store用来存储Worker执行的任务的结果Celery支持以不同方式存储任务的结果包括AMQP, redis等 (3)  特点 1简单 Celery易于使用和维护并且它不需要配置文件并且配置和使用是比较简单的2高可用 当任务执行失败或执行过程中发生连接中断celery会自动尝试重新执行任务3快速 单个 Celery 进程每分钟可处理数以百万计的任务而保持往返延迟在亚毫秒级4灵活 Celery几乎所有部分都可以扩展或单独使用各个部分可以自定义。 4场景 Celery是一个强大的 分布式任务队列的异步处理框架它可以让任务的执行完全脱离主程序甚至可以被分配到其他主机上运行。通常使用它来实现异步任务(async task)和定时任务(crontab)。 1)异步任务 将耗时操作任务提交给Celery去异步执行比如发送短信/邮件、消息推送、音视频处理等等2)定时任务 定时执行某件事情比如每天数据统计 二、实验 1.Windows11安装Redis (1)下载最新版Redis Redis-x64-xxx.zip压缩包到D盘解压后将文件夹重新命名为 Redis (2)查看目录 D:\Redisdir (3)打开一个 cmd 窗口 使用 cd 命令切换目录到 D:\Redis 运行 redis-server.exe redis.windows.conf(4)把 redis 的路径加到系统的环境变量 (5)另外开启一个 cmd 窗口原来的不要关闭因为先前打开的是redis服务端 #切换到 redis 目录下运行 redis-cli.exe -h 127.0.0.1 -p 6379(6)检测连接是否成功 #设置键值对 set firstKey 123#取出键值对 get firstKey#退出 exit Redis数据库已显示 (7)ctrlc 退出先前打开的服务端 (8)注册Redis服务 #通过 cmd 命令行工具进入 Redis 安装目录将 Redis 服务注册到 Windows 服务中执行以下命令 redis-server.exe --service-install redis.windows.conf --loglevel verbose(9)启动Redis服务 #执行以下命令启动 Redis 服务 redis-server --service-start(10)Redis 已经被添加到 Windows 服务中 (11)打开Redis服务将启动类型设置为自动即可实现开机自启动 2.Python3.8环境中配置Celery (1) PyCharm安装celeryredis #celery是典型的生产者消费者的模式生产者生产任务并加入队列中消费者取出任务消费。多用于处理异步任务或者定时任务。#第一种方式 pip install celery pip install redis#第二种方式 pip install -i https://pypi.douban.com/simple celery pip install -i https://pypi.douban.com/simple redis2新建异步任务执行文件celery_task.py.相当于注册了celery app # -*- coding: utf-8 -*- from celery import Celery import time app Celery(demo, backendredis://localhost:6379/1, brokerredis://localhost:6379/2) app.task def send_email(name):print(向%s发送邮件...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok(3) 在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 17:26:39 - *** --- * --- - ** ---------- [config] - ** ---------- . app: test:0x1e6fa358550 - ** ---------- . transport: redis://127.0.0.1:6379/2 - ** ---------- . results: redis://127.0.0.1:6379/1 - *** --- * --- . concurrency: 32 (prefork) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email[2023-11-22 17:26:39,265: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine [2023-11-22 20:30:08,249: INFO/MainProcess] mingle: searching for neighbors [2023-11-22 20:30:15,379: INFO/MainProcess] mingle: all alone [2023-11-22 20:30:25,608: INFO/MainProcess] celerynode1 ready.4ctrlc 退出 5修改celery_task.py文件增加一个task # -*- coding: utf-8 -*- from celery import Celery import time app Celery(demo, backendredis://localhost:6379/1, brokerredis://localhost:6379/2) app.task def send_email(name):print(向%s发送邮件...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok app.task def send_msg(name):print(向%s发送短信...%name)time.sleep(5)print(向%s发送邮件完成%name)return ok (6)再次在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:01:43 - *** --- * --- - ** ---------- [config] - ** ---------- . app: demo:0x29cea446250 - ** ---------- . transport: redis://localhost:6379/2 - ** ---------- . results: redis://localhost:6379/1 - *** --- * --- . concurrency: 32 (prefork) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ----- -------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msg[2023-11-22 21:01:43,381: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-23] child process 23988 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-17] child process 16184 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-21] child process 22444 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-27] child process 29480 calling self.run() [2023-11-22 21:01:43,612: INFO/SpawnPoolWorker-24] child process 5844 calling self.run() [2023-11-22 21:01:43,631: INFO/SpawnPoolWorker-25] child process 8896 calling self.run() [2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-29] child process 28068 calling self.run() [2023-11-22 21:01:43,634: INFO/SpawnPoolWorker-28] child process 18952 calling self.run() [2023-11-22 21:01:43,636: INFO/SpawnPoolWorker-26] child process 13680 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-31] child process 25472 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-30] child process 28688 calling self.run() [2023-11-22 21:01:43,638: INFO/SpawnPoolWorker-32] child process 10072 calling self.run() [2023-11-22 21:01:45,401: INFO/MainProcess] Connected to redis://localhost:6379/2 [2023-11-22 21:01:45,401: WARNING/MainProcess] d:\soft\python38\lib\site-packages\celery\worker\consumer\consumer.py:507: CPendingDeprecationWarning: The broker_connection_retry configuration setting will no longer determine whether broker connection retries are made during startup in Celery 6.0 and above. If you wish to retain the existing behavior for retrying connections on startup, you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:01:49,477: INFO/MainProcess] mingle: searching for neighbors [2023-11-22 21:01:56,607: INFO/MainProcess] mingle: all alone [2023-11-22 21:02:04,753: INFO/MainProcess] celerynode1 ready.6ctrlc 退出创建执行任务文件produce_task.py # -*- coding: utf-8 -*- from celerypro.celery_task import send_email,send_msg result send_email.delay(david) print(result.id) result2 send_msg.delay(mao) print(result2.id) 7运行produce_task.py 8同时取到id值 9如遇到报错需要安装包 eventlet PS D:\soft\pythonProject pip install eventlet10重新在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO -P eventlet-------------- celerynode1 v5.3.5 (emerald-rush) --- ***** ----- -- ******* ---- Windows-10-10.0.22621-SP0 2023-11-22 21:29:34 - *** --- * --- - ** ---------- [config] - ** ---------- . app: demo:0x141511962e0 - ** ---------- . transport: redis://localhost:6379/2 - ** ---------- . results: redis://localhost:6379/1 - *** --- * --- . concurrency: 32 (eventlet) -- ******* ---- . task events: OFF (enable -E to monitor tasks in this worker) --- ***** ------------------- [queues]. celery exchangecelery(direct) keycelery[tasks]. celerypro.celery_task.send_email. celerypro.celery_task.send_msgr_connection_retry configuration setting will no longer determine whether broker connection retries are made during startup in Celery 6.0 and above. If you wish to retain the existing behavior for retrying connections on startup, you should set broker_connection_retry_on_startup to True.warnings.warn([2023-11-22 21:29:48,022: INFO/MainProcess] pidbox: Connected to redis://localhost:6379/2. [2023-11-22 21:29:52,117: INFO/MainProcess] celerynode1 ready.11 运行produce_task.py 12生成id 13查看任务消息 [2023-11-22 21:30:35,194: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] received [2023-11-22 21:30:35,195: WARNING/MainProcess] 向david发送邮件... [2023-11-22 21:30:35,197: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] received [2023-11-22 21:30:35,198: WARNING/MainProcess] 向mao发送短信... [2023-11-22 21:30:40,210: WARNING/MainProcess] 向david发送邮件完成 [2023-11-22 21:30:40,210: WARNING/MainProcess] 向mao发送邮件完成 [2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_msg[de30d70b-9110-4dfb-bcfd-45a61403357f] succeeded in 7.063000000001921s: ok [2023-11-22 21:30:42,270: INFO/MainProcess] Task celerypro.celery_task.send_email[c1a473d5-49ac-4468-9370-19226f377e00] succeeded in 7.063000000001921s: ok14创建py文件result.py查看任务执行结果 取第2个id:de30d70b-9110-4dfb-bcfd-45a61403357f # -*- coding: utf-8 -*- from celery.result import AsyncResult from celerypro.celery_task import app async_result AsyncResult(idde30d70b-9110-4dfb-bcfd-45a61403357f, appapp) if async_result.successful():result async_result.get()print(result) elif async_result.failed():print(执行失败) elif async_result.status PENDING:print(任务等待中被执行) elif async_result.status RETRY:print(任务异常后正在重试) elif async_result.status STARTED:print(任务已经开始被执行) (15) 运行result.py文件 16输出ok 17Redis可视化界面查看最后2次的task 3.celery的多目录结构异步执行 1原目录结构 2优化目录结构 3消费者celery_main.py # -*- coding: utf-8 -*- #消费者 from celery import Celery app Celery(celery_demo,backendredis://localhost:6379/1,brokerredis://localhost:6379/2,include[celery_tasks.task01,celery_tasks.task02]) # 时区 app.conf.timezone Asia/Shanghai # 是否使用UTC app.conf.enable_utc False (2)  任务一task01.py # -*- coding: utf-8 -*- #task01 import time from celery_tasks.celery_main import app app.task def send_email(res):print(完成向%s发送邮件任务%res)time.sleep(5)return 邮件完成 (3) 任务二task02.py # -*- coding: utf-8 -*- #task02 import time from celery_tasks.celery_main import app app.task def send_msg(name):print(完成向%s发送短信任务%name)time.sleep(5)return 短信完成 4 在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject\CeleryMTask celery --appcelery_tasks.celery worker -n node1 -l INFO -P eventlet5生产者produce_task.py # -*- coding: utf-8 -*- # 生产者 from CeleryMTask import celery_tasks from celery_tasks.task01 import send_email from celery_tasks.task02 import send_msg result send_email.delay(jack) print(result.id) result2 send_msg.delay(alice) print(result2.id) (5)运行生产者生产者produce_task.py 6 异步结果检查check_result # -*- coding: utf-8 -*- # 异步结果检查 from celery.result import AsyncResult from celery_tasks.celery_main import app async_result AsyncResult(idbb906153-822a-4bd5-aacd-2f2172d3003a, appapp) if async_result.successful():result async_result.get()print(result) elif async_result.failed():print(执行失败) elif async_result.status PENDING:print(任务等待中被执行) elif async_result.status RETRY:print(任务异常后正在重试) elif async_result.status STARTED:print(任务已经开始被执行) (7) 运行异步结果检查check_result 8查看运行结果 8查看Terminal输出 4.celery简单结构下的定时任务 1基于celerypro简单结构源目录结构如下 2 设定时间让celery执行一个定时任务新建time.py # -*- coding: utf-8 -*- from celerypro.celery_task import send_email from datetime import datetime v1 datetime(2023, 11, 24, 20, 7, 00) print(v1) # 时间戳timestamp国标时间方法utcfromtimestamp v2 datetime.utcfromtimestamp(v1.timestamp()) print(v2) #调异步任务建议用delay方法掉定时任务建议用apply_async方法来接纳更多参数 #参数放列表eta接收时间对象 result send_email.apply_async(args[leon,], etav2) print(result.id) 3在项目文件目录下创建worker消费任务 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO -P eventlet4运行time.py文件 5查看RUN运行结果 6等待指定时间查看Terminal 7 利用时差方式让celery执行一个定时任务修改time.py # -*- coding: utf-8 -*- from celerypro.celery_task import send_email from datetime import datetime from datetime import timedelta #当前时间 ctime datetime.now() # 默认用utc时间 # 时间戳timestamp国标时间方法utcfromtimestamp utc_ctime datetime.utcfromtimestamp(ctime.timestamp()) #时差 time_delay timedelta(seconds10) task_time utc_ctime time_delay # 使用apply_async并设定时间 result send_email.apply_async(args[bale], etatask_time) print(result.id) 8运行time.py文件 9查看RUN运行结果 10等待10秒查看Terminal 11Redis可视化界面查看最后2次的task 三、问题 1.Celery命令报错 1报错 2原因分析 celery版本不同命令不同。 查看帮助命令 PS D:\soft\pythonProject celery --help Usage: celery [OPTIONS] COMMAND [ARGS]...Celery command entrypoint.Options:-A, --app APPLICATION-b, --broker TEXT--result-backend TEXT--loader TEXT--config TEXT--workdir PATH-C, --no-color-q, --quiet--version--skip-checks Skip Django core checks on startup. Setting theSKIP_CHECKS environment variable to any non-emptystring will have the same effect.--help Show this message and exit.Commands:amqp AMQP Administration Shell.beat Start the beat periodic task scheduler.call Call a task by name.control Workers remote control.events Event-stream utilities.graph The celery graph command.inspect Inspect the worker at runtime.list Get info from broker.logtool The celery logtool command.migrate Migrate tasks from one broker to another.multi Start multiple worker instances.purge Erase all messages from all known task queues.report Shows information useful to include in bug-reports.result Print the return value for a given task id.shell Start shell session with convenient access to celery symbols.status Show list of workers that are online.upgrade Perform upgrade between versions.worker Start worker instance.PS D:\soft\pythonProject celery worker --help Usage: celery worker [OPTIONS]Start worker instance.Examples--------$ celery --appproj worker -l INFO$ celery -A proj worker -l INFO -Q hipri,lopri$ celery -A proj worker --concurrency4$ celery -A proj worker --concurrency1000 -P eventlet$ celery worker --autoscale10,0Worker Options:-n, --hostname HOSTNAME Set custom hostname (e.g., w1%%h).Expands: %%h (hostname), %%n (name) and %%d,(domain).-D, --detach Start worker as a background process.-S, --statedb PATH Path to the state database. The extension.db may be appended to the filename.-l, --loglevel [DEBUG|INFO|WARNING|ERROR|CRITICAL|FATAL]Logging level.-O, --optimization [default|fair]Apply optimization profile.--prefetch-multiplier prefetch multiplierSet custom prefetch multiplier value forthis worker instance.Pool Options:-c, --concurrency concurrencyNumber of child processes processing thequeue. The default is the number of CPUsavailable on your system.-P, --pool [prefork|eventlet|gevent|solo|processes|threads|custom]Pool implementation.-E, --task-events, --events Send task-related events that can becaptured by monitors like celery events,celerymon, and others.--time-limit FLOAT Enables a hard time limit (in secondsint/float) for tasks.--soft-time-limit FLOAT Enables a soft time limit (in secondsint/float) for tasks.--max-tasks-per-child INTEGER Maximum number of tasks a pool worker canexecute before its terminated and replacedby a new worker.--max-memory-per-child INTEGER Maximum amount of resident memory, in KiB,that may be consumed by a child processbefore it will be replaced by a new one. Ifa single task causes a child process toexceed this limit, the task will becompleted and the child process will bereplaced afterwards. Default: no limit.--scheduler TEXTDaemonization Options:-f, --logfile TEXT Log destination; defaults to stderr--pidfile TEXT--uid TEXT--gid TEXT--umask TEXT--executable TEXTOptions:--help Show this message and exit.3解决方法 修改命令 PS D:\soft\pythonProject celery --appcelerypro.celery_task worker -n node1 -l INFO成功 2.执行Celery命令报错 1报错 AttributeError: NoneType object has no attribute Redis 2原因分析 PyCharm未安装redis插件。 3解决方法 安装redis插件 3.Win11启动Celery报ValueErro错误 1报错 Windows 在开发 Celery 异步任务通过命令 celery --appcelerypro.celery_task worker -n node1 -l INFO 启动 Celery 服务后正常 但在使用 delay() 调用任务时会出现以下报错信息 Task handler raised error: ValueError(not enough values to unpack (expected 3, got 0))   2原因分析 PyCharm未安装eventlet 3解决方法 安装包 eventlet pip install eventlet 通过以下命令启动服务 celery --appcelerypro.celery_task worker -n node1 -l INFO -P eventlet4.Pycharm 无法 import 同目录下的 .py 文件或自定义模块 1报错 2原因分析 pycharm 默认情况下只检索项目根目录下的py文件当引用的py文件不在项目根目录时会出现错误。 3解决方法 需将要引用的py文件所在的文件夹添加到默认搜索搜索文件夹即可 方法一 右键py文件所在的文件夹依次点击MarkDircetory as - Sources Root 方法二 将py文件所在文件夹作为一个package在py文件所在文件夹下新建__init__.py文件在__init__.py中添加如下语句 from [py文件所在文件夹名].[py文件名] import 模块名
http://www.hkea.cn/news/14275979/

相关文章:

  • 贵州城乡和住房建设厅网站审批电脑怎做单页网站
  • 商城网站怎么做的最近国内新闻大事20条
  • 网站建设个人简历社交媒体营销三种方式
  • 望城经开区建设开发公司门户网站怎么用支付宝做发卡网站
  • 方案案例网站ai一键生成短视频免费版
  • 创世网站建设公司在中国做国外网站
  • 2021能看的网站不要app贴吧做画册的国外网站
  • 十个实用网站网址建设网站成都
  • 笔记本怎么建设网站值得买wordpress
  • 国外营销网站政务公开网站建设意义
  • 网站建设发展的前景简洁的公司网站
  • 化妆品网站的建设方案建设部精神文明建设网站
  • 流浪动物网站开发wordpress网盘主题
  • 做网站月入7000西安北郊网站建设公司
  • 百度站长工具域名查询福州网站建设嘉艺
  • 网站域名备案转接入手续城阳网站建设电话
  • 平顶山公司网站建设爱设计网
  • asp 手机网站公司装修费用可以一次性入账吗
  • 北京手机网站建设费用wordpress 2.8
  • 织梦做的网站怎么添加关键词做公众号的网站模板
  • 网上网站代码可以下载吗南充网站建设迅达网络
  • 网站建设的一般流程.net网站开发源码
  • 哪个网站有建设需要等文件大丰seo排名
  • 海口网站建设加q.479185700网站加载页模板
  • 手机网站开发教程pdfdw做的简单的个人网站网盘
  • 网站推广排名机构为女人网上量体做衣网站
  • 精湛的企业网站建设学习网站建设的网站
  • asp网站后台无法显示该页面电影网站如何做seo
  • 网站做seo必要的结构国家电网公司人力资源招聘平台
  • 毕业设计网站选题装修设计小程序