windows下运行提示importerror:cannot import name Process

萧乐颜 2014-10-08 05:14:30
最近在学python
在windows环境下,写了一条:from multiprocessing import Process
但是运行的时候报错,提示:importerror:cannot import name Process

我在网上看了一些帖子,说windows下想多进程,可以用from multiprocessing import Process
这句。但是为什么运行的时候又报错呢。
...全文
2732 10 打赏 收藏 转发到动态 举报
写回复
用AI写文章
10 条回复
切换为时间正序
请发表友善的回复…
发表回复
zhmyi000 2016-11-27
  • 打赏
  • 举报
回复
引用 3 楼 angel_su 的回复:
估计脚本名跟模块同名了,所以本来正常该有的不见了...
感谢前辈
哈喽世界 2015-12-28
  • 打赏
  • 举报
回复
引用 3 楼 angel_su 的回复:
估计脚本名跟模块同名了,所以本来正常该有的不见了...
对这句话不太理解,所以在查看了版本,看了模块库之后发现都没有问题。再回来理解这句话的意思,才猛然发现自己新建了一个用于测试的py文件,文件名就叫multiprocessing.py 将其改为:test_multiprocessing.py,运行,正常。 教训啊!
qq_23665387 2015-11-20
  • 打赏
  • 举报
回复
小白一个,想知道同名了具体要怎么改啊?谢谢各位啦~~
wyf1020 2015-08-18
  • 打赏
  • 举报
回复
引用 3 楼 angel_su 的回复:
估计脚本名跟模块同名了,所以本来正常该有的不见了...
是酱紫
tao01230 2015-06-01
  • 打赏
  • 举报
回复
引用 3 楼 angel_su 的回复:
估计脚本名跟模块同名了,所以本来正常该有的不见了...
正解
xxxiaoxiami 2015-01-13
  • 打赏
  • 举报
回复
引用 3 楼 angel_su 的回复:
估计脚本名跟模块同名了,所以本来正常该有的不见了...
我的是这样
萧乐颜 2014-10-11
  • 打赏
  • 举报
回复
我用2.7运行了,还是报这错
angel_su 2014-10-10
  • 打赏
  • 举报
回复
估计脚本名跟模块同名了,所以本来正常该有的不见了...
zhoujiping1234 2014-10-08
  • 打赏
  • 举报
回复
我在Window下用的python 2.7,from multiprocessing import Process,没有问题。 先检查版本,再看看代码,估计是代码问题。
dbbruce 2014-10-08
  • 打赏
  • 举报
回复
0.python的版本,我的2.7没有这个问题; 1.你要确定有没有multiprocessing这个模块; 2.如果有这个模块,使用dir查看都有哪些方法,是否有你这个方法。
文件: import scrapy from demo1.items import Demo1Item import urllib from scrapy import log # BOSS直聘网站爬虫职位 class DemoSpider(scrapy.Spider): # 爬虫名, 启动爬虫时需要的参数*必填 name = 'demo' # 爬取域范围,允许爬虫在这个域名下进行爬取(可选) allowed_domains = ['zhipin.com'] # 爬虫需要的url start_urls = ['https://www.zhipin.com/c101280600/h_101280600/?query=测试'] def parse(self, response): node_list = response.xpath("//div[@class='job-primary']") # 用来存储所有的item字段 # items = [] for node in node_list: item = Demo1Item() # extract() 将xpath对象转换为Unicode字符串 href = node.xpath("./div[@class='info-primary']//a/@href").extract() job_title = node.xpath("./div[@class='info-primary']//a/div[@class='job-title']/text()").extract() salary = node.xpath("./div[@class='info-primary']//a/span/text()").extract() working_place = node.xpath("./div[@class='info-primary']/p/text()").extract() company_name = node.xpath("./div[@class='info-company']//a/text()").extract() item['href'] = href[0] item['job_title'] = job_title[0] item['sa 报错: C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\python.exe "C:\Users\xieqianyun\PyCharm Community Edition 2019.2.5\helpers\pydev\pydevconsole.py" --mode=client --port=55825 import sys; print('Python %s on %s' % (sys.version, sys.platform)) sys.path.extend(['C:\\Users\\xieqianyun\\demo1', 'C:/Users/xieqianyun/demo1']) Python 3.6.5 (v3.6.5:f59c0932b4, Mar 28 2018, 17:00:18) [MSC v.1900 64 bit (AMD64)] Type 'copyright', 'credits' or 'license' for more information IPython 7.10.0 -- An enhanced Interactive Python. Type '?' for help. PyDev console: using IPython 7.10.0 Python 3.6.5 (v3.6.5:f59c0932b4, Mar 28 2018, 17:00:18) [MSC v.1900 64 bit (AMD64)] on win32 runfile('C:/Users/xieqianyun/demo1/demo1/begin.py', wdir='C:/Users/xieqianyun/demo1/demo1') Traceback (most recent call last): File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\IPython\core\interactiveshell.py", line 3319, in run_code exec(code_obj, self.user_global_ns, self.user_ns) File "", line 1, in runfile('C:/Users/xieqianyun/demo1/demo1/begin.py', wdir='C:/Users/xieqianyun/demo1/demo1') File "C:\Users\xieqianyun\PyCharm Community Edition 2019.2.5\helpers\pydev\_pydev_bundle\pydev_umd.py", line 197, in runfile pydev_imports.execfile(filename, global_vars, local_vars) # execute the script File "C:\Users\xieqianyun\PyCharm Community Edition 2019.2.5\helpers\pydev\_pydev_imps\_pydev_execfile.py", line 18, in execfile exec(compile(contents+"\n", file, 'exec'), glob, loc) File "C:/Users/xieqianyun/demo1/demo1/begin.py", line 3, in cmdline.execute('scrapy crawl demo'.split()) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\cmdline.py", line 145, in execute cmd.crawler_process = CrawlerProcess(settings) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\crawler.py", line 267, in __init__ super(CrawlerProcess, self).__init__(settings) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\crawler.py", line 145, in __init__ self.spider_loader = _get_spider_loader(settings) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\crawler.py", line 347, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\spiderloader.py", line 61, in from_settings return cls(settings) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\spiderloader.py", line 25, in __init__ self._load_all_spiders() File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\spiderloader.py", line 47, in _load_all_spiders for module in walk_modules(name): File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\utils\misc.py", line 73, in walk_modules submod = import_module(fullpath) File "C:\Users\xieqianyun\AppData\Local\Programs\Python\Python36\lib\importlib\__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "importlib._bootstrap>", line 994, in _gcd_import File "importlib._bootstrap>", line 971, in _find_and_load File "importlib._bootstrap>", line 955, in _find_and_load_unlocked File "importlib._bootstrap>", line 665, in _load_unlocked File "importlib._bootstrap_external>", line 678, in exec_module File "importlib._bootstrap>", line 219, in _call_with_frames_removed File "C:\Users\xieqianyun\demo1\demo1\spiders\demo.py", line 4, in from scrapy import log ImportError: cannot import name 'log'

37,719

社区成员

发帖
与我相关
我的任务
社区描述
JavaScript,VBScript,AngleScript,ActionScript,Shell,Perl,Ruby,Lua,Tcl,Scala,MaxScript 等脚本语言交流。
社区管理员
  • 脚本语言(Perl/Python)社区
  • IT.BOB
加入社区
  • 近7日
  • 近30日
  • 至今

试试用AI创作助手写篇文章吧