I
I
isaac-212021-01-22 12:53:11
Python
isaac-21, 2021-01-22 12:53:11

How to fix error when running python script from terminal?

if the script is run from Pycharm, then everything works, but as soon as I try to run it from the script, it gives an error.
The script runs the scrapy spider

The script itself:

import os
import subprocess

os.chdir('brainboy')
subprocess.call(
    ['scrapy', 'crawl', 'brainboy']
)


Error from terminal:
The mistake itself
[email protected]:~/PycharmProjects/parsing_kviz$ python3 run_parsing.py 
2021-01-22 12:50:50 [scrapy.utils.log] INFO: Scrapy 1.7.3 started (bot: brainboy)
2021-01-22 12:50:50 [scrapy.utils.log] INFO: Versions: lxml 4.5.0.0, libxml2 2.9.10, cssselect 1.1.0, parsel 1.5.2, w3lib 1.21.0, Twisted 18.9.0, Python 3.8.5 (default, Jul 28 2020, 12:59:40) - [GCC 9.3.0], pyOpenSSL 19.0.0 (OpenSSL 1.1.1f  31 Mar 2020), cryptography 2.8, Platform Linux-5.8.0-38-generic-x86_64-with-glibc2.29
2021-01-22 12:50:50 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'brainboy', 'NEWSPIDER_MODULE': 'brainboy.spiders', 'SPIDER_MODULES': ['brainboy.spiders']}
2021-01-22 12:50:50 [scrapy.extensions.telnet] INFO: Telnet Password: 6ef7b251b0293985
2021-01-22 12:50:50 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.memusage.MemoryUsage',
 'scrapy.extensions.logstats.LogStats']
2021-01-22 12:50:50 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
 'scrapy.downloadermiddlewares.retry.RetryMiddleware',
 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
 'scrapy.downloadermiddlewares.stats.DownloaderStats']
2021-01-22 12:50:50 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
 'scrapy.spidermiddlewares.referer.RefererMiddleware',
 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
 'scrapy.spidermiddlewares.depth.DepthMiddleware']
Unhandled error in Deferred:
Temporarily disabling observer LegacyLogObserverWrapper(<bound method PythonLoggingObserver.emit of <twisted.python.log.PythonLoggingObserver object at 0x7f7ffc8daf40>>) due to exception: [Failure instance: Traceback: <class 'TypeError'>: _findCaller() takes from 1 to 2 positional arguments but 3 were given
/usr/lib/python3/dist-packages/twisted/internet/defer.py:953:__del__
/usr/lib/python3/dist-packages/twisted/logger/_logger.py:270:critical
/usr/lib/python3/dist-packages/twisted/logger/_logger.py:144:emit
--- <exception caught here> ---
/usr/lib/python3/dist-packages/twisted/logger/_observer.py:131:__call__
/usr/lib/python3/dist-packages/twisted/logger/_legacy.py:93:__call__
/usr/lib/python3/dist-packages/twisted/python/log.py:595:emit
/usr/lib/python3/dist-packages/twisted/logger/_legacy.py:154:publishToNewObserver
/usr/lib/python3/dist-packages/twisted/logger/_stdlib.py:115:__call__
/usr/lib/python3.8/logging/__init__.py:1500:log
/usr/lib/python3.8/logging/__init__.py:1565:_log
]
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 953, in __del__
    log.critical("Unhandled error in Deferred:",
  File "/usr/lib/python3/dist-packages/twisted/logger/_logger.py", line 270, in critical
    self.emit(LogLevel.critical, format, **kwargs)
  File "/usr/lib/python3/dist-packages/twisted/logger/_logger.py", line 144, in emit
    self.observer(event)
--- <exception caught here> ---
  File "/usr/lib/python3/dist-packages/twisted/logger/_observer.py", line 131, in __call__
    observer(event)
  File "/usr/lib/python3/dist-packages/twisted/logger/_legacy.py", line 93, in __call__
    self.legacyObserver(event)
  File "/usr/lib/python3/dist-packages/twisted/python/log.py", line 595, in emit
    _publishNew(self._newObserver, eventDict, textFromEventDict)
  File "/usr/lib/python3/dist-packages/twisted/logger/_legacy.py", line 154, in publishToNewObserver
    observer(eventDict)
  File "/usr/lib/python3/dist-packages/twisted/logger/_stdlib.py", line 115, in __call__
    self.logger.log(
  File "/usr/lib/python3.8/logging/__init__.py", line 1500, in log
    self._log(level, msg, args, **kwargs)
  File "/usr/lib/python3.8/logging/__init__.py", line 1565, in _log
    fn, lno, func, sinfo = self.findCaller(stack_info, stacklevel)
builtins.TypeError: _findCaller() takes from 1 to 2 positional arguments but 3 were given


Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/scrapy/crawler.py", line 184, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "/usr/lib/python3/dist-packages/scrapy/crawler.py", line 188, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 1613, in unwindGenerator
    return _cancellableInlineCallbacks(gen)
  File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 1529, in _cancellableInlineCallbacks
    _inlineCallbacks(None, g, status)
--- <exception caught here> ---
  File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "/usr/lib/python3/dist-packages/scrapy/crawler.py", line 86, in crawl
    self.engine = self._create_engine()
  File "/usr/lib/python3/dist-packages/scrapy/crawler.py", line 111, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "/usr/lib/python3/dist-packages/scrapy/core/engine.py", line 70, in __init__
    self.scraper = Scraper(crawler)
  File "/usr/lib/python3/dist-packages/scrapy/core/scraper.py", line 71, in __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/lib/python3/dist-packages/scrapy/middleware.py", line 53, in from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/lib/python3/dist-packages/scrapy/middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)
  File "/usr/lib/python3/dist-packages/scrapy/utils/misc.py", line 46, in load_object
    mod = import_module(module)
  File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
    
  File "<frozen importlib._bootstrap>", line 991, in _find_and_load
    
  File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
    
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
    
  File "<frozen importlib._bootstrap_external>", line 783, in exec_module
    
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
    
  File "/home/huston/PycharmProjects/parsing_kviz/brainboy/brainboy/pipelines.py", line 8, in <module>
    from itemadapter import ItemAdapter
builtins.ModuleNotFoundError: No module named 'itemadapter'

Temporarily disabling observer LegacyLogObserverWrapper(<bound method PythonLoggingObserver.emit of <twisted.python.log.PythonLoggingObserver object at 0x7f7ffc8daf40>>) due to exception: [Failure instance: Traceback: <class 'TypeError'>: _findCaller() takes from 1 to 2 positional arguments but 3 were given
/usr/lib/python3/dist-packages/twisted/internet/defer.py:962:__del__
/usr/lib/python3/dist-packages/twisted/logger/_logger.py:190:failure
/usr/lib/python3/dist-packages/twisted/logger/_logger.py:144:emit
--- <exception caught here> ---
/usr/lib/python3/dist-packages/twisted/logger/_observer.py:131:__call__
/usr/lib/python3/dist-packages/twisted/logger/_legacy.py:93:__call__
/usr/lib/python3/dist-packages/twisted/python/log.py:595:emit
/usr/lib/python3/dist-packages/twisted/logger/_legacy.py:154:publishToNewObserver
/usr/lib/python3/dist-packages/twisted/logger/_stdlib.py:115:__call__
/usr/lib/python3.8/logging/__init__.py:1500:log
/usr/lib/python3.8/logging/__init__.py:1565:_log
]
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/twisted/internet/defer.py", line 962, in __del__
    log.failure(format,
  File "/usr/lib/python3/dist-packages/twisted/logger/_logger.py", line 190, in failure
    self.emit(level, format, log_failure=failure, **kwargs)
  File "/usr/lib/python3/dist-packages/twisted/logger/_logger.py", line 144, in emit
    self.observer(event)
--- <exception caught here> ---
  File "/usr/lib/python3/dist-packages/twisted/logger/_observer.py", line 131, in __call__
    observer(event)
  File "/usr/lib/python3/dist-packages/twisted/logger/_legacy.py", line 93, in __call__
    self.legacyObserver(event)
  File "/usr/lib/python3/dist-packages/twisted/python/log.py", line 595, in emit
    _publishNew(self._newObserver, eventDict, textFromEventDict)
  File "/usr/lib/python3/dist-packages/twisted/logger/_legacy.py", line 154, in publishToNewObserver
    observer(eventDict)
  File "/usr/lib/python3/dist-packages/twisted/logger/_stdlib.py", line 115, in __call__
    self.logger.log(
  File "/usr/lib/python3.8/logging/__init__.py", line 1500, in log
    self._log(level, msg, args, **kwargs)
  File "/usr/lib/python3.8/logging/__init__.py", line 1565, in _log
    fn, lno, func, sinfo = self.findCaller(stack_info, stacklevel)
builtins.TypeError: _findCaller() takes from 1 to 2 positional arguments but 3 were given

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
vascodogama, 2021-01-22
@isaac-21

Most likely it runs in different environments (or even pythons)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question