Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scrapy template failure can pass tests #309

Open
Pijukatel opened this issue Jan 14, 2025 · 0 comments
Open

Scrapy template failure can pass tests #309

Pijukatel opened this issue Jan 14, 2025 · 0 comments
Labels
t-tooling Issues with this label are in the ownership of the tooling team.

Comments

@Pijukatel
Copy link

Pijukatel commented Jan 14, 2025

In situation where scrapy template explodes and produces no results, template test passes. It is because test framework checks only for return code of template script and the scrapy code from template swallows all the exceptions. So even in case of such critical failure the template script returns status code 0

Adapt the test(or the template?) to make it fail is such scenarios.

Error example:

Unhandled error in Deferred:
[twisted] CRITICAL Unhandled error in Deferred:

Traceback (most recent call last):
  File "C:\Users\RUNNER~1\AppData\Local\Temp\python-scrapyz9IcFM\.venv\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks
    result = context.run(gen.send, result)
  File "C:\Users\RUNNER~1\AppData\Local\Temp\python-scrapyz9IcFM\.venv\lib\site-packages\scrapy\crawler.py", line 154, in crawl
    yield self.engine.open_spider(self.spider, start_requests)
  File "C:\Users\RUNNER~1\AppData\Local\Temp\python-scrapyz9IcFM\.venv\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks
    result = context.run(gen.send, result)
  File "C:\Users\RUNNER~1\AppData\Local\Temp\python-scrapyz9IcFM\.venv\lib\site-packages\scrapy\core\engine.py", line 393, in open_spider
    if d := scheduler.open(spider):
  File "C:\Users\RUNNER~1\AppData\Local\Temp\python-scrapyz9IcFM\.venv\lib\site-packages\apify\scrapy\scheduler.py", line 59, in open
    self._rq = nested_event_loop.run_until_complete(open_queue())
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\asyncio\base_events.py", line 623, in run_until_complete
    self._check_running()
  File "C:\hostedtoolcache\windows\Python\3.9.13\x64\lib\asyncio\base_events.py", line 585, in _check_running
    raise RuntimeError(
builtins.RuntimeError: Cannot run the event loop while another loop is running

No results example:

[scrapy.statscollectors] INFO  Dumping Scrapy stats:
{'items_per_minute': None,
 'log_count/ERROR': 2,
 'log_count/INFO': 8,
 'log_count/WARNING': 1,
 'responses_per_minute': None} 
@Pijukatel Pijukatel added the t-tooling Issues with this label are in the ownership of the tooling team. label Jan 14, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
t-tooling Issues with this label are in the ownership of the tooling team.
Projects
None yet
Development

No branches or pull requests

1 participant