spacepaste

  1.  
  2. C:\Users\NF\Desktop\8888\conifers\conifers>C:\Python27\Scripts\scrapy.exe
  3. crawl conifers
  4. 2017-10-23 02:19:24 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: conifers
  5. )
  6. 2017-10-23 02:19:24 [scrapy.utils.log] INFO: Overridden settings: {'NEWSPIDER_MO
  7. DULE': 'conifers.spiders', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['conifers.
  8. spiders'], 'RETRY_TIMES': 10, 'BOT_NAME': 'conifers', 'RETRY_HTTP_CODES': [500,
  9. 503, 504, 400, 403, 404, 408]}
  10. 2017-10-23 02:19:24 [scrapy.middleware] INFO: Enabled extensions:
  11. ['scrapy.extensions.logstats.LogStats',
  12. 'scrapy.extensions.telnet.TelnetConsole',
  13. 'scrapy.extensions.corestats.CoreStats']
  14. 2017-10-23 02:19:24 [scrapy.middleware] INFO: Enabled downloader middlewares:
  15. ['scrapy.downloadermiddlewares.retry.RetryMiddleware',
  16. 'scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware',
  17. 'scrapy_proxies.RandomProxy',
  18. 'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
  19. 'scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
  20. 'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
  21. 'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
  22. 'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
  23. 'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
  24. 'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
  25. 'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
  26. 'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
  27. 'scrapy.downloadermiddlewares.stats.DownloaderStats']
  28. 2017-10-23 02:19:24 [scrapy.middleware] INFO: Enabled spider middlewares:
  29. ['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
  30. 'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
  31. 'scrapy.spidermiddlewares.referer.RefererMiddleware',
  32. 'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
  33. 'scrapy.spidermiddlewares.depth.DepthMiddleware']
  34. 2017-10-23 02:19:24 [scrapy.middleware] INFO: Enabled item pipelines:
  35. []
  36. 2017-10-23 02:19:24 [scrapy.core.engine] INFO: Spider opened
  37. 2017-10-23 02:19:24 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pag
  38. es/min), scraped 0 items (at 0 items/min)
  39. 2017-10-23 02:19:24 [scrapy.extensions.telnet] DEBUG: Telnet console listening o
  40. n 127.0.0.1:6023
  41. 2017-10-23 02:19:24 [scrapy.proxies] DEBUG: Proxy user pass not found
  42. 2017-10-23 02:19:24 [scrapy.proxies] DEBUG: Using proxy <http://142.4.214.9:88>,
  43. 1 proxies left
  44. 2017-10-23 02:19:25 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://httpb
  45. in.org/robots.txt> (referer: None)
  46. 2017-10-23 02:19:25 [scrapy.proxies] DEBUG: Proxy user pass not found
  47. 2017-10-23 02:19:25 [scrapy.proxies] DEBUG: Using proxy <http://142.4.214.9:88>,
  48. 1 proxies left
  49. 2017-10-23 02:19:25 [scrapy.core.engine] DEBUG: Crawled (200) <GET https://httpb
  50. in.org/ip> (referer: None)
  51. 2017-10-23 02:19:25 [scrapy.core.engine] INFO: Closing spider (finished)
  52. 2017-10-23 02:19:25 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
  53. {'downloader/request_bytes': 430,
  54. 'downloader/request_count': 2,
  55. 'downloader/request_method_count/GET': 2,
  56. 'downloader/response_bytes': 571,
  57. 'downloader/response_count': 2,
  58. 'downloader/response_status_count/200': 2,
  59. 'finish_reason': 'finished',
  60. 'finish_time': datetime.datetime(2017, 10, 22, 23, 19, 25, 936000),
  61. 'log_count/DEBUG': 7,
  62. 'log_count/INFO': 7,
  63. 'response_received_count': 2,
  64. 'scheduler/dequeued': 1,
  65. 'scheduler/dequeued/memory': 1,
  66. 'scheduler/enqueued': 1,
  67. 'scheduler/enqueued/memory': 1,
  68. 'start_time': datetime.datetime(2017, 10, 22, 23, 19, 24, 897000)}
  69. 2017-10-23 02:19:25 [scrapy.core.engine] INFO: Spider closed (finished)
  70. C:\Users\NF\Desktop\8888\conifers\conifers>
  71.