-
Notifications
You must be signed in to change notification settings - Fork 10.8k
Document that Mailsender.send() returns a Deferred #3478
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I met the same error. However, I don't figure it out. |
Same problem here. |
by the way, I'm using python3.7 and scrapy 1.5.1 |
I met the same problem when i try to send an email in the pipeline.It throw the error into logs but my email has been sent successfully. The console outputs: 2019-02-21 21:32:58 [scrapy.mail] INFO: Mail sent OK: To=['xxxxxxxx@outlook.com'] Cc=[] Subject="test" Attachs=0
Unhandled Error
Traceback (most recent call last):
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/python/log.py", line 103, in callWithLogger
return callWithContext({"system": lp}, func, *args, **kw)
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/python/log.py", line 86, in callWithContext
return context.call({ILogContext: newCtx}, func, *args, **kw)
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/python/context.py", line 122, in callWithContext
return self.currentContext().callWithContext(ctx, func, *args, **kw)
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/python/context.py", line 85, in callWithContext
return func(*args,**kw)
--- <exception caught here> ---
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/internet/posixbase.py", line 614, in _doReadOrWrite
why = selectable.doRead()
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/internet/tcp.py", line 243, in doRead
return self._dataReceived(data)
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/internet/tcp.py", line 249, in _dataReceived
rval = self.protocol.dataReceived(data)
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/protocols/tls.py", line 330, in dataReceived
self._flushReceiveBIO()
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/protocols/tls.py", line 300, in _flushReceiveBIO
self._flushSendBIO()
File "/home/xu/.local/share/virtualenvs/converse-OK57Cjbh/lib/python3.6/site-packages/twisted/protocols/tls.py", line 252, in _flushSendBIO
bytes = self._tlsConnection.bio_read(2 ** 15)
builtins.AttributeError: 'NoneType' object has no attribute 'bio_read' Python and scrapy version: (converse) xu@xu-ThundeRobot ~/Projects/temp/spider/converse python -V
Python 3.6.7
(converse) xu@xu-ThundeRobot ~/Projects/temp/spider/converse pipenv graph
Pillow==5.4.1
Scrapy==1.6.0
- cssselect [required: >=0.9, installed: 1.0.3]
- lxml [required: Any, installed: 4.3.1]
- parsel [required: >=1.5, installed: 1.5.1]
- cssselect [required: >=0.9, installed: 1.0.3]
- lxml [required: >=2.3, installed: 4.3.1]
- six [required: >=1.5.2, installed: 1.12.0]
- w3lib [required: >=1.19.0, installed: 1.20.0]
- six [required: >=1.4.1, installed: 1.12.0]
- PyDispatcher [required: >=2.0.5, installed: 2.0.5]
- pyOpenSSL [required: Any, installed: 19.0.0]
- cryptography [required: >=2.3, installed: 2.5]
- asn1crypto [required: >=0.21.0, installed: 0.24.0]
- cffi [required: >=1.8,!=1.11.3, installed: 1.12.1]
- pycparser [required: Any, installed: 2.19]
- six [required: >=1.4.1, installed: 1.12.0]
- six [required: >=1.5.2, installed: 1.12.0]
- queuelib [required: Any, installed: 1.5.0]
- service-identity [required: Any, installed: 18.1.0]
- attrs [required: >=16.0.0, installed: 18.2.0]
- cryptography [required: Any, installed: 2.5]
- asn1crypto [required: >=0.21.0, installed: 0.24.0]
- cffi [required: >=1.8,!=1.11.3, installed: 1.12.1]
- pycparser [required: Any, installed: 2.19]
- six [required: >=1.4.1, installed: 1.12.0]
- pyasn1 [required: Any, installed: 0.4.5]
- pyasn1-modules [required: Any, installed: 0.2.4]
- pyasn1 [required: >=0.4.1,<0.5.0, installed: 0.4.5]
- six [required: >=1.5.2, installed: 1.12.0]
- Twisted [required: >=13.1.0, installed: 18.9.0]
- attrs [required: >=17.4.0, installed: 18.2.0]
- Automat [required: >=0.3.0, installed: 0.7.0]
- attrs [required: >=16.1.0, installed: 18.2.0]
- six [required: Any, installed: 1.12.0]
- constantly [required: >=15.1, installed: 15.1.0]
- hyperlink [required: >=17.1.1, installed: 18.0.0]
- idna [required: >=2.5, installed: 2.8]
- incremental [required: >=16.10.1, installed: 17.5.0]
- PyHamcrest [required: >=1.9.0, installed: 1.9.0]
- setuptools [required: Any, installed: 40.8.0]
- six [required: Any, installed: 1.12.0]
- zope.interface [required: >=4.4.2, installed: 4.6.0]
- setuptools [required: Any, installed: 40.8.0]
- w3lib [required: >=1.17.0, installed: 1.20.0]
- six [required: >=1.4.1, installed: 1.12.0] And my code: class SendEmailPipeLine(object):
def __init__(self, settings):
self.mailer = MailSender.from_settings(settings)
self.pools = []
@classmethod
def from_crawler(cls, crawler):
settings = crawler.settings
return cls(settings)
def process_item(self, item, spider):
self.pools.append(item)
return item
def close_spider(self, spider):
self.mailer.send('xxxxxxxxxx@outlook.com','test','asdfghjkjbvcxzqwertyuiop') Email settings: MAIL_FROM = 'xxxxxxxxxx@outlook.com'
MAIL_HOST = 'smtp.office365.com'
MAIL_PORT = 587
MAIL_USER = 'xxxxxxxxxxxxx@outlook.com'
MAIL_PASS = 'xxxxxxxxxxxxxx'
MAIL_TLS = True |
i have the same issue when i use Scrapy send emaill , the email has been sent successfully.
|
i have the same issue when i use Scrapy send emaill , the email has been sent successfully. |
same problem here. when i try to send email via email module got same error, i used python 3.6.2 and scrapy 1.6.0 |
test_spider.py
Hello, MailSender.send() returns Twisted deferred object (see line 106 in module scrapy.mail) with callbacks _sent_ok and _sent_failed for success and failure accordingly. (Line 102 in scrapy.mail). Use of MailerSend.send() in spider_closed generates logs where the spider is closed and then mail is sent - looks like expected behaviour. 2019-06-02 19:54:08 [scrapy.core.engine] INFO: Spider closed (finished) However, you get the error in traceback: My explanation of the error: The error itself is the result of TLSMemoryBIOProtocol.connectionLost() triggered by end of crawler work where attribute _tlsConnection is assigned None (see line 407 twisted.protocols.tls). As a workaround and based on my very little knowledge of Twisted Deferred class and Scrapy I can propose the following: def spider_closed(self): In this case reactor/main loop shutdown process will wait. You can see it from logs: 2019-06-02 20:00:20 [scrapy.core.engine] INFO: Closing spider (finished) My question to Scrapy owners, @Gallaecio, can we consider the workaround as a fix and change documentation for MailSender.send() ? |
i have the same issue when i use Scrapy send emaill , the email has been sent successfully. |
self._send_mail(body,subject).addCallback(lambda x: x) |
I can verify that this is still an issue. The email goes through, but a fatal error gets thrown with the following traceback:
|
I have an email pipeline that sends email during process_item and have the error
Changing the function to async and use await seems to solve it for me, as
Not sure if this is the right way to solve but it seems working for me. |
Same issue here. @iveney 's solution works. |
Same issue here. @iveney's solution really works for me. |
It's not a workaround but the correct usage of this function, or other functions that return a Deferred instead of waiting until the action is done. It indeed makes sense to mention in the docs that you are supposed to wait for the deferred instead of just calling this function and assuming its synchronous. |
Wait, @iveney, how are you using |
You definitely can, it even inherits from |
You're right. I was unable to get this working initially because I wasn't using the right setup in my test script. I thought I needed to use I finally came up with this, which runs as expected: import scrapy.mail
import scrapy.settings
import scrapy.utils.defer
import twisted.internet.reactor
async def send_mail():
mail_settings = {
'MAIL_FROM': 'from@example.com',
'MAIL_HOST': 'smtp.example.com',
'MAIL_PORT': 587,
'MAIL_USER': 'user@example.com',
'MAIL_PASS': 'yourpassword',
'MAIL_TLS': True,
'MAIL_SSL': True,
}
settings = scrapy.settings.Settings(values=mail_settings)
mailer = scrapy.mail.MailSender.from_settings(settings)
deferred = mailer.send(
to=['recipient@example.com'],
subject='Test email',
body='This is a test email',
cc=['cc@example.com'],
)
assert deferred is not None
await deferred
if __name__ == '__main__':
main_deferred = scrapy.utils.defer.ensureDeferred(send_mail())
twisted.internet.reactor.callWhenRunning(
lambda: main_deferred.addBoth(lambda _: twisted.internet.reactor.stop())
)
twisted.internet.reactor.run() |
(if you use asyncio read https://docs.scrapy.org/en/latest/topics/asyncio.html#awaiting-on-deferreds) |
Hi, I'm new to scrapy and I want to send some emails after the spider closed. But I got some errors, anyone know ? I'm using python2.7 and scrapy 1.5.1.
Here are my codes:
I want to send two e-mails after the spider close, but I get below errors:
(By the way, there is no problem if I just send one e-mail)
File "C:\Software\Python27\lib\site-packages\twisted\internet\selectreactor.py", line 149, in _doReadOrWrite why = getattr(selectable, method)() File "C:\Software\Python27\lib\site-packages\twisted\internet\tcp.py", line 243, in doRead return self._dataReceived(data) File "C:\Software\Python27\lib\site-packages\twisted\internet\tcp.py", line 249, in _dataReceived rval = self.protocol.dataReceived(data) File "C:\Software\Python27\lib\site-packages\twisted\protocols\tls.py", line 330, in dataReceived self._flushReceiveBIO() File "C:\Software\Python27\lib\site-packages\twisted\protocols\tls.py", line 300, in _flushReceiveBIO self._flushSendBIO() File "C:\Software\Python27\lib\site-packages\twisted\protocols\tls.py", line 252, in _flushSendBIO bytes = self._tlsConnection.bio_read(2 ** 15) exceptions.AttributeError: 'NoneType' object has no attribute 'bio_read'
It seems to the
twisted
doesn't close the io, but I don't find anyclose
method inMailSender
class,so anyone have met this error?
The text was updated successfully, but these errors were encountered: