Skip to content
Advertisement

How to Send Emails From Databricks

I have used the code from Send email from Databricks Notebook with attachment to attempt sending code from my Databricks Community edition:

I have used the following code:

import smtplib
from pathlib import Path
from email.mime.multipart import MIMEMultipart
from email.mime.base import MIMEBase
from email.mime.text import MIMEText
from email.utils import COMMASPACE, formatdate
from email import encoders

from_email = "myname@myemail.co.uk"
to_email = "myname@myemail.co.uk"


def send_mail(send_from = from_email, send_to = to_email, subject = "Test", message = "Test", files=["/FileStore/tables2/"],
              server="smtp.hosts.co.uk", port=587, username='myusername.co.uk', password='!L3qGWGuyw',
              use_tls=True):

    msg = MIMEMultipart()
    msg['From'] = send_from
    msg['To'] = COMMASPACE.join(send_to)
    msg['Date'] = formatdate(localtime=True)
    msg['Subject'] = subject

    msg.attach(MIMEText(message))

    for path in files:
        part = MIMEBase('application', "octet-stream")
        with open(path, 'rb') as file:
            part.set_payload(file.read())
        encoders.encode_base64(part)
        part.add_header('Content-Disposition',
                        'attachment; filename="{}"'.format(Path(path).name))
        msg.attach(part)

    smtp = smtplib.SMTP(server, port)
    if use_tls:
        smtp.starttls()
    smtp.login(username, password)
    smtp.sendmail(send_from, send_to, msg.as_string())
    smtp.quit()

As you can see the code is almost identical. However, when I run the code I get the following error:

FileNotFoundError: [Errno 2] No such file or directory: '/FileStore/tables2/'

Is this error also because I’m running on Databricks Community edition, as was the case in the previous question.

Advertisement

Answer

/FileStore/tables2/ is just a name of file that you want to send as an attachment. You need to put your file names there, or make the list empty if you don’t want to send attachments. It should be a local file, so on Azure use /dbfs/...., and on community edition – use dbutils.fs.cp to copy file from DBFS to local file system.

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement