I am using
import requests
requests.post(url='https://foo.com', data={'bar':'baz'})
but I get a request.exceptions.SSLError. The website has an expired certficate, but I am not sending sensitive data, so it doesn't matter to me. I would imagine there is an argument like 'verifiy=False' that I could use, but I can't seem to find it.
This question is related to
python
https
python-requests
Use requests.packages.urllib3.disable_warnings()
and verify=False
on requests
methods.
import requests
from urllib3.exceptions import InsecureRequestWarning
# Suppress only the single warning from urllib3 needed.
requests.packages.urllib3.disable_warnings(category=InsecureRequestWarning)
# Set `verify=False` on `requests.post`.
requests.post(url='https://example.com', data={'bar':'baz'}, verify=False)
To add to Blender's answer, you can disable SSL certificate validation for all requests using Session.verify = False
import requests
session = requests.Session()
session.verify = False
session.post(url='https://example.com', data={'bar':'baz'})
Note that urllib3
, (which Requests uses), strongly discourages making unverified HTTPS requests and will raise an InsecureRequestWarning
.
Also can be done from the environment variable:
export CURL_CA_BUNDLE=""
If you want to send exactly post request with verify=False option, fastest way is to use this code:
import requests
requests.api.request('post', url, data={'bar':'baz'}, json=None, verify=False)
If you are writing a scraper and really don't care about the SSL certificate you can set it global:
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
DO NOT USE IN PRODUCTION
Source: Stackoverflow.com