[linux] Unable to establish SSL connection upon wget on Ubuntu 14.04 LTS

... right now it happens only to the website I'm testing. I can't post it here because it's confidential.

Then I guess it is one of the sites which is incompatible with TLS1.2. The openssl as used in 12.04 does not use TLS1.2 on the client side while with 14.04 it uses TLS1.2 which might explain the difference. To work around try to explicitly use --secure-protocol=TLSv1. If this does not help check if you can access the site with openssl s_client -connect ... (probably not) and with openssl s_client -tls1 -no_tls1_1, -no_tls1_2 ....

Please note that it might be other causes, but this one is the most probable and without getting access to the site everything is just speculation anyway.

The assumed problem in detail: Usually clients use the most compatible handshake to access a server. This is the SSLv23 handshake which is compatible to older SSL versions but announces the best TLS version the client supports, so that the server can pick the best version. In this case wget would announce TLS1.2. But there are some broken servers which never assumed that one day there would be something like TLS1.2 and which refuse the handshake if the client announces support for this hot new version (from 2008!) instead of just responding with the best version the server supports. To access these broken servers the client has to lie and claim that it only supports TLS1.0 as the best version.

Is Ubuntu 14.04 or wget 1.15 not compatible with TLS 1.0 websites? Do I need to install/download any library/software to enable this connection?

The problem is the server, not the client. Most browsers work around these broken servers by retrying with a lower version. Most other applications fail permanently if the first connection attempt fails, i.e. they don't downgrade by itself and one has to enforce another version by some application specific settings.

Examples related to linux

grep's at sign caught as whitespace How to prevent Google Colab from disconnecting? "E: Unable to locate package python-pip" on Ubuntu 18.04 How to upgrade Python version to 3.7? Install Qt on Ubuntu Get first line of a shell command's output Cannot connect to the Docker daemon at unix:/var/run/docker.sock. Is the docker daemon running? Run bash command on jenkins pipeline How to uninstall an older PHP version from centOS7 How to update-alternatives to Python 3 without breaking apt?

Examples related to ubuntu

grep's at sign caught as whitespace "E: Unable to locate package python-pip" on Ubuntu 18.04 How to Install pip for python 3.7 on Ubuntu 18? "Repository does not have a release file" error ping: google.com: Temporary failure in name resolution How to install JDK 11 under Ubuntu? How to upgrade Python version to 3.7? Issue in installing php7.2-mcrypt Install Qt on Ubuntu Failed to start mongod.service: Unit mongod.service not found

Examples related to ssl

Requests (Caused by SSLError("Can't connect to HTTPS URL because the SSL module is not available.") Error in PyCharm requesting website A fatal error occurred while creating a TLS client credential. The internal error state is 10013 curl: (35) error:1408F10B:SSL routines:ssl3_get_record:wrong version number How to install OpenSSL in windows 10? ssl.SSLError: tlsv1 alert protocol version Invalid self signed SSL cert - "Subject Alternative Name Missing" "SSL certificate verify failed" using pip to install packages ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749) Powershell Invoke-WebRequest Fails with SSL/TLS Secure Channel "ssl module in Python is not available" when installing package with pip3

Examples related to wget

How to `wget` a list of URLs in a text file? How to install wget in macOS? wget ssl alert handshake failure How to run wget inside Ubuntu Docker image? Unable to establish SSL connection upon wget on Ubuntu 14.04 LTS How to use Python requests to fake a browser visit a.k.a and generate User Agent? wget/curl large file from google drive wget: unable to resolve host address `http' Python equivalent of a given wget command How to download HTTP directory with all files and sub-directories as they appear on the online files/folders list?