[python] Checking network connection

I want to see if I can access an online API, but for that I need to have Internet access.

How can I see if there's a connection available and active using Python?

This question is related to python networking

The answer is


I added a few to Joel's code.

    import socket,time
    mem1 = 0
    while True:
        try:
                host = socket.gethostbyname("www.google.com") #Change to personal choice of site
                s = socket.create_connection((host, 80), 2)
                s.close()
                mem2 = 1
                if (mem2 == mem1):
                    pass #Add commands to be executed on every check
                else:
                    mem1 = mem2
                    print ("Internet is working") #Will be executed on state change

        except Exception as e:
                mem2 = 0
                if (mem2 == mem1):
                    pass
                else:
                    mem1 = mem2
                    print ("Internet is down")
        time.sleep(10) #timeInterval for checking

Here's my version

import requests

try:
    if requests.get('https://google.com').ok:
        print("You're Online")
except:
    print("You're Offline")

Taking Ivelin's answer and add some extra check as my router delivers its ip address 192.168.0.1 and returns a head if it has no internet connection when querying google.com.

import socket

def haveInternet():
    try:
        # first check if we get the correct IP-Address or just the router's IP-Address
        info = socket.getaddrinfo("www.google.com", None)[0]
        ipAddr = info[4][0]
        if ipAddr == "192.168.0.1" :
            return False
    except:
        return False

    conn = httplib.HTTPConnection("www.google.com", timeout=5)
    try:
        conn.request("HEAD", "/")
        conn.close()
        return True
    except:
        conn.close()
        return False

Just to update what unutbu said for new code in Python 3.2

def check_connectivity(reference):
    try:
        urllib.request.urlopen(reference, timeout=1)
        return True
    except urllib.request.URLError:
        return False

And, just to note, the input here (reference) is the url that you want to check: I suggest choosing something that connects fast where you live -- i.e. I live in South Korea, so I would probably set reference to http://www.naver.com.


You can just try to download data, and if connection fail you will know that somethings with connection isn't fine.

Basically you can't check if computer is connected to internet. There can be many reasons for failure, like wrong DNS configuration, firewalls, NAT. So even if you make some tests, you can't have guaranteed that you will have connection with your API until you try.


This works for me in Python3.6

import urllib
from urllib.request import urlopen


def is_internet():
    """
    Query internet using python
    :return:
    """
    try:
        urlopen('https://www.google.com', timeout=1)
        return True
    except urllib.error.URLError as Error:
        print(Error)
        return False


if is_internet():
    print("Internet is active")
else:
    print("Internet disconnected")

As an alternative to ubutnu's/Kevin C answers, I use the requests package like this:

import requests

def connected_to_internet(url='http://www.google.com/', timeout=5):
    try:
        _ = requests.head(url, timeout=timeout)
        return True
    except requests.ConnectionError:
        print("No internet connection available.")
    return False

Bonus: this can be extended to this function that pings a website.

def web_site_online(url='http://www.google.com/', timeout=5):
    try:
        req = requests.head(url, timeout=timeout)
        # HTTP errors are not raised by default, this statement does that
        req.raise_for_status()
        return True
    except requests.HTTPError as e:
        print("Checking internet connection failed, status code {0}.".format(
        e.response.status_code))
    except requests.ConnectionError:
        print("No internet connection available.")
    return False

Taking unutbu's answer as a starting point, and having been burned in the past by a "static" IP address changing, I've made a simple class that checks once using a DNS lookup (i.e., using the URL "https://www.google.com"), and then stores the IP address of the responding server for use on subsequent checks. That way, the IP address is always up to date (assuming the class is re-initialized at least once every few years or so). I also give credit to gawry for this answer, which showed me how to get the server's IP address (after any redirection, etc.). Please disregard the apparent hackiness of this solution, I'm going for a minimal working example here. :)

Here is what I have:

import socket

try:
    from urllib2 import urlopen, URLError
    from urlparse import urlparse
except ImportError:  # Python 3
    from urllib.parse import urlparse
    from urllib.request import urlopen, URLError

class InternetChecker(object):
    conn_url = 'https://www.google.com/'

    def __init__(self):
        pass

    def test_internet(self):
        try:
            data = urlopen(self.conn_url, timeout=5)
        except URLError:
            return False

        try:
            host = data.fp._sock.fp._sock.getpeername()
        except AttributeError:  # Python 3
            host = data.fp.raw._sock.getpeername()

        # Ensure conn_url is an IPv4 address otherwise future queries will fail
        self.conn_url = 'http://' + (host[0] if len(host) == 2 else
                                     socket.gethostbyname(urlparse(data.geturl()).hostname))

        return True

# Usage example
checker = InternetChecker()
checker.test_internet()

my favorite one, when running scripts on a cluster or not

import subprocess

def online(timeout):
    try:
        return subprocess.run(
            ['wget', '-q', '--spider', 'google.com'],
            timeout=timeout
        ).returncode == 0
    except subprocess.TimeoutExpired:
        return False

this runs wget quietly, not downloading anything but checking that the given remote file exists on the web


import requests and try this simple python code.

def check_internet():
    url = 'http://www.google.com/'
    timeout = 5
    try:
        _ = requests.get(url, timeout=timeout)
        return True
    except requests.ConnectionError:
        return False

Best way to do this is to make it check against an IP address that python always gives if it can't find the website. In this case this is my code:

import socket

print("website connection checker")
while True:
    website = input("please input website: ")
    print("")
    print(socket.gethostbyname(website))
    if socket.gethostbyname(website) == "92.242.140.2":
        print("Website could be experiencing an issue/Doesn't exist")
    else:
        socket.gethostbyname(website)
        print("Website is operational!")
        print("")

If we can connect to some Internet server, then we indeed have connectivity. However, for the fastest and most reliable approach, all solutions should comply with the following requirements, at the very least:

  • Avoid DNS resolution (we will need an IP that is well-known and guaranteed to be available for most of the time)
  • Avoid application layer connections (connecting to an HTTP/FTP/IMAP service)
  • Avoid calls to external utilities from Python or other language of choice (we need to come up with a language-agnostic solution that doesn't rely on third-party solutions)

To comply with these, one approach could be to, check if one of the Google's public DNS servers is reachable. The IPv4 addresses for these servers are 8.8.8.8 and 8.8.4.4. We can try connecting to any of them.

A quick Nmap of the host 8.8.8.8 gave below result:

$ sudo nmap 8.8.8.8

Starting Nmap 6.40 ( http://nmap.org ) at 2015-10-14 10:17 IST
Nmap scan report for google-public-dns-a.google.com (8.8.8.8)
Host is up (0.0048s latency).
Not shown: 999 filtered ports
PORT   STATE SERVICE
53/tcp open  domain

Nmap done: 1 IP address (1 host up) scanned in 23.81 seconds

As we can see, 53/tcp is open and non-filtered. If you are a non-root user, remember to use sudo or the -Pn argument for Nmap to send crafted probe packets and determine if a host is up.

Before we try with Python, let's test connectivity using an external tool, Netcat:

$ nc 8.8.8.8 53 -zv
Connection to 8.8.8.8 53 port [tcp/domain] succeeded!

Netcat confirms that we can reach 8.8.8.8 over 53/tcp. Now we can set up a socket connection to 8.8.8.8:53/tcp in Python to check connection:

import socket

def internet(host="8.8.8.8", port=53, timeout=3):
    """
    Host: 8.8.8.8 (google-public-dns-a.google.com)
    OpenPort: 53/tcp
    Service: domain (DNS/TCP)
    """
    try:
        socket.setdefaulttimeout(timeout)
        socket.socket(socket.AF_INET, socket.SOCK_STREAM).connect((host, port))
        return True
    except socket.error as ex:
        print(ex)
        return False

internet()

Another approach could be to send a manually crafted DNS probe to one of these servers and wait for a response. But, I assume, it might prove slower in comparison due to packet drops, DNS resolution failure, etc. Please comment if you think otherwise.

UPDATE #1: Thanks to @theamk's comment, timeout is now an argument and initialized to 3s by default.

UPDATE #2: I did quick tests to identify the fastest and most generic implementation of all valid answers to this question. Here's the summary:

$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.487

iamaziz.py
True
00:00:00:00.335

ivelin.py
True
00:00:00:00.105

jaredb.py
True
00:00:00:00.533

kevinc.py
True
00:00:00:00.295

unutbu.py
True
00:00:00:00.546

7h3rAm.py
True
00:00:00:00.032

And once more:

$ ls *.py | sort -n | xargs -I % sh -c 'echo %; ./timeit.sh %; echo'
defos.py
True
00:00:00:00.450

iamaziz.py
True
00:00:00:00.358

ivelin.py
True
00:00:00:00.099

jaredb.py
True
00:00:00:00.585

kevinc.py
True
00:00:00:00.492

unutbu.py
True
00:00:00:00.485

7h3rAm.py
True
00:00:00:00.035

True in the above output signifies that all these implementations from respective authors correctly identify connectivity to the Internet. Time is shown with milliseconds resolution.

UPDATE #3: Tested again after the exception handling change:

defos.py
True
00:00:00:00.410

iamaziz.py
True
00:00:00:00.240

ivelin.py
True
00:00:00:00.109

jaredb.py
True
00:00:00:00.520

kevinc.py
True
00:00:00:00.317

unutbu.py
True
00:00:00:00.436

7h3rAm.py
True
00:00:00:00.030

Try the operation you were attempting to do anyway. If it fails python should throw you an exception to let you know.

To try some trivial operation first to detect a connection will be introducing a race condition. What if the internet connection is valid when you test but goes down before you need to do actual work?


import urllib

def connected(host='http://google.com'):
    try:
        urllib.urlopen(host)
        return True
    except:
        return False

# test
print( 'connected' if connected() else 'no internet!' )

For python 3, use urllib.request.urlopen(host)


This might not work if the localhost has been changed from 127.0.0.1 Try

import socket
ipaddress=socket.gethostbyname(socket.gethostname())
if ipaddress=="127.0.0.1":
    print("You are not connected to the internet!")
else:
    print("You are connected to the internet with the IP address of "+ ipaddress )

Unless edited , your computers IP will be 127.0.0.1 when not connected to the internet. This code basically gets the IP address and then asks if it is the localhost IP address . Hope that helps


A modern portable solution with requests:

import requests

def internet():
    """Detect an internet connection."""

    connection = None
    try:
        r = requests.get("https://google.com")
        r.raise_for_status()
        print("Internet connection detected.")
        connection = True
    except:
        print("Internet connection not detected.")
        connection = False
    finally:
        return connection

Or, a version that raises an exception:

import requests
from requests.exceptions import ConnectionError

def internet():
    """Detect an internet connection."""

    try:
        r = requests.get("https://google.com")
        r.raise_for_status()
        print("Internet connection detected.")
    except ConnectionError as e:
        print("Internet connection not detected.")
        raise e

It will be faster to just make a HEAD request so no HTML will be fetched.
Also I am sure google would like it better this way :)

try:
    import httplib
except:
    import http.client as httplib

def have_internet():
    conn = httplib.HTTPConnection("www.google.com", timeout=5)
    try:
        conn.request("HEAD", "/")
        conn.close()
        return True
    except:
        conn.close()
        return False

For my projects I use script modified to ping the google public DNS server 8.8.8.8. Using a timeout of 1 second and core python libraries with no external dependencies:

import struct
import socket
import select


def send_one_ping(to='8.8.8.8'):
   ping_socket = socket.socket(socket.AF_INET, socket.SOCK_RAW, socket.getprotobyname('icmp'))
   checksum = 49410
   header = struct.pack('!BBHHH', 8, 0, checksum, 0x123, 1)
   data = b'BCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwx'
   header = struct.pack(
      '!BBHHH', 8, 0, checksum, 0x123, 1
   )
   packet = header + data
   ping_socket.sendto(packet, (to, 1))
   inputready, _, _ = select.select([ping_socket], [], [], 1.0)
   if inputready == []:
      raise Exception('No internet') ## or return False
   _, address = ping_socket.recvfrom(2048)
   print(address) ## or return True


send_one_ping()

The select timeout value is 1, but can be a floating point number of choice to fail more readily than the 1 second in this example.


Make sure your pip is up to date by running

pip install --upgrade pip

Install the requests package using

pip install requests  
    import requests
    import webbrowser
    url = "http://www.youtube.com"

    timeout = 6
    try:
        request = requests.get(url, timeout=timeout)
        print("Connected to the Internet")
        print("browser is loading url")
        webbrowser.open(url)
    except (requests.ConnectionError, requests.Timeout) as exception:
       print("poor or no internet connection.")

Taking Six' answer I think we could simplify somehow, an important issue as newcomers are lost in highly technical matters.

Here what I finally will use to wait for my connection (3G, slow) to be established once a day for my PV monitoring.

Works under Pyth3 with Raspbian 3.4.2

from urllib.request import urlopen
from time import sleep
urltotest=http://www.lsdx.eu             # my own web page
nboftrials=0
answer='NO'
while answer=='NO' and nboftrials<10:
    try:
        urlopen(urltotest)
        answer='YES'
    except:
        essai='NO'
        nboftrials+=1
        sleep(30)       

maximum running: 5 minutes if reached I will try in one hour's time but its another bit of script!