Several of these answers suggest that at the top of a module you you do
import logging
logger = logging.getLogger(__name__)
It is my understanding that this is considered very bad practice. The reason is that the file config will disable all existing loggers by default. E.g.
#my_module
import logging
logger = logging.getLogger(__name__)
def foo():
logger.info('Hi, foo')
class Bar(object):
def bar(self):
logger.info('Hi, bar')
And in your main module :
#main
import logging
# load my module - this now configures the logger
import my_module
# This will now disable the logger in my module by default, [see the docs][1]
logging.config.fileConfig('logging.ini')
my_module.foo()
bar = my_module.Bar()
bar.bar()
Now the log specified in logging.ini will be empty, as the existing logger was disabled by fileconfig call.
While is is certainly possible to get around this (disable_existing_Loggers=False), realistically many clients of your library will not know about this behavior, and will not receive your logs. Make it easy for your clients by always calling logging.getLogger locally. Hat Tip : I learned about this behavior from Victor Lin's Website.
So good practice is instead to always call logging.getLogger locally. E.g.
#my_module
import logging
logger = logging.getLogger(__name__)
def foo():
logging.getLogger(__name__).info('Hi, foo')
class Bar(object):
def bar(self):
logging.getLogger(__name__).info('Hi, bar')
Also, if you use fileconfig in your main, set disable_existing_loggers=False, just in case your library designers use module level logger instances.