[python] Understanding timedelta

Given the python code below, please help me understand what is happening there.

start_time = time.time()
time.sleep(42)
end_time = time.time()

uptime = end_time - start_time

human_uptime = str(datetime.timedelta(seconds=int(uptime)))

So I get the difference between start time and end time, on line 5 I round up the duration by casting and what now, what's the further explanation?

I know what delta means(average or difference), but why do I have to pass seconds = uptime to timedelta and why does the string casting works so nicely that I get HH:MM:SS ?

This question is related to python timedelta

The answer is


why do I have to pass seconds = uptime to timedelta

Because timedelta objects can be passed seconds, milliseconds, days, etc... so you need to specify what are you passing in (this is why you use the explicit key). Typecasting to int is superfluous as they could also accept floats.

and why does the string casting works so nicely that I get HH:MM:SS ?

It's not the typecasting that formats, is the internal __str__ method of the object. In fact you will achieve the same result if you write:

print datetime.timedelta(seconds=int(uptime))