You could always use something like mktime to create a known time (midnight, last night) and use difftime to get a double-precision time difference between the two. For a platform-independant solution, unless you go digging into the details of your libraries, you're not going to do much better than that. According to the C spec, the definition of time_t is implementation-defined (meaning that each implementation of the library can define it however they like, as long as library functions with use it behave according to the spec.)
That being said, the size of time_t on my linux machine is 8 bytes, which suggests a long int or a double. So I did:
int main()
{
for(;;)
{
printf ("%ld\n", time(NULL));
printf ("%f\n", time(NULL));
sleep(1);
}
return 0;
}
The time given by the %ld increased by one each step and the float printed 0.000 each time. If you're hell-bent on using printf to display time_ts, your best bet is to try your own such experiment and see how it work out on your platform and with your compiler.