There are lots of explanation of Jmeter Summary, I have been using this tool from quite some time for generating performance testing report with relevant data. The explanation available on below link is right from the field experience:
Jmeter:Understanding Summary Report
This is one of the most useful report generated by Jmeter to undertstand the load test result.
# Label: Name of HTTP sample request send to server
# Samples : This Captures the total number of samples pushed to server. Suppose you put a Loop Controller to run it 5 times this particular request and then 2 iteration(Called Loop Count in Thread Group)is set and load test is run for 100 users, then the count that will be displayed here .... 1*5*2 * 100 =1000. Total = total number of samples send to server during entire run.
# Average : It's an average response time for a particular http request. This response time is in millisecond, and an average for 5 loops in two iteration for 100 users. Total = Average of total average of samples, means add all averages for all samples and divide by number of samples
# Min : Minmum time spend by sample requests send for this label. The total equals to the minimum time across all samples.
# Max : Maximum tie spend by sample requests send for this label The total equals to the maxmimum time across all samples.
# Std. Dev. : Knowing the standard deviation of your data set tells you how densely the data points are clustered around the mean. The smaller the standard deviation, the more consistent the data. Standard deviation should be less than or equal to half of the average time for a label. If it is more than that, then it means that something is wrong. you need to figure out the problem and fix it. https://en.wikipedia.org/wiki/Standard_deviation Total is euqals to highest deviation across all samples.
# Error: Total percentage of erros found for a particular sample request. 0.0% shows that all requests completed successfully. Total equals to percentage of errors samples in all samples (Total Samples)
# Throughput: Hits/sec, or total number of request per unit of time(sec, mins, hr) send to server during test.
endTime = lastSampleStartTime + lastSampleLoadTime startTime = firstSampleStartTime converstion = unit time conversion value Throughput = Numrequests / ((endTime - startTime)*conversion)
# KB/sec : Its mesuring throughput rate in Kilobytes per second.
# Avg. Bytes: Avegare of total bytes of data downloaded from server. Totals is average bytes across all samples.