[algorithm] Expand a random range from 1–5 to 1–7

def rand5():
    return random.randint(1,5)    #return random integers from 1 to 5

def rand7():
    rand = rand5()+rand5()-1
    if rand > 7:                  #if numbers > 7, call rand7() again
        return rand7()
    print rand%7 + 1

I guess this will the easiest solution but everywhere people have suggested 5*rand5() + rand5() - 5 like in http://www.geeksforgeeks.org/generate-integer-from-1-to-7-with-equal-probability/. Can someone explain what is wrong with rand5()+rand5()-1