you mentioned that you are running yourcode interactivly on spark-shell so, while doing if no proper value is set for driver-memory or executor memory then spark defaultly assign some value to it, which is based on it's properties file(where default value is being mentioned).
I hope you are aware of the fact that there is one driver(master node) and worker-node(where executors are get created and processed), so basically two types of space is required by the spark program,so if you want to set driver memory then when start spark-shell .
spark-shell --driver-memory "your value" and to set executor memory : spark-shell --executor-memory "your value"
then I think you are good to go with the desired value of the memory that you want your spark-shell to use.