I ran into this scenario recently (well over 7 million rows) and eneded up using sqlcmd via powershell (after parsing raw data into SQL insert statements) in segments of 5,000 at a time (SQL can't handle 7 million lines in one lump job or even 500,000 lines for that matter unless its broken down into smaller 5K pieces. You can then run each 5K script one after the other.) as I needed to leverage the new sequence command in SQL Server 2012 Enterprise. I couldn't find a programatic way to insert seven million rows of data quickly and efficiently with said sequence command.
Secondly, one of the things to look out for when inserting a million rows or more of data in one sitting is the CPU and memory consumption (mostly memory) during the insert process. SQL will eat up memory/CPU with a job of this magnitude without releasing said processes. Needless to say if you don't have enough processing power or memory on your server you can crash it pretty easily in a short time (which I found out the hard way). If you get to the point to where your memory consumption is over 70-75% just reboot the server and the processes will be released back to normal.
I had to run a bunch of trial and error tests to see what the limits for my server was (given the limited CPU/Memory resources to work with) before I could actually have a final execution plan. I would suggest you do the same in a test environment before rolling this out into production.