Here's how I solved same issue in September 2020. There is a super-fast and easy way from the mongodb atlas (cloud and desktop). Probably it was not that easy before? That is why I feel like I should write this answer in 2020.
First of all, I read above some suggestions of changing the field "unique" on the mongoose schema. If you came up with this error I assume you already changed your schema, but despite of that you got a 500 as your response, and notice this: specifying duplicated KEY!. If the problem was caused by schema configuration and assuming you have configurated a decent middleware to log mongo errors the response would be a 400.
Why is that? In my case was simple, that field on the schema it used to accept only unique values but I just changed it to accept repeated values. Mongodb on the past created an index for that field, and so even after setting "unique" property as "false" on schema, mongodb was still using that index.
Solution? Dropping that index. You can do it in 2 seconds from Mongo Atlas.
Go to your collection. By default you are on "Find" tab. Just select the next one on the right: "Indexes". You will see how there is still an index given to the same field is causing you trouble. Just click the button "Drop Index".
I believe this is a better option than just dropping your entire collection. Basically because this is why it works after dropping the entire collection. Because mongo is not going to set an index if your first entry is using your new schema with "unique: false".