When you want to change the encoding you always go from one into another. So you might go from Mac Roman
to UTF-8
or from ASCII
to UTF-8
.
It's as important to know the desired output encoding as the current source encoding. For example if you have Mac Roman
and you decode it from UTF-16
to UTF-8
you'll just make it garbled.
If you want to know more about encoding this article goes into a lot of details:
The npm pacakge encoding which uses node-iconv or iconv-lite should allow you to easily specify which source and output encoding you want:
var resultBuffer = encoding.convert(nameString, 'ASCII', 'UTF-8');
I had the same problem, when i loaded a text file via fs.readFile()
, I tried to set the encodeing to UTF8, it keeped the same. my solution now is this:
myString = JSON.parse( JSON.stringify( myString ) )
after this an Ö is realy interpreted as an Ö.
I'd recommend using the Buffer
class:
var someEncodedString = Buffer.from('someString', 'utf-8');
This avoids any unnecessary dependencies that other answers require, since Buffer
is included with node.js
, and is already defined in the global scope.
Source: Stackoverflow.com