I'm making a javascript app which retrieves .json
files with jquery and injects data into the webpage it is embedded in.
The .json
files are encoded with UTF-8 and contains accented chars like é, ö and å.
The problem is that I don't control the charset on the pages that are going to use the app.
Some will be using UTF-8, but others will be using the iso-8859-1 charset. This will of course garble the special chars from the .json
files.
How do I convert special UTF-8 chars to their iso-8859-1 equivalent using javascript?
This question is related to
javascript
jquery
character-encoding
Since the question on how to convert from ISO-8859-1 to UTF-8 is closed because of this one I'm going to post my solution here.
The problem is when you try to GET anything by using XMLHttpRequest, if the XMLHttpRequest.responseType is "text" or empty, the XMLHttpRequest.response is transformed to a DOMString and that's were things break up. After, it's almost impossible to reliably work with that string.
Now, if the content from the server is ISO-8859-1 you'll have to force the response to be of type "Blob" and later convert this to DOMSTring. For example:
var ajax = new XMLHttpRequest();
ajax.open('GET', url, true);
ajax.responseType = 'blob';
ajax.onreadystatechange = function(){
...
if(ajax.responseType === 'blob'){
// Convert the blob to a string
var reader = new window.FileReader();
reader.addEventListener('loadend', function() {
// For ISO-8859-1 there's no further conversion required
Promise.resolve(reader.result);
});
reader.readAsBinaryString(ajax.response);
}
}
Seems like the magic is happening on readAsBinaryString so maybe someone can shed some light on why this works.
you should add this line above your page
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
Internally, Javascript strings are all Unicode (actually UCS-2, a subset of UTF-16).
If you're retrieving the JSON files separately via AJAX, then you only need to make sure that the JSON files are served with the correct Content-Type and charset: Content-Type: application/json; charset="utf-8"
). If you do that, jQuery should already have interpreted them properly by the time you access the deserialized objects.
Could you post an example of the code you’re using to retrieve the JSON objects?
The problem is that once the page is served up, the content is going to be in the encoding described in the content-type meta tag. The content in "wrong" encoding is already garbled.
You're best to do this on the server before serving up the page. Or as I have been know to say: UTF-8 end-to-end or die.
There are libraries that do charset conversion in Javascript. But if you want something simple, this function does approximately what you want:
function stringToBytes(text) {
const length = text.length;
const result = new Uint8Array(length);
for (let i = 0; i < length; i++) {
const code = text.charCodeAt(i);
const byte = code > 255 ? 32 : code;
result[i] = byte;
}
return result;
}
If you want to convert the resulting byte array into a Blob, you would do something like this:
const originalString = 'ååå';
const bytes = stringToBytes(originalString);
const blob = new Blob([bytes.buffer], { type: 'text/plain; charset=ISO-8859-1' });
Now, keep in mind that some apps do accept UTF-8 encoding, but they can't guess the encoding unless you prepend a BOM character, as explained here.
Source: Stackoverflow.com