Unfortunately, an additional fix is required on Windows: Regardless, changing it to this gets it working on Mac only: const auto newLength = (int) newText.getNumBytesAsUTF8() // Returns 18 in this example. Strictly speaking, I’m not 100% sure if it’s bytes or something more ‘Unicodey’ like ‘code points’ or whatever, but they might be equivalent with UTF-8 anyway. And since each emoji is represented by 4 bytes in UTF-8, the total for 6 normal characters (6 bytes) plus 3 emojis (3 * 4 = 12 bytes) is 18. It should actually be 18! Why? Because it needs the size in bytes not characters. In this example, newLength is set to 9 which seems correct, right? There are 9 characters in total. The first bug is in juce::AttributedString::setText (const& String newText): auto newLength = newText.length() Note that it chopped the last 3 characters off on Windows, and the last 3 characters are the wrong font/color on Mac. Here’s some test code taken from a component paint() method: juce::AttributedString attrString ĪtText( L"Abc???Def" ) // I know - don't embed actual Unicode chars in source code*.ĪtFont( juce::Font( 20.f ) ) ĪtJustification( juce::Justification::centred ) ĪtWordWrap( juce::AttributedString::WordWrap::none ) ĪtColour( juce::Colours::white ) ĪttrString.draw( g, getLocalBounds().toFloat() )
It works, but it stops short of rendering the full string. I’m using juce::AttributedString to render Unicode text (UTF-8) containing fancy characters such as emojis.