T
T
Type Programmer2020-07-23 21:32:32
C++ / C#
Type Programmer, 2020-07-23 21:32:32

C++ how to display asci emoticon without specifying the code?

A friend said that he chose C# because in C++
'☺' does not work correctly, can someone explain why?
And how can you display this smiley without specifying the code directly?)

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Anton Zhilin, 2020-07-23
@MegaCraZy6

Under Linux, it seems, nothing needs to be done, the standard terminal already supports everything, and the OS interprets the text in char and string as UTF-8. If you want Unicode on Windows, then welcome to hell.
1. Make sure the source code is saved in UTF-8. If Visual Studio, you need to dance a little with a tambourine.
2. Where to output, to a file or to the console? It is impossible to output to the standard console, you need to install Windows Terminal, the latest PowerShell and link them (for this you still have to dance with a tambourine).
3. In the program, before sending a unicode string to the OS (for example, before printing), make sure that you have converted it from UTF-8 to the system encoding (which is usually UTF-16). There is no such thing in the standard library, you need to include a lib. Here is what I use with Boost and fmt:

namespace strings
{
    using native_char = boost::filesystem::path::value_type;
    using native_string = std::basic_string<native_char>;

    namespace detail
    {
        inline void write(const std::string& string)
        {
            std::cout.write(string.c_str(), string.size());
        }

        inline void write(const std::wstring& string)
        {
            std::wcout.write(string.c_str(), string.size());
        }
    }

    using fmt::format;

    template <typename S, typename... Args>
    void print(const S& format_str, Args&&... args)
    {
        const auto string = fmt::format(format_str, std::forward<Args>(args)...);
        detail::write(boost::locale::conv::utf_to_utf<native_char>(string));
    }

    using boost::locale::conv::utf_to_utf;
    using boost::locale::conv::from_utf;
    using boost::locale::conv::to_utf;
    using boost::locale::conv::between;

    inline native_string to_native(const std::string& utf8)
    {
        return utf_to_utf<native_char>(utf8);
    }

    inline std::string from_native(const native_string& native)
    {
        return utf_to_utf<char>(native);
    }
}

A
anikavoi, 2020-07-24
@anikavoi

The question is divided into two:
1) How to get into char 0x01 ascii without using numbers
2) How to make the system show it correctly.
The first question is trite:
char c=' '; c--; c--;... and so on 31 times.
The second one is also banal - set console

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question