terminal/src/buffer/out/TextColor.h
James Holderness fa7c1abdf8
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834)
This PR introduces a new `ColorType` to allow us to distinguish between
`SGR` indexed colors from the 16 color table, the lower half of which
can be brightened, and the ISO/ITU indexed colors from the 256 color
table, which have a fixed brightness. Retaining the distinction between
these two types will enable us to forward the correct `SGR` sequences to
conpty when addressing issue #2661. 

The other benefit of retaining the color index (which we didn't
previously do for ISO/ITU colors) is that it ensures that the colors are
updated correctly when the color scheme is changed.

## References

* This is another step towards fixing the conpty narrowing bugs in issue
  #2661.
* This is technically a fix for issue #5384, but that won't be apparent
  until #2661 is complete.

## PR Checklist
* [x] Closes #1223
* [x] CLA signed. 
* [x] Tests added/passed
* [ ] Requires documentation to be updated
* [x] I've discussed this with core contributors already.

## Detailed Description of the Pull Request / Additional comments

The first part of this PR was the introduction of a new `ColorType` in
the `TextColor` class. Instead of just the one `IsIndex` type, there is
now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight
original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the
brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256`
covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR
48;5`.

There are two reasons for this distinction. The first is that the ANSI
colors have the potential to be brightened by the `SGR 1` bold
attribute, while the ISO/ITO color do not. The second reason is that
when forwarding an attributes through conpty, we want to try and
preserve the original SGR sequence that generated each color (to the
extent that that is possible). By having the two separate types, we can
map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256`
to the ISO/ITU sequences.

In addition to the VT colors, we also have to deal with the legacy
colors set by the Windows console APIs, but we don't really need a
separate type for those. It seemed most appropriate to me to store them
as `IsIndex256` colors, since it doesn't make sense to have them
brightened by the `SGR 1` attribute (which is what would happen if they
were stored as `IsIndex16`). If a console app wanted a bright color it
would have selected one, so we shouldn't be messing with that choice.

The second part of the PR was the unification of the two color tables.
Originally we had a 16 color table for the legacy colors, and a separate
table for the 256 ISO/ITU colors. These have now been merged into one,
so color table lookups no longer need to decide which of the two tables
they should be referencing. I've also updated all the methods that took
a color table as a parameter to use a `basic_string_view` instead of
separate pointer and length variables, which I think makes them a lot
easier and safer to work with. 

With this new architecture in place, I could now update the
`AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors
as `IsIndex256` values, where before they were mapped to RGB values
(which prevented them reflecting any color scheme changes). I could also
update the `TerminalDispatch` implementation to differentiate between
the two index types, so that the `SGR 1` brightening would only be
applied to the ANSI colors.

I've also done a bit of code refactoring to try and minimise any direct
access to the color tables, getting rid of a lot of places that were
copying tables with `memmove` operations. I'm hoping this will make it
easier for us to update the code in the future if we want to reorder the
table entries (which is likely a requirement for unifying the
`AdaptDispatch` and `TerminalDispatch` implementations). 

## Validation Steps Performed

For testing, I've just updated the existing unit tests to account for
the API changes. The `TextColorTests` required an extra parameter
specifying the index type when setting an index. And the `AdapterTest`
and `ScreenBufferTests` required the use of the new `SetIndexedXXX`
methods in order to be explicit about the index type, instead of relying
on the `TextAttribute` constructor and the old `SetForeground` and
`SetBackground` methods which didn't have a way to differentiate index
types.

I've manually tested the various console APIs
(`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and
`ReadConsoleOutput`), to make sure they are still setting and reading
the attributes as well as they used to. And I've tested the
`SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs
to make sure they can read and write the color table correctly. I've
also tested the color table in the properties dialog, made sure it was
saved and restored from the registry correctly, and similarly saved and
restored from a shortcut link.

Note that there are still a bunch of issues with the color table APIs,
but no new problems have been introduced by the changes in this PR, as
far as I could tell.

I've also done a bunch of manual tests of `OSC 4` to make sure it's
updating all the colors correctly (at least in conhost), and confirmed
that the test case in issue #1223 now works as expected.
2020-05-27 22:34:45 +00:00

164 lines
4.5 KiB
C++

/*++
Copyright (c) Microsoft Corporation
Licensed under the MIT license.
Module Name:
- TextColor.h
Abstract:
- contains data for a single color of the text. Text Attributes are composed of
two of these - one for the foreground and one for the background.
The color can be in one of three states:
* Default Colors - The terminal should use the terminal's notion of whatever
the default color should be for this component.
It's up to the terminal that's consuming this buffer to control the
behavior of default attributes.
Terminals typically have a pair of Default colors that are separate from
their color table. This component should use that value.
Consoles also can have a legacy table index as their default colors.
* Indexed Color - The terminal should use our value as an index into the
color table to retrieve the real value of the color.
This is the type of color that "legacy" 16-color attributes have.
* RGB color - We'll store a real color value in this attribute
Author(s):
- Mike Griese (migrie) Nov 2018
Revision History:
- From components of output.h/.c
by Therese Stowell (ThereseS) 1990-1991
- Pulled into its own file from textBuffer.hpp/cpp (AustDi, 2017)
- Moved the colors into their own separate abstraction. (migrie Nov 2018)
--*/
#pragma once
#ifdef UNIT_TESTING
#include "WexTestClass.h"
#endif
#pragma pack(push, 1)
enum class ColorType : BYTE
{
IsIndex256 = 0x0,
IsIndex16 = 0x1,
IsDefault = 0x2,
IsRgb = 0x3
};
struct TextColor
{
public:
constexpr TextColor() noexcept :
_meta{ ColorType::IsDefault },
_red{ 0 },
_green{ 0 },
_blue{ 0 }
{
}
constexpr TextColor(const BYTE index, const bool isIndex256) noexcept :
_meta{ isIndex256 ? ColorType::IsIndex256 : ColorType::IsIndex16 },
_index{ index },
_green{ 0 },
_blue{ 0 }
{
}
constexpr TextColor(const COLORREF rgb) noexcept :
_meta{ ColorType::IsRgb },
_red{ GetRValue(rgb) },
_green{ GetGValue(rgb) },
_blue{ GetBValue(rgb) }
{
}
friend constexpr bool operator==(const TextColor& a, const TextColor& b) noexcept;
friend constexpr bool operator!=(const TextColor& a, const TextColor& b) noexcept;
bool IsLegacy() const noexcept;
bool IsHighColor() const noexcept;
bool IsIndex16() const noexcept;
bool IsIndex256() const noexcept;
bool IsDefault() const noexcept;
bool IsRgb() const noexcept;
void SetColor(const COLORREF rgbColor) noexcept;
void SetIndex(const BYTE index, const bool isIndex256) noexcept;
void SetDefault() noexcept;
COLORREF GetColor(std::basic_string_view<COLORREF> colorTable,
const COLORREF defaultColor,
const bool brighten) const noexcept;
constexpr BYTE GetIndex() const noexcept
{
return _index;
}
private:
ColorType _meta : 2;
union
{
BYTE _red, _index;
};
BYTE _green;
BYTE _blue;
COLORREF _GetRGB() const noexcept;
#ifdef UNIT_TESTING
friend class TextBufferTests;
template<typename TextColor>
friend class WEX::TestExecution::VerifyOutputTraits;
#endif
};
#pragma pack(pop)
bool constexpr operator==(const TextColor& a, const TextColor& b) noexcept
{
return a._meta == b._meta &&
a._red == b._red &&
a._green == b._green &&
a._blue == b._blue;
}
bool constexpr operator!=(const TextColor& a, const TextColor& b) noexcept
{
return !(a == b);
}
#ifdef UNIT_TESTING
namespace WEX
{
namespace TestExecution
{
template<>
class VerifyOutputTraits<TextColor>
{
public:
static WEX::Common::NoThrowString ToString(const TextColor& color)
{
if (color.IsDefault())
{
return L"{default}";
}
else if (color.IsRgb())
{
return WEX::Common::NoThrowString().Format(L"{RGB:0x%06x}", color._GetRGB());
}
else
{
return WEX::Common::NoThrowString().Format(L"{index:0x%04x}", color._red);
}
}
};
}
}
#endif
static_assert(sizeof(TextColor) <= 4 * sizeof(BYTE), "We should only need 4B for an entire TextColor. Any more than that is just waste");