terminal/src/inc/conattrs.hpp
James Holderness b604117421
Standardize the color table order (#11602)
## Summary of the Pull Request

In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day.

## References

This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728.

## PR Checklist
* [x] Closes #11461
* [x] CLA signed.
* [x] Tests added/passed
* [ ] Documentation updated.
* [ ] Schema updated.
* [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461

## Detailed Description of the Pull Request / Additional comments

Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code.

And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values.

There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values.

These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame.

The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty.

The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before.

One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive.

The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone.

## Validation Steps Performed

A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate.

In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation.

This includes:
- `WriteConsoleOutput`
- `ReadConsoleOutput`
- `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter`
- `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` 
- `ScrollConsoleScreenBuffer`
- `GetConsoleScreenBufferInfo`
- `GetConsoleScreenBufferInfoEx`
- `SetConsoleScreenBufferInfoEx`

I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell.

In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation.

I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 22:13:22 +00:00

41 lines
1.5 KiB
C++

/*++
Copyright (c) Microsoft Corporation
Licensed under the MIT license.
--*/
#pragma once
#define FG_ATTRS (FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED | FOREGROUND_INTENSITY)
#define BG_ATTRS (BACKGROUND_BLUE | BACKGROUND_GREEN | BACKGROUND_RED | BACKGROUND_INTENSITY)
#define META_ATTRS (COMMON_LVB_LEADING_BYTE | COMMON_LVB_TRAILING_BYTE | COMMON_LVB_GRID_HORIZONTAL | COMMON_LVB_GRID_LVERTICAL | COMMON_LVB_GRID_RVERTICAL | COMMON_LVB_REVERSE_VIDEO | COMMON_LVB_UNDERSCORE)
enum class ExtendedAttributes : BYTE
{
Normal = 0x00,
Bold = 0x01,
Italics = 0x02,
Blinking = 0x04,
Invisible = 0x08,
CrossedOut = 0x10,
Underlined = 0x20,
DoublyUnderlined = 0x40,
Faint = 0x80,
};
DEFINE_ENUM_FLAG_OPERATORS(ExtendedAttributes);
enum class CursorType : unsigned int
{
Legacy = 0x0, // uses the cursor's height value to range from underscore-like to full box
VerticalBar = 0x1, // A single vertical line, '|'
Underscore = 0x2, // a single horizontal underscore, smaller that the min height legacy cursor.
EmptyBox = 0x3, // Just the outline of a full box
FullBox = 0x4, // a full box, similar to legacy with height=100%
DoubleUnderscore = 0x5 // a double horizontal underscore
};
// Valid COLORREFs are of the pattern 0x00bbggrr. -1 works as an invalid color,
// as the highest byte of a valid color is always 0.
constexpr COLORREF INVALID_COLOR = 0xffffffff;
constexpr WORD COLOR_TABLE_SIZE = 16;
constexpr WORD XTERM_COLOR_TABLE_SIZE = 256;