terminal/src/buffer/out/TextColor.h

162 lines
4.5 KiB
C
Raw Normal View History

/*++
Copyright (c) Microsoft Corporation
Licensed under the MIT license.
Module Name:
- TextColor.h
Abstract:
- contains data for a single color of the text. Text Attributes are composed of
two of these - one for the foreground and one for the background.
The color can be in one of three states:
* Default Colors - The terminal should use the terminal's notion of whatever
the default color should be for this component.
It's up to the terminal that's consuming this buffer to control the
behavior of default attributes.
Terminals typically have a pair of Default colors that are separate from
their color table. This component should use that value.
Consoles also can have a legacy table index as their default colors.
* Indexed Color - The terminal should use our value as an index into the
color table to retrieve the real value of the color.
This is the type of color that "legacy" 16-color attributes have.
* RGB color - We'll store a real color value in this attribute
Author(s):
- Mike Griese (migrie) Nov 2018
Revision History:
- From components of output.h/.c
by Therese Stowell (ThereseS) 1990-1991
- Pulled into its own file from textBuffer.hpp/cpp (AustDi, 2017)
- Moved the colors into their own separate abstraction. (migrie Nov 2018)
--*/
#pragma once
#ifdef UNIT_TESTING
#include "WexTestClass.h"
#endif
enum class ColorType : BYTE
{
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
IsIndex256 = 0x0,
IsIndex16 = 0x1,
IsDefault = 0x2,
IsRgb = 0x3
};
struct TextColor
{
public:
constexpr TextColor() noexcept :
_meta{ ColorType::IsDefault },
_red{ 0 },
_green{ 0 },
_blue{ 0 }
{
}
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
constexpr TextColor(const BYTE index, const bool isIndex256) noexcept :
_meta{ isIndex256 ? ColorType::IsIndex256 : ColorType::IsIndex16 },
_index{ index },
_green{ 0 },
_blue{ 0 }
{
}
constexpr TextColor(const COLORREF rgb) noexcept :
_meta{ ColorType::IsRgb },
_red{ GetRValue(rgb) },
_green{ GetGValue(rgb) },
_blue{ GetBValue(rgb) }
{
}
friend constexpr bool operator==(const TextColor& a, const TextColor& b) noexcept;
friend constexpr bool operator!=(const TextColor& a, const TextColor& b) noexcept;
Improve the propagation of color attributes over ConPTY (#6506) This PR reimplements the VT rendering engines to do a better job of preserving the original color types when propagating attributes over ConPTY. For the 16-color renderers it provides better support for default colors and improves the efficiency of the color narrowing conversions. It also fixes problems with the ordering of character renditions that could result in attributes being dropped. Originally the base renderer would calculate the RGB color values and legacy/extended attributes up front, passing that data on to the active engine's `UpdateDrawingBrushes` method. With this new implementation, the renderer now just passes through the original `TextAttribute` along with an `IRenderData` interface, and leaves it to the engines to extract the information they need. The GDI and DirectX engines now have to lookup the RGB colors themselves (via simple `IRenderData` calls), but have no need for the other attributes. The VT engines extract the information that they need from the `TextAttribute`, instead of having to reverse engineer it from `COLORREF`s. The process for the 256-color Xterm engine starts with a check for default colors. If both foreground and background are default, it outputs a SGR 0 reset, and clears the `_lastTextAttribute` completely to make sure any reset state is reapplied. With that out the way, the foreground and background are updated (if changed) in one of 4 ways. They can either be a default value (SGR 39 and 49), a 16-color index (using ANSI or AIX sequences), a 256-color index, or a 24-bit RGB value (both using SGR 38 and 48 sequences). Then once the colors are accounted for, there is a separate step that handles the character rendition attributes (bold, italics, underline, etc.) This step must come _after_ the color sequences, in case a SGR reset is required, which would otherwise have cleared any character rendition attributes if it came last (which is what happened in the original implementation). The process for the 16-color engines is a little different. The target client in this case (Windows telnet) is incapable of setting default colors individually, so we need to output an SGR 0 reset if _either_ color has changed to default. With that out the way, we use the `TextColor::GetLegacyIndex` method to obtain an approximate 16-color index for each color, and apply the bold attribute by brightening the foreground index (setting bit 8) if the color type permits that. However, since Windows telnet only supports the 8 basic ANSI colors, the best we can do for bright colors is to output an SGR 1 attribute to get a bright foreground. There is nothing we can do about a bright background, so after that we just have to drop the high bit from the colors. If the resulting index values have changed from what they were before, we then output ANSI 8-color SGR sequences to update them. As with the 256-color engine, there is also a final step to handle the character rendition attributes. But in this case, the only supported attributes are underline and reversed video. Since the VT engines no longer depend on the active color table and default color values, there was quite a lot of code that could now be removed. This included the `IDefaultColorProvider` interface and implementations, the `Find(Nearest)TableIndex` functions, and also the associated HLS conversion and difference calculations. VALIDATION Other than simple API parameter changes, the majority of updates required in the unit tests were to correct assumptions about the way the colors should be rendered, which were the source of the narrowing bugs this PR was trying to fix. Like passing white on black to the `UpdateDrawingBrushes` API, and expecting it to output the default `SGR 0` sequence, or passing an RGB color and expecting an indexed SGR sequence. In addition to that, I've added some VT renderer tests to make sure the rendition attributes (bold, underline, etc) are correctly retained when a default color update causes an `SGR 0` sequence to be generated (the source of bug #3076). And I've extended the VT renderer color tests (both 256-color and 16-color) to make sure we're covering all of the different color types (default, RGB, and both forms of indexed colors). I've also tried to manually verify that all of the test cases in the linked bug reports (and their associated duplicates) are now fixed when this PR is applied. Closes #2661 Closes #3076 Closes #3717 Closes #5384 Closes #5864 This is only a partial fix for #293, but I suspect the remaining cases are unfixable.
2020-07-01 20:10:36 +02:00
bool CanBeBrightened() const noexcept;
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
bool IsLegacy() const noexcept;
bool IsIndex16() const noexcept;
bool IsIndex256() const noexcept;
bool IsDefault() const noexcept;
bool IsRgb() const noexcept;
void SetColor(const COLORREF rgbColor) noexcept;
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
void SetIndex(const BYTE index, const bool isIndex256) noexcept;
void SetDefault() noexcept;
COLORREF GetColor(gsl::span<const COLORREF> colorTable,
const COLORREF defaultColor,
Refactor the renderer color calculations (#6853) This is a refactoring of the renderer color calculations to simplify the implementation, and to make it easier to support additional color-altering rendition attributes in the future (e.g. _faint_ and _conceal_). ## References * This is a followup to PRs #3817 and #6809, which introduced additional complexity in the color calculations, and which suggested the need for refactoring. ## Detailed Description of the Pull Request / Additional comments When we added support for `DECSCNM`, that required the foreground and background color lookup methods to be able to return the opposite of what was requested when the reversed mode was set. That made those methods unnecessarily complicated, and I thought we could simplify them considerably just by combining the calculations into a single method that derived both colors at the same time. And since both conhost and Windows Terminal needed to perform the same calculations, it also made sense to move that functionality into the `TextAttribute` class, where it could easily be shared. In general this way of doing things is a bit more efficient. However, it does result in some unnecessary work when only one of the colors is required, as is the case for the gridline painter. So to make that less of an issue, I've reordered the gridline code a bit so it at least avoids looking up the colors when no gridlines are needed. ## Validation Steps Performed Because of the API changes, quite a lot of the unit tests had to be updated. For example instead of verifying colors with two separate calls to `LookupForegroundColor` and `LookupBackgroundColor`, that's now achieved with a single `LookupAttributeColors` call, comparing against a pair of values. The specifics of the tests haven't changed though, and they're all still working as expected. I've also manually confirmed that the various color sequences and rendition attributes are rendering correctly with the new refactoring.
2020-07-11 00:26:34 +02:00
const bool brighten = false) const noexcept;
Improve the legacy color conversions (#6358) This PR provides a faster algorithm for converting 8-bit and 24-bit colors into the 4-bit legacy values that are required by the Win32 console APIs. It also fixes areas of the code that were incorrectly using a simple 16-color conversion that didn't handle 8-bit and 24-bit values. The faster conversion algorithm should be an improvement for issues #783 and #3950. One of the main points of this PR was to fix the `ReadConsoleOutputAttribute` API, which was using a simplified legacy color conversion (the original `TextAttribute:GetLegacyAttributes` method), which could only handle values from the 16-color table. RGB values, and colors from the 256-color table, would be mapped to completely nonsensical values. This API has now been updated to use the more correct `Settings::GenerateLegacyAttributes` method. But there were also a couple of other places in the code that were using `GetLegacyAttributes` when they really had no reason to be working with legacy attributes at all. This could result in colors being downgraded to 4-bit values (often badly, as explained above), when the code was already perfectly capable of displaying the full 24-bits. This included the fill colors in the IME composer (in `ConsoleImeInfo`), and the construction of the highlighting colors in the color search/selection handler (`Selection::_HandleColorSelection`). I also got rid of some legacy attribute code in the `Popup` class, which was originally intended to update colors below the popup when the settings changed, but actually caused more problems than it solved. The other major goal of this PR was to improve the performance of the `GenerateLegacyAttributes` method, since the existing implementation could be quite slow when dealing with RGB values. The simple cases are handled much the same as they were before. For an `IsDefault` color, we get the default index from the `Settings::_wFillAttribute` field. For an `IsIndex16` color, the index can just be returned as is. For an `IsRgb` color, the RGB components are compressed down to 8 bits (3 red, 3 green, 2 blue), simply by dropping the least significant bits. This 8-bit value is then used to lookup a representative 16-color value from a hard-coded table. An `IsIndex256` color is also converted with a lookup table, just using the existing 8-bit index. The RGB mapping table was calculated by taking each compressed 8-bit color, and picking a entry from the _Campbell_ palette that best approximated that color. This was done by looking at a range of 24-bit colors that mapped to the 8-bit value, finding the best _Campbell_ match for each of them (using a [CIEDE2000] color difference calculation), and then the most common match became the index that the 8-bit value would map to. The 256-color table was just a simpler version of this process. For each entry in the table, we take the default RGB palette value, and find it's closest match in the _Campbell_ palette. Because these tables are hard-coded, the results won't adjust to changes in the palette. However, they should still produce reasonable results for palettes that follow the standard ANSI color range. And since they're only a very loose approximation of the colors anyway, the exact value really isn't that important. That said, I have tried to make sure that if you take an RGB value for a particular index in a reasonable color scheme, then the legacy color mapped from that value should ideally match the same index. This will never be possible for all color schemes, but I have tweaked a few of the table entries to improve the results for some of the common schemes. One other point worth making regarding the hard-coded tables: even if we wanted to take the active palette into account, that wouldn't actually be possible over a conpty connection, because we can't easily know what color scheme the client application is using. At least this way the results in conhost are guaranteed to be the same as in the Windows Terminal. [CIEDE2000]: https://en.wikipedia.org/wiki/Color_difference#CIEDE2000 ## Validation Steps Performed This code still passes the `TextAttributeTests` that check the basic `GetLegacyAttribute` behaviour and verify the all legacy attributes roundtrip correctly. However, some of the values in the `RgbColorTests` had to be updated, since we're now intentionally returning different values as a result of the changes to the RGB conversion algorithm. I haven't added additional unit tests, but I have done a lot of manual testing to see how well the new algorithm works with a range of colors and a variety of different color schemes. It's not perfect in every situation, but I think it works well enough for the purpose it serves. I've also confirmed that the issues reported in #5940 and #6247 are now fixed by these changes. Closes #5940 Closes #6247
2020-06-08 21:05:06 +02:00
BYTE GetLegacyIndex(const BYTE defaultIndex) const noexcept;
constexpr BYTE GetIndex() const noexcept
{
return _index;
}
Improve the propagation of color attributes over ConPTY (#6506) This PR reimplements the VT rendering engines to do a better job of preserving the original color types when propagating attributes over ConPTY. For the 16-color renderers it provides better support for default colors and improves the efficiency of the color narrowing conversions. It also fixes problems with the ordering of character renditions that could result in attributes being dropped. Originally the base renderer would calculate the RGB color values and legacy/extended attributes up front, passing that data on to the active engine's `UpdateDrawingBrushes` method. With this new implementation, the renderer now just passes through the original `TextAttribute` along with an `IRenderData` interface, and leaves it to the engines to extract the information they need. The GDI and DirectX engines now have to lookup the RGB colors themselves (via simple `IRenderData` calls), but have no need for the other attributes. The VT engines extract the information that they need from the `TextAttribute`, instead of having to reverse engineer it from `COLORREF`s. The process for the 256-color Xterm engine starts with a check for default colors. If both foreground and background are default, it outputs a SGR 0 reset, and clears the `_lastTextAttribute` completely to make sure any reset state is reapplied. With that out the way, the foreground and background are updated (if changed) in one of 4 ways. They can either be a default value (SGR 39 and 49), a 16-color index (using ANSI or AIX sequences), a 256-color index, or a 24-bit RGB value (both using SGR 38 and 48 sequences). Then once the colors are accounted for, there is a separate step that handles the character rendition attributes (bold, italics, underline, etc.) This step must come _after_ the color sequences, in case a SGR reset is required, which would otherwise have cleared any character rendition attributes if it came last (which is what happened in the original implementation). The process for the 16-color engines is a little different. The target client in this case (Windows telnet) is incapable of setting default colors individually, so we need to output an SGR 0 reset if _either_ color has changed to default. With that out the way, we use the `TextColor::GetLegacyIndex` method to obtain an approximate 16-color index for each color, and apply the bold attribute by brightening the foreground index (setting bit 8) if the color type permits that. However, since Windows telnet only supports the 8 basic ANSI colors, the best we can do for bright colors is to output an SGR 1 attribute to get a bright foreground. There is nothing we can do about a bright background, so after that we just have to drop the high bit from the colors. If the resulting index values have changed from what they were before, we then output ANSI 8-color SGR sequences to update them. As with the 256-color engine, there is also a final step to handle the character rendition attributes. But in this case, the only supported attributes are underline and reversed video. Since the VT engines no longer depend on the active color table and default color values, there was quite a lot of code that could now be removed. This included the `IDefaultColorProvider` interface and implementations, the `Find(Nearest)TableIndex` functions, and also the associated HLS conversion and difference calculations. VALIDATION Other than simple API parameter changes, the majority of updates required in the unit tests were to correct assumptions about the way the colors should be rendered, which were the source of the narrowing bugs this PR was trying to fix. Like passing white on black to the `UpdateDrawingBrushes` API, and expecting it to output the default `SGR 0` sequence, or passing an RGB color and expecting an indexed SGR sequence. In addition to that, I've added some VT renderer tests to make sure the rendition attributes (bold, underline, etc) are correctly retained when a default color update causes an `SGR 0` sequence to be generated (the source of bug #3076). And I've extended the VT renderer color tests (both 256-color and 16-color) to make sure we're covering all of the different color types (default, RGB, and both forms of indexed colors). I've also tried to manually verify that all of the test cases in the linked bug reports (and their associated duplicates) are now fixed when this PR is applied. Closes #2661 Closes #3076 Closes #3717 Closes #5384 Closes #5864 This is only a partial fix for #293, but I suspect the remaining cases are unfixable.
2020-07-01 20:10:36 +02:00
COLORREF GetRGB() const noexcept;
private:
union
{
BYTE _red, _index;
};
BYTE _green;
BYTE _blue;
ColorType _meta;
#ifdef UNIT_TESTING
friend class TextBufferTests;
template<typename TextColor>
friend class WEX::TestExecution::VerifyOutputTraits;
#endif
};
bool constexpr operator==(const TextColor& a, const TextColor& b) noexcept
{
return a._meta == b._meta &&
a._red == b._red &&
a._green == b._green &&
a._blue == b._blue;
}
bool constexpr operator!=(const TextColor& a, const TextColor& b) noexcept
{
return !(a == b);
}
#ifdef UNIT_TESTING
namespace WEX
{
namespace TestExecution
{
template<>
class VerifyOutputTraits<TextColor>
{
public:
static WEX::Common::NoThrowString ToString(const TextColor& color)
{
if (color.IsDefault())
{
return L"{default}";
}
else if (color.IsRgb())
{
Improve the propagation of color attributes over ConPTY (#6506) This PR reimplements the VT rendering engines to do a better job of preserving the original color types when propagating attributes over ConPTY. For the 16-color renderers it provides better support for default colors and improves the efficiency of the color narrowing conversions. It also fixes problems with the ordering of character renditions that could result in attributes being dropped. Originally the base renderer would calculate the RGB color values and legacy/extended attributes up front, passing that data on to the active engine's `UpdateDrawingBrushes` method. With this new implementation, the renderer now just passes through the original `TextAttribute` along with an `IRenderData` interface, and leaves it to the engines to extract the information they need. The GDI and DirectX engines now have to lookup the RGB colors themselves (via simple `IRenderData` calls), but have no need for the other attributes. The VT engines extract the information that they need from the `TextAttribute`, instead of having to reverse engineer it from `COLORREF`s. The process for the 256-color Xterm engine starts with a check for default colors. If both foreground and background are default, it outputs a SGR 0 reset, and clears the `_lastTextAttribute` completely to make sure any reset state is reapplied. With that out the way, the foreground and background are updated (if changed) in one of 4 ways. They can either be a default value (SGR 39 and 49), a 16-color index (using ANSI or AIX sequences), a 256-color index, or a 24-bit RGB value (both using SGR 38 and 48 sequences). Then once the colors are accounted for, there is a separate step that handles the character rendition attributes (bold, italics, underline, etc.) This step must come _after_ the color sequences, in case a SGR reset is required, which would otherwise have cleared any character rendition attributes if it came last (which is what happened in the original implementation). The process for the 16-color engines is a little different. The target client in this case (Windows telnet) is incapable of setting default colors individually, so we need to output an SGR 0 reset if _either_ color has changed to default. With that out the way, we use the `TextColor::GetLegacyIndex` method to obtain an approximate 16-color index for each color, and apply the bold attribute by brightening the foreground index (setting bit 8) if the color type permits that. However, since Windows telnet only supports the 8 basic ANSI colors, the best we can do for bright colors is to output an SGR 1 attribute to get a bright foreground. There is nothing we can do about a bright background, so after that we just have to drop the high bit from the colors. If the resulting index values have changed from what they were before, we then output ANSI 8-color SGR sequences to update them. As with the 256-color engine, there is also a final step to handle the character rendition attributes. But in this case, the only supported attributes are underline and reversed video. Since the VT engines no longer depend on the active color table and default color values, there was quite a lot of code that could now be removed. This included the `IDefaultColorProvider` interface and implementations, the `Find(Nearest)TableIndex` functions, and also the associated HLS conversion and difference calculations. VALIDATION Other than simple API parameter changes, the majority of updates required in the unit tests were to correct assumptions about the way the colors should be rendered, which were the source of the narrowing bugs this PR was trying to fix. Like passing white on black to the `UpdateDrawingBrushes` API, and expecting it to output the default `SGR 0` sequence, or passing an RGB color and expecting an indexed SGR sequence. In addition to that, I've added some VT renderer tests to make sure the rendition attributes (bold, underline, etc) are correctly retained when a default color update causes an `SGR 0` sequence to be generated (the source of bug #3076). And I've extended the VT renderer color tests (both 256-color and 16-color) to make sure we're covering all of the different color types (default, RGB, and both forms of indexed colors). I've also tried to manually verify that all of the test cases in the linked bug reports (and their associated duplicates) are now fixed when this PR is applied. Closes #2661 Closes #3076 Closes #3717 Closes #5384 Closes #5864 This is only a partial fix for #293, but I suspect the remaining cases are unfixable.
2020-07-01 20:10:36 +02:00
return WEX::Common::NoThrowString().Format(L"{RGB:0x%06x}", color.GetRGB());
}
else
{
return WEX::Common::NoThrowString().Format(L"{index:0x%04x}", color._red);
}
}
};
}
}
#endif
static_assert(sizeof(TextColor) <= 4 * sizeof(BYTE), "We should only need 4B for an entire TextColor. Any more than that is just waste");