terminal/src/terminal/adapter/ut_adapter/adapterTest.cpp

2639 lines
119 KiB
C++
Raw Normal View History

// Copyright (c) Microsoft Corporation.
// Licensed under the MIT license.
#include "precomp.h"
#include <wextestclass.h>
#include "../../inc/consoletaeftemplates.hpp"
#include "adaptDispatch.hpp"
using namespace WEX::Common;
using namespace WEX::Logging;
using namespace WEX::TestExecution;
namespace Microsoft
{
namespace Console
{
namespace VirtualTerminal
{
class AdapterTest;
class ConAdapterTestGetSet;
};
};
};
enum class CursorY
{
TOP,
BOTTOM,
YCENTER
};
enum class CursorX
{
LEFT,
RIGHT,
XCENTER
};
enum class CursorDirection : size_t
{
UP = 0,
DOWN = 1,
RIGHT = 2,
LEFT = 3,
NEXTLINE = 4,
PREVLINE = 5
};
enum class AbsolutePosition : size_t
{
CursorHorizontal = 0,
VerticalLine = 1,
};
using namespace Microsoft::Console::VirtualTerminal;
class TestGetSet final : public ConGetSet
{
public:
bool GetConsoleScreenBufferInfoEx(CONSOLE_SCREEN_BUFFER_INFOEX& sbiex) const override
{
Log::Comment(L"GetConsoleScreenBufferInfoEx MOCK returning data...");
if (_getConsoleScreenBufferInfoExResult)
{
sbiex.dwSize = _bufferSize;
sbiex.srWindow = _viewport;
sbiex.dwCursorPosition = _cursorPos;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
sbiex.wAttributes = _attribute.GetLegacyAttributes();
}
return _getConsoleScreenBufferInfoExResult;
}
bool SetConsoleScreenBufferInfoEx(const CONSOLE_SCREEN_BUFFER_INFOEX& sbiex) override
{
Log::Comment(L"SetConsoleScreenBufferInfoEx MOCK returning data...");
if (_setConsoleScreenBufferInfoExResult)
{
VERIFY_ARE_EQUAL(_expectedCursorPos, sbiex.dwCursorPosition);
VERIFY_ARE_EQUAL(_expectedScreenBufferSize, sbiex.dwSize);
VERIFY_ARE_EQUAL(_expectedScreenBufferViewport, sbiex.srWindow);
VERIFY_ARE_EQUAL(_expectedAttribute, TextAttribute{ sbiex.wAttributes });
}
return _setConsoleScreenBufferInfoExResult;
}
bool SetConsoleCursorPosition(const COORD position) override
{
Log::Comment(L"SetConsoleCursorPosition MOCK called...");
if (_setConsoleCursorPositionResult)
{
VERIFY_ARE_EQUAL(_expectedCursorPos, position);
_cursorPos = position;
}
return _setConsoleCursorPositionResult;
}
bool SetConsoleWindowInfo(const bool absolute, const SMALL_RECT& window) override
{
Log::Comment(L"SetConsoleWindowInfo MOCK called...");
if (_setConsoleWindowInfoResult)
{
VERIFY_ARE_EQUAL(_expectedWindowAbsolute, absolute);
VERIFY_ARE_EQUAL(_expectedConsoleWindow, window);
_viewport = window;
}
return _setConsoleWindowInfoResult;
}
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
bool SetInputMode(const TerminalInput::Mode mode, const bool enabled) override
{
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
Log::Comment(L"SetInputMode MOCK called...");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
if (_setInputModeResult)
{
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
VERIFY_ARE_EQUAL(_expectedInputMode, mode);
VERIFY_ARE_EQUAL(_expectedInputModeEnabled, enabled);
}
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
return _setInputModeResult;
}
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
bool SetParserMode(const StateMachine::Mode mode, const bool enabled) override
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
{
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
Log::Comment(L"SetParserMode MOCK called...");
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
if (_setParserModeResult)
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
{
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
VERIFY_ARE_EQUAL(_expectedParserMode, mode);
VERIFY_ARE_EQUAL(_expectedParserModeEnabled, enabled);
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
}
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
return _setParserModeResult;
}
bool GetParserMode(const StateMachine::Mode /*mode*/) const override
{
Log::Comment(L"GetParserMode MOCK called...");
return false;
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
}
Add support for the DECSCNM screen mode (#3817) ## Summary of the Pull Request This adds support for the [`DECSCNM`](https://vt100.net/docs/vt510-rm/DECSCNM.html) private mode escape sequence, which toggles the display between normal and reverse screen modes. When reversed, the background and foreground colors are switched. Tested manually, with [Vttest](https://invisible-island.net/vttest/), and with some new unit tests. ## References This also fixes issue #72 for the most part, although if you toggle the mode too fast, there is no discernible flash. ## PR Checklist * [x] Closes #3773 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments I've implemented this as a new flag in the `Settings` class, along with updates to the `LookupForegroundColor` and `LookupBackgroundColor` methods, to switch the returned foreground and background colors when that flag is set. It also required a new private API in the `ConGetSet` interface to toggle the setting. And that API is then called from the `AdaptDispatch` class when the screen mode escape sequence is received. The last thing needed was to add a step to the `HardReset` method, to reset the mode back to normal, which is one of the `RIS` requirements. Note that this does currently work in the Windows Terminal, but once #2661 is implemented that may no longer be the case. It might become necessary to let the mode change sequences pass through conpty, and handle the color reversing on the client side. ## Validation Steps Performed I've added a state machine test to make sure the escape sequence is dispatched correctly, and a screen buffer test to confirm that the mode change does alter the interpretation of colors as expected. I've also confirmed that the various "light background" tests in Vttest now display correctly, and that the `tput flash` command (in a bash shell) does actually cause the screen to flash.
2020-01-22 23:29:50 +01:00
bool PrivateSetScreenMode(const bool /*reverseMode*/) override
{
Log::Comment(L"PrivateSetScreenMode MOCK called...");
return true;
}
Add support for VT100 Auto Wrap Mode (DECAWM) (#3943) ## Summary of the Pull Request This adds support for the [`DECAWM`](https://vt100.net/docs/vt510-rm/DECAWM) private mode escape sequence, which controls whether or not the output wraps to the next line when the cursor reaches the right edge of the screen. Tested manually, with [Vttest](https://invisible-island.net/vttest/), and with some new unit tests. ## PR Checklist * [x] Closes #3826 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #3826 ## Detailed Description of the Pull Request / Additional comments The idea was to repurpose the existing `ENABLE_WRAP_AT_EOL_OUTPUT` mode, but the problem with that was it didn't work in VT mode - specifically, disabling it didn't prevent the wrapping from happening. This was because in VT mode the `WC_DELAY_EOL_WRAP` behaviour takes affect, and that bypasses the usual codepath where `ENABLE_WRAP_AT_EOL_OUTPUT` is checked, To fix this, I had to add additional checks in the `WriteCharsLegacy` function (7dbefe06e41f191a0e83cfefe4896b66094c4089) to make sure the `WC_DELAY_EOL_WRAP` mode is only activated when `ENABLE_WRAP_AT_EOL_OUTPUT` is also set. Once that was fixed, though, another issue came to light: the `ENABLE_WRAP_AT_EOL_OUTPUT` mode doesn't actually work as documented. According to the docs, "if this mode is disabled, the last character in the row is overwritten with any subsequent characters". What actually happens is the cursor jumps back to the position at the start of the write, which could be anywhere on the line. This seems completely broken to me, but I've checked in the Windows XP, and it has the same behaviour, so it looks like that's the way it has always been. So I've added a fix for this (9df98497ca38f7d0ea42623b723a8e2ecf9a4ab9), but it is only applied in VT mode. Once that basic functionality was in place, though, we just needed a private API in the `ConGetSet` interface to toggle the mode, and then that API could be called from the `AdaptDispatch` class when the `DECAWM` escape sequence was received. One last thing was to reenable the mode in reponse to a `DECSTR` soft reset. Technically the auto wrap mode was disabled by default on many of the DEC terminals, and some documentation suggests that `DECSTR` should reset it to that state, But most modern terminals (including XTerm) expect the wrapping to be enabled by default, and `DECSTR` reenables that state, so that's the behaviour I've copied. ## Validation Steps Performed I've add a state machine test to confirm the `DECAWM` escape is dispatched correctly, and a screen buffer test to make sure the output is wrapped or clamped as appropriate for the two states. I've also confirmed that the "wrap around" test is now working correctly in the _Test of screen features_ in Vttest.
2020-02-04 01:20:21 +01:00
bool PrivateSetAutoWrapMode(const bool /*wrapAtEOL*/) override
{
Log::Comment(L"PrivateSetAutoWrapMode MOCK called...");
return false;
}
bool PrivateShowCursor(const bool show) override
{
Log::Comment(L"PrivateShowCursor MOCK called...");
if (_privateShowCursorResult)
{
VERIFY_ARE_EQUAL(_expectedShowCursor, show);
}
return _privateShowCursorResult;
}
bool PrivateAllowCursorBlinking(const bool enable) override
{
Log::Comment(L"PrivateAllowCursorBlinking MOCK called...");
if (_privateAllowCursorBlinkingResult)
{
VERIFY_ARE_EQUAL(_enable, enable);
}
return _privateAllowCursorBlinkingResult;
}
bool PrivateIsVtInputEnabled() const override
{
return false;
}
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
bool PrivateGetTextAttributes(TextAttribute& attrs) const
{
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
Log::Comment(L"PrivateGetTextAttributes MOCK called...");
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
if (_privateGetTextAttributesResult)
{
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
attrs = _attribute;
}
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
return _privateGetTextAttributesResult;
}
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
bool PrivateSetTextAttributes(const TextAttribute& attrs)
{
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
Log::Comment(L"PrivateSetTextAttributes MOCK called...");
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
if (_privateSetTextAttributesResult)
{
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_ARE_EQUAL(_expectedAttribute, attrs);
_attribute = attrs;
}
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
return _privateSetTextAttributesResult;
Improve the DECSC/DECRC implementation (#3160) The current DECSC implementation only saves the cursor position and origin mode. This PR improves that functionality with additional support for saving the SGR attributes, as well as the active character set. It also fixes the way the saved state interacts with the alt buffer (private mode 1049), triggering a save when switching to the alt buffer, and a restore when switching back, and tracking the alt buffer state independently from the main state. In order to properly save and restore the SGR attributes, we first needed to add a pair of APIs in the `ConGetSet` interface which could round-trip the attributes with full 32-bit colors (the existing methods only work with legacy attributes). I then added a struct in the `AdaptDispatch` implementation to make it easier to manage all of the parameters that needed to be saved. This includes the cursor position and origin mode that we were already tracking, and now also the SGR text attributes and the active character set (via the `TermOutput` class). Two instances of this structure are required, since changes made to the saved state in the alt buffer need to be tracked separately from changes in the main buffer. I've added a boolean property that specifies whether we're in the alt buffer or not, and use that to decide which of the two instances we're working with. I also needed to explicitly trigger a save when switching to the alt buffer, and a restore when switching back, since we weren't already doing that (the existing implementation gave the impression that the state was saved, because each buffer has its own cursor position, but that doesn't properly match the XTerm behaviour). For the state tracking itself, we've now got two additional properties - the SGR attributes, which we obtain via the new private API, and the active character set, which we get from a local `AdaptDispatch` field. I'm saving the whole `TermOutput` class for the character set, since I'm hoping that will make it automatically supports future enhancements. When restoring the cursor position, there is also now a fix to handle the relative origin mode correctly. If the margins are changed between the position being saved and restored, it's possible for the cursor to end up outside of the new margins, which would be illegal. So there is now an additional step that clamps the Y coordinate within the margin boundaries if the origin mode is relative. # Validation I've added a couple of screen buffer tests which check that the various parameters are saved and restored as expected, as well as checking that the Y coordinate is clamped appropriately when the relative origin mode is set. I've also tested manually with vttest and confirmed that the _SAVE/RESTORE CURSOR_ test (the last page of the _Test of screen features_)) is now working a lot better than it used to. Closes #148.
2019-11-05 22:35:50 +01:00
}
Add support for double-width/double-height lines in conhost (#8664) This PR adds support for the VT line rendition attributes, which allow for double-width and double-height line renditions. These renditions are enabled with the `DECDWL` (double-width line) and `DECDHL` (double-height line) escape sequences. Both reset to the default rendition with the `DECSWL` (single-width line) escape sequence. For now this functionality is only supported by the GDI renderer in conhost. There are a lot of changes, so this is just a general overview of the main areas affected. Previously it was safe to assume that the screen had a fixed width, at least for a given point in time. But now we need to deal with the possibility of different lines have different widths, so all the functions that are constrained by the right border (text wrapping, cursor movement operations, and sequences like `EL` and `ICH`) now need to lookup the width of the active line in order to behave correctly. Similarly it used to be safe to assume that buffer and screen coordinates were the same thing, but that is no longer true. Lots of places now need to translate back and forth between coordinate systems dependent on the line rendition. This includes clipboard handling, the conhost color selection and search, accessibility location tracking and screen reading, IME editor positioning, "snapping" the viewport, and of course all the rendering calculations. For the rendering itself, I've had to introduce a new `PrepareLineTransform` method that the render engines can use to setup the necessary transform matrix for a given line rendition. This is also now used to handle the horizontal viewport offset, since that could no longer be achieved just by changing the target coordinates (on a double width line, the viewport offset may be halfway through a character). I've also had to change the renderer's existing `InvalidateCursor` method to take a `SMALL_RECT` rather than a `COORD`, to allow for the cursor being a variable width. Technically this was already a problem, because the cursor could occupy two screen cells when over a double-width character, but now it can be anything between one and four screen cells (e.g. a double-width character on the double-width line). In terms of architectural changes, there is now a new `lineRendition` field in the `ROW` class that keeps track of the line rendition for each row, and several new methods in the `ROW` and `TextBuffer` classes for manipulating that state. This includes a few helper methods for handling the various issues discussed above, e.g. position clamping and translating between coordinate systems. ## Validation Steps Performed I've manually confirmed all the double-width and double-height tests in _Vttest_ are now working as expected, and the _VT100 Torture Test_ now renders correctly (at least the line rendition aspects). I've also got my own test scripts that check many of the line rendition boundary cases and have confirmed that those are now passing. I've manually tested as many areas of the conhost UI that I could think of, that might be affected by line rendition, including things like searching, selection, copying, and color highlighting. For accessibility, I've confirmed that the _Magnifier_ and _Narrator_ correctly handle double-width lines. And I've also tested the Japanese IME, which while not perfect, is at least useable. Closes #7865
2021-02-18 06:44:50 +01:00
bool PrivateSetCurrentLineRendition(const LineRendition /*lineRendition*/)
{
Log::Comment(L"PrivateSetCurrentLineRendition MOCK called...");
return false;
}
bool PrivateResetLineRenditionRange(const size_t /*startRow*/, const size_t /*endRow*/)
{
Log::Comment(L"PrivateResetLineRenditionRange MOCK called...");
return false;
}
SHORT PrivateGetLineWidth(const size_t /*row*/) const
{
Log::Comment(L"PrivateGetLineWidth MOCK called...");
return _bufferSize.X;
}
bool PrivateWriteConsoleInputW(std::deque<std::unique_ptr<IInputEvent>>& events,
size_t& eventsWritten) override
{
Log::Comment(L"PrivateWriteConsoleInputW MOCK called...");
if (_privateWriteConsoleInputWResult)
{
// move all the input events we were given into local storage so we can test against them
Log::Comment(NoThrowString().Format(L"Moving %zu input events into local storage...", events.size()));
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
if (_retainInput)
{
std::move(events.begin(), events.end(), std::back_inserter(_events));
}
else
{
_events.clear();
_events.swap(events);
}
eventsWritten = _events.size();
}
return _privateWriteConsoleInputWResult;
}
bool PrivateWriteConsoleControlInput(_In_ KeyEvent key) override
{
Log::Comment(L"PrivateWriteConsoleControlInput MOCK called...");
if (_privateWriteConsoleControlInputResult)
{
VERIFY_ARE_EQUAL('C', key.GetVirtualKeyCode());
VERIFY_ARE_EQUAL(0x3, key.GetCharData());
VERIFY_ARE_EQUAL(true, key.IsCtrlPressed());
}
return _privateWriteConsoleControlInputResult;
}
bool PrivateSetScrollingRegion(const SMALL_RECT& scrollMargins) override
{
Log::Comment(L"PrivateSetScrollingRegion MOCK called...");
if (_privateSetScrollingRegionResult)
{
VERIFY_ARE_EQUAL(_expectedScrollRegion, scrollMargins);
}
return _privateSetScrollingRegionResult;
}
Dispatch more C0 control characters from the VT state machine (#4171) This commit moves the handling of the `BEL`, `BS`, `TAB`, and `CR` controls characters into the state machine (when in VT mode), instead of forwarding them on to the default string writer, which would otherwise have to parse them out all over again. This doesn't cover all the control characters, but `ESC`, `SUB`, and `CAN` are already an integral part of the `StateMachine` itself; `NUL` is filtered out by the `OutputStateMachineEngine`; and `LF`, `FF`, and `VT` are due to be implemented as part of PR #3271. Once all of these controls are handled at the state machine level, we can strip out all the VT-specific code from the `WriteCharsLegacy` function, which should simplify it considerably. This would also let us simplify the `Terminal::_WriteBuffer` implementation, and the planned replacement stream writer for issue #780. On the conhost side, the implementation is handled as follows: * The `BS` control is dispatched to the existing `CursorBackward` method, with a distance of 1. * The `TAB` control is dispatched to the existing `ForwardTab` method, with a tab count of 1. * The `CR` control required a new dispatch method, but the implementation was a simple call to the new `_CursorMovePosition` method from PR #3628. * The `BEL` control also required a new dispatch method, as well as an additional private API in the `ConGetSet` interface. But that's mostly boilerplate code - ultimately it just calls the `SendNotifyBeep` method. On the Windows Terminal side, not all dispatch methods are implemented. * There is an existing `CursorBackward` implementation, so `BS` works OK. * There isn't a `ForwardTab` implementation, but `TAB` isn't currently required by the conpty protocol. * I had to implement the `CarriageReturn` dispatch method, but that was a simple call to `Terminal::SetCursorPosition`. * The `WarningBell` method I've left unimplemented, because that functionality wasn't previously supported anyway, and there's an existing issue for that (#4046). ## Validation Steps Performed I've added a state machine test to confirm that the updated control characters are now forwarded to the appropriate dispatch handlers. But since the actual implementation is mostly relying on existing functionality, I'm assuming that code is already adequately tested elsewhere. That said, I have also run various manual tests of my own, and confirmed that everything still worked as well as before. References #3271 References #780 References #3628 References #4046
2020-01-17 02:43:21 +01:00
bool PrivateWarningBell() override
{
Log::Comment(L"PrivateWarningBell MOCK called...");
// We made it through the adapter, woo! Return true.
return TRUE;
}
Add support for all the line feed control sequences (#3271) ## Summary of the Pull Request This adds support for the `FF` (form feed) and `VT` (vertical tab) [control characters](https://vt100.net/docs/vt510-rm/chapter4.html#T4-1), as well as the [`NEL` (Next Line)](https://vt100.net/docs/vt510-rm/NEL.html) and [`IND` (Index)](https://vt100.net/docs/vt510-rm/IND.html) escape sequences. ## References #976 discusses the conflict between VT100 Index sequence and the VT52 cursor back sequence. ## PR Checklist * [x] Closes #3189 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #3189 ## Detailed Description of the Pull Request / Additional comments I've added a `LineFeed` method to the `ITermDispatch` interface, with an enum parameter specifying the required line feed type (i.e. with carriage return, without carriage return, or dependent on the [`LNM` mode](https://vt100.net/docs/vt510-rm/LNM.html)). The output state machine can then call that method to handle the various line feed control characters (parsed in the `ActionExecute` method), as well the `NEL` and `IND` escape sequences (parsed in the `ActionEscDispatch` method). The `AdaptDispatch` implementation of `LineFeed` then forwards the call to a new `PrivateLineFeed` method in the `ConGetSet` interface, which simply takes a bool parameter specifying whether a carriage return is required or not. In the case of mode-dependent line feeds, the `AdaptDispatch` implementation determines whether the return is necessary or not, based on the existing _AutoReturnOnNewLine_ setting (which I'm obtaining via another new `PrivateGetLineFeedMode` method). Ultimately we'll want to support changing the mode via the [`LNM` escape sequence](https://vt100.net/docs/vt510-rm/LNM.html), but there's no urgent need for that now. And using the existing _AutoReturnOnNewLine_ setting as a substitute for the mode gives us backwards compatible behaviour, since that will be true for the Windows shells (which expect a linefeed to also generate a carriage return), and false in a WSL bash shell (which won't want the carriage return by default). As for the actual `PrivateLineFeed` implementation, that is just a simplified version of how the line feed would previously have been executed in the `WriteCharsLegacy` function. This includes setting the cursor to "On" (with `Cursor::SetIsOn`), potentially clearing the wrap property of the line being left (with `CharRow::SetWrapForced` false), and then setting the new position using `AdjustCursorPosition` with the _fKeepCursorVisible_ parameter set to false. I'm unsure whether the `SetIsOn` call is really necessary, and I think the way the forced wrap is handled needs a rethink in general, but for now this should at least be compatible with the existing behaviour. Finally, in order to make this all work in the _Windows Terminal_ app, I also had to add a basic implementation of the `ITermDispatch::LineFeed` method in the `TerminalDispatch` class. There is currently no need to support mode-specific line feeds here, so this simply forwards a `\n` or `\r\n` to the `Execute` method, which is ultimately handled by the `Terminal::_WriteBuffer` implementation. ## Validation Steps Performed I've added output engine tests which confirm that the various control characters and escape sequences trigger the dispatch method correctly. Then I've added adapter tests which confirm the various dispatch options trigger the `PrivateLineFeed` API correctly. And finally I added some screen buffer tests that check the actual results of the `NEL` and `IND` sequences, which covers both forms of the `PrivateLineFeed` API (i.e. with and without a carriage return). I've also run the _Test of cursor movements_ in the [Vttest](https://invisible-island.net/vttest/) utility, and confirmed that screens 1, 2, and 5 are now working correctly. The first two depend on `NEL` and `IND` being supported, and screen 5 requires the `VT` control character.
2020-01-15 14:41:55 +01:00
bool PrivateGetLineFeedMode() const override
{
Log::Comment(L"PrivateGetLineFeedMode MOCK called...");
return _privateGetLineFeedModeResult;
}
bool PrivateLineFeed(const bool withReturn) override
{
Log::Comment(L"PrivateLineFeed MOCK called...");
if (_privateLineFeedResult)
{
VERIFY_ARE_EQUAL(_expectedLineFeedWithReturn, withReturn);
}
return _privateLineFeedResult;
}
bool PrivateReverseLineFeed() override
{
Log::Comment(L"PrivateReverseLineFeed MOCK called...");
// We made it through the adapter, woo! Return true.
return TRUE;
}
bool SetConsoleTitleW(const std::wstring_view title)
{
Log::Comment(L"SetConsoleTitleW MOCK called...");
if (_setConsoleTitleWResult)
{
// Put into WEX strings for rich logging when they don't compare.
VERIFY_ARE_EQUAL(String(_expectedWindowTitle.data(), gsl::narrow<int>(_expectedWindowTitle.size())),
String(title.data(), gsl::narrow<int>(title.size())));
}
return TRUE;
}
bool PrivateUseAlternateScreenBuffer() override
{
Log::Comment(L"PrivateUseAlternateScreenBuffer MOCK called...");
return true;
}
bool PrivateUseMainScreenBuffer() override
{
Log::Comment(L"PrivateUseMainScreenBuffer MOCK called...");
return true;
}
bool PrivateEraseAll() override
{
Log::Comment(L"PrivateEraseAll MOCK called...");
return TRUE;
}
bool PrivateClearBuffer() override
{
Log::Comment(L"PrivateClearBuffer MOCK called...");
return TRUE;
}
bool GetUserDefaultCursorStyle(CursorType& style) override
{
style = CursorType::Legacy;
return true;
}
bool SetCursorStyle(const CursorType cursorType) override
{
Log::Comment(L"SetCursorStyle MOCK called...");
if (_setCursorStyleResult)
{
VERIFY_ARE_EQUAL(_expectedCursorStyle, cursorType);
}
return _setCursorStyleResult;
}
bool PrivateRefreshWindow() override
{
Log::Comment(L"PrivateRefreshWindow MOCK called...");
// We made it through the adapter, woo! Return true.
return TRUE;
}
bool PrivateSuppressResizeRepaint() override
{
Log::Comment(L"PrivateSuppressResizeRepaint MOCK called...");
VERIFY_IS_TRUE(false, L"AdaptDispatch should never be calling this function.");
return FALSE;
}
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
bool SetConsoleOutputCP(const unsigned int codepage) override
Improve support for VT character sets (#4496) This PR improves our VT character set support, enabling the [`SCS`] escape sequences to designate into all four G-sets with both 94- and 96-character sets, and supports invoking those G-sets into both the GL and GR areas of the code table, with [locking shifts] and [single shifts]. It also adds [`DOCS`] sequences to switch between UTF-8 and the ISO-2022 coding system (which is what the VT character sets require), and adds support for a lot more characters sets, up to around the level of a VT510. [`SCS`]: https://vt100.net/docs/vt510-rm/SCS.html [locking shifts]: https://vt100.net/docs/vt510-rm/LS.html [single shifts]: https://vt100.net/docs/vt510-rm/SS.html [`DOCS`]: https://en.wikipedia.org/wiki/ISO/IEC_2022#Interaction_with_other_coding_systems ## Detailed Description of the Pull Request / Additional comments To make it easier for us to declare a bunch of character sets, I've made a little `constexpr` class that can build up a mapping table from a base character set (ASCII or Latin1), along with a collection of mappings for the characters the deviate from the base set. Many of the character sets are simple variations of ASCII, so they're easy to define this way. This class then casts directly to a `wstring_view` which is how the translation tables are represented in most of the code. We have an array of four of these tables representing the four G-sets, two instances for the active left and right tables, and one instance for the single shift table. Initially we had just one `DesignateCharset` method, which could select the active character set. We now have two designate methods (for 94- and 96- character sets), and each takes a G-set number specifying the target of the designation, and a pair of characters identifying the character set that will be designated (at the higher VT levels, character sets are often identified by more than one character). There are then two new `LockingShift` methods to invoke these G-sets into either the GL or GR area of the code table, and a `SingleShift` method which invokes a G-set temporarily (for just the next character that is output). I should mention here that I had to make some changes to the state machine to make these single shift sequences work. The problem is that the input state machine treats `SS3` as the start of a control sequence, while the output state machine needs it to be dispatched immediately (it's literally the _Single Shift 3_ escape sequence). To make that work, I've added a `ParseControlSequenceAfterSs3` callback in the `IStateMachineEngine` interface to decide which behavior is appropriate. When it comes to mapping a character, it's simply an array reference into the appropriate `wstring_view` table. If the single shift table is set, that takes preference. Otherwise the GL table is used for characters in the range 0x20 to 0x7F, and the GR table for characters 0xA0 to 0xFF (technically some character sets will only map up to 0x7E and 0xFE, but that's easily controlled by the length of the `wstring_view`). The `DEL` character is a bit of a special case. By default it's meant to be ignored like the `NUL` character (it's essentially a time-fill character). However, it's possible that it could be remapped to a printable character in a 96-character set, so we need to check for that after the translation. This is handled in the `AdaptDispatch::Print` method, so it doesn't interfere with the primary `PrintString` code path. The biggest problem with this whole process, though, is that the GR mappings only really make sense if you have access to the raw output, but by the time the output gets to us, it would already have been translated to Unicode by the active code page. And in the case of UTF-8, the characters we eventually receive may originally have been composed from two or more code points. The way I've dealt with this was to disable the GR translations by default, and then added support for a pair of ISO-2022 `DOCS` sequences, which can switch the code page between UTF-8 and ISO-8859-1. When the code page is ISO-8859-1, we're essentially receiving the raw output bytes, so it's safe to enable the GR translations. This is not strictly correct ISO-2022 behavior, and there are edge cases where it's not going to work, but it's the best solution I could come up with. ## Validation Steps Performed As a result of the `SS3` changes in the state machine engine, I've had to move the existing `SS3` tests from the `OutputEngineTest` to the `InputEngineTest`, otherwise they would now fail (technically they should never have been output tests). I've added no additional unit tests, but I have done a lot of manual testing, and made sure we passed all the character set tests in Vttest (at least for the character sets we currently support). Note that this required a slightly hacked version of the app, since by default it doesn't expose a lot of the test to low-level terminals, and we currently identify as a VT100. Closes #3377 Closes #3487
2020-06-04 21:40:15 +02:00
{
Log::Comment(L"SetConsoleOutputCP MOCK called...");
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
if (_setConsoleOutputCPResult)
{
VERIFY_ARE_EQUAL(_expectedOutputCP, codepage);
}
return _setConsoleOutputCPResult;
Improve support for VT character sets (#4496) This PR improves our VT character set support, enabling the [`SCS`] escape sequences to designate into all four G-sets with both 94- and 96-character sets, and supports invoking those G-sets into both the GL and GR areas of the code table, with [locking shifts] and [single shifts]. It also adds [`DOCS`] sequences to switch between UTF-8 and the ISO-2022 coding system (which is what the VT character sets require), and adds support for a lot more characters sets, up to around the level of a VT510. [`SCS`]: https://vt100.net/docs/vt510-rm/SCS.html [locking shifts]: https://vt100.net/docs/vt510-rm/LS.html [single shifts]: https://vt100.net/docs/vt510-rm/SS.html [`DOCS`]: https://en.wikipedia.org/wiki/ISO/IEC_2022#Interaction_with_other_coding_systems ## Detailed Description of the Pull Request / Additional comments To make it easier for us to declare a bunch of character sets, I've made a little `constexpr` class that can build up a mapping table from a base character set (ASCII or Latin1), along with a collection of mappings for the characters the deviate from the base set. Many of the character sets are simple variations of ASCII, so they're easy to define this way. This class then casts directly to a `wstring_view` which is how the translation tables are represented in most of the code. We have an array of four of these tables representing the four G-sets, two instances for the active left and right tables, and one instance for the single shift table. Initially we had just one `DesignateCharset` method, which could select the active character set. We now have two designate methods (for 94- and 96- character sets), and each takes a G-set number specifying the target of the designation, and a pair of characters identifying the character set that will be designated (at the higher VT levels, character sets are often identified by more than one character). There are then two new `LockingShift` methods to invoke these G-sets into either the GL or GR area of the code table, and a `SingleShift` method which invokes a G-set temporarily (for just the next character that is output). I should mention here that I had to make some changes to the state machine to make these single shift sequences work. The problem is that the input state machine treats `SS3` as the start of a control sequence, while the output state machine needs it to be dispatched immediately (it's literally the _Single Shift 3_ escape sequence). To make that work, I've added a `ParseControlSequenceAfterSs3` callback in the `IStateMachineEngine` interface to decide which behavior is appropriate. When it comes to mapping a character, it's simply an array reference into the appropriate `wstring_view` table. If the single shift table is set, that takes preference. Otherwise the GL table is used for characters in the range 0x20 to 0x7F, and the GR table for characters 0xA0 to 0xFF (technically some character sets will only map up to 0x7E and 0xFE, but that's easily controlled by the length of the `wstring_view`). The `DEL` character is a bit of a special case. By default it's meant to be ignored like the `NUL` character (it's essentially a time-fill character). However, it's possible that it could be remapped to a printable character in a 96-character set, so we need to check for that after the translation. This is handled in the `AdaptDispatch::Print` method, so it doesn't interfere with the primary `PrintString` code path. The biggest problem with this whole process, though, is that the GR mappings only really make sense if you have access to the raw output, but by the time the output gets to us, it would already have been translated to Unicode by the active code page. And in the case of UTF-8, the characters we eventually receive may originally have been composed from two or more code points. The way I've dealt with this was to disable the GR translations by default, and then added support for a pair of ISO-2022 `DOCS` sequences, which can switch the code page between UTF-8 and ISO-8859-1. When the code page is ISO-8859-1, we're essentially receiving the raw output bytes, so it's safe to enable the GR translations. This is not strictly correct ISO-2022 behavior, and there are edge cases where it's not going to work, but it's the best solution I could come up with. ## Validation Steps Performed As a result of the `SS3` changes in the state machine engine, I've had to move the existing `SS3` tests from the `OutputEngineTest` to the `InputEngineTest`, otherwise they would now fail (technically they should never have been output tests). I've added no additional unit tests, but I have done a lot of manual testing, and made sure we passed all the character set tests in Vttest (at least for the character sets we currently support). Note that this required a slightly hacked version of the app, since by default it doesn't expose a lot of the test to low-level terminals, and we currently identify as a VT100. Closes #3377 Closes #3487
2020-06-04 21:40:15 +02:00
}
bool GetConsoleOutputCP(unsigned int& codepage) override
{
Log::Comment(L"GetConsoleOutputCP MOCK called...");
if (_getConsoleOutputCPResult)
{
codepage = _expectedOutputCP;
}
return _getConsoleOutputCPResult;
}
bool IsConsolePty() const override
{
Log::Comment(L"IsConsolePty MOCK called...");
return _isPty;
}
bool DeleteLines(const size_t /*count*/) override
{
Log::Comment(L"DeleteLines MOCK called...");
return TRUE;
}
bool InsertLines(const size_t /*count*/) override
{
Log::Comment(L"InsertLines MOCK called...");
return TRUE;
}
bool MoveToBottom() const override
{
Log::Comment(L"MoveToBottom MOCK called...");
return _moveToBottomResult;
}
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
COLORREF GetColorTableEntry(const size_t tableIndex) const noexcept override
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
{
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
Log::Comment(L"GetColorTableEntry MOCK called...");
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
if (_getColorTableEntryResult)
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
{
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
VERIFY_ARE_EQUAL(_expectedColorTableIndex, tableIndex);
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
// Simply returning the index as the color value makes it easy for
// tests to confirm that they've received the color they expected.
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
return gsl::narrow_cast<COLORREF>(tableIndex);
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
}
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
return INVALID_COLOR;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
}
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
bool SetColorTableEntry(const size_t tableIndex, const COLORREF color) noexcept override
{
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
Log::Comment(L"SetColorTableEntry MOCK called...");
if (_setColorTableEntryResult)
{
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
VERIFY_ARE_EQUAL(_expectedColorTableIndex, tableIndex);
VERIFY_ARE_EQUAL(_expectedColorValue, color);
}
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
return _setColorTableEntryResult;
}
bool PrivateFillRegion(const COORD /*startPosition*/,
Correct fill attributes when scrolling and erasing (#3100) ## Summary of the Pull Request Operations that erase areas of the screen are typically meant to do so using the current color attributes, but with the rendition attributes reset (what we refer to as meta attributes). This also includes scroll operations that have to clear the area of the screen that has scrolled into view. The only exception is the _Erase Scrollback_ operation, which needs to reset the buffer with the default attributes. This PR updates all of these cases to apply the correct attributes when scrolling and erasing. ## PR Checklist * [x] Closes #2553 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've not really discussed this with core contributors. I'm ready to accept this work might be rejected in favor of a different grand plan. ## Detailed Description of the Pull Request / Additional comments My initial plan was to use a special case legacy attribute value to indicate the "standard erase attribute" which could safely be passed through the legacy APIs. But this wouldn't cover the cases that required default attributes to be used. And then with the changes in PR #2668 and #2987, it became clear that our requirements could be better achieved with a couple of new private APIs that wouldn't have to depend on legacy attribute hacks at all. To that end, I've added the `PrivateFillRegion` and `PrivateScrollRegion` APIs to the `ConGetSet` interface. These are just thin wrappers around the existing `SCREEN_INFORMATION::Write` method and the `ScrollRegion` function respectively, but with a simple boolean parameter to choose between filling with default attributes or the standard erase attributes (i.e the current colors but with meta attributes reset). With those new APIs in place, I could then update most scroll operations to use `PrivateScrollRegion`, and most erase operations to use `PrivateFillRegion`. The functions affected by scrolling included: * `DoSrvPrivateReverseLineFeed` (the RI command) * `DoSrvPrivateModifyLinesImpl` (the IL and DL commands) * `AdaptDispatch::_InsertDeleteHelper` (the ICH and DCH commands) * `AdaptDispatch::_ScrollMovement` (the SU and SD commands) The functions affected by erasing included: * `AdaptDispatch::_EraseSingleLineHelper` (the EL command, and most ED variants) * `AdaptDispatch::EraseCharacters` (the ECH command) While updating these erase methods, I noticed that both of them also required boundary fixes similar to those in PR #2505 (i.e. the horizontal extent of the erase operation should apply to the full width of the buffer, and not just the current viewport width), so I've addressed that at the same time. In addition to the changes above, there were also a few special cases, the first being the line feed handling, which required updating in a number of places to use the correct erase attributes: * `SCREEN_INFORMATION::InitializeCursorRowAttributes` - this is used to initialise the rows that pan into view when the viewport is moved down the buffer. * `TextBuffer::IncrementCircularBuffer` - this occurs when we scroll passed the very end of the buffer, and a recycled row now needs to be reinitialised. * `AdjustCursorPosition` - when within margin boundaries, this relies on a couple of direct calls to `ScrollRegion` which needed to be passed the correct fill attributes. The second special case was the full screen erase sequence (`ESC 2 J`), which is handled separately from the other ED sequences. This required updating the `SCREEN_INFORMATION::VtEraseAll` method to use the standard erase attributes, and also required changes to the horizontal extent of the filled area, since it should have been clearing the full buffer width (the same issue as the other erase operations mentioned above). Finally, there was the `AdaptDispatch::_EraseScrollback` method, which uses both scroll and fill operations, which could now be handled by the new `PrivateScrollRegion` and `PrivateFillRegion` APIs. But in this case we needed to fill with the default attributes rather than the standard erase attributes. And again this implementation needed some changes to make sure the full width of the active area was retained after the erase, similar to the horizontal boundary issues with the other erase operations. Once all these changes were made, there were a few areas of the code that could then be simplified quite a bit. The `FillConsoleOutputCharacterW`, `FillConsoleOutputAttribute`, and `ScrollConsoleScreenBufferW` were no longer needed in the `ConGetSet` interface, so all of that code could now be removed. The `_EraseSingleLineDistanceHelper` and `_EraseAreaHelper` methods in the `AdaptDispatch` class were also no longer required and could be removed. Then there were the hacks to handle legacy default colors in the `FillConsoleOutputAttributeImpl` and `ScrollConsoleScreenBufferWImpl` implementations. Since those hacks were only needed for VT operations, and the VT code no longer calls those methods, there was no longer a need to retain that behaviour (in fact there are probably some edge cases where that behaviour might have been considered a bug when reached via the public console APIs). ## Validation Steps Performed For most of the scrolling operations there were already existing tests in place, and those could easily be extended to check that the meta attributes were correctly reset when filling the revealed lines of the scrolling region. In the screen buffer tests, I made updates of that sort to the `ScrollOperations` method (handling SU, SD, IL, DL, and RI), the `InsertChars` and `DeleteChars` methods (ICH and DCH), and the `VtNewlinePastViewport` method (LF). I also added a new `VtNewlinePastEndOfBuffer` test to check the case where the line feed causes the viewport to pan past the end of the buffer. The erase operations, however, were being covered by adapter tests, and those aren't really suited for this kind of functionality (the same sort of issue came up in PR #2505). As a result I've had to reimplement those tests as screen buffer tests. Most of the erase operations are covered by the `EraseTests` method, except the for the scrollback erase which has a dedicated `EraseScrollbackTests` method. I've also had to replace the `HardReset` adapter test, but that was already mostly covered by the `HardResetBuffer` screen buffer test, which I've now extended slightly (it could do with some more checks, but I think that can wait for a future PR when we're fixing other RIS issues).
2019-12-11 00:14:40 +01:00
const size_t /*fillLength*/,
const wchar_t /*fillChar*/,
const bool /*standardFillAttrs*/) noexcept override
{
Correct fill attributes when scrolling and erasing (#3100) ## Summary of the Pull Request Operations that erase areas of the screen are typically meant to do so using the current color attributes, but with the rendition attributes reset (what we refer to as meta attributes). This also includes scroll operations that have to clear the area of the screen that has scrolled into view. The only exception is the _Erase Scrollback_ operation, which needs to reset the buffer with the default attributes. This PR updates all of these cases to apply the correct attributes when scrolling and erasing. ## PR Checklist * [x] Closes #2553 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've not really discussed this with core contributors. I'm ready to accept this work might be rejected in favor of a different grand plan. ## Detailed Description of the Pull Request / Additional comments My initial plan was to use a special case legacy attribute value to indicate the "standard erase attribute" which could safely be passed through the legacy APIs. But this wouldn't cover the cases that required default attributes to be used. And then with the changes in PR #2668 and #2987, it became clear that our requirements could be better achieved with a couple of new private APIs that wouldn't have to depend on legacy attribute hacks at all. To that end, I've added the `PrivateFillRegion` and `PrivateScrollRegion` APIs to the `ConGetSet` interface. These are just thin wrappers around the existing `SCREEN_INFORMATION::Write` method and the `ScrollRegion` function respectively, but with a simple boolean parameter to choose between filling with default attributes or the standard erase attributes (i.e the current colors but with meta attributes reset). With those new APIs in place, I could then update most scroll operations to use `PrivateScrollRegion`, and most erase operations to use `PrivateFillRegion`. The functions affected by scrolling included: * `DoSrvPrivateReverseLineFeed` (the RI command) * `DoSrvPrivateModifyLinesImpl` (the IL and DL commands) * `AdaptDispatch::_InsertDeleteHelper` (the ICH and DCH commands) * `AdaptDispatch::_ScrollMovement` (the SU and SD commands) The functions affected by erasing included: * `AdaptDispatch::_EraseSingleLineHelper` (the EL command, and most ED variants) * `AdaptDispatch::EraseCharacters` (the ECH command) While updating these erase methods, I noticed that both of them also required boundary fixes similar to those in PR #2505 (i.e. the horizontal extent of the erase operation should apply to the full width of the buffer, and not just the current viewport width), so I've addressed that at the same time. In addition to the changes above, there were also a few special cases, the first being the line feed handling, which required updating in a number of places to use the correct erase attributes: * `SCREEN_INFORMATION::InitializeCursorRowAttributes` - this is used to initialise the rows that pan into view when the viewport is moved down the buffer. * `TextBuffer::IncrementCircularBuffer` - this occurs when we scroll passed the very end of the buffer, and a recycled row now needs to be reinitialised. * `AdjustCursorPosition` - when within margin boundaries, this relies on a couple of direct calls to `ScrollRegion` which needed to be passed the correct fill attributes. The second special case was the full screen erase sequence (`ESC 2 J`), which is handled separately from the other ED sequences. This required updating the `SCREEN_INFORMATION::VtEraseAll` method to use the standard erase attributes, and also required changes to the horizontal extent of the filled area, since it should have been clearing the full buffer width (the same issue as the other erase operations mentioned above). Finally, there was the `AdaptDispatch::_EraseScrollback` method, which uses both scroll and fill operations, which could now be handled by the new `PrivateScrollRegion` and `PrivateFillRegion` APIs. But in this case we needed to fill with the default attributes rather than the standard erase attributes. And again this implementation needed some changes to make sure the full width of the active area was retained after the erase, similar to the horizontal boundary issues with the other erase operations. Once all these changes were made, there were a few areas of the code that could then be simplified quite a bit. The `FillConsoleOutputCharacterW`, `FillConsoleOutputAttribute`, and `ScrollConsoleScreenBufferW` were no longer needed in the `ConGetSet` interface, so all of that code could now be removed. The `_EraseSingleLineDistanceHelper` and `_EraseAreaHelper` methods in the `AdaptDispatch` class were also no longer required and could be removed. Then there were the hacks to handle legacy default colors in the `FillConsoleOutputAttributeImpl` and `ScrollConsoleScreenBufferWImpl` implementations. Since those hacks were only needed for VT operations, and the VT code no longer calls those methods, there was no longer a need to retain that behaviour (in fact there are probably some edge cases where that behaviour might have been considered a bug when reached via the public console APIs). ## Validation Steps Performed For most of the scrolling operations there were already existing tests in place, and those could easily be extended to check that the meta attributes were correctly reset when filling the revealed lines of the scrolling region. In the screen buffer tests, I made updates of that sort to the `ScrollOperations` method (handling SU, SD, IL, DL, and RI), the `InsertChars` and `DeleteChars` methods (ICH and DCH), and the `VtNewlinePastViewport` method (LF). I also added a new `VtNewlinePastEndOfBuffer` test to check the case where the line feed causes the viewport to pan past the end of the buffer. The erase operations, however, were being covered by adapter tests, and those aren't really suited for this kind of functionality (the same sort of issue came up in PR #2505). As a result I've had to reimplement those tests as screen buffer tests. Most of the erase operations are covered by the `EraseTests` method, except the for the scrollback erase which has a dedicated `EraseScrollbackTests` method. I've also had to replace the `HardReset` adapter test, but that was already mostly covered by the `HardResetBuffer` screen buffer test, which I've now extended slightly (it could do with some more checks, but I think that can wait for a future PR when we're fixing other RIS issues).
2019-12-11 00:14:40 +01:00
Log::Comment(L"PrivateFillRegion MOCK called...");
Correct fill attributes when scrolling and erasing (#3100) ## Summary of the Pull Request Operations that erase areas of the screen are typically meant to do so using the current color attributes, but with the rendition attributes reset (what we refer to as meta attributes). This also includes scroll operations that have to clear the area of the screen that has scrolled into view. The only exception is the _Erase Scrollback_ operation, which needs to reset the buffer with the default attributes. This PR updates all of these cases to apply the correct attributes when scrolling and erasing. ## PR Checklist * [x] Closes #2553 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've not really discussed this with core contributors. I'm ready to accept this work might be rejected in favor of a different grand plan. ## Detailed Description of the Pull Request / Additional comments My initial plan was to use a special case legacy attribute value to indicate the "standard erase attribute" which could safely be passed through the legacy APIs. But this wouldn't cover the cases that required default attributes to be used. And then with the changes in PR #2668 and #2987, it became clear that our requirements could be better achieved with a couple of new private APIs that wouldn't have to depend on legacy attribute hacks at all. To that end, I've added the `PrivateFillRegion` and `PrivateScrollRegion` APIs to the `ConGetSet` interface. These are just thin wrappers around the existing `SCREEN_INFORMATION::Write` method and the `ScrollRegion` function respectively, but with a simple boolean parameter to choose between filling with default attributes or the standard erase attributes (i.e the current colors but with meta attributes reset). With those new APIs in place, I could then update most scroll operations to use `PrivateScrollRegion`, and most erase operations to use `PrivateFillRegion`. The functions affected by scrolling included: * `DoSrvPrivateReverseLineFeed` (the RI command) * `DoSrvPrivateModifyLinesImpl` (the IL and DL commands) * `AdaptDispatch::_InsertDeleteHelper` (the ICH and DCH commands) * `AdaptDispatch::_ScrollMovement` (the SU and SD commands) The functions affected by erasing included: * `AdaptDispatch::_EraseSingleLineHelper` (the EL command, and most ED variants) * `AdaptDispatch::EraseCharacters` (the ECH command) While updating these erase methods, I noticed that both of them also required boundary fixes similar to those in PR #2505 (i.e. the horizontal extent of the erase operation should apply to the full width of the buffer, and not just the current viewport width), so I've addressed that at the same time. In addition to the changes above, there were also a few special cases, the first being the line feed handling, which required updating in a number of places to use the correct erase attributes: * `SCREEN_INFORMATION::InitializeCursorRowAttributes` - this is used to initialise the rows that pan into view when the viewport is moved down the buffer. * `TextBuffer::IncrementCircularBuffer` - this occurs when we scroll passed the very end of the buffer, and a recycled row now needs to be reinitialised. * `AdjustCursorPosition` - when within margin boundaries, this relies on a couple of direct calls to `ScrollRegion` which needed to be passed the correct fill attributes. The second special case was the full screen erase sequence (`ESC 2 J`), which is handled separately from the other ED sequences. This required updating the `SCREEN_INFORMATION::VtEraseAll` method to use the standard erase attributes, and also required changes to the horizontal extent of the filled area, since it should have been clearing the full buffer width (the same issue as the other erase operations mentioned above). Finally, there was the `AdaptDispatch::_EraseScrollback` method, which uses both scroll and fill operations, which could now be handled by the new `PrivateScrollRegion` and `PrivateFillRegion` APIs. But in this case we needed to fill with the default attributes rather than the standard erase attributes. And again this implementation needed some changes to make sure the full width of the active area was retained after the erase, similar to the horizontal boundary issues with the other erase operations. Once all these changes were made, there were a few areas of the code that could then be simplified quite a bit. The `FillConsoleOutputCharacterW`, `FillConsoleOutputAttribute`, and `ScrollConsoleScreenBufferW` were no longer needed in the `ConGetSet` interface, so all of that code could now be removed. The `_EraseSingleLineDistanceHelper` and `_EraseAreaHelper` methods in the `AdaptDispatch` class were also no longer required and could be removed. Then there were the hacks to handle legacy default colors in the `FillConsoleOutputAttributeImpl` and `ScrollConsoleScreenBufferWImpl` implementations. Since those hacks were only needed for VT operations, and the VT code no longer calls those methods, there was no longer a need to retain that behaviour (in fact there are probably some edge cases where that behaviour might have been considered a bug when reached via the public console APIs). ## Validation Steps Performed For most of the scrolling operations there were already existing tests in place, and those could easily be extended to check that the meta attributes were correctly reset when filling the revealed lines of the scrolling region. In the screen buffer tests, I made updates of that sort to the `ScrollOperations` method (handling SU, SD, IL, DL, and RI), the `InsertChars` and `DeleteChars` methods (ICH and DCH), and the `VtNewlinePastViewport` method (LF). I also added a new `VtNewlinePastEndOfBuffer` test to check the case where the line feed causes the viewport to pan past the end of the buffer. The erase operations, however, were being covered by adapter tests, and those aren't really suited for this kind of functionality (the same sort of issue came up in PR #2505). As a result I've had to reimplement those tests as screen buffer tests. Most of the erase operations are covered by the `EraseTests` method, except the for the scrollback erase which has a dedicated `EraseScrollbackTests` method. I've also had to replace the `HardReset` adapter test, but that was already mostly covered by the `HardResetBuffer` screen buffer test, which I've now extended slightly (it could do with some more checks, but I think that can wait for a future PR when we're fixing other RIS issues).
2019-12-11 00:14:40 +01:00
return TRUE;
}
bool PrivateScrollRegion(const SMALL_RECT /*scrollRect*/,
Correct fill attributes when scrolling and erasing (#3100) ## Summary of the Pull Request Operations that erase areas of the screen are typically meant to do so using the current color attributes, but with the rendition attributes reset (what we refer to as meta attributes). This also includes scroll operations that have to clear the area of the screen that has scrolled into view. The only exception is the _Erase Scrollback_ operation, which needs to reset the buffer with the default attributes. This PR updates all of these cases to apply the correct attributes when scrolling and erasing. ## PR Checklist * [x] Closes #2553 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've not really discussed this with core contributors. I'm ready to accept this work might be rejected in favor of a different grand plan. ## Detailed Description of the Pull Request / Additional comments My initial plan was to use a special case legacy attribute value to indicate the "standard erase attribute" which could safely be passed through the legacy APIs. But this wouldn't cover the cases that required default attributes to be used. And then with the changes in PR #2668 and #2987, it became clear that our requirements could be better achieved with a couple of new private APIs that wouldn't have to depend on legacy attribute hacks at all. To that end, I've added the `PrivateFillRegion` and `PrivateScrollRegion` APIs to the `ConGetSet` interface. These are just thin wrappers around the existing `SCREEN_INFORMATION::Write` method and the `ScrollRegion` function respectively, but with a simple boolean parameter to choose between filling with default attributes or the standard erase attributes (i.e the current colors but with meta attributes reset). With those new APIs in place, I could then update most scroll operations to use `PrivateScrollRegion`, and most erase operations to use `PrivateFillRegion`. The functions affected by scrolling included: * `DoSrvPrivateReverseLineFeed` (the RI command) * `DoSrvPrivateModifyLinesImpl` (the IL and DL commands) * `AdaptDispatch::_InsertDeleteHelper` (the ICH and DCH commands) * `AdaptDispatch::_ScrollMovement` (the SU and SD commands) The functions affected by erasing included: * `AdaptDispatch::_EraseSingleLineHelper` (the EL command, and most ED variants) * `AdaptDispatch::EraseCharacters` (the ECH command) While updating these erase methods, I noticed that both of them also required boundary fixes similar to those in PR #2505 (i.e. the horizontal extent of the erase operation should apply to the full width of the buffer, and not just the current viewport width), so I've addressed that at the same time. In addition to the changes above, there were also a few special cases, the first being the line feed handling, which required updating in a number of places to use the correct erase attributes: * `SCREEN_INFORMATION::InitializeCursorRowAttributes` - this is used to initialise the rows that pan into view when the viewport is moved down the buffer. * `TextBuffer::IncrementCircularBuffer` - this occurs when we scroll passed the very end of the buffer, and a recycled row now needs to be reinitialised. * `AdjustCursorPosition` - when within margin boundaries, this relies on a couple of direct calls to `ScrollRegion` which needed to be passed the correct fill attributes. The second special case was the full screen erase sequence (`ESC 2 J`), which is handled separately from the other ED sequences. This required updating the `SCREEN_INFORMATION::VtEraseAll` method to use the standard erase attributes, and also required changes to the horizontal extent of the filled area, since it should have been clearing the full buffer width (the same issue as the other erase operations mentioned above). Finally, there was the `AdaptDispatch::_EraseScrollback` method, which uses both scroll and fill operations, which could now be handled by the new `PrivateScrollRegion` and `PrivateFillRegion` APIs. But in this case we needed to fill with the default attributes rather than the standard erase attributes. And again this implementation needed some changes to make sure the full width of the active area was retained after the erase, similar to the horizontal boundary issues with the other erase operations. Once all these changes were made, there were a few areas of the code that could then be simplified quite a bit. The `FillConsoleOutputCharacterW`, `FillConsoleOutputAttribute`, and `ScrollConsoleScreenBufferW` were no longer needed in the `ConGetSet` interface, so all of that code could now be removed. The `_EraseSingleLineDistanceHelper` and `_EraseAreaHelper` methods in the `AdaptDispatch` class were also no longer required and could be removed. Then there were the hacks to handle legacy default colors in the `FillConsoleOutputAttributeImpl` and `ScrollConsoleScreenBufferWImpl` implementations. Since those hacks were only needed for VT operations, and the VT code no longer calls those methods, there was no longer a need to retain that behaviour (in fact there are probably some edge cases where that behaviour might have been considered a bug when reached via the public console APIs). ## Validation Steps Performed For most of the scrolling operations there were already existing tests in place, and those could easily be extended to check that the meta attributes were correctly reset when filling the revealed lines of the scrolling region. In the screen buffer tests, I made updates of that sort to the `ScrollOperations` method (handling SU, SD, IL, DL, and RI), the `InsertChars` and `DeleteChars` methods (ICH and DCH), and the `VtNewlinePastViewport` method (LF). I also added a new `VtNewlinePastEndOfBuffer` test to check the case where the line feed causes the viewport to pan past the end of the buffer. The erase operations, however, were being covered by adapter tests, and those aren't really suited for this kind of functionality (the same sort of issue came up in PR #2505). As a result I've had to reimplement those tests as screen buffer tests. Most of the erase operations are covered by the `EraseTests` method, except the for the scrollback erase which has a dedicated `EraseScrollbackTests` method. I've also had to replace the `HardReset` adapter test, but that was already mostly covered by the `HardResetBuffer` screen buffer test, which I've now extended slightly (it could do with some more checks, but I think that can wait for a future PR when we're fixing other RIS issues).
2019-12-11 00:14:40 +01:00
const std::optional<SMALL_RECT> /*clipRect*/,
const COORD /*destinationOrigin*/,
const bool /*standardFillAttrs*/) noexcept override
{
Log::Comment(L"PrivateScrollRegion MOCK called...");
return TRUE;
}
Add support for downloadable soft fonts (#10011) This PR adds conhost support for downloadable soft fonts - also known as dynamically redefinable character sets (DRCS) - using the `DECDLD` escape sequence. These fonts are typically designed to work on a specific terminal model, and each model tends to have a different character cell size. So in order to support as many models as possible, the code attempts to detect the original target size of the font, and then scale the glyphs to fit our current cell size. Once a font has been downloaded to the terminal, it can be designated in the same way you would a standard character set, using an `SCS` escape sequence. The identification string for the set is defined by the `DECDLD` sequence. Internally we map the characters in this set to code points `U+EF20` to `U+EF7F` in the Unicode private use are (PUA). Then in the renderer, any characters in that range are split off into separate runs, which get painted with a special font. The font itself is dynamically generated as an in-memory resource, constructed from the downloaded character bitmaps which have been scaled to the appropriate size. If no soft fonts are in use, then no mapping of the PUA code points will take place, so this shouldn't interfere with anyone using those code points for something else, as along as they aren't also trying to use soft fonts. I also tried to pick a PUA range that hadn't already been snatched up by Nerd Fonts, but if we do receive reports of a conflict, it's easy enough to change. ## Validation Steps Performed I added an adapter test that runs through a bunch of parameter variations for the `DECDLD` sequence, to make sure we're correctly detecting the font sizes for most of the known DEC terminal models. I've also tested manually on a wide range of existing fonts, of varying dimensions, and from multiple sources, and made sure they all worked reasonably well. Closes #9164
2021-08-06 22:41:02 +02:00
bool PrivateUpdateSoftFont(const gsl::span<const uint16_t> /*bitPattern*/,
const SIZE cellSize,
const size_t /*centeringHint*/) noexcept override
{
Log::Comment(L"PrivateUpdateSoftFont MOCK called...");
Log::Comment(NoThrowString().Format(L"Cell size: %dx%d", cellSize.cx, cellSize.cy));
VERIFY_ARE_EQUAL(_expectedCellSize.cx, cellSize.cx);
VERIFY_ARE_EQUAL(_expectedCellSize.cy, cellSize.cy);
return TRUE;
}
void PrepData()
{
PrepData(CursorDirection::UP); // if called like this, the cursor direction doesn't matter.
}
void PrepData(CursorDirection dir)
{
switch (dir)
{
case CursorDirection::UP:
return PrepData(CursorX::LEFT, CursorY::TOP);
case CursorDirection::DOWN:
return PrepData(CursorX::LEFT, CursorY::BOTTOM);
case CursorDirection::LEFT:
return PrepData(CursorX::LEFT, CursorY::TOP);
case CursorDirection::RIGHT:
return PrepData(CursorX::RIGHT, CursorY::TOP);
case CursorDirection::NEXTLINE:
return PrepData(CursorX::LEFT, CursorY::BOTTOM);
case CursorDirection::PREVLINE:
return PrepData(CursorX::LEFT, CursorY::TOP);
}
}
void PrepData(CursorX xact, CursorY yact)
{
Log::Comment(L"Resetting mock data state.");
// APIs succeed by default
_setConsoleCursorPositionResult = TRUE;
_getConsoleScreenBufferInfoExResult = TRUE;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_privateGetTextAttributesResult = TRUE;
_privateSetTextAttributesResult = TRUE;
_privateWriteConsoleInputWResult = TRUE;
_privateWriteConsoleControlInputResult = TRUE;
_setConsoleWindowInfoResult = TRUE;
_moveToBottomResult = true;
_bufferSize.X = 100;
_bufferSize.Y = 600;
// Viewport sitting in the "middle" of the buffer somewhere (so all sides have excess buffer around them)
_viewport.Top = 20;
_viewport.Bottom = 49;
_viewport.Left = 30;
_viewport.Right = 59;
// Call cursor positions separately
PrepCursor(xact, yact);
_cursorVisible = TRUE;
// Attribute default is gray on black.
_attribute = TextAttribute{ FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED };
_expectedAttribute = _attribute;
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
_events.clear();
_retainInput = false;
}
void PrepCursor(CursorX xact, CursorY yact)
{
Log::Comment(L"Adjusting cursor within viewport... Expected will match actual when done.");
switch (xact)
{
case CursorX::LEFT:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Cursor set to left edge of buffer.");
_cursorPos.X = 0;
break;
case CursorX::RIGHT:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Cursor set to right edge of buffer.");
_cursorPos.X = _bufferSize.X - 1;
break;
case CursorX::XCENTER:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Cursor set to centered X of buffer.");
_cursorPos.X = _bufferSize.X / 2;
break;
}
switch (yact)
{
case CursorY::TOP:
Log::Comment(L"Cursor set to top edge of viewport.");
_cursorPos.Y = _viewport.Top;
break;
case CursorY::BOTTOM:
Log::Comment(L"Cursor set to bottom edge of viewport.");
_cursorPos.Y = _viewport.Bottom - 1;
break;
case CursorY::YCENTER:
Log::Comment(L"Cursor set to centered Y of viewport.");
_cursorPos.Y = _viewport.Top + ((_viewport.Bottom - _viewport.Top) / 2);
break;
}
_expectedCursorPos = _cursorPos;
}
void ValidateInputEvent(_In_ PCWSTR pwszExpectedResponse)
{
size_t const cchResponse = wcslen(pwszExpectedResponse);
size_t const eventCount = _events.size();
VERIFY_ARE_EQUAL(cchResponse * 2, eventCount, L"We should receive TWO input records for every character in the expected string. Key down and key up.");
for (size_t iInput = 0; iInput < eventCount; iInput++)
{
wchar_t const wch = pwszExpectedResponse[iInput / 2]; // the same portion of the string will be used twice. 0/2 = 0. 1/2 = 0. 2/2 = 1. 3/2 = 1. and so on.
VERIFY_ARE_EQUAL(InputEventType::KeyEvent, _events[iInput]->EventType());
const KeyEvent* const keyEvent = static_cast<const KeyEvent* const>(_events[iInput].get());
// every even key is down. every odd key is up. DOWN = 0, UP = 1. DOWN = 2, UP = 3. and so on.
VERIFY_ARE_EQUAL((bool)!(iInput % 2), keyEvent->IsKeyDown());
VERIFY_ARE_EQUAL(0u, keyEvent->GetActiveModifierKeys());
Log::Comment(NoThrowString().Format(L"Comparing '%c' with '%c'...", wch, keyEvent->GetCharData()));
VERIFY_ARE_EQUAL(wch, keyEvent->GetCharData());
VERIFY_ARE_EQUAL(1u, keyEvent->GetRepeatCount());
VERIFY_ARE_EQUAL(0u, keyEvent->GetVirtualKeyCode());
VERIFY_ARE_EQUAL(0u, keyEvent->GetVirtualScanCode());
}
}
OSC 8 support for conhost and terminal (#7251) <!-- Enter a brief description/summary of your PR here. What does it fix/what does it change/how was it tested (even manually, if necessary)? --> ## Summary of the Pull Request Conhost can now support OSC8 sequences (as specified [here](https://gist.github.com/egmontkob/eb114294efbcd5adb1944c9f3cb5feda)). Terminal also supports those sequences and additionally hyperlinks can be opened by Ctrl+LeftClicking on them. <!-- Other than the issue solved, is this relevant to any other issues/existing PRs? --> ## References #204 <!-- Please review the items on the PR checklist before submitting--> ## PR Checklist * [X] Closes #204 * [ ] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [ ] Tests added/passed * [ ] Documentation updated. If checked, please file a pull request on [our docs repo](https://github.com/MicrosoftDocs/terminal) and link it here: #xxx * [ ] Schema updated. * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx <!-- Provide a more detailed description of the PR, other things fixed or any additional comments/features here --> ## Detailed Description of the Pull Request / Additional comments Added support to: - parse OSC8 sequences and extract URIs from them (conhost and terminal) - add hyperlink uri data to textbuffer/screeninformation, associated with a hyperlink id (conhost and terminal) - attach hyperlink ids to text to allow for uri extraction from the textbuffer/screeninformation (conhost and terminal) - process ctrl+leftclick to open a hyperlink in the clicked region if present <!-- Describe how you validated the behavior. Add automated tests wherever possible, but list manual validation steps taken as well --> ## Validation Steps Performed Open up a PowerShell tab and type ```PowerShell ${ESC}=[char]27 Write-Host "${ESC}]8;;https://github.com/microsoft/terminal${ESC}\This is a link!${ESC}]8;;${ESC}\" ``` Ctrl+LeftClick on the link correctly brings you to the terminal page on github ![hyperlink](https://user-images.githubusercontent.com/26824113/89953536-45a6f580-dbfd-11ea-8e0d-8a3cd25c634a.gif)
2020-09-03 19:52:39 +02:00
bool PrivateAddHyperlink(const std::wstring_view /*uri*/, const std::wstring_view /*params*/) const
{
Log::Comment(L"PrivateAddHyperlink MOCK called...");
return TRUE;
}
bool PrivateEndHyperlink() const
{
Log::Comment(L"PrivateEndHyperlink MOCK called...");
return TRUE;
}
void _SetMarginsHelper(SMALL_RECT* rect, SHORT top, SHORT bottom)
{
rect->Top = top;
rect->Bottom = bottom;
//The rectangle is going to get converted from VT space to conhost space
_expectedScrollRegion.Top = (top > 0) ? rect->Top - 1 : rect->Top;
_expectedScrollRegion.Bottom = (bottom > 0) ? rect->Bottom - 1 : rect->Bottom;
}
~TestGetSet()
{
}
static const WCHAR s_wchErase = (WCHAR)0x20;
static const WCHAR s_wchDefault = L'Z';
static const WORD s_wAttrErase = FOREGROUND_BLUE | FOREGROUND_GREEN | BACKGROUND_RED | BACKGROUND_INTENSITY;
static const WORD s_wDefaultAttribute = 0;
static const WORD s_defaultFill = FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED; // dark gray on black.
std::deque<std::unique_ptr<IInputEvent>> _events;
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
bool _retainInput{ false };
auto EnableInputRetentionInScope()
{
auto oldRetainValue{ _retainInput };
_retainInput = true;
return wil::scope_exit([oldRetainValue, this] {
_retainInput = oldRetainValue;
});
}
COORD _bufferSize = { 0, 0 };
SMALL_RECT _viewport = { 0, 0, 0, 0 };
SMALL_RECT _expectedConsoleWindow = { 0, 0, 0, 0 };
COORD _cursorPos = { 0, 0 };
SMALL_RECT _expectedScrollRegion = { 0, 0, 0, 0 };
bool _cursorVisible = false;
COORD _expectedCursorPos = { 0, 0 };
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
TextAttribute _attribute = {};
TextAttribute _expectedAttribute = {};
unsigned int _expectedOutputCP = 0;
bool _isPty = false;
bool _privateShowCursorResult = false;
bool _expectedShowCursor = false;
bool _getConsoleScreenBufferInfoExResult = false;
bool _setConsoleCursorPositionResult = false;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
bool _privateGetTextAttributesResult = false;
bool _privateSetTextAttributesResult = false;
bool _privateWriteConsoleInputWResult = false;
bool _privateWriteConsoleControlInputResult = false;
bool _setConsoleWindowInfoResult = false;
bool _expectedWindowAbsolute = false;
bool _setConsoleScreenBufferInfoExResult = false;
COORD _expectedScreenBufferSize = { 0, 0 };
SMALL_RECT _expectedScreenBufferViewport{ 0, 0, 0, 0 };
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
bool _setInputModeResult = false;
TerminalInput::Mode _expectedInputMode;
bool _expectedInputModeEnabled = false;
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
bool _setParserModeResult = false;
StateMachine::Mode _expectedParserMode;
bool _expectedParserModeEnabled = false;
bool _privateAllowCursorBlinkingResult = false;
bool _enable = false; // for cursor blinking
bool _privateSetScrollingRegionResult = false;
Add support for all the line feed control sequences (#3271) ## Summary of the Pull Request This adds support for the `FF` (form feed) and `VT` (vertical tab) [control characters](https://vt100.net/docs/vt510-rm/chapter4.html#T4-1), as well as the [`NEL` (Next Line)](https://vt100.net/docs/vt510-rm/NEL.html) and [`IND` (Index)](https://vt100.net/docs/vt510-rm/IND.html) escape sequences. ## References #976 discusses the conflict between VT100 Index sequence and the VT52 cursor back sequence. ## PR Checklist * [x] Closes #3189 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #3189 ## Detailed Description of the Pull Request / Additional comments I've added a `LineFeed` method to the `ITermDispatch` interface, with an enum parameter specifying the required line feed type (i.e. with carriage return, without carriage return, or dependent on the [`LNM` mode](https://vt100.net/docs/vt510-rm/LNM.html)). The output state machine can then call that method to handle the various line feed control characters (parsed in the `ActionExecute` method), as well the `NEL` and `IND` escape sequences (parsed in the `ActionEscDispatch` method). The `AdaptDispatch` implementation of `LineFeed` then forwards the call to a new `PrivateLineFeed` method in the `ConGetSet` interface, which simply takes a bool parameter specifying whether a carriage return is required or not. In the case of mode-dependent line feeds, the `AdaptDispatch` implementation determines whether the return is necessary or not, based on the existing _AutoReturnOnNewLine_ setting (which I'm obtaining via another new `PrivateGetLineFeedMode` method). Ultimately we'll want to support changing the mode via the [`LNM` escape sequence](https://vt100.net/docs/vt510-rm/LNM.html), but there's no urgent need for that now. And using the existing _AutoReturnOnNewLine_ setting as a substitute for the mode gives us backwards compatible behaviour, since that will be true for the Windows shells (which expect a linefeed to also generate a carriage return), and false in a WSL bash shell (which won't want the carriage return by default). As for the actual `PrivateLineFeed` implementation, that is just a simplified version of how the line feed would previously have been executed in the `WriteCharsLegacy` function. This includes setting the cursor to "On" (with `Cursor::SetIsOn`), potentially clearing the wrap property of the line being left (with `CharRow::SetWrapForced` false), and then setting the new position using `AdjustCursorPosition` with the _fKeepCursorVisible_ parameter set to false. I'm unsure whether the `SetIsOn` call is really necessary, and I think the way the forced wrap is handled needs a rethink in general, but for now this should at least be compatible with the existing behaviour. Finally, in order to make this all work in the _Windows Terminal_ app, I also had to add a basic implementation of the `ITermDispatch::LineFeed` method in the `TerminalDispatch` class. There is currently no need to support mode-specific line feeds here, so this simply forwards a `\n` or `\r\n` to the `Execute` method, which is ultimately handled by the `Terminal::_WriteBuffer` implementation. ## Validation Steps Performed I've added output engine tests which confirm that the various control characters and escape sequences trigger the dispatch method correctly. Then I've added adapter tests which confirm the various dispatch options trigger the `PrivateLineFeed` API correctly. And finally I added some screen buffer tests that check the actual results of the `NEL` and `IND` sequences, which covers both forms of the `PrivateLineFeed` API (i.e. with and without a carriage return). I've also run the _Test of cursor movements_ in the [Vttest](https://invisible-island.net/vttest/) utility, and confirmed that screens 1, 2, and 5 are now working correctly. The first two depend on `NEL` and `IND` being supported, and screen 5 requires the `VT` control character.
2020-01-15 14:41:55 +01:00
bool _privateGetLineFeedModeResult = false;
bool _privateLineFeedResult = false;
bool _expectedLineFeedWithReturn = false;
bool _privateReverseLineFeedResult = false;
bool _setConsoleTitleWResult = false;
std::wstring_view _expectedWindowTitle{};
bool _setCursorStyleResult = false;
CursorType _expectedCursorStyle;
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
bool _setConsoleOutputCPResult = false;
bool _getConsoleOutputCPResult = false;
bool _moveToBottomResult = false;
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
bool _getColorTableEntryResult = false;
bool _setColorTableEntryResult = false;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
size_t _expectedColorTableIndex = SIZE_MAX;
COLORREF _expectedColorValue = INVALID_COLOR;
Add support for downloadable soft fonts (#10011) This PR adds conhost support for downloadable soft fonts - also known as dynamically redefinable character sets (DRCS) - using the `DECDLD` escape sequence. These fonts are typically designed to work on a specific terminal model, and each model tends to have a different character cell size. So in order to support as many models as possible, the code attempts to detect the original target size of the font, and then scale the glyphs to fit our current cell size. Once a font has been downloaded to the terminal, it can be designated in the same way you would a standard character set, using an `SCS` escape sequence. The identification string for the set is defined by the `DECDLD` sequence. Internally we map the characters in this set to code points `U+EF20` to `U+EF7F` in the Unicode private use are (PUA). Then in the renderer, any characters in that range are split off into separate runs, which get painted with a special font. The font itself is dynamically generated as an in-memory resource, constructed from the downloaded character bitmaps which have been scaled to the appropriate size. If no soft fonts are in use, then no mapping of the PUA code points will take place, so this shouldn't interfere with anyone using those code points for something else, as along as they aren't also trying to use soft fonts. I also tried to pick a PUA range that hadn't already been snatched up by Nerd Fonts, but if we do receive reports of a conflict, it's easy enough to change. ## Validation Steps Performed I added an adapter test that runs through a bunch of parameter variations for the `DECDLD` sequence, to make sure we're correctly detecting the font sizes for most of the known DEC terminal models. I've also tested manually on a wide range of existing fonts, of varying dimensions, and from multiple sources, and made sure they all worked reasonably well. Closes #9164
2021-08-06 22:41:02 +02:00
SIZE _expectedCellSize = {};
private:
HANDLE _hCon;
};
class DummyAdapter : public AdaptDefaults
{
void Print(const wchar_t /*wch*/) override
{
}
void PrintString(const std::wstring_view /*string*/) override
{
}
void Execute(const wchar_t /*wch*/) override
{
}
};
class AdapterTest
{
public:
TEST_CLASS(AdapterTest);
TEST_METHOD_SETUP(SetupMethods)
{
bool fSuccess = true;
auto api = std::make_unique<TestGetSet>();
fSuccess = api.get() != nullptr;
if (fSuccess)
{
auto adapter = std::make_unique<DummyAdapter>();
// give AdaptDispatch ownership of _testGetSet
_testGetSet = api.get(); // keep a copy for us but don't manage its lifetime anymore.
_pDispatch = std::make_unique<AdaptDispatch>(std::move(api), std::move(adapter));
fSuccess = _pDispatch != nullptr;
}
return fSuccess;
}
TEST_METHOD_CLEANUP(CleanupMethods)
{
_pDispatch.reset();
_testGetSet = nullptr;
return true;
}
TEST_METHOD(CursorMovementTest)
{
BEGIN_TEST_METHOD_PROPERTIES()
TEST_METHOD_PROPERTY(L"Data:uiDirection", L"{0, 1, 2, 3, 4, 5}") // These values align with the CursorDirection enum class to try all the directions.
END_TEST_METHOD_PROPERTIES()
Log::Comment(L"Starting test...");
// Used to switch between the various function options.
typedef bool (AdaptDispatch::*CursorMoveFunc)(size_t);
CursorMoveFunc moveFunc = nullptr;
// Modify variables based on directionality of this test
CursorDirection direction;
size_t dir;
VERIFY_SUCCEEDED_RETURN(TestData::TryGetValue(L"uiDirection", dir));
direction = (CursorDirection)dir;
switch (direction)
{
case CursorDirection::UP:
Log::Comment(L"Testing up direction.");
moveFunc = &AdaptDispatch::CursorUp;
break;
case CursorDirection::DOWN:
Log::Comment(L"Testing down direction.");
moveFunc = &AdaptDispatch::CursorDown;
break;
case CursorDirection::RIGHT:
Log::Comment(L"Testing right direction.");
moveFunc = &AdaptDispatch::CursorForward;
break;
case CursorDirection::LEFT:
Log::Comment(L"Testing left direction.");
moveFunc = &AdaptDispatch::CursorBackward;
break;
case CursorDirection::NEXTLINE:
Log::Comment(L"Testing next line direction.");
moveFunc = &AdaptDispatch::CursorNextLine;
break;
case CursorDirection::PREVLINE:
Log::Comment(L"Testing prev line direction.");
moveFunc = &AdaptDispatch::CursorPrevLine;
break;
}
if (moveFunc == nullptr)
{
VERIFY_FAIL();
return;
}
// success cases
// place cursor in top left. moving up is expected to go nowhere (it should get bounded by the viewport)
Log::Comment(L"Test 1: Cursor doesn't move when placed in corner of viewport.");
_testGetSet->PrepData(direction);
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(1));
Log::Comment(L"Test 1b: Cursor moves to left of line with next/prev line command when cursor can't move higher/lower.");
bool fDoTest1b = false;
switch (direction)
{
case CursorDirection::NEXTLINE:
_testGetSet->PrepData(CursorX::RIGHT, CursorY::BOTTOM);
fDoTest1b = true;
break;
case CursorDirection::PREVLINE:
_testGetSet->PrepData(CursorX::RIGHT, CursorY::TOP);
fDoTest1b = true;
break;
}
if (fDoTest1b)
{
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(1));
}
else
{
Log::Comment(L"Test not applicable to direction selected. Skipping.");
}
// place cursor lower, move up 1.
Log::Comment(L"Test 2: Cursor moves 1 in the correct direction from viewport.");
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
switch (direction)
{
case CursorDirection::UP:
_testGetSet->_expectedCursorPos.Y--;
break;
case CursorDirection::DOWN:
_testGetSet->_expectedCursorPos.Y++;
break;
case CursorDirection::RIGHT:
_testGetSet->_expectedCursorPos.X++;
break;
case CursorDirection::LEFT:
_testGetSet->_expectedCursorPos.X--;
break;
case CursorDirection::NEXTLINE:
_testGetSet->_expectedCursorPos.Y++;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
break;
case CursorDirection::PREVLINE:
_testGetSet->_expectedCursorPos.Y--;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
break;
}
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(1));
// place cursor and move it up too far. It should get bounded by the viewport.
Log::Comment(L"Test 3: Cursor moves and gets stuck at viewport when started away from edges and moved beyond edges.");
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
// Bottom and right viewports are -1 because those two sides are specified to be 1 outside the viewable area.
switch (direction)
{
case CursorDirection::UP:
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Top;
break;
case CursorDirection::DOWN:
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Bottom - 1;
break;
case CursorDirection::RIGHT:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = _testGetSet->_bufferSize.X - 1;
break;
case CursorDirection::LEFT:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
break;
case CursorDirection::NEXTLINE:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Bottom - 1;
break;
case CursorDirection::PREVLINE:
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = 0;
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Top;
break;
}
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(100));
// error cases
// SetConsoleCursorPosition throws failure. Parameters are otherwise normal.
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 4: When SetConsoleCursorPosition throws a failure, call fails and cursor doesn't move.");
_testGetSet->PrepData(direction);
_testGetSet->_setConsoleCursorPositionResult = FALSE;
VERIFY_IS_FALSE((_pDispatch.get()->*(moveFunc))(0));
VERIFY_ARE_EQUAL(_testGetSet->_expectedCursorPos, _testGetSet->_cursorPos);
// GetConsoleScreenBufferInfo throws failure. Parameters are otherwise normal.
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 5: When GetConsoleScreenBufferInfo throws a failure, call fails and cursor doesn't move.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
_testGetSet->_getConsoleScreenBufferInfoExResult = FALSE;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
VERIFY_IS_FALSE((_pDispatch.get()->*(moveFunc))(0));
VERIFY_ARE_EQUAL(_testGetSet->_expectedCursorPos, _testGetSet->_cursorPos);
}
TEST_METHOD(CursorPositionTest)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Place cursor within the viewport. Start from top left, move to middle.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
short sCol = (_testGetSet->_viewport.Right - _testGetSet->_viewport.Left) / 2;
short sRow = (_testGetSet->_viewport.Bottom - _testGetSet->_viewport.Top) / 2;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
// The X coordinate is unaffected by the viewport.
_testGetSet->_expectedCursorPos.X = sCol - 1;
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Top + (sRow - 1);
VERIFY_IS_TRUE(_pDispatch.get()->CursorPosition(sRow, sCol));
Log::Comment(L"Test 2: Move to 0, 0 (which is 1,1 in VT speak)");
_testGetSet->PrepData(CursorX::RIGHT, CursorY::BOTTOM);
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
// The X coordinate is unaffected by the viewport.
_testGetSet->_expectedCursorPos.X = 0;
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Top;
VERIFY_IS_TRUE(_pDispatch.get()->CursorPosition(1, 1));
Log::Comment(L"Test 3: Move beyond rectangle (down/right too far). Should be bounded back in.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
sCol = (_testGetSet->_bufferSize.X) * 2;
sRow = (_testGetSet->_viewport.Bottom - _testGetSet->_viewport.Top) * 2;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_expectedCursorPos.X = _testGetSet->_bufferSize.X - 1;
_testGetSet->_expectedCursorPos.Y = _testGetSet->_viewport.Bottom - 1;
VERIFY_IS_TRUE(_pDispatch.get()->CursorPosition(sRow, sCol));
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 4: GetConsoleInfo API returns false. No move, return false.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
_testGetSet->_getConsoleScreenBufferInfoExResult = FALSE;
VERIFY_IS_FALSE(_pDispatch.get()->CursorPosition(1, 1));
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 5: SetCursor API returns false. No move, return false.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
_testGetSet->_setConsoleCursorPositionResult = FALSE;
VERIFY_IS_FALSE(_pDispatch.get()->CursorPosition(1, 1));
}
TEST_METHOD(CursorSingleDimensionMoveTest)
{
BEGIN_TEST_METHOD_PROPERTIES()
TEST_METHOD_PROPERTY(L"Data:uiDirection", L"{0, 1}") // These values align with the CursorDirection enum class to try all the directions.
END_TEST_METHOD_PROPERTIES()
Log::Comment(L"Starting test...");
//// Used to switch between the various function options.
typedef bool (AdaptDispatch::*CursorMoveFunc)(size_t);
CursorMoveFunc moveFunc = nullptr;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
SHORT sRangeEnd = 0;
SHORT sRangeStart = 0;
SHORT* psCursorExpected = nullptr;
// Modify variables based on directionality of this test
AbsolutePosition direction;
size_t dir;
VERIFY_SUCCEEDED_RETURN(TestData::TryGetValue(L"uiDirection", dir));
direction = (AbsolutePosition)dir;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->PrepData();
switch (direction)
{
case AbsolutePosition::CursorHorizontal:
Log::Comment(L"Testing cursor horizontal movement.");
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
sRangeEnd = _testGetSet->_bufferSize.X;
sRangeStart = 0;
psCursorExpected = &_testGetSet->_expectedCursorPos.X;
moveFunc = &AdaptDispatch::CursorHorizontalPositionAbsolute;
break;
case AbsolutePosition::VerticalLine:
Log::Comment(L"Testing vertical line movement.");
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
sRangeEnd = _testGetSet->_viewport.Bottom;
sRangeStart = _testGetSet->_viewport.Top;
psCursorExpected = &_testGetSet->_expectedCursorPos.Y;
moveFunc = &AdaptDispatch::VerticalLinePositionAbsolute;
break;
}
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
if (moveFunc == nullptr || psCursorExpected == nullptr)
{
VERIFY_FAIL();
return;
}
Log::Comment(L"Test 1: Place cursor within the viewport. Start from top left, move to middle.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
short sVal = (sRangeEnd - sRangeStart) / 2;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
*psCursorExpected = sRangeStart + (sVal - 1);
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(sVal));
Log::Comment(L"Test 2: Move to 0 (which is 1 in VT speak)");
_testGetSet->PrepData(CursorX::RIGHT, CursorY::BOTTOM);
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
*psCursorExpected = sRangeStart;
sVal = 1;
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(sVal));
Log::Comment(L"Test 3: Move beyond rectangle (down/right too far). Should be bounded back in.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
sVal = (sRangeEnd - sRangeStart) * 2;
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
*psCursorExpected = sRangeEnd - 1;
VERIFY_IS_TRUE((_pDispatch.get()->*(moveFunc))(sVal));
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 4: GetConsoleInfo API returns false. No move, return false.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
_testGetSet->_getConsoleScreenBufferInfoExResult = FALSE;
sVal = 1;
VERIFY_IS_FALSE((_pDispatch.get()->*(moveFunc))(sVal));
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
Log::Comment(L"Test 5: SetCursor API returns false. No move, return false.");
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
_testGetSet->_setConsoleCursorPositionResult = FALSE;
sVal = 1;
VERIFY_IS_FALSE((_pDispatch.get()->*(moveFunc))(sVal));
}
TEST_METHOD(CursorSaveRestoreTest)
{
Log::Comment(L"Starting test...");
COORD coordExpected = { 0 };
Log::Comment(L"Test 1: Restore with no saved data should move to top-left corner, the null/default position.");
// Move cursor to top left and save off expected position.
_testGetSet->PrepData(CursorX::LEFT, CursorY::TOP);
coordExpected = _testGetSet->_expectedCursorPos;
// Then move cursor to the middle and reset the expected to the top left.
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
_testGetSet->_expectedCursorPos = coordExpected;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
// Attributes are restored to defaults.
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch.get()->CursorRestoreState(), L"By default, restore to top left corner (0,0 offset from viewport).");
Log::Comment(L"Test 2: Place cursor in center. Save. Move cursor to corner. Restore. Should come back to center.");
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
VERIFY_IS_TRUE(_pDispatch.get()->CursorSaveState(), L"Succeed at saving position.");
Log::Comment(L"Backup expected cursor (in the middle). Move cursor to corner. Then re-set expected cursor to middle.");
// save expected cursor position
coordExpected = _testGetSet->_expectedCursorPos;
// adjust cursor to corner
_testGetSet->PrepData(CursorX::LEFT, CursorY::BOTTOM);
// restore expected cursor position to center.
_testGetSet->_expectedCursorPos = coordExpected;
VERIFY_IS_TRUE(_pDispatch.get()->CursorRestoreState(), L"Restoring to corner should succeed. API call inside will test that cursor matched expected position.");
}
TEST_METHOD(CursorHideShowTest)
{
BEGIN_TEST_METHOD_PROPERTIES()
TEST_METHOD_PROPERTY(L"Data:fStartingVis", L"{TRUE, FALSE}")
TEST_METHOD_PROPERTY(L"Data:fEndingVis", L"{TRUE, FALSE}")
END_TEST_METHOD_PROPERTIES()
Log::Comment(L"Starting test...");
// Modify variables based on permutations of this test.
bool fStart;
bool fEnd;
VERIFY_SUCCEEDED_RETURN(TestData::TryGetValue(L"fStartingVis", fStart));
VERIFY_SUCCEEDED_RETURN(TestData::TryGetValue(L"fEndingVis", fEnd));
Log::Comment(L"Test 1: Verify successful API call modifies visibility state.");
_testGetSet->PrepData();
_testGetSet->_cursorVisible = fStart;
_testGetSet->_privateShowCursorResult = true;
_testGetSet->_expectedShowCursor = fEnd;
VERIFY_IS_TRUE(_pDispatch.get()->CursorVisibility(fEnd));
Log::Comment(L"Test 3: When we fail to set updated cursor information, the dispatch should fail.");
_testGetSet->PrepData();
_testGetSet->_privateShowCursorResult = false;
VERIFY_IS_FALSE(_pDispatch.get()->CursorVisibility(fEnd));
}
TEST_METHOD(GraphicsBaseTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Send no options.");
_testGetSet->PrepData();
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VTParameter rgOptions[16];
size_t cOptions = 0;
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
Log::Comment(L"Test 2: Gracefully fail when getting attribute data fails.");
_testGetSet->PrepData();
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_privateGetTextAttributesResult = FALSE;
VERIFY_IS_FALSE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Test 3: Gracefully fail when setting attribute data fails.");
_testGetSet->PrepData();
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_privateSetTextAttributesResult = FALSE;
// Need at least one option in order for the call to be able to fail.
rgOptions[0] = (DispatchTypes::GraphicsOptions)0;
cOptions = 1;
VERIFY_IS_FALSE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
}
TEST_METHOD(GraphicsSingleTests)
{
BEGIN_TEST_METHOD_PROPERTIES()
Add support for the "doubly underlined" graphic rendition attribute (#7223) This PR adds support for the ANSI _doubly underlined_ graphic rendition attribute, which is enabled by the `SGR 21` escape sequence. There was already an `ExtendedAttributes::DoublyUnderlined` flag in the `TextAttribute` class, but I needed to add `SetDoublyUnderlined` and `IsDoublyUnderlined` methods to access that flag, and update the `SetGraphicsRendition` methods of the two dispatchers to set the attribute on receipt of the `SGR 21` sequence. I also had to update the existing `SGR 24` handler to reset _DoublyUnderlined_ in addition to _Underlined_, since they share the same reset sequence. For the rendering, I've added a new grid line type, which essentially just draws an additional line with the same thickness as the regular underline, but slightly below it - I found a gap of around 0.05 "em" between the lines looked best. If there isn't enough space in the cell for that gap, the second line will be clamped to overlap the first, so you then just get a thicker line. If there isn't even enough space below for a thicker line, we move the offset _above_ the first line, but just enough to make it thicker. The only other complication was the update of the `Xterm256Engine` in the VT renderer. As mentioned above, the two underline attributes share the same reset sequence, so to forward that state over conpty we require a slightly more complicated process than with most other attributes (similar to _Bold_ and _Faint_). We first check whether either underline attribute needs to be turned off to send the reset sequence, and then check individually if each of them needs to be turned back on again. ## Validation Steps Performed For testing, I've extended the existing attribute tests in `AdapterTest`, `VTRendererTest`, and `ScreenBufferTests`, to make sure we're covering both the _Underlined_ and _DoublyUnderlined_ attributes. I've also manually tested the `SGR 21` sequence in conhost and Windows Terminal, with a variety of fonts and font sizes, to make sure the rendering was reasonably distinguishable from a single underline. Closes #2916
2020-08-10 19:06:16 +02:00
TEST_METHOD_PROPERTY(L"Data:uiGraphicsOptions", L"{0, 1, 2, 4, 7, 8, 9, 21, 22, 24, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 39, 40, 41, 42, 43, 44, 45, 46, 47, 49, 53, 55, 90, 91, 92, 93, 94, 95, 96, 97, 100, 101, 102, 103, 104, 105, 106, 107}") // corresponds to options in DispatchTypes::GraphicsOptions
END_TEST_METHOD_PROPERTIES()
Log::Comment(L"Starting test...");
_testGetSet->PrepData();
// Modify variables based on type of this test
DispatchTypes::GraphicsOptions graphicsOption;
size_t uiGraphicsOption;
VERIFY_SUCCEEDED_RETURN(TestData::TryGetValue(L"uiGraphicsOptions", uiGraphicsOption));
graphicsOption = (DispatchTypes::GraphicsOptions)uiGraphicsOption;
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VTParameter rgOptions[16];
size_t cOptions = 1;
rgOptions[0] = graphicsOption;
switch (graphicsOption)
{
case DispatchTypes::GraphicsOptions::Off:
Log::Comment(L"Testing graphics 'Off/Reset'");
_testGetSet->_attribute = TextAttribute{ (WORD)~_testGetSet->s_defaultFill };
_testGetSet->_expectedAttribute = TextAttribute{};
break;
case DispatchTypes::GraphicsOptions::BoldBright:
Log::Comment(L"Testing graphics 'Bold/Bright'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute.SetBold(true);
break;
Add support for the "faint" graphic rendition attribute (#6873) ## Summary of the Pull Request This PR adds support for the `SGR 2` escape sequence, which enables the ANSI _faint_ graphic rendition attribute. When a character is output with this attribute set, it uses a dimmer version of the active foreground color. ## PR Checklist * [x] Closes #6703 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #6703 ## Detailed Description of the Pull Request / Additional comments There was already an `ExtendedAttributes::Faint` flag in the `TextAttribute` class, but I needed to add `SetFaint` and `IsFaint` methods to access that flag, and update the `SetGraphicsRendition` methods of the two dispatchers to set the attribute on receipt of the `SGR 2` sequence. I also had to update the existing `SGR 22` handler to reset _Faint_ in addition to _Bold_, since they share the same reset sequence. For that reason, I thought it a good idea to change the name of the `SGR 22` enum to `NotBoldOrFaint`. For the purpose of rendering, I've updated the `TextAttribute::CalculateRgbColors` method to return a dimmer version of the foreground color when the _Faint_ attribute is set. This is simply achieved by dividing each color component by two, which produces a reasonable effect without being too complicated. Note that the _Faint_ effect is applied before _Reverse Video_, so if the output it reversed, it's the background that will be faint. The only other complication was the update of the `Xterm256Engine` in the VT renderer. As mentioned above, _Bold_ and _Faint_ share the same reset sequence, so to forward that state over conpty we have to go through a slightly more complicated process than with other attributes. We first check whether either attribute needs to be turned off to send the reset sequence, and then check if the individual attributes need to be turned on again. ## Validation I've extended the existing SGR unit tests to cover the new attribute in the `AdapterTest`, the `ScreenBufferTests`, and the `VtRendererTest`, and added a test to confirm the color calculations when _Faint_ is set in the `TextAttributeTests`. I've also done a bunch of manual testing with all the different VT color types and confirmed that our output is comparable to most other terminals.
2020-07-13 19:44:09 +02:00
case DispatchTypes::GraphicsOptions::RGBColorOrFaint:
Log::Comment(L"Testing graphics 'Faint'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute.SetFaint(true);
break;
case DispatchTypes::GraphicsOptions::Underline:
Log::Comment(L"Testing graphics 'Underline'");
_testGetSet->_attribute = TextAttribute{ 0 };
Render the SGR "underlined" attribute in the style of the font (#7148) This PR updates the rendering of the _underlined_ graphic rendition attribute, using the style specified in the active font, instead of just reusing the grid line at the bottom of the character cell. * Support for drawing the correct underline effect in the grid line renderer was added in #7107. There was already an `ExtendedAttributes` flag defined for the underlined state, but I needed to update the `SetUnderlined` and `IsUnderlined` methods in the `TextAttribute` class to use that flag now in place of the legacy `LVB_UNDERSCORE` attribute. This enables underlines set via a VT sequence to be tracked separately from `LVB_UNDERSCORE` grid lines set via the console API. I then needed to update the `Renderer::s_GetGridlines` method to activate the `GridLines::Underline` style when the `Underlined` attribute was set. The `GridLines::Bottom` style is still triggered by the `LVB_UNDERSCORE` attribute to produce the bottom grid line effect. Validation ---------- Because this is a change from the existing behaviour, certain unit tests that were expecting the `LVB_UNDERSCORE` to be toggled by `SGR 4` and `SGR 24` have now had to be updated to check the `Underlined` flag instead. There were also some UI Automation tests that were checking for `SGR 4` mapping to `LVB_UNDERSCORE` attribute, which I've now substituted with a test of the `SGR 53` overline attribute mapping to `LVB_GRID_HORIZONTAL`. These tests only work with legacy attributes, so they can't access the extended underline state, and I thought a replacement test that covered similar ground would be better than dropping the tests altogether. As far as the visual rendering is concerned, I've manually confirmed that the VT underline sequences now draw the underline in the correct position and style, while grid lines output via the console API are still displayed in their original form. Closes #2915
2020-08-03 14:49:25 +02:00
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute.SetUnderlined(true);
break;
Add support for the "doubly underlined" graphic rendition attribute (#7223) This PR adds support for the ANSI _doubly underlined_ graphic rendition attribute, which is enabled by the `SGR 21` escape sequence. There was already an `ExtendedAttributes::DoublyUnderlined` flag in the `TextAttribute` class, but I needed to add `SetDoublyUnderlined` and `IsDoublyUnderlined` methods to access that flag, and update the `SetGraphicsRendition` methods of the two dispatchers to set the attribute on receipt of the `SGR 21` sequence. I also had to update the existing `SGR 24` handler to reset _DoublyUnderlined_ in addition to _Underlined_, since they share the same reset sequence. For the rendering, I've added a new grid line type, which essentially just draws an additional line with the same thickness as the regular underline, but slightly below it - I found a gap of around 0.05 "em" between the lines looked best. If there isn't enough space in the cell for that gap, the second line will be clamped to overlap the first, so you then just get a thicker line. If there isn't even enough space below for a thicker line, we move the offset _above_ the first line, but just enough to make it thicker. The only other complication was the update of the `Xterm256Engine` in the VT renderer. As mentioned above, the two underline attributes share the same reset sequence, so to forward that state over conpty we require a slightly more complicated process than with most other attributes (similar to _Bold_ and _Faint_). We first check whether either underline attribute needs to be turned off to send the reset sequence, and then check individually if each of them needs to be turned back on again. ## Validation Steps Performed For testing, I've extended the existing attribute tests in `AdapterTest`, `VTRendererTest`, and `ScreenBufferTests`, to make sure we're covering both the _Underlined_ and _DoublyUnderlined_ attributes. I've also manually tested the `SGR 21` sequence in conhost and Windows Terminal, with a variety of fonts and font sizes, to make sure the rendering was reasonably distinguishable from a single underline. Closes #2916
2020-08-10 19:06:16 +02:00
case DispatchTypes::GraphicsOptions::DoublyUnderlined:
Log::Comment(L"Testing graphics 'Doubly Underlined'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute.SetDoublyUnderlined(true);
break;
Add support for the "overline" graphic rendition attribute (#6754) ## Summary of the Pull Request This PR adds support for the `SGR 53` and `SGR 55` escapes sequences, which enable and disable the ANSI _overline_ graphic rendition attribute, the equivalent of the console character attribute `COMMON_LVB_GRID_HORIZONTAL`. When a character is output with this attribute set, a horizontal line is rendered at the top of the character cell. ## PR Checklist * [x] Closes #6000 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments To start with, I added `SetOverline` and `IsOverlined` methods to the `TextAttribute` class, to set and get the legacy `COMMON_LVB_GRID_HORIZONTAL` attribute. Technically there was already an `IsTopHorizontalDisplayed` method, but I thought it more readable to add a separate `IsOverlined` as an alias for that. Then it was just a matter of adding calls to set and reset the attribute in response to the `SGR 53` and `SGR 55` sequences in the `SetGraphicsRendition` methods of the two dispatchers. The actual rendering was already taken care of by the `PaintBufferGridLines` method in the rendering engines. The only other change required was to update the `_UpdateExtendedAttrs` method in the `Xterm256Engine` of the VT renderer, to ensure the attribute state would be forwarded to the Windows Terminal over conpty. ## Validation Steps Performed I've extended the existing SGR unit tests to cover the new attribute in the `AdapterTest`, the `OutputEngineTest`, and the `VtRendererTest`. I've also manually tested the `SGR 53` and `SGR 55` sequences to confirm that they do actually render (or remove) an overline on the characters being output.
2020-07-06 16:11:17 +02:00
case DispatchTypes::GraphicsOptions::Overline:
Log::Comment(L"Testing graphics 'Overline'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ COMMON_LVB_GRID_HORIZONTAL };
break;
case DispatchTypes::GraphicsOptions::Negative:
Log::Comment(L"Testing graphics 'Negative'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ COMMON_LVB_REVERSE_VIDEO };
break;
Add support for the "concealed" graphic rendition attribute (#6907) ## Summary of the Pull Request This PR adds support for the `SGR 8` and `SGR 28` escape sequences, which enable and disable the _concealed/invisible_ graphic rendition attribute. When a character is output with this attribute set, it is rendered with the same foreground and background colors, so the text is essentially invisible. ## PR Checklist * [x] Closes #6876 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #6876 ## Detailed Description of the Pull Request / Additional comments Most of the framework for this attribute was already implemented, so it was just a matter of updating the `TextAttribute::CalculateRgbColors` method to make the foreground the same as the background when the _Invisible_ flag was set. Note that this has to happen after the _Reverse Video_ attribute is applied, so if you have white-on-black text that is reversed and invisible, it should be all white, rather than all black. ## Validation Steps Performed There were already existing SGR unit tests covering this attribute in the `ScreenBufferTests`, and the `VtRendererTest`. But I've added to the `AdapterTest` which verifies the SGR sequences for setting and resetting the attribute, and I've extended the `TextAttributeTests` to verify that the color calculations return the correct values when the attribute is set. I've also manually confirmed that we now render the _concealed text_ values correctly in the _ISO 6429_ tests in Vttest. And I've manually tested the output of _concealed_ when combined with other attributes, and made sure that we're matching the behaviour of most other terminals.
2020-07-14 16:11:03 +02:00
case DispatchTypes::GraphicsOptions::Invisible:
Log::Comment(L"Testing graphics 'Invisible'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute.SetInvisible(true);
break;
case DispatchTypes::GraphicsOptions::CrossedOut:
Log::Comment(L"Testing graphics 'Crossed Out'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
_testGetSet->_expectedAttribute.SetCrossedOut(true);
break;
Add support for the "faint" graphic rendition attribute (#6873) ## Summary of the Pull Request This PR adds support for the `SGR 2` escape sequence, which enables the ANSI _faint_ graphic rendition attribute. When a character is output with this attribute set, it uses a dimmer version of the active foreground color. ## PR Checklist * [x] Closes #6703 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #6703 ## Detailed Description of the Pull Request / Additional comments There was already an `ExtendedAttributes::Faint` flag in the `TextAttribute` class, but I needed to add `SetFaint` and `IsFaint` methods to access that flag, and update the `SetGraphicsRendition` methods of the two dispatchers to set the attribute on receipt of the `SGR 2` sequence. I also had to update the existing `SGR 22` handler to reset _Faint_ in addition to _Bold_, since they share the same reset sequence. For that reason, I thought it a good idea to change the name of the `SGR 22` enum to `NotBoldOrFaint`. For the purpose of rendering, I've updated the `TextAttribute::CalculateRgbColors` method to return a dimmer version of the foreground color when the _Faint_ attribute is set. This is simply achieved by dividing each color component by two, which produces a reasonable effect without being too complicated. Note that the _Faint_ effect is applied before _Reverse Video_, so if the output it reversed, it's the background that will be faint. The only other complication was the update of the `Xterm256Engine` in the VT renderer. As mentioned above, _Bold_ and _Faint_ share the same reset sequence, so to forward that state over conpty we have to go through a slightly more complicated process than with other attributes. We first check whether either attribute needs to be turned off to send the reset sequence, and then check if the individual attributes need to be turned on again. ## Validation I've extended the existing SGR unit tests to cover the new attribute in the `AdapterTest`, the `ScreenBufferTests`, and the `VtRendererTest`, and added a test to confirm the color calculations when _Faint_ is set in the `TextAttributeTests`. I've also done a bunch of manual testing with all the different VT color types and confirmed that our output is comparable to most other terminals.
2020-07-13 19:44:09 +02:00
case DispatchTypes::GraphicsOptions::NotBoldOrFaint:
Log::Comment(L"Testing graphics 'No Bold or Faint'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_attribute.SetBold(true);
_testGetSet->_attribute.SetFaint(true);
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
case DispatchTypes::GraphicsOptions::NoUnderline:
Log::Comment(L"Testing graphics 'No Underline'");
Render the SGR "underlined" attribute in the style of the font (#7148) This PR updates the rendering of the _underlined_ graphic rendition attribute, using the style specified in the active font, instead of just reusing the grid line at the bottom of the character cell. * Support for drawing the correct underline effect in the grid line renderer was added in #7107. There was already an `ExtendedAttributes` flag defined for the underlined state, but I needed to update the `SetUnderlined` and `IsUnderlined` methods in the `TextAttribute` class to use that flag now in place of the legacy `LVB_UNDERSCORE` attribute. This enables underlines set via a VT sequence to be tracked separately from `LVB_UNDERSCORE` grid lines set via the console API. I then needed to update the `Renderer::s_GetGridlines` method to activate the `GridLines::Underline` style when the `Underlined` attribute was set. The `GridLines::Bottom` style is still triggered by the `LVB_UNDERSCORE` attribute to produce the bottom grid line effect. Validation ---------- Because this is a change from the existing behaviour, certain unit tests that were expecting the `LVB_UNDERSCORE` to be toggled by `SGR 4` and `SGR 24` have now had to be updated to check the `Underlined` flag instead. There were also some UI Automation tests that were checking for `SGR 4` mapping to `LVB_UNDERSCORE` attribute, which I've now substituted with a test of the `SGR 53` overline attribute mapping to `LVB_GRID_HORIZONTAL`. These tests only work with legacy attributes, so they can't access the extended underline state, and I thought a replacement test that covered similar ground would be better than dropping the tests altogether. As far as the visual rendering is concerned, I've manually confirmed that the VT underline sequences now draw the underline in the correct position and style, while grid lines output via the console API are still displayed in their original form. Closes #2915
2020-08-03 14:49:25 +02:00
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_attribute.SetUnderlined(true);
Add support for the "doubly underlined" graphic rendition attribute (#7223) This PR adds support for the ANSI _doubly underlined_ graphic rendition attribute, which is enabled by the `SGR 21` escape sequence. There was already an `ExtendedAttributes::DoublyUnderlined` flag in the `TextAttribute` class, but I needed to add `SetDoublyUnderlined` and `IsDoublyUnderlined` methods to access that flag, and update the `SetGraphicsRendition` methods of the two dispatchers to set the attribute on receipt of the `SGR 21` sequence. I also had to update the existing `SGR 24` handler to reset _DoublyUnderlined_ in addition to _Underlined_, since they share the same reset sequence. For the rendering, I've added a new grid line type, which essentially just draws an additional line with the same thickness as the regular underline, but slightly below it - I found a gap of around 0.05 "em" between the lines looked best. If there isn't enough space in the cell for that gap, the second line will be clamped to overlap the first, so you then just get a thicker line. If there isn't even enough space below for a thicker line, we move the offset _above_ the first line, but just enough to make it thicker. The only other complication was the update of the `Xterm256Engine` in the VT renderer. As mentioned above, the two underline attributes share the same reset sequence, so to forward that state over conpty we require a slightly more complicated process than with most other attributes (similar to _Bold_ and _Faint_). We first check whether either underline attribute needs to be turned off to send the reset sequence, and then check individually if each of them needs to be turned back on again. ## Validation Steps Performed For testing, I've extended the existing attribute tests in `AdapterTest`, `VTRendererTest`, and `ScreenBufferTests`, to make sure we're covering both the _Underlined_ and _DoublyUnderlined_ attributes. I've also manually tested the `SGR 21` sequence in conhost and Windows Terminal, with a variety of fonts and font sizes, to make sure the rendering was reasonably distinguishable from a single underline. Closes #2916
2020-08-10 19:06:16 +02:00
_testGetSet->_attribute.SetDoublyUnderlined(true);
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
Add support for the "overline" graphic rendition attribute (#6754) ## Summary of the Pull Request This PR adds support for the `SGR 53` and `SGR 55` escapes sequences, which enable and disable the ANSI _overline_ graphic rendition attribute, the equivalent of the console character attribute `COMMON_LVB_GRID_HORIZONTAL`. When a character is output with this attribute set, a horizontal line is rendered at the top of the character cell. ## PR Checklist * [x] Closes #6000 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments To start with, I added `SetOverline` and `IsOverlined` methods to the `TextAttribute` class, to set and get the legacy `COMMON_LVB_GRID_HORIZONTAL` attribute. Technically there was already an `IsTopHorizontalDisplayed` method, but I thought it more readable to add a separate `IsOverlined` as an alias for that. Then it was just a matter of adding calls to set and reset the attribute in response to the `SGR 53` and `SGR 55` sequences in the `SetGraphicsRendition` methods of the two dispatchers. The actual rendering was already taken care of by the `PaintBufferGridLines` method in the rendering engines. The only other change required was to update the `_UpdateExtendedAttrs` method in the `Xterm256Engine` of the VT renderer, to ensure the attribute state would be forwarded to the Windows Terminal over conpty. ## Validation Steps Performed I've extended the existing SGR unit tests to cover the new attribute in the `AdapterTest`, the `OutputEngineTest`, and the `VtRendererTest`. I've also manually tested the `SGR 53` and `SGR 55` sequences to confirm that they do actually render (or remove) an overline on the characters being output.
2020-07-06 16:11:17 +02:00
case DispatchTypes::GraphicsOptions::NoOverline:
Log::Comment(L"Testing graphics 'No Overline'");
_testGetSet->_attribute = TextAttribute{ COMMON_LVB_GRID_HORIZONTAL };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
case DispatchTypes::GraphicsOptions::Positive:
Log::Comment(L"Testing graphics 'Positive'");
_testGetSet->_attribute = TextAttribute{ COMMON_LVB_REVERSE_VIDEO };
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
Add support for the "concealed" graphic rendition attribute (#6907) ## Summary of the Pull Request This PR adds support for the `SGR 8` and `SGR 28` escape sequences, which enable and disable the _concealed/invisible_ graphic rendition attribute. When a character is output with this attribute set, it is rendered with the same foreground and background colors, so the text is essentially invisible. ## PR Checklist * [x] Closes #6876 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #6876 ## Detailed Description of the Pull Request / Additional comments Most of the framework for this attribute was already implemented, so it was just a matter of updating the `TextAttribute::CalculateRgbColors` method to make the foreground the same as the background when the _Invisible_ flag was set. Note that this has to happen after the _Reverse Video_ attribute is applied, so if you have white-on-black text that is reversed and invisible, it should be all white, rather than all black. ## Validation Steps Performed There were already existing SGR unit tests covering this attribute in the `ScreenBufferTests`, and the `VtRendererTest`. But I've added to the `AdapterTest` which verifies the SGR sequences for setting and resetting the attribute, and I've extended the `TextAttributeTests` to verify that the color calculations return the correct values when the attribute is set. I've also manually confirmed that we now render the _concealed text_ values correctly in the _ISO 6429_ tests in Vttest. And I've manually tested the output of _concealed_ when combined with other attributes, and made sure that we're matching the behaviour of most other terminals.
2020-07-14 16:11:03 +02:00
case DispatchTypes::GraphicsOptions::Visible:
Log::Comment(L"Testing graphics 'Visible'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_attribute.SetInvisible(true);
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
case DispatchTypes::GraphicsOptions::NotCrossedOut:
Log::Comment(L"Testing graphics 'Not Crossed Out'");
_testGetSet->_attribute = TextAttribute{ 0 };
_testGetSet->_attribute.SetCrossedOut(true);
_testGetSet->_expectedAttribute = TextAttribute{ 0 };
break;
case DispatchTypes::GraphicsOptions::ForegroundBlack:
Log::Comment(L"Testing graphics 'Foreground Color Black'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_GREEN | FOREGROUND_BLUE | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLACK);
break;
case DispatchTypes::GraphicsOptions::ForegroundBlue:
Log::Comment(L"Testing graphics 'Foreground Color Blue'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_GREEN | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLUE);
break;
case DispatchTypes::GraphicsOptions::ForegroundGreen:
Log::Comment(L"Testing graphics 'Foreground Color Green'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_BLUE | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
break;
case DispatchTypes::GraphicsOptions::ForegroundCyan:
Log::Comment(L"Testing graphics 'Foreground Color Cyan'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_CYAN);
break;
case DispatchTypes::GraphicsOptions::ForegroundRed:
Log::Comment(L"Testing graphics 'Foreground Color Red'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
break;
case DispatchTypes::GraphicsOptions::ForegroundMagenta:
Log::Comment(L"Testing graphics 'Foreground Color Magenta'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_GREEN | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_MAGENTA);
break;
case DispatchTypes::GraphicsOptions::ForegroundYellow:
Log::Comment(L"Testing graphics 'Foreground Color Yellow'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_BLUE | FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_YELLOW);
break;
case DispatchTypes::GraphicsOptions::ForegroundWhite:
Log::Comment(L"Testing graphics 'Foreground Color White'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_WHITE);
break;
case DispatchTypes::GraphicsOptions::ForegroundDefault:
Log::Comment(L"Testing graphics 'Foreground Color Default'");
_testGetSet->_attribute = TextAttribute{ (WORD)~_testGetSet->s_wDefaultAttribute }; // set the current attribute to the opposite of default so we can ensure all relevant bits flip.
// To get expected value, take what we started with and change ONLY the background series of bits to what the Default says.
_testGetSet->_expectedAttribute = _testGetSet->_attribute; // expect = starting
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute.SetDefaultForeground(); // set the foreground as default
break;
case DispatchTypes::GraphicsOptions::BackgroundBlack:
Log::Comment(L"Testing graphics 'Background Color Black'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_GREEN | BACKGROUND_BLUE | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_BLACK);
break;
case DispatchTypes::GraphicsOptions::BackgroundBlue:
Log::Comment(L"Testing graphics 'Background Color Blue'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_GREEN | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_BLUE);
break;
case DispatchTypes::GraphicsOptions::BackgroundGreen:
Log::Comment(L"Testing graphics 'Background Color Green'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_BLUE | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_GREEN);
break;
case DispatchTypes::GraphicsOptions::BackgroundCyan:
Log::Comment(L"Testing graphics 'Background Color Cyan'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_CYAN);
break;
case DispatchTypes::GraphicsOptions::BackgroundRed:
Log::Comment(L"Testing graphics 'Background Color Red'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_BLUE | BACKGROUND_GREEN | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_RED);
break;
case DispatchTypes::GraphicsOptions::BackgroundMagenta:
Log::Comment(L"Testing graphics 'Background Color Magenta'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_GREEN | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_MAGENTA);
break;
case DispatchTypes::GraphicsOptions::BackgroundYellow:
Log::Comment(L"Testing graphics 'Background Color Yellow'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_BLUE | BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_YELLOW);
break;
case DispatchTypes::GraphicsOptions::BackgroundWhite:
Log::Comment(L"Testing graphics 'Background Color White'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_INTENSITY };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_WHITE);
break;
case DispatchTypes::GraphicsOptions::BackgroundDefault:
Log::Comment(L"Testing graphics 'Background Color Default'");
_testGetSet->_attribute = TextAttribute{ (WORD)~_testGetSet->s_wDefaultAttribute }; // set the current attribute to the opposite of default so we can ensure all relevant bits flip.
// To get expected value, take what we started with and change ONLY the background series of bits to what the Default says.
_testGetSet->_expectedAttribute = _testGetSet->_attribute; // expect = starting
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute.SetDefaultBackground(); // set the background as default
break;
case DispatchTypes::GraphicsOptions::BrightForegroundBlack:
Log::Comment(L"Testing graphics 'Bright Foreground Color Black'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_GREEN | FOREGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_BLACK);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundBlue:
Log::Comment(L"Testing graphics 'Bright Foreground Color Blue'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_BLUE);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundGreen:
Log::Comment(L"Testing graphics 'Bright Foreground Color Green'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED | FOREGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_GREEN);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundCyan:
Log::Comment(L"Testing graphics 'Bright Foreground Color Cyan'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_RED };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_CYAN);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundRed:
Log::Comment(L"Testing graphics 'Bright Foreground Color Red'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_BLUE | FOREGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_RED);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundMagenta:
Log::Comment(L"Testing graphics 'Bright Foreground Color Magenta'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_MAGENTA);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundYellow:
Log::Comment(L"Testing graphics 'Bright Foreground Color Yellow'");
_testGetSet->_attribute = TextAttribute{ FOREGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_YELLOW);
break;
case DispatchTypes::GraphicsOptions::BrightForegroundWhite:
Log::Comment(L"Testing graphics 'Bright Foreground Color White'");
_testGetSet->_attribute = TextAttribute{ 0 };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_WHITE);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundBlack:
Log::Comment(L"Testing graphics 'Bright Background Color Black'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_GREEN | BACKGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_BLACK);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundBlue:
Log::Comment(L"Testing graphics 'Bright Background Color Blue'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_BLUE);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundGreen:
Log::Comment(L"Testing graphics 'Bright Background Color Green'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED | BACKGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_GREEN);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundCyan:
Log::Comment(L"Testing graphics 'Bright Background Color Cyan'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_RED };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_CYAN);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundRed:
Log::Comment(L"Testing graphics 'Bright Background Color Red'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_BLUE | BACKGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_RED);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundMagenta:
Log::Comment(L"Testing graphics 'Bright Background Color Magenta'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_GREEN };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_MAGENTA);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundYellow:
Log::Comment(L"Testing graphics 'Bright Background Color Yellow'");
_testGetSet->_attribute = TextAttribute{ BACKGROUND_BLUE };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_YELLOW);
break;
case DispatchTypes::GraphicsOptions::BrightBackgroundWhite:
Log::Comment(L"Testing graphics 'Bright Background Color White'");
_testGetSet->_attribute = TextAttribute{ 0 };
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::BRIGHT_WHITE);
break;
default:
VERIFY_FAIL(L"Test not implemented yet!");
break;
}
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
}
TEST_METHOD(GraphicsPushPopTests)
{
Log::Comment(L"Starting test...");
_testGetSet->PrepData(); // default color from here is gray on black, FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED
VTParameter rgOptions[16];
VTParameter rgStackOptions[16];
size_t cOptions = 1;
Log::Comment(L"Test 1: Basic push and pop");
rgOptions[0] = DispatchTypes::GraphicsOptions::Off;
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
cOptions = 0;
VERIFY_IS_TRUE(_pDispatch->PushGraphicsRendition({ rgStackOptions, cOptions }));
VERIFY_IS_TRUE(_pDispatch->PopGraphicsRendition());
Log::Comment(L"Test 2: Push, change color, pop");
VERIFY_IS_TRUE(_pDispatch->PushGraphicsRendition({ rgStackOptions, cOptions }));
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundCyan;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_CYAN);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
cOptions = 0;
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch->PopGraphicsRendition());
Log::Comment(L"Test 3: two pushes (nested) and pops");
// First push:
VERIFY_IS_TRUE(_pDispatch->PushGraphicsRendition({ rgStackOptions, cOptions }));
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundRed;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
// Second push:
cOptions = 0;
VERIFY_IS_TRUE(_pDispatch->PushGraphicsRendition({ rgStackOptions, cOptions }));
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundGreen;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
// First pop:
cOptions = 0;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->PopGraphicsRendition());
// Second pop:
cOptions = 0;
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch->PopGraphicsRendition());
Log::Comment(L"Test 4: Save and restore partial attributes");
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundGreen;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::BoldBright;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetBold(true);
_testGetSet->_expectedAttribute.SetDefaultBackground();
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundBlue;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_BLUE);
_testGetSet->_expectedAttribute.SetBold(true);
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
// Push, specifying that we only want to save the background, the boldness, and double-underline-ness:
cOptions = 3;
rgStackOptions[0] = (size_t)DispatchTypes::SgrSaveRestoreStackOptions::Boldness;
rgStackOptions[1] = (size_t)DispatchTypes::SgrSaveRestoreStackOptions::SaveBackgroundColor;
rgStackOptions[2] = (size_t)DispatchTypes::SgrSaveRestoreStackOptions::DoublyUnderlined;
VERIFY_IS_TRUE(_pDispatch->PushGraphicsRendition({ rgStackOptions, cOptions }));
// Now change everything...
cOptions = 2;
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundGreen;
rgOptions[1] = DispatchTypes::GraphicsOptions::DoublyUnderlined;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetBold(true);
_testGetSet->_expectedAttribute.SetDoublyUnderlined(true);
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
cOptions = 1;
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundRed;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetBold(true);
_testGetSet->_expectedAttribute.SetDoublyUnderlined(true);
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
rgOptions[0] = DispatchTypes::GraphicsOptions::NotBoldOrFaint;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_GREEN);
_testGetSet->_expectedAttribute.SetDoublyUnderlined(true);
VERIFY_IS_TRUE(_pDispatch->SetGraphicsRendition({ rgOptions, cOptions }));
// And then restore...
cOptions = 0;
_testGetSet->_expectedAttribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_RED);
_testGetSet->_expectedAttribute.SetIndexedBackground(TextColor::DARK_BLUE);
_testGetSet->_expectedAttribute.SetBold(true);
VERIFY_IS_TRUE(_pDispatch->PopGraphicsRendition());
}
TEST_METHOD(GraphicsPersistBrightnessTests)
{
Log::Comment(L"Starting test...");
_testGetSet->PrepData(); // default color from here is gray on black, FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VTParameter rgOptions[16];
size_t cOptions = 1;
Log::Comment(L"Test 1: Basic brightness test");
Log::Comment(L"Resetting graphics options");
rgOptions[0] = DispatchTypes::GraphicsOptions::Off;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Testing graphics 'Foreground Color Blue'");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Enabling brightness");
rgOptions[0] = DispatchTypes::GraphicsOptions::BoldBright;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute.SetBold(true);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Green, with brightness'");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundGreen;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(WI_IsFlagSet(_testGetSet->_attribute.GetLegacyAttributes(), FOREGROUND_GREEN));
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Test 2: Disable brightness, use a bright color, next normal call remains not bright");
Log::Comment(L"Resetting graphics options");
rgOptions[0] = DispatchTypes::GraphicsOptions::Off;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(WI_IsFlagClear(_testGetSet->_attribute.GetLegacyAttributes(), FOREGROUND_INTENSITY));
VERIFY_IS_FALSE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Bright Blue'");
rgOptions[0] = DispatchTypes::GraphicsOptions::BrightForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_FALSE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Blue', brightness of 9x series doesn't persist");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_FALSE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Test 3: Enable brightness, use a bright color, brightness persists to next normal call");
Log::Comment(L"Resetting graphics options");
rgOptions[0] = DispatchTypes::GraphicsOptions::Off;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute = {};
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_FALSE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Blue'");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_FALSE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Enabling brightness");
rgOptions[0] = DispatchTypes::GraphicsOptions::BoldBright;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute.SetBold(true);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Bright Blue'");
rgOptions[0] = DispatchTypes::GraphicsOptions::BrightForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::BRIGHT_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Blue, with brightness', brightness of 9x series doesn't affect brightness");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundBlue;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_BLUE);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
Log::Comment(L"Testing graphics 'Foreground Color Green, with brightness'");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundGreen;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground(TextColor::DARK_GREEN);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
VERIFY_IS_TRUE(_testGetSet->_attribute.IsBold());
}
TEST_METHOD(DeviceStatusReportTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify failure when using bad status.");
_testGetSet->PrepData();
VERIFY_IS_FALSE(_pDispatch.get()->DeviceStatusReport((DispatchTypes::AnsiStatusType)-1));
}
TEST_METHOD(DeviceStatus_OperatingStatusTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify good operating condition.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->DeviceStatusReport(DispatchTypes::AnsiStatusType::OS_OperatingStatus));
_testGetSet->ValidateInputEvent(L"\x1b[0n");
}
TEST_METHOD(DeviceStatus_CursorPositionReportTests)
{
Log::Comment(L"Starting test...");
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
{
Log::Comment(L"Test 1: Verify normal cursor response position.");
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
// start with the cursor position in the buffer.
COORD coordCursorExpected = _testGetSet->_cursorPos;
// to get to VT, we have to adjust it to its position relative to the viewport top.
coordCursorExpected.Y -= _testGetSet->_viewport.Top;
// Then note that VT is 1,1 based for the top left, so add 1. (The rest of the console uses 0,0 for array index bases.)
coordCursorExpected.X++;
coordCursorExpected.Y++;
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
VERIFY_IS_TRUE(_pDispatch.get()->DeviceStatusReport(DispatchTypes::AnsiStatusType::CPR_CursorPositionReport));
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
wchar_t pwszBuffer[50];
swprintf_s(pwszBuffer, ARRAYSIZE(pwszBuffer), L"\x1b[%d;%dR", coordCursorExpected.Y, coordCursorExpected.X);
_testGetSet->ValidateInputEvent(pwszBuffer);
}
{
Log::Comment(L"Test 2: Verify multiple CPRs with a cursor move between them");
_testGetSet->PrepData(CursorX::XCENTER, CursorY::YCENTER);
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
// enable retention so that the two DSR responses don't delete eachother
auto retentionScope{ _testGetSet->EnableInputRetentionInScope() };
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
// start with the cursor position in the buffer.
til::point coordCursorExpectedFirst{ _testGetSet->_cursorPos };
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
// to get to VT, we have to adjust it to its position relative to the viewport top.
coordCursorExpectedFirst -= til::point{ 0, _testGetSet->_viewport.Top };
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
// Then note that VT is 1,1 based for the top left, so add 1. (The rest of the console uses 0,0 for array index bases.)
coordCursorExpectedFirst += til::point{ 1, 1 };
VERIFY_IS_TRUE(_pDispatch.get()->DeviceStatusReport(DispatchTypes::AnsiStatusType::CPR_CursorPositionReport));
_testGetSet->_cursorPos.X++;
_testGetSet->_cursorPos.Y++;
auto coordCursorExpectedSecond{ coordCursorExpectedFirst };
coordCursorExpectedSecond += til::point{ 1, 1 };
VERIFY_IS_TRUE(_pDispatch.get()->DeviceStatusReport(DispatchTypes::AnsiStatusType::CPR_CursorPositionReport));
wchar_t pwszBuffer[50];
swprintf_s(pwszBuffer, ARRAYSIZE(pwszBuffer), L"\x1b[%d;%dR\x1b[%d;%dR", coordCursorExpectedFirst.y<int>(), coordCursorExpectedFirst.x<int>(), coordCursorExpectedSecond.y<int>(), coordCursorExpectedSecond.x<int>());
_testGetSet->ValidateInputEvent(pwszBuffer);
}
}
TEST_METHOD(DeviceAttributesTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify normal response.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->DeviceAttributes());
PCWSTR pwszExpectedResponse = L"\x1b[?1;0c";
_testGetSet->ValidateInputEvent(pwszExpectedResponse);
Log::Comment(L"Test 2: Verify failure when WriteConsoleInput doesn't work.");
_testGetSet->PrepData();
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
_testGetSet->_privateWriteConsoleInputWResult = FALSE;
VERIFY_IS_FALSE(_pDispatch.get()->DeviceAttributes());
}
Add support for DA2 and DA3 device attributes reports (#6850) This PR adds support for the `DA2` (Secondary Device Attributes) and `DA3` (Tertiary Device Attributes) escape sequences, which are standard VT queries reporting basic information about the terminal. The _Secondary Device Attributes_ response is made up of a number of parameters: 1. An identification code, for which I've used 0 to indicate that we have the capabilities of a VT100 (using code 0 for this is an XTerm convention, since technically DA2 would not have been supported by a VT100). 2. A firmware revision level, which some terminal emulators use to report their actual version number, but I thought it best we just hardcode a value of 10 (the DEC convention for 1.0). 3. Additional hardware options, which tend to be device specific, but I've followed the convention of the later DEC terminals using 1 to indicate the presence of a PC keyboard. The _Tertiary Device Attributes_ response was originally used to provide a unique terminal identification code, and which some terminal emulators use as a way to identify themselves. However, I think that's information we'd probably prefer not to reveal, so I've followed the more common practice of returning all zeros for the ID. In terms of implementation, the only complication was the need to add an additional code path in the `OutputStateMachine` to handle the `>` and `=` intermediates (technically private parameter prefixes) that these sequences require. I've done this as a single method - rather than one for each prefix - since I think that makes the code easier to follow. VALIDATION ---------- I've added output engine tests to make sure the sequences are dispatched correctly, and adapter tests to confirm that they are returning the responses we expect. I've also manually confirmed that they pass the _Test of terminal reports_ in Vttest. Closes #5836
2020-07-11 00:27:47 +02:00
TEST_METHOD(SecondaryDeviceAttributesTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify normal response.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->SecondaryDeviceAttributes());
PCWSTR pwszExpectedResponse = L"\x1b[>0;10;1c";
_testGetSet->ValidateInputEvent(pwszExpectedResponse);
Log::Comment(L"Test 2: Verify failure when WriteConsoleInput doesn't work.");
_testGetSet->PrepData();
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
_testGetSet->_privateWriteConsoleInputWResult = FALSE;
Add support for DA2 and DA3 device attributes reports (#6850) This PR adds support for the `DA2` (Secondary Device Attributes) and `DA3` (Tertiary Device Attributes) escape sequences, which are standard VT queries reporting basic information about the terminal. The _Secondary Device Attributes_ response is made up of a number of parameters: 1. An identification code, for which I've used 0 to indicate that we have the capabilities of a VT100 (using code 0 for this is an XTerm convention, since technically DA2 would not have been supported by a VT100). 2. A firmware revision level, which some terminal emulators use to report their actual version number, but I thought it best we just hardcode a value of 10 (the DEC convention for 1.0). 3. Additional hardware options, which tend to be device specific, but I've followed the convention of the later DEC terminals using 1 to indicate the presence of a PC keyboard. The _Tertiary Device Attributes_ response was originally used to provide a unique terminal identification code, and which some terminal emulators use as a way to identify themselves. However, I think that's information we'd probably prefer not to reveal, so I've followed the more common practice of returning all zeros for the ID. In terms of implementation, the only complication was the need to add an additional code path in the `OutputStateMachine` to handle the `>` and `=` intermediates (technically private parameter prefixes) that these sequences require. I've done this as a single method - rather than one for each prefix - since I think that makes the code easier to follow. VALIDATION ---------- I've added output engine tests to make sure the sequences are dispatched correctly, and adapter tests to confirm that they are returning the responses we expect. I've also manually confirmed that they pass the _Test of terminal reports_ in Vttest. Closes #5836
2020-07-11 00:27:47 +02:00
VERIFY_IS_FALSE(_pDispatch.get()->SecondaryDeviceAttributes());
}
TEST_METHOD(TertiaryDeviceAttributesTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify normal response.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->TertiaryDeviceAttributes());
PCWSTR pwszExpectedResponse = L"\x1bP!|00000000\x1b\\";
_testGetSet->ValidateInputEvent(pwszExpectedResponse);
Log::Comment(L"Test 2: Verify failure when WriteConsoleInput doesn't work.");
_testGetSet->PrepData();
Switch all DSR responses to appending instead of prepending (#7583) This fixes an issue where two CPRs could end up corrupted in the input buffer. An application that sent two CPRs back-to-back could end up reading the first few characters of the first prepended CPR before handing us another CPR. We would dutifully prepend it to the buffer, causing them to overlap. ``` ^[^[2;2R[1;1R ^^ ^^^^^ First CPR ^^^^^^ Second CPR ``` The end result of this corruption is that a requesting application would receive an unbidden `R` on stdin; for vim, this would trigger replace mode immediately on startup. Response prepending was implemented in !997738 without much comment. There's very little in the way of audit trail as to why we switched. Michael believes that we wanted to make sure that applications got DSR responses immediately. It had the unfortunate side effect of causing subsequence CPRs across cursor moves to come out in the wrong order. I discussed our options with him, and he suggested that we could implement a priority queue in InputBuffer and make sure that "response" input was dispatched to a client application before any application- or user-generated input. This was deemed to be too much work. We decided that DSR responses getting top billing was likely to be a stronger guarantee than most terminals are capable of giving, and that we should be fine if we just switch it back to append. Thanks to @k-takata, @tekki and @brammool for the investigation on the vim side. Fixes #1637.
2020-09-10 01:55:22 +02:00
_testGetSet->_privateWriteConsoleInputWResult = FALSE;
Add support for DA2 and DA3 device attributes reports (#6850) This PR adds support for the `DA2` (Secondary Device Attributes) and `DA3` (Tertiary Device Attributes) escape sequences, which are standard VT queries reporting basic information about the terminal. The _Secondary Device Attributes_ response is made up of a number of parameters: 1. An identification code, for which I've used 0 to indicate that we have the capabilities of a VT100 (using code 0 for this is an XTerm convention, since technically DA2 would not have been supported by a VT100). 2. A firmware revision level, which some terminal emulators use to report their actual version number, but I thought it best we just hardcode a value of 10 (the DEC convention for 1.0). 3. Additional hardware options, which tend to be device specific, but I've followed the convention of the later DEC terminals using 1 to indicate the presence of a PC keyboard. The _Tertiary Device Attributes_ response was originally used to provide a unique terminal identification code, and which some terminal emulators use as a way to identify themselves. However, I think that's information we'd probably prefer not to reveal, so I've followed the more common practice of returning all zeros for the ID. In terms of implementation, the only complication was the need to add an additional code path in the `OutputStateMachine` to handle the `>` and `=` intermediates (technically private parameter prefixes) that these sequences require. I've done this as a single method - rather than one for each prefix - since I think that makes the code easier to follow. VALIDATION ---------- I've added output engine tests to make sure the sequences are dispatched correctly, and adapter tests to confirm that they are returning the responses we expect. I've also manually confirmed that they pass the _Test of terminal reports_ in Vttest. Closes #5836
2020-07-11 00:27:47 +02:00
VERIFY_IS_FALSE(_pDispatch.get()->TertiaryDeviceAttributes());
}
Add support for the DECREQTPARM report (#7939) This PR adds support for the `DECREQTPARM` (Request Terminal Parameters) escape sequence, which was originally used on the VT100 terminal to report the serial communication parameters. Modern terminal emulators simply hardcode the reported values for backward compatibility. The `DECREQTPARM` sequence has one parameter, which was originally used to tell the terminal whether it was permitted to send unsolicited reports or not. However, since we have no reason to send an unsolicited report, we don't need to keep track of that state, but the permission parameter does still determine the value of the first parameter in the response. The response parameters are as follows: | Parameter | Value | Meaning | | ---------------- | ------ | ------------------------ | | response type | 2 or 3 | unsolicited or solicited | | parity | 1 | no parity | | data bits | 1 | 8 bits per character | | transmit speed | 128 | 38400 baud | | receive speed | 128 | 38400 baud | | clock multiplier | 1 | | | flags | 0 | | There is some variation in the baud rate reported by modern terminal emulators, and 9600 baud seems to be a little more common than 38400 baud, but I thought the higher speed was probably more appropriate, especially since that's also the value reported by XTerm. ## Validation Steps Performed I've added a couple of adapter and output engine tests to verify that the sequence is dispatched correctly, and the expected responses are generated. I've also manually tested in Vttest and confirmed that we now pass the `DECREQTPARM` test in the _Test of terminal reports_. Closes #7852
2020-10-16 00:50:02 +02:00
TEST_METHOD(RequestTerminalParametersTests)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Verify response for unsolicited permission.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->RequestTerminalParameters(DispatchTypes::ReportingPermission::Unsolicited));
_testGetSet->ValidateInputEvent(L"\x1b[2;1;1;128;128;1;0x");
Log::Comment(L"Test 2: Verify response for solicited permission.");
_testGetSet->PrepData();
VERIFY_IS_TRUE(_pDispatch.get()->RequestTerminalParameters(DispatchTypes::ReportingPermission::Solicited));
_testGetSet->ValidateInputEvent(L"\x1b[3;1;1;128;128;1;0x");
Log::Comment(L"Test 3: Verify failure with invalid parameter.");
_testGetSet->PrepData();
VERIFY_IS_FALSE(_pDispatch.get()->RequestTerminalParameters((DispatchTypes::ReportingPermission)2));
Log::Comment(L"Test 4: Verify failure when WriteConsoleInput doesn't work.");
_testGetSet->PrepData();
_testGetSet->_privateWriteConsoleInputWResult = FALSE;
VERIFY_IS_FALSE(_pDispatch.get()->RequestTerminalParameters(DispatchTypes::ReportingPermission::Unsolicited));
}
TEST_METHOD(RequestSettingsTests)
{
const auto requestSetting = [=](const std::wstring_view settingId = {}) {
const auto stringHandler = _pDispatch.get()->RequestSetting();
for (auto ch : settingId)
{
stringHandler(ch);
}
stringHandler(L'\033'); // String terminator
};
Log::Comment(L"Requesting DECSTBM margins (5 to 10).");
_testGetSet->PrepData();
_pDispatch.get()->SetTopBottomScrollingMargins(5, 10);
requestSetting(L"r");
_testGetSet->ValidateInputEvent(L"\033P1$r5;10r\033\\");
Log::Comment(L"Requesting DECSTBM margins (full screen).");
_testGetSet->PrepData();
// Set screen height to 25 - this will be the expected margin range.
_testGetSet->_viewport.Bottom = _testGetSet->_viewport.Top + 25;
_pDispatch.get()->SetTopBottomScrollingMargins(0, 0);
requestSetting(L"r");
_testGetSet->ValidateInputEvent(L"\033P1$r1;25r\033\\");
Log::Comment(L"Requesting SGR attributes (default).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0m\033\\");
Log::Comment(L"Requesting SGR attributes (bold, underlined, reversed).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetBold(true);
_testGetSet->_attribute.SetUnderlined(true);
_testGetSet->_attribute.SetReverseVideo(true);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;1;4;7m\033\\");
Log::Comment(L"Requesting SGR attributes (faint, blinking, invisible).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetFaint(true);
_testGetSet->_attribute.SetBlinking(true);
_testGetSet->_attribute.SetInvisible(true);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;2;5;8m\033\\");
Log::Comment(L"Requesting SGR attributes (italic, crossed-out).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetItalic(true);
_testGetSet->_attribute.SetCrossedOut(true);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;3;9m\033\\");
Log::Comment(L"Requesting SGR attributes (doubly underlined, overlined).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetDoublyUnderlined(true);
_testGetSet->_attribute.SetOverlined(true);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;21;53m\033\\");
Log::Comment(L"Requesting SGR attributes (standard colors).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_attribute.SetIndexedForeground(TextColor::DARK_YELLOW);
_testGetSet->_attribute.SetIndexedBackground(TextColor::DARK_CYAN);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;33;46m\033\\");
Log::Comment(L"Requesting SGR attributes (AIX colors).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_attribute.SetIndexedForeground(TextColor::BRIGHT_CYAN);
_testGetSet->_attribute.SetIndexedBackground(TextColor::BRIGHT_YELLOW);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;96;103m\033\\");
Log::Comment(L"Requesting SGR attributes (ITU indexed colors).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetIndexedForeground256(123);
_testGetSet->_attribute.SetIndexedBackground256(45);
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;38;5;123;48;5;45m\033\\");
Log::Comment(L"Requesting SGR attributes (ITU RGB colors).");
_testGetSet->PrepData();
_testGetSet->_attribute = {};
_testGetSet->_attribute.SetForeground(RGB(12, 34, 56));
_testGetSet->_attribute.SetBackground(RGB(65, 43, 21));
requestSetting(L"m");
_testGetSet->ValidateInputEvent(L"\033P1$r0;38;2;12;34;56;48;2;65;43;21m\033\\");
Log::Comment(L"Requesting an unsupported setting.");
_testGetSet->PrepData();
requestSetting(L"x");
_testGetSet->ValidateInputEvent(L"\033P0$r\033\\");
}
TEST_METHOD(CursorKeysModeTest)
{
Log::Comment(L"Starting test...");
// success cases
// set numeric mode = true
Log::Comment(L"Test 1: application mode = false");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_setInputModeResult = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::CursorKey;
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->SetCursorKeysMode(false));
// set numeric mode = false
Log::Comment(L"Test 2: application mode = true");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_setInputModeResult = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::CursorKey;
_testGetSet->_expectedInputModeEnabled = true;
VERIFY_IS_TRUE(_pDispatch.get()->SetCursorKeysMode(true));
}
TEST_METHOD(KeypadModeTest)
{
Log::Comment(L"Starting test...");
// success cases
// set numeric mode = true
Log::Comment(L"Test 1: application mode = false");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_setInputModeResult = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::Keypad;
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->SetKeypadMode(false));
// set numeric mode = false
Log::Comment(L"Test 2: application mode = true");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_setInputModeResult = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::Keypad;
_testGetSet->_expectedInputModeEnabled = true;
VERIFY_IS_TRUE(_pDispatch.get()->SetKeypadMode(true));
}
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
TEST_METHOD(AnsiModeTest)
{
Log::Comment(L"Starting test...");
// success cases
// set ansi mode = true
Log::Comment(L"Test 1: ansi mode = true");
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::Ansi;
_testGetSet->_expectedParserModeEnabled = true;
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
VERIFY_IS_TRUE(_pDispatch.get()->SetAnsiMode(true));
// set ansi mode = false
Log::Comment(L"Test 2: ansi mode = false.");
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::Ansi;
_testGetSet->_expectedParserModeEnabled = false;
Add support for VT52 emulation (#4789) ## Summary of the Pull Request This PR adds support for the core VT52 commands, and implements the `DECANM` private mode sequence, which switches the terminal between ANSI mode and VT52-compatible mode. ## References PR #2017 defined the initial specification for VT52 support. PR #4044 removed the original VT52 cursor ops that conflicted with VT100 sequences. ## PR Checklist * [x] Closes #976 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #2017 ## Detailed Description of the Pull Request / Additional comments Most of the work involves updates to the parsing state machine, which behaves differently in VT52 mode. `CSI`, `OSC`, and `SS3` sequences are not applicable, and there is one special-case escape sequence (_Direct Cursor Address_), which requires an additional state to handle parameters that come _after_ the final character. Once the parsing is handled though, it's mostly just a matter of dispatching the commands to existing methods in the `ITermDispatch` interface. Only one new method was required in the interface to handle the _Identify_ command. The only real new functionality is in the `TerminalInput` class, which needs to generate different escape sequences for certain keys in VT52 mode. This does not yet support _all_ of the VT52 key sequences, because the VT100 support is itself not yet complete. But the basics are in place, and I think the rest is best left for a follow-up issue, and potentially a refactor of the `TerminalInput` class. I should point out that the original spec called for a new _Graphic Mode_ character set, but I've since discovered that the VT terminals that _emulate_ VT52 just use the existing VT100 _Special Graphics_ set, so that is really what we should be doing too. We can always consider adding the VT52 graphic set as a option later, if there is demand for strict VT52 compatibility. ## Validation Steps Performed I've added state machine and adapter tests to confirm that the `DECANM` mode changing sequences are correctly dispatched and forwarded to the `ConGetSet` handler. I've also added state machine tests that confirm the VT52 escape sequences are dispatched correctly when the ANSI mode is reset. For fuzzing support, I've extended the VT command fuzzer to generate the different kinds of VT52 sequences, as well as mode change sequences to switch between the ANSI and VT52 modes. In terms of manual testing, I've confirmed that the _Test of VT52 mode_ in Vttest now works as expected.
2020-06-01 23:20:40 +02:00
VERIFY_IS_TRUE(_pDispatch.get()->SetAnsiMode(false));
}
TEST_METHOD(AllowBlinkingTest)
{
Log::Comment(L"Starting test...");
// success cases
// set numeric mode = true
Log::Comment(L"Test 1: enable blinking = true");
_testGetSet->_privateAllowCursorBlinkingResult = TRUE;
_testGetSet->_enable = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableCursorBlinking(true));
// set numeric mode = false
Log::Comment(L"Test 2: enable blinking = false");
_testGetSet->_privateAllowCursorBlinkingResult = TRUE;
_testGetSet->_enable = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableCursorBlinking(false));
}
TEST_METHOD(ScrollMarginsTest)
{
Log::Comment(L"Starting test...");
SMALL_RECT srTestMargins = { 0 };
Improve the VT cursor movement implementation (#3628) ## Summary of the Pull Request Originally there were 3 different methods for implementing VT cursor movement, and between them they still couldn't handle some of the operations correctly. This PR unifies those operations into a single method that can handle every type of cursor movement, and which fixes some of the issues with the existing implementations. In particular it fixes the `CNL` and `CPL` operations, so they're now correctly constrained by the `DECSTBM` margins. ## References If this PR is accepted, the method added here should make it trivial to implement the `VPR` and `HPR` commands in issue #3428. ## PR Checklist * [x] Closes #2926 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [ ] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #xxx ## Detailed Description of the Pull Request / Additional comments The new [`AdaptDispatch::_CursorMovePosition`](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.cpp#L169) method is based on the proposal I made in issue #3428 for the `VPR` and `HPR` comands. It takes three arguments: a row offset (which can be absolute or relative), a column offset (ditto), and a flag specifying whether the position should be constrained by the `DECSTBM` margins. To make the code more readable, I've implemented the offsets using [a `struct` with some `constexpr` helper functions for the construction](https://github.com/microsoft/terminal/blob/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884/src/terminal/adapter/adaptDispatch.hpp#L116-L125). This lets you specify the parameters with expressions like `Offset::Absolute(col)` or `Offset::Forward(distance)` which I think makes the calling code a little easier to understand. While implementing this new method, I noticed a couple of issues in the existing movement implementations which I thought would be good to fix at the same time. 1. When cursor movement is constrained horizontally, it should be constrained by the buffer width, and not the horizontal viewport boundaries. This is an issue I've previously corrected in other parts of the codebase, and I think the cursor movement was one of the last areas where it was still a problem. 2. A number of the commands had range and overflow checks for their parameters that were either unnecessary (testing for a condition that could never occur) or incorrect (if an operation overflows, the correct behavior is to clamp it, and not just fail). The new implementation handles legitimate overflows correctly, but doesn't check for impossible ranges. Because of the change of behavior in point 1, I also had to update the implementations of [the `DECSC` and `CPR` commands](https://github.com/microsoft/terminal/pull/3628/commits/9cf7a9b577ed7831908bb9353d4f8e0a6e6fcc5e) to account for the column offset now being relative to the buffer and not the viewport, otherwise those operations would no longer work correctly. ## Validation Steps Performed Because of the two changes in behavior mentioned above, there were a number of adapter tests that stopped working and needed to be updated. First off there were those that expected the column offset to be relative to the left viewport position and constrained by the viewport width. These now had to be updated to [use the full buffer width](https://github.com/microsoft/terminal/pull/3628/commits/49887a3589169b2724f4046c1773836384543c10) as the allowed horizontal extent. Then there were all the overflow and out-of-range tests that were testing conditions that could never occur in practice, or where the expected behavior that was tested was actually incorrect. I did spend some time trying to see if there was value in updating these tests somehow, but in the end I decided it was best to just [drop them](https://github.com/microsoft/terminal/pull/3628/commits/6e80d0de19cb3313cfdd4fea555f6be41cc9fcd8) altogether. For the `CNL` and `CPL` operations, there didn't appear to be any existing tests, so I added some [new screen buffer tests](https://github.com/microsoft/terminal/pull/3628/commits/d6c4f35cf60a239cb1b8b7b7cc5796b06f78a884) to check that those operations now work correctly, both with and without margins.
2020-01-16 23:33:35 +01:00
_testGetSet->_bufferSize = { 100, 600 };
_testGetSet->_viewport.Right = 8;
_testGetSet->_viewport.Bottom = 8;
_testGetSet->_getConsoleScreenBufferInfoExResult = TRUE;
SHORT sScreenHeight = _testGetSet->_viewport.Bottom - _testGetSet->_viewport.Top;
Log::Comment(L"Test 1: Verify having both values is valid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 2, 6);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
_testGetSet->_setConsoleCursorPositionResult = true;
_testGetSet->_moveToBottomResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 2: Verify having only top is valid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 7, 0);
_testGetSet->_expectedScrollRegion.Bottom = _testGetSet->_viewport.Bottom - 1; // We expect the bottom to be the bottom of the viewport, exclusive.
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 3: Verify having only bottom is valid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 0, 7);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 4: Verify having no values is valid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 0, 0);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 5: Verify having both values, but bad bounds is invalid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 7, 3);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_FALSE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 6: Verify setting margins to (0, height) clears them");
// First set,
_testGetSet->_privateSetScrollingRegionResult = TRUE;
_testGetSet->_SetMarginsHelper(&srTestMargins, 2, 6);
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
// Then clear
_testGetSet->_SetMarginsHelper(&srTestMargins, 0, sScreenHeight);
_testGetSet->_expectedScrollRegion.Top = 0;
_testGetSet->_expectedScrollRegion.Bottom = 0;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 7: Verify setting margins to (1, height) clears them");
// First set,
_testGetSet->_privateSetScrollingRegionResult = TRUE;
_testGetSet->_SetMarginsHelper(&srTestMargins, 2, 6);
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
// Then clear
_testGetSet->_SetMarginsHelper(&srTestMargins, 1, sScreenHeight);
_testGetSet->_expectedScrollRegion.Top = 0;
_testGetSet->_expectedScrollRegion.Bottom = 0;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 8: Verify setting margins to (1, 0) clears them");
// First set,
_testGetSet->_privateSetScrollingRegionResult = TRUE;
_testGetSet->_SetMarginsHelper(&srTestMargins, 2, 6);
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
// Then clear
_testGetSet->_SetMarginsHelper(&srTestMargins, 1, 0);
_testGetSet->_expectedScrollRegion.Top = 0;
_testGetSet->_expectedScrollRegion.Bottom = 0;
VERIFY_IS_TRUE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 9: Verify having top and bottom margin the same is invalid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 4, 4);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_FALSE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 10: Verify having top margin out of bounds is invalid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, sScreenHeight + 1, sScreenHeight + 10);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_FALSE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
Log::Comment(L"Test 11: Verify having bottom margin out of bounds is invalid.");
_testGetSet->_SetMarginsHelper(&srTestMargins, 1, sScreenHeight + 1);
_testGetSet->_privateSetScrollingRegionResult = TRUE;
VERIFY_IS_FALSE(_pDispatch.get()->SetTopBottomScrollingMargins(srTestMargins.Top, srTestMargins.Bottom));
}
Add support for all the line feed control sequences (#3271) ## Summary of the Pull Request This adds support for the `FF` (form feed) and `VT` (vertical tab) [control characters](https://vt100.net/docs/vt510-rm/chapter4.html#T4-1), as well as the [`NEL` (Next Line)](https://vt100.net/docs/vt510-rm/NEL.html) and [`IND` (Index)](https://vt100.net/docs/vt510-rm/IND.html) escape sequences. ## References #976 discusses the conflict between VT100 Index sequence and the VT52 cursor back sequence. ## PR Checklist * [x] Closes #3189 * [x] CLA signed. If not, go over [here](https://cla.opensource.microsoft.com/microsoft/Terminal) and sign the CLA * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. If not checked, I'm ready to accept this work might be rejected in favor of a different grand plan. Issue number where discussion took place: #3189 ## Detailed Description of the Pull Request / Additional comments I've added a `LineFeed` method to the `ITermDispatch` interface, with an enum parameter specifying the required line feed type (i.e. with carriage return, without carriage return, or dependent on the [`LNM` mode](https://vt100.net/docs/vt510-rm/LNM.html)). The output state machine can then call that method to handle the various line feed control characters (parsed in the `ActionExecute` method), as well the `NEL` and `IND` escape sequences (parsed in the `ActionEscDispatch` method). The `AdaptDispatch` implementation of `LineFeed` then forwards the call to a new `PrivateLineFeed` method in the `ConGetSet` interface, which simply takes a bool parameter specifying whether a carriage return is required or not. In the case of mode-dependent line feeds, the `AdaptDispatch` implementation determines whether the return is necessary or not, based on the existing _AutoReturnOnNewLine_ setting (which I'm obtaining via another new `PrivateGetLineFeedMode` method). Ultimately we'll want to support changing the mode via the [`LNM` escape sequence](https://vt100.net/docs/vt510-rm/LNM.html), but there's no urgent need for that now. And using the existing _AutoReturnOnNewLine_ setting as a substitute for the mode gives us backwards compatible behaviour, since that will be true for the Windows shells (which expect a linefeed to also generate a carriage return), and false in a WSL bash shell (which won't want the carriage return by default). As for the actual `PrivateLineFeed` implementation, that is just a simplified version of how the line feed would previously have been executed in the `WriteCharsLegacy` function. This includes setting the cursor to "On" (with `Cursor::SetIsOn`), potentially clearing the wrap property of the line being left (with `CharRow::SetWrapForced` false), and then setting the new position using `AdjustCursorPosition` with the _fKeepCursorVisible_ parameter set to false. I'm unsure whether the `SetIsOn` call is really necessary, and I think the way the forced wrap is handled needs a rethink in general, but for now this should at least be compatible with the existing behaviour. Finally, in order to make this all work in the _Windows Terminal_ app, I also had to add a basic implementation of the `ITermDispatch::LineFeed` method in the `TerminalDispatch` class. There is currently no need to support mode-specific line feeds here, so this simply forwards a `\n` or `\r\n` to the `Execute` method, which is ultimately handled by the `Terminal::_WriteBuffer` implementation. ## Validation Steps Performed I've added output engine tests which confirm that the various control characters and escape sequences trigger the dispatch method correctly. Then I've added adapter tests which confirm the various dispatch options trigger the `PrivateLineFeed` API correctly. And finally I added some screen buffer tests that check the actual results of the `NEL` and `IND` sequences, which covers both forms of the `PrivateLineFeed` API (i.e. with and without a carriage return). I've also run the _Test of cursor movements_ in the [Vttest](https://invisible-island.net/vttest/) utility, and confirmed that screens 1, 2, and 5 are now working correctly. The first two depend on `NEL` and `IND` being supported, and screen 5 requires the `VT` control character.
2020-01-15 14:41:55 +01:00
TEST_METHOD(LineFeedTest)
{
Log::Comment(L"Starting test...");
// All test cases need the LineFeed call to succeed.
_testGetSet->_privateLineFeedResult = TRUE;
Log::Comment(L"Test 1: Line feed without carriage return.");
_testGetSet->_expectedLineFeedWithReturn = false;
VERIFY_IS_TRUE(_pDispatch.get()->LineFeed(DispatchTypes::LineFeedType::WithoutReturn));
Log::Comment(L"Test 2: Line feed with carriage return.");
_testGetSet->_expectedLineFeedWithReturn = true;
VERIFY_IS_TRUE(_pDispatch.get()->LineFeed(DispatchTypes::LineFeedType::WithReturn));
Log::Comment(L"Test 3: Line feed depends on mode, and mode reset.");
_testGetSet->_privateGetLineFeedModeResult = false;
_testGetSet->_expectedLineFeedWithReturn = false;
VERIFY_IS_TRUE(_pDispatch.get()->LineFeed(DispatchTypes::LineFeedType::DependsOnMode));
Log::Comment(L"Test 4: Line feed depends on mode, and mode set.");
_testGetSet->_privateGetLineFeedModeResult = true;
_testGetSet->_expectedLineFeedWithReturn = true;
VERIFY_IS_TRUE(_pDispatch.get()->LineFeed(DispatchTypes::LineFeedType::DependsOnMode));
}
TEST_METHOD(SetConsoleTitleTest)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: set title to be non-null");
_testGetSet->_setConsoleTitleWResult = TRUE;
_testGetSet->_expectedWindowTitle = L"Foo bar";
VERIFY_IS_TRUE(_pDispatch.get()->SetWindowTitle(_testGetSet->_expectedWindowTitle));
Log::Comment(L"Test 2: set title to be null");
_testGetSet->_setConsoleTitleWResult = FALSE;
_testGetSet->_expectedWindowTitle = {};
VERIFY_IS_TRUE(_pDispatch.get()->SetWindowTitle({}));
}
TEST_METHOD(TestMouseModes)
{
Log::Comment(L"Starting test...");
Log::Comment(L"Test 1: Test Default Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::DefaultMouseTracking;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableVT200MouseMode(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableVT200MouseMode(false));
Log::Comment(L"Test 2: Test UTF-8 Extended Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::Utf8MouseEncoding;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableUTF8ExtendedMouseMode(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableUTF8ExtendedMouseMode(false));
Log::Comment(L"Test 3: Test SGR Extended Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::SgrMouseEncoding;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableSGRExtendedMouseMode(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableSGRExtendedMouseMode(false));
Log::Comment(L"Test 4: Test Button-Event Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::ButtonEventMouseTracking;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableButtonEventMouseMode(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableButtonEventMouseMode(false));
Log::Comment(L"Test 5: Test Any-Event Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::AnyEventMouseTracking;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableAnyEventMouseMode(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableAnyEventMouseMode(false));
Log::Comment(L"Test 6: Test Alt Scroll Mouse Mode");
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = true;
_testGetSet->_expectedInputMode = TerminalInput::Mode::AlternateScroll;
_testGetSet->_setInputModeResult = true;
VERIFY_IS_TRUE(_pDispatch.get()->EnableAlternateScroll(true));
Consolidate the interfaces for setting VT input modes (#11384) Instead of having a separate method for setting each mouse and keyboard mode, this PR consolidates them all into a single method which takes a mode parameter, and stores the modes in a `til::enumset` rather than having a separate `bool` for each mode. This enables us to get rid of a lot of boilerplate code, and makes the code easier to extend when we want to introduce additional modes in the future. It'll also makes it easier to read back the state of the various modes when implementing the `DECRQM` query. Most of the complication is in the `TerminalInput` class, which had to be adjusted to work with an `enumset` in place of all the `bool` fields. For the rest, it was largely a matter of replacing calls to all the old mode setting methods with the new `SetInputMode` method, and deleting a bunch of unused code. One thing worth mentioning is that the `AdaptDispatch` implementation used to have a `_ShouldPassThroughInputModeChange` method that was called after every mode change. This code has now been moved up into the `SetInputMode` implementation in `ConhostInternalGetSet` so it's just handled in one place. Keeping this out of the dispatch class will also be beneficial for sharing the implementation with `TerminalDispatch`. ## Validation The updated interface necessitated some adjustments to the tests in `AdapterTest` and `MouseInputTest`, but the essential structure of the tests remains unchanged, and everything still passes. I've also tested the keyboard and mouse modes in Vttest and confirmed they still work at least as well as they did before (both conhost and Windows Terminal), and I tested the alternate scroll mode manually (conhost only). Simplifying the `ConGetSet` and `ITerminalApi` is also part of the plan to de-duplicate the `AdaptDispatch` and `TerminalDispatch` implementation (#3849).
2021-10-26 23:12:22 +02:00
_testGetSet->_expectedInputModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->EnableAlternateScroll(false));
}
TEST_METHOD(Xterm256ColorTest)
{
Log::Comment(L"Starting test...");
_testGetSet->PrepData(); // default color from here is gray on black, FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VTParameter rgOptions[16];
size_t cOptions = 3;
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
_testGetSet->_getColorTableEntryResult = true;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Log::Comment(L"Test 1: Change Foreground");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = (DispatchTypes::GraphicsOptions)2; // Green
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground256(TextColor::DARK_GREEN);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Test 2: Change Background");
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = (DispatchTypes::GraphicsOptions)9; // Bright Red
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground256(TextColor::BRIGHT_RED);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Test 3: Change Foreground to RGB color");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = (DispatchTypes::GraphicsOptions)42; // Arbitrary Color
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute.SetIndexedForeground256(42);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Test 4: Change Background to RGB color");
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = (DispatchTypes::GraphicsOptions)142; // Arbitrary Color
Fix SGR indexed colors to distinguish Indexed256 color (and more) (#5834) This PR introduces a new `ColorType` to allow us to distinguish between `SGR` indexed colors from the 16 color table, the lower half of which can be brightened, and the ISO/ITU indexed colors from the 256 color table, which have a fixed brightness. Retaining the distinction between these two types will enable us to forward the correct `SGR` sequences to conpty when addressing issue #2661. The other benefit of retaining the color index (which we didn't previously do for ISO/ITU colors) is that it ensures that the colors are updated correctly when the color scheme is changed. ## References * This is another step towards fixing the conpty narrowing bugs in issue #2661. * This is technically a fix for issue #5384, but that won't be apparent until #2661 is complete. ## PR Checklist * [x] Closes #1223 * [x] CLA signed. * [x] Tests added/passed * [ ] Requires documentation to be updated * [x] I've discussed this with core contributors already. ## Detailed Description of the Pull Request / Additional comments The first part of this PR was the introduction of a new `ColorType` in the `TextColor` class. Instead of just the one `IsIndex` type, there is now an `IsIndex16` and an `IsIndex256`. `IsIndex16` covers the eight original ANSI colors set with `SGR 3x` and `SGR 4x`, as well as the brighter aixterm variants set with `SGR 9x` and `SGR 10x`. `IsIndex256` covers the 256 ISO/ITU indexed colors set with `SGR 38;5` and `SGR 48;5`. There are two reasons for this distinction. The first is that the ANSI colors have the potential to be brightened by the `SGR 1` bold attribute, while the ISO/ITO color do not. The second reason is that when forwarding an attributes through conpty, we want to try and preserve the original SGR sequence that generated each color (to the extent that that is possible). By having the two separate types, we can map the `IsIndex16` colors back to ANSI/aixterm values, and `IsIndex256` to the ISO/ITU sequences. In addition to the VT colors, we also have to deal with the legacy colors set by the Windows console APIs, but we don't really need a separate type for those. It seemed most appropriate to me to store them as `IsIndex256` colors, since it doesn't make sense to have them brightened by the `SGR 1` attribute (which is what would happen if they were stored as `IsIndex16`). If a console app wanted a bright color it would have selected one, so we shouldn't be messing with that choice. The second part of the PR was the unification of the two color tables. Originally we had a 16 color table for the legacy colors, and a separate table for the 256 ISO/ITU colors. These have now been merged into one, so color table lookups no longer need to decide which of the two tables they should be referencing. I've also updated all the methods that took a color table as a parameter to use a `basic_string_view` instead of separate pointer and length variables, which I think makes them a lot easier and safer to work with. With this new architecture in place, I could now update the `AdaptDispatch` SGR implementation to store the ISO/ITU indexed colors as `IsIndex256` values, where before they were mapped to RGB values (which prevented them reflecting any color scheme changes). I could also update the `TerminalDispatch` implementation to differentiate between the two index types, so that the `SGR 1` brightening would only be applied to the ANSI colors. I've also done a bit of code refactoring to try and minimise any direct access to the color tables, getting rid of a lot of places that were copying tables with `memmove` operations. I'm hoping this will make it easier for us to update the code in the future if we want to reorder the table entries (which is likely a requirement for unifying the `AdaptDispatch` and `TerminalDispatch` implementations). ## Validation Steps Performed For testing, I've just updated the existing unit tests to account for the API changes. The `TextColorTests` required an extra parameter specifying the index type when setting an index. And the `AdapterTest` and `ScreenBufferTests` required the use of the new `SetIndexedXXX` methods in order to be explicit about the index type, instead of relying on the `TextAttribute` constructor and the old `SetForeground` and `SetBackground` methods which didn't have a way to differentiate index types. I've manually tested the various console APIs (`SetConsoleTextAttribute`, `ReadConsoleOutputAttribute`, and `ReadConsoleOutput`), to make sure they are still setting and reading the attributes as well as they used to. And I've tested the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs to make sure they can read and write the color table correctly. I've also tested the color table in the properties dialog, made sure it was saved and restored from the registry correctly, and similarly saved and restored from a shortcut link. Note that there are still a bunch of issues with the color table APIs, but no new problems have been introduced by the changes in this PR, as far as I could tell. I've also done a bunch of manual tests of `OSC 4` to make sure it's updating all the colors correctly (at least in conhost), and confirmed that the test case in issue #1223 now works as expected.
2020-05-28 00:34:45 +02:00
_testGetSet->_expectedAttribute.SetIndexedBackground256(142);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
Log::Comment(L"Test 5: Change Foreground to Legacy Attr while BG is RGB color");
// Unfortunately this test isn't all that good, because the adapterTest adapter isn't smart enough
2019-05-21 08:15:44 +02:00
// to have its own color table and translate the pre-existing RGB BG into a legacy BG.
// Fortunately, the ft_api:RgbColorTests IS smart enough to test that.
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = (DispatchTypes::GraphicsOptions)9; // Bright Red
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground256(TextColor::BRIGHT_RED);
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, cOptions }));
}
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
TEST_METHOD(XtermExtendedColorDefaultParameterTest)
{
Log::Comment(L"Starting test...");
_testGetSet->PrepData(); // default color from here is gray on black, FOREGROUND_BLUE | FOREGROUND_GREEN | FOREGROUND_RED
VTParameter rgOptions[16];
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
_testGetSet->_getColorTableEntryResult = true;
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
_testGetSet->_expectedAttribute = _testGetSet->_attribute;
Log::Comment(L"Test 1: Change Indexed Foreground with missing index parameter");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedForeground256(TextColor::DARK_BLACK);
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, 2 }));
Log::Comment(L"Test 2: Change Indexed Background with default index parameter");
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::BlinkOrXterm256Index;
rgOptions[2] = {};
Standardize the color table order (#11602) ## Summary of the Pull Request In the original implementation, we used two different orderings for the color tables. The WT color table used ANSI order, while the conhost color table used a Windows-specific order. This PR standardizes on the ANSI color order everywhere, so the usage of indexed colors is consistent across both parts of the code base, which will hopefully allow more of the code to be shared one day. ## References This is another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849, and is essentially a followup to the SGR dispatch refactoring in PR #6728. ## PR Checklist * [x] Closes #11461 * [x] CLA signed. * [x] Tests added/passed * [ ] Documentation updated. * [ ] Schema updated. * [x] I've discussed this with core contributors already. Issue number where discussion took place: #11461 ## Detailed Description of the Pull Request / Additional comments Conhost still needs to deal with legacy attributes using Windows color order, so those values now need to be transposed to ANSI colors order when creating a `TextAttribute` object. This is done with a simple mapping table, which also handles the translation of the default color entries, so it's actually slightly faster than the original code. And when converting `TextAttribute` values back to legacy console attributes, we were already using a mapping table to handle the narrowing of 256-color values down to 16 colors, so we just needed to adjust that table to account for the translation from ANSI to Windows, and then could make use of the same table for both 256-color and 16-color values. There are also a few places in conhost that read from or write to the color tables, and those now need to transpose the index values. I've addressed this by creating separate `SetLegacyColorTableEntry` and `GetLegacyColorTableEntry` methods in the `Settings` class which take care of the mapping, so it's now clearer in which cases the code is dealing with legacy values, and which are ANSI values. These methods are used in the `SetConsoleScreenBufferInfoEx` and `GetConsoleScreenBufferInfoEx` APIs, as well as a few place where color preferences are handled (the registry, shortcut links, and the properties dialog), none of which are particularly sensitive to performance. However, we also use the legacy table when looking up the default colors for rendering (which happens a lot), so I've refactored that code so the default color calculations now only occur once per frame. The plus side of all of this is that the VT code doesn't need to do the index translation anymore, so we can finally get rid of all the calls to `XTermToWindowsIndex`, and we no longer need a separate color table initialization method for conhost, so I was able to merge a number of color initialization methods into one. We also no longer need to translate from legacy values to ANSI when generating VT sequences for conpty. The one exception to that is the 16-color VT renderer, which uses the `TextColor::GetLegacyIndex` method to approximate 16-color equivalents for RGB and 256-color values. Since that method returns a legacy index, it still needs to be translated to ANSI before it can be used in a VT sequence. But this should be no worse than it was before. One more special case is conhost's secret _Color Selection_ feature. That uses `Ctrl`+Number and `Alt`+Number key sequences to highlight parts of the buffer, and the mapping from number to color is based on the Windows color order. So that mapping now needs to be transposed, but that's also not performance sensitive. The only thing that I haven't bothered to update is the trace logging code in the `Telemetry` class, which logs the first 16 entries in the color table. Those entries are now going to be in a different order, but I didn't think that would be of great concern to anyone. ## Validation Steps Performed A lot of unit tests needed to be updated to use ANSI color constants when setting indexed colors, where before they might have been expecting values in Windows order. But this replaced a wild mix of different constants, sometimes having to use bit shifting, as well as values mapped with `XTermToWindowsIndex`, so I think the tests are a whole lot clearer now. Only a few cases have been left with literal numbers where that seemed more appropriate. In addition to getting the unit tests working, I've also manually tested the behaviour of all the console APIs which I thought could be affected by these changes, and confirmed that they produced the same results in the new code as they did in the original implementation. This includes: - `WriteConsoleOutput` - `ReadConsoleOutput` - `SetConsoleTextAttribute` with `WriteConsoleOutputCharacter` - `FillConsoleOutputAttribute` and `FillConsoleOutputCharacter` - `ScrollConsoleScreenBuffer` - `GetConsoleScreenBufferInfo` - `GetConsoleScreenBufferInfoEx` - `SetConsoleScreenBufferInfoEx` I've also manually tested changing colors via the console properties menu, the registry, and shortcut links, including setting default colors and popup colors. And I've tested that the "Quirks Mode" is still working as expected in PowerShell. In terms of performance, I wrote a little test app that filled a 80x9999 buffer with random color combinations using `WriteConsoleOutput`, which I figured was likely to be the most performance sensitive call, and I think it now actually performs slightly better than the original implementation. I've also tested similar code - just filling the visible window - with SGR VT sequences of various types, and the performance seems about the same as it was before.
2021-11-04 23:13:22 +01:00
_testGetSet->_expectedAttribute.SetIndexedBackground256(TextColor::DARK_BLACK);
Refactor VT parameter handling (#7799) This PR introduces a pair of classes for managing VT parameters that automatically handle range checking and default fallback values, so the individual operations don't have to do that validation themselves. In addition to simplifying the code, this fixes a few cases where we were mishandling missing or extraneous parameters, and adds support for parameter sequences on commands that couldn't previously handle them. This PR also sets a limit on the number of parameters allowed, to help thwart DoS memory consumption attacks. ## References * The new parameter class also introduces the concept of an omitted/default parameter which is not necessarily zero, which is a prerequisite for addressing issue #4417. ## Detailed Description of the Pull Request / Additional comments There are two new classes provide by this PR: a `VTParameter` class, similar in function to a `std::optional<size_t>`, which holds an individual parameter (which may be an omitted/default value); and a `VTParameters` class, similar in function to `gsl:span<VTParameter>`, which holds a sequence of those parameters. Where `VTParameter` differs from `std::optional` is with the inclusion of two cast operators. There is a `size_t` cast that interprets omitted and zero values as 1 (the expected behaviour for most numeric parameters). And there is a generic cast, for use with the enum parameter types, which interprets omitted values as 0 (the expected behaviour for most selective parameters). The advantage of `VTParameters` class is that it has an `at` method that can never fail - out of range values simply return the a default `VTParameter` instance (this is standard behaviour in VT terminals). It also has a `size` method that will always return a minimum count of 1, since an empty parameter list is typically the equivalent of a single "default" parameter, so this guarantees you'll get at least one value when iterating over the list with `size()`. For cases where we just need to call the same dispatch method for every parameter, there is a helper `for_each` method, which repeatedly calls a given predicate function with each value in the sequence. It also collates the returned success values to determine the overall result of the sequence. As with the `size` method, this will always make at least one call, so it correctly handles empty sequences. With those two classes in place, we could get rid of all the parameter validation and default handling code in the `OutputStateMachineEngine`. We now just use the `VTParameters::at` method to grab a parameter and typically pass it straight to the appropriate dispatch method, letting the cast operators automatically handle the assignment of default values. Occasionally we might need a `value_or` call to specify a non-standard default value, but those cases are fairly rare. In some case the `OutputStateMachineEngine` was also checking whether parameters values were in range, but for the most part this shouldn't have been necessary, since that is something the dispatch classes would already have been doing themselves (in the few cases that they weren't, I've now updated them to do so). I've also updated the `InputStateMachineEngine` in a similar way to the `OutputStateMachineEngine`, getting rid of a few of the parameter extraction methods, and simplifying other parts of the implementation. It's not as clean a replacement as the output engine, but there are still benefits in using the new classes. ## Validation Steps Performed For the most part I haven't had to alter existing tests other than accounting for changes to the API. There were a couple of tests I needed to drop because they were checking for failure cases which shouldn't have been failing (unexpected parameters should never be an error), or testing output engine validation that is no longer handled at that level. I've added a few new tests to cover operations that take sequences of selective parameters (`ED`, `EL`, `TBC`, `SM`, and `RM`). And I've extended the cursor movement tests to make sure those operations can handle extraneous parameters that weren't expected. I've also added a test to verify that the state machine will correctly ignore parameters beyond the maximum 32 parameter count limit. I've also manual confirmed that the various test cases given in issues #2101 are now working as expected. Closes #2101
2020-10-15 18:12:52 +02:00
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, 3 }));
Log::Comment(L"Test 3: Change RGB Foreground with all RGB parameters missing");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::RGBColorOrFaint;
_testGetSet->_expectedAttribute.SetForeground(RGB(0, 0, 0));
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, 2 }));
Log::Comment(L"Test 4: Change RGB Background with some missing RGB parameters");
rgOptions[0] = DispatchTypes::GraphicsOptions::BackgroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::RGBColorOrFaint;
rgOptions[2] = 123;
_testGetSet->_expectedAttribute.SetBackground(RGB(123, 0, 0));
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, 3 }));
Log::Comment(L"Test 5: Change RGB Foreground with some default RGB parameters");
rgOptions[0] = DispatchTypes::GraphicsOptions::ForegroundExtended;
rgOptions[1] = DispatchTypes::GraphicsOptions::RGBColorOrFaint;
rgOptions[2] = {};
rgOptions[3] = {};
rgOptions[4] = 123;
_testGetSet->_expectedAttribute.SetForeground(RGB(0, 0, 123));
VERIFY_IS_TRUE(_pDispatch.get()->SetGraphicsRendition({ rgOptions, 5 }));
}
TEST_METHOD(SetColorTableValue)
{
_testGetSet->PrepData();
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
_testGetSet->_setColorTableEntryResult = true;
const auto testColor = RGB(1, 2, 3);
_testGetSet->_expectedColorValue = testColor;
Refactor the SGR implementation in AdaptDispatch (#5758) This is an attempt to simplify the SGR (Select Graphic Rendition) implementation in conhost, to cut down on the number of methods required in the `ConGetSet` interface, and pave the way for future improvements and bug fixes. It already fixes one bug that prevented SGR 0 from being correctly applied when combined with meta attributes. * This a first step towards fixing the conpty narrowing bugs in issue #2661 * I'm hoping the simplification of `ConGetSet` will also help with #3849. * Some of the `TextAttribute` refactoring in this PR overlaps with similar work in PR #1978. ## Detailed Description of the Pull Request / Additional comments The main point of this PR was to simplify the `AdaptDispatch::SetGraphicsRendition` implementation. So instead of having it call a half a dozen methods in the `ConGetSet` API, depending on what kinds of attributes needed to be set, there is now just one call to get current attributes, and another call to set the new value. All adjustments to the attributes are made in the `AdaptDispatch` class, in a simple switch statement. To help with this refactoring, I also made some change to the `TextAttribute` class to make it easier to work with. This included adding a set of methods for setting (and getting) the individual attribute flags, instead of having the calling code being exposed to the internal attribute structures and messing with bit manipulation. I've tried to get rid of any methods that were directly setting legacy, meta, and extended attributes. Other than the fix to the `SGR 0` bug, the `AdaptDispatch` refactoring mostly follows the behaviour of the original code. In particular, it still maps the `SGR 38/48` indexed colors to RGB instead of retaining the index, which is what we ultimately need it to do. Fixing that will first require the color tables to be unified (issue #1223), which I'm hoping to address in a followup PR. But for now, mapping the indexed colors to RGB values required adding an an additional `ConGetSet` API to lookup the color table entries. In the future that won't be necessary, but the API will still be useful for other color reporting operations that we may want to support. I've made this API, and the existing setter, standardise on index values being in the "Xterm" order, since that'll be essential for unifying the code with the terminal adapter one day. I should also point out one minor change to the `SGR 38/48` behavior, which is that out-of-range RGB colors are now ignored rather than being clamped, since that matches the way Xterm works. ## Validation Steps Performed This refactoring has obviously required corresponding changes to the unit tests, but most were just minor updates to use the new `TextAttribute` methods without any real change in behavior. However, the adapter tests did require significant changes to accommodate the new `ConGetSet` API. The basic structure of the tests remain the same, but the simpler API has meant fewer values needed to be checked in each test case. I think they are all still covering the areas there were intended to, though, and they are all still passing. Other than getting the unit tests to work, I've also done a bunch of manual testing of my own. I've made sure the color tests in Vttest all still work as well as they used to. And I've confirmed that the test case from issue #5341 is now working correctly. Closes #5341
2020-05-09 01:04:16 +02:00
for (size_t i = 0; i < 256; i++)
{
_testGetSet->_expectedColorTableIndex = i;
VERIFY_IS_TRUE(_pDispatch.get()->SetColorTableEntry(i, testColor));
}
Consolidate the color palette APIs (#11784) This PR merges the default colors and cursor color into the main color table, enabling us to simplify the `ConGetSet` and `ITerminalApi` interfaces, with just two methods required for getting and setting any form of color palette entry. The is a follow-up to the color table standardization in #11602, and a another small step towards de-duplicating `AdaptDispatch` and `TerminalDispatch` for issue #3849. It should also make it easier to support color queries (#3718) and a configurable bold color (#5682) in the future. On the conhost side, default colors could originally be either indexed positions in the 16-color table, or separate standalone RGB values. With the new system, the default colors will always be in the color table, so we just need to track their index positions. To make this work, those positions need to be calculated at startup based on the loaded registry/shortcut settings, and updated when settings are changed (this is handled in `CalculateDefaultColorIndices`). But the plus side is that it's now much easier to lookup the default color values for rendering. For now the default colors in Windows Terminal use hardcoded positions, because it doesn't need indexed default colors like conhost. But in the future I'd like to extend the index handling to both terminals, so we can eventually support the VT525 indexed color operations. As for the cursor color, that was previously stored in the `Cursor` class, which meant that it needed to be copied around in various places where cursors were being instantiated. Now that it's managed separately in the color table, a lot of that code is no longer required. ## Validation Some of the unit test initialization code needed to be updated to setup the color table and default index values as required for the new system. There were also some adjustments needed to account for API changes, in particular for methods that now take index values for the default colors in place of COLORREFs. But for the most part, the essential behavior of the tests remains unchanged. I've also run a variety of manual tests looking at the legacy console APIs as well as the various VT color sequences, and checking that everything works as expected when color schemes are changed, both in Windows Terminal and conhost, and in the latter case with both indexed colors and RGB values. Closes #11768
2021-11-23 19:28:55 +01:00
// Test in pty mode - we should fail, but SetColorTableEntry should still be called
_testGetSet->_isPty = true;
_testGetSet->_expectedColorTableIndex = 15; // Windows BRIGHT_WHITE
VERIFY_IS_FALSE(_pDispatch.get()->SetColorTableEntry(15, testColor));
}
Add support for downloadable soft fonts (#10011) This PR adds conhost support for downloadable soft fonts - also known as dynamically redefinable character sets (DRCS) - using the `DECDLD` escape sequence. These fonts are typically designed to work on a specific terminal model, and each model tends to have a different character cell size. So in order to support as many models as possible, the code attempts to detect the original target size of the font, and then scale the glyphs to fit our current cell size. Once a font has been downloaded to the terminal, it can be designated in the same way you would a standard character set, using an `SCS` escape sequence. The identification string for the set is defined by the `DECDLD` sequence. Internally we map the characters in this set to code points `U+EF20` to `U+EF7F` in the Unicode private use are (PUA). Then in the renderer, any characters in that range are split off into separate runs, which get painted with a special font. The font itself is dynamically generated as an in-memory resource, constructed from the downloaded character bitmaps which have been scaled to the appropriate size. If no soft fonts are in use, then no mapping of the PUA code points will take place, so this shouldn't interfere with anyone using those code points for something else, as along as they aren't also trying to use soft fonts. I also tried to pick a PUA range that hadn't already been snatched up by Nerd Fonts, but if we do receive reports of a conflict, it's easy enough to change. ## Validation Steps Performed I added an adapter test that runs through a bunch of parameter variations for the `DECDLD` sequence, to make sure we're correctly detecting the font sizes for most of the known DEC terminal models. I've also tested manually on a wide range of existing fonts, of varying dimensions, and from multiple sources, and made sure they all worked reasonably well. Closes #9164
2021-08-06 22:41:02 +02:00
TEST_METHOD(SoftFontSizeDetection)
{
using CellMatrix = DispatchTypes::DrcsCellMatrix;
using FontSet = DispatchTypes::DrcsFontSet;
using FontUsage = DispatchTypes::DrcsFontUsage;
const auto decdld = [=](const auto cmw, const auto cmh, const auto ss, const auto u, const std::wstring_view data = {}) {
const auto ec = DispatchTypes::DrcsEraseControl::AllChars;
const auto css = DispatchTypes::DrcsCharsetSize::Size94;
const auto cellMatrix = static_cast<DispatchTypes::DrcsCellMatrix>(cmw);
const auto stringHandler = _pDispatch.get()->DownloadDRCS(0, 0, ec, cellMatrix, ss, u, cmh, css);
if (stringHandler)
{
stringHandler(L'B'); // Charset identifier
for (auto ch : data)
{
stringHandler(ch);
}
stringHandler(L'\033'); // String terminator
}
return stringHandler != nullptr;
};
// Matrix sizes at 80x24 should always use a 10x10 cell size (VT2xx).
Log::Comment(L"Matrix 5x10 for 80x24 font set with text usage");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size5x10, 0, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Matrix 6x10 for 80x24 font set with text usage");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size6x10, 0, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Matrix 7x10 for 80x24 font set with text usage");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size7x10, 0, FontSet::Size80x24, FontUsage::Text));
// At 132x24 the cell size is typically 6x10 (VT240), but could be 10x10 (VT220)
Log::Comment(L"Matrix 5x10 for 132x24 font set with text usage");
_testGetSet->_expectedCellSize = { 6, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size5x10, 0, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Matrix 6x10 for 132x24 font set with text usage");
_testGetSet->_expectedCellSize = { 6, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size6x10, 0, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Matrix 7x10 for 132x24 font set with text usage (VT220 only)");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size7x10, 0, FontSet::Size132x24, FontUsage::Text));
// Full cell usage is invalid for all matrix sizes except 6x10 at 132x24.
Log::Comment(L"Matrix 5x10 for 80x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Size5x10, 0, FontSet::Size80x24, FontUsage::FullCell));
Log::Comment(L"Matrix 6x10 for 80x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Size6x10, 0, FontSet::Size80x24, FontUsage::FullCell));
Log::Comment(L"Matrix 7x10 for 80x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Size7x10, 0, FontSet::Size80x24, FontUsage::FullCell));
Log::Comment(L"Matrix 5x10 for 132x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Size5x10, 0, FontSet::Size132x24, FontUsage::FullCell));
Log::Comment(L"Matrix 6x10 for 132x24 font set with full cell usage");
_testGetSet->_expectedCellSize = { 6, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size6x10, 0, FontSet::Size132x24, FontUsage::FullCell));
Log::Comment(L"Matrix 7x10 for 132x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Size7x10, 0, FontSet::Size132x24, FontUsage::FullCell));
// Matrix size 1 is always invalid.
Log::Comment(L"Matrix 1 for 80x24 font set with text usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Invalid, 0, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Matrix 1 for 132x24 font set with text usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Invalid, 0, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Matrix 1 for 80x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Invalid, 0, FontSet::Size80x24, FontUsage::FullCell));
Log::Comment(L"Matrix 1 for 132x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(CellMatrix::Invalid, 0, FontSet::Size132x24, FontUsage::FullCell));
// The height parameter has no effect when a matrix size is used.
Log::Comment(L"Matrix 7x10 with unused height parameter");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Size7x10, 20, FontSet::Size80x24, FontUsage::Text));
// Full cell fonts with explicit dimensions are accepted as their given cell size.
Log::Comment(L"Explicit 13x17 for 80x24 font set with full cell usage");
_testGetSet->_expectedCellSize = { 13, 17 };
VERIFY_IS_TRUE(decdld(13, 17, FontSet::Size80x24, FontUsage::FullCell));
Log::Comment(L"Explicit 9x25 for 132x24 font set with full cell usage");
_testGetSet->_expectedCellSize = { 9, 25 };
VERIFY_IS_TRUE(decdld(9, 25, FontSet::Size132x24, FontUsage::FullCell));
// Cell sizes outside the maximum supported range (16x32) are invalid.
Log::Comment(L"Explicit 18x38 for 80x24 font set with full cell usage (invalid)");
VERIFY_IS_FALSE(decdld(18, 38, FontSet::Size80x24, FontUsage::FullCell));
// Text fonts with explicit dimensions are interpreted as their closest matching device.
Log::Comment(L"Explicit 12x12 for 80x24 font set with text usage (VT320)");
_testGetSet->_expectedCellSize = { 15, 12 };
VERIFY_IS_TRUE(decdld(12, 12, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Explicit 9x20 for 80x24 font set with text usage (VT340)");
_testGetSet->_expectedCellSize = { 10, 20 };
VERIFY_IS_TRUE(decdld(9, 20, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Explicit 10x30 for 80x24 font set with text usage (VT382)");
_testGetSet->_expectedCellSize = { 12, 30 };
VERIFY_IS_TRUE(decdld(10, 30, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Explicit 8x16 for 80x24 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 16 };
VERIFY_IS_TRUE(decdld(8, 16, FontSet::Size80x24, FontUsage::Text));
Log::Comment(L"Explicit 7x12 for 132x24 font set with text usage (VT320)");
_testGetSet->_expectedCellSize = { 9, 12 };
VERIFY_IS_TRUE(decdld(7, 12, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Explicit 5x20 for 132x24 font set with text usage (VT340)");
_testGetSet->_expectedCellSize = { 6, 20 };
VERIFY_IS_TRUE(decdld(5, 20, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Explicit 6x30 for 132x24 font set with text usage (VT382)");
_testGetSet->_expectedCellSize = { 7, 30 };
VERIFY_IS_TRUE(decdld(6, 30, FontSet::Size132x24, FontUsage::Text));
Log::Comment(L"Explicit 5x16 for 132x24 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 16 };
VERIFY_IS_TRUE(decdld(5, 16, FontSet::Size132x24, FontUsage::Text));
// Font sets with more than 24 lines must be VT420/VT5xx.
Log::Comment(L"80x36 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x36, FontUsage::Text));
Log::Comment(L"80x48 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 8 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x48, FontUsage::Text));
Log::Comment(L"132x36 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x36, FontUsage::Text));
Log::Comment(L"132x48 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 8 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x48, FontUsage::Text));
Log::Comment(L"80x36 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x36, FontUsage::FullCell));
Log::Comment(L"80x48 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 8 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x48, FontUsage::FullCell));
Log::Comment(L"132x36 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 10 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x36, FontUsage::FullCell));
Log::Comment(L"132x48 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 8 };
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x48, FontUsage::FullCell));
// Without an explicit size, the cell size is estimated from the number of sixels
// used in the character bitmaps. But note that sixel heights are always a multiple
// of 6, so will often be larger than the cell size for which they were intended.
Log::Comment(L"8x12 bitmap for 80x24 font set with text usage (VT2xx)");
_testGetSet->_expectedCellSize = { 10, 10 };
const auto bitmapOf8x12 = L"????????/????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::Text, bitmapOf8x12));
Log::Comment(L"12x12 bitmap for 80x24 font set with text usage (VT320)");
_testGetSet->_expectedCellSize = { 15, 12 };
const auto bitmapOf12x12 = L"????????????/????????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::Text, bitmapOf12x12));
Log::Comment(L"9x24 bitmap for 80x24 font set with text usage (VT340)");
_testGetSet->_expectedCellSize = { 10, 20 };
const auto bitmapOf9x24 = L"?????????/?????????/?????????/?????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::Text, bitmapOf9x24));
Log::Comment(L"10x30 bitmap for 80x24 font set with text usage (VT382)");
_testGetSet->_expectedCellSize = { 12, 30 };
const auto bitmapOf10x30 = L"??????????/??????????/??????????/??????????/??????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::Text, bitmapOf10x30));
Log::Comment(L"8x18 bitmap for 80x24 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 16 };
const auto bitmapOf8x18 = L"????????/????????/????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::Text, bitmapOf8x18));
Log::Comment(L"5x12 bitmap for 132x24 font set with text usage (VT240)");
_testGetSet->_expectedCellSize = { 6, 10 };
const auto bitmapOf5x12 = L"?????/?????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::Text, bitmapOf5x12));
Log::Comment(L"7x12 bitmap for 132x24 font set with text usage (VT320)");
_testGetSet->_expectedCellSize = { 9, 12 };
const auto bitmapOf7x12 = L"???????/???????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::Text, bitmapOf7x12));
Log::Comment(L"5x24 bitmap for 132x24 font set with text usage (VT340)");
_testGetSet->_expectedCellSize = { 6, 20 };
const auto bitmapOf5x24 = L"?????/?????/?????/?????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::Text, bitmapOf5x24));
Log::Comment(L"6x30 bitmap for 132x24 font set with text usage (VT382)");
_testGetSet->_expectedCellSize = { 7, 30 };
const auto bitmapOf6x30 = L"??????/??????/??????/??????/??????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::Text, bitmapOf6x30));
Log::Comment(L"5x18 bitmap for 132x24 font set with text usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 16 };
const auto bitmapOf5x18 = L"?????/?????/?????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::Text, bitmapOf5x18));
Log::Comment(L"15x12 bitmap for 80x24 font set with full cell usage (VT320)");
_testGetSet->_expectedCellSize = { 15, 12 };
const auto bitmapOf15x12 = L"???????????????/???????????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::FullCell, bitmapOf15x12));
Log::Comment(L"10x24 bitmap for 80x24 font set with full cell usage (VT340)");
_testGetSet->_expectedCellSize = { 10, 20 };
const auto bitmapOf10x24 = L"??????????/??????????/??????????/??????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::FullCell, bitmapOf10x24));
Log::Comment(L"12x30 bitmap for 80x24 font set with full cell usage (VT382)");
_testGetSet->_expectedCellSize = { 12, 30 };
const auto bitmapOf12x30 = L"????????????/????????????/????????????/????????????/????????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::FullCell, bitmapOf12x30));
Log::Comment(L"10x18 bitmap for 80x24 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 10, 16 };
const auto bitmapOf10x18 = L"??????????/??????????/??????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size80x24, FontUsage::FullCell, bitmapOf10x18));
Log::Comment(L"6x12 bitmap for 132x24 font set with full cell usage (VT240)");
_testGetSet->_expectedCellSize = { 6, 10 };
const auto bitmapOf6x12 = L"??????/??????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::FullCell, bitmapOf6x12));
Log::Comment(L"9x12 bitmap for 132x24 font set with full cell usage (VT320)");
_testGetSet->_expectedCellSize = { 9, 12 };
const auto bitmapOf9x12 = L"?????????/?????????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::FullCell, bitmapOf9x12));
Log::Comment(L"6x24 bitmap for 132x24 font set with full cell usage (VT340)");
_testGetSet->_expectedCellSize = { 6, 20 };
const auto bitmapOf6x24 = L"??????/??????/??????/??????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::FullCell, bitmapOf6x24));
Log::Comment(L"7x30 bitmap for 132x24 font set with full cell usage (VT382)");
_testGetSet->_expectedCellSize = { 7, 30 };
const auto bitmapOf7x30 = L"???????/???????/???????/???????/???????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::FullCell, bitmapOf7x30));
Log::Comment(L"6x18 bitmap for 132x24 font set with full cell usage (VT420/VT5xx)");
_testGetSet->_expectedCellSize = { 6, 16 };
const auto bitmapOf6x18 = L"??????/??????/??????";
VERIFY_IS_TRUE(decdld(CellMatrix::Default, 0, FontSet::Size132x24, FontUsage::FullCell, bitmapOf6x18));
}
Disable the acceptance of C1 control codes by default (#11690) There are some code pages with "unmapped" code points in the C1 range, which results in them being translated into Unicode C1 control codes, even though that is not their intended use. To avoid having these characters triggering unintentional escape sequences, this PR now disables C1 controls by default. Switching to ISO-2022 encoding will re-enable them, though, since that is the most likely scenario in which they would be required. They can also be explicitly enabled, even in UTF-8 mode, with the `DECAC1` escape sequence. What I've done is add a new mode to the `StateMachine` class that controls whether C1 code points are interpreted as control characters or not. When disabled, these code points are simply dropped from the output, similar to the way a `NUL` is interpreted. This isn't exactly the way they were handled in the v1 console (which I think replaces them with the font _notdef_ glyph), but it matches the XTerm behavior, which seems more appropriate considering this is in VT mode. And it's worth noting that Windows Explorer seems to work the same way. As mentioned above, the mode can be enabled by designating the ISO-2022 coding system with a `DOCS` sequence, and it will be disabled again when UTF-8 is designated. You can also enable it explicitly with a `DECAC1` sequence (originally this was actually a DEC printer sequence, but it doesn't seem unreasonable to use it in a terminal). I've also extended the operations that save and restore "cursor state" (e.g. `DECSC` and `DECRC`) to include the state of the C1 parser mode, since it's closely tied to the code page and character sets which are also saved there. Similarly, when a `DECSTR` sequence resets the code page and character sets, I've now made it reset the C1 mode as well. I should note that the new `StateMachine` mode is controlled via a generic `SetParserMode` method (with a matching API in the `ConGetSet` interface) to allow for easier addition of other modes in the future. And I've reimplemented the existing ANSI/VT52 mode in terms of these generic methods instead of it having to have its own separate APIs. ## Validation Steps Performed Some of the unit tests for OSC sequences were using a C1 `0x9C` for the string terminator, which doesn't work by default anymore. Since that's not a good practice anyway, I thought it best to change those to a standard 7-bit terminator. However, in tests that were explicitly validating the C1 controls, I've just enabled the C1 parser mode at the start of the tests in order to get them working again. There were also some ANSI mode adapter tests that had to be updated to account for the fact that it has now been reimplemented in terms of the `SetParserMode` API. I've added a new state machine test to validate the changes in behavior when the C1 parser mode is enabled or disabled. And I've added an adapter test to verify that the `DesignateCodingSystems` and `AcceptC1Controls` methods toggle the C1 parser mode as expected. I've manually verified the test cases in #10069 and #10310 to confirm that they're no longer triggering control sequences by default. Although, as I explained above, the C1 code points are completely dropped from the output rather than displayed as _notdef_ glyphs. I think this is a reasonable compromise though. Closes #10069 Closes #10310
2021-11-18 00:40:31 +01:00
TEST_METHOD(TogglingC1ParserMode)
{
Log::Comment(L"1. Accept C1 controls");
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::AcceptC1;
_testGetSet->_expectedParserModeEnabled = true;
VERIFY_IS_TRUE(_pDispatch.get()->AcceptC1Controls(true));
Log::Comment(L"2. Don't accept C1 controls");
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::AcceptC1;
_testGetSet->_expectedParserModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->AcceptC1Controls(false));
Log::Comment(L"3. Designate ISO-2022 coding system");
// Code page should be set to ISO-8859-1 and C1 parsing enabled
_testGetSet->_setConsoleOutputCPResult = true;
_testGetSet->_expectedOutputCP = 28591;
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::AcceptC1;
_testGetSet->_expectedParserModeEnabled = true;
VERIFY_IS_TRUE(_pDispatch.get()->DesignateCodingSystem(DispatchTypes::CodingSystem::ISO2022));
Log::Comment(L"4. Designate UTF-8 coding system");
// Code page should be set to UTF-8 and C1 parsing disabled
_testGetSet->_setConsoleOutputCPResult = true;
_testGetSet->_expectedOutputCP = CP_UTF8;
_testGetSet->_setParserModeResult = true;
_testGetSet->_expectedParserMode = StateMachine::Mode::AcceptC1;
_testGetSet->_expectedParserModeEnabled = false;
VERIFY_IS_TRUE(_pDispatch.get()->DesignateCodingSystem(DispatchTypes::CodingSystem::UTF8));
}
private:
TestGetSet* _testGetSet; // non-ownership pointer
std::unique_ptr<AdaptDispatch> _pDispatch;
};