1 2 3 4 5 6 7 8 9 10 11
Compile-check LESS/CSS classnames using TypeScript and Webpack

As I am making myself familiar with modern frontend development based on React, TypeScript, Webpack and others, I learned something really cool. I like to write this down not only for you – dear reader – but also for my own reference.

The problem

Let’s say you have a trivial React component like this where you specify a classsName to tell what CSS class should be used:

const MyComponent = (props: MyComponentProps) => (
<MySubCompnent className='myclass'>
....
</MySubCompnent>
);

export default MyComponent;

The problem with this is that we don’t have any compiler-check to ensure this class myclass really exists in our LESS file. So if we have a Typo or we later change the LESS file, we cannot be sure all classes/selectors are still valid. Not even the browser will show that. It silently breaks. Bad thing!

A solution

Using Webpack and the LESS loader one can fix this by checking this at compile time. To do so, you can define the style and its classname in the LESS file and import it into .tsx files. The LESS loader for webpack will expose the following LESS variables to the build process where the TypeScript loader (used for the .tsx files) can pick it up.

MyComponent.less:

@my-class: ~':local(.myClass)';
 
@{my-class}{
  width: 100%;
  background-color: green;
}
...

Note the local() function supported by the LESS loader (see webpack config at the end) which scopes that class to a local scope.

The above LESS files can be typed and imported into the .TSX file like this:

MyComponent.tsx:

type TStyles = {
  myClass: string;
};
 
const styles: TStyles = require('./MyComponent.less');
 
const MyComponent = (props: MyComponentProps) => (
 <MySubCompnent className={{styles.myClass}}>
 ....
 </MySubCompnent>
);
 
export default MyComponent;

Then firing up your build, the .less file gets picked up using the require() function and checked against the TypeScript type TStyles. The property myClass will contain the LESS/CSS classname as defined in the .less file.

I then can use the styles.myClass instead of the string literal of the original code.

To get this working, ensure you have the LESS loader included in your webpack configuration (you probably already have it if your are already using LESS):

webpack.json:

module: {
  rules: [
  {
    test: /.tsx?$/,
    loader: "ts-loader"
  },
  {
    test: /.less$/,
    use: ExtractTextPlugin.extract({
    use: [
      {
        loader: "css-loader",
        options: {
          localIdentName: '[local]--[hash:5]',
          sourceMap: true
        }
      }, {
        loader: "less-loader",
        options: {
          sourceMap: true
        }
      }
    ],
    fallback: "style-loader",
    ...
  }),
  ...
},
...

Note: The samples use LESS stylesheets, but one can do the same with SCSS/SASS – I guess. You just have to use another loader for webpack and therefore the syntax supported by that loader.

No broken CSS classnames anymore – isn’t this cool? Let me know your feedback.

This is a cross-post from Marc's personal blog at https://marcduerst.com/2018/03/08/compile-check-less-css-classnames-using-typescript-and-webpack/

Finding deep assembly dependencies

Quino contains a Sandbox in the main solution that lets us test a lot of the Quino subsystems in real-world conditions. The Sandbox has several application targets:

  • WPF
  • Winform
  • Remote Data Server
  • WebAPI Server
  • Console

The targets that connect directly to a database (e.g. WPF, Winform) were using the PostgreSql driver by default. I wanted to configure all Sandbox applications to be easily configurable to run with SqlServer.

Just add the driver, right?

This is pretty straightforward for a Quino application. The driver can be selected directly in the application (directly linking the corresponding assembly) or it can be configured externally.

Naturally, if the Sandbox loads the driver from configuration, some mechanism still has to make sure that the required data-driver assemblies are available.

The PostgreSql driver was in the output folder. This was expected, since that driver works. The SqlServer was not in the output folder. This was also expected, since that driver had never been used.

I checked the direct dependencies of the Sandbox Winform application, but it didn't include the PostgreSql driver. That's not really good, as I would like both SqlServer and PostgreSql to be configured in the same way. As it stood, though, I would be referencing SqlServer directly and PostgreSql would continue to show up by magic.

Before doing anything else, I was going to have to find out why PostgreSql was included in the output folder.

I needed to figure out assembly dependencies.

Visual Studio?

My natural inclination was to reach for NDepend, but I thought maybe I'd see what the other tools have to offer first.

Does Visual Studio include anything that might help? The "Project Dependencies" shows only assemblies on which a project is dependent. I wanted to find assemblies that were dependent on PostgreSql. I have the Enterprise version of Visual Studio and I seem to recall an "Architecture" menu, but I discovered that these tools are no longer installed by default.

According to the VS support team in that link, you have to install the "Visual Studio extension development" workload in the Visual Studio installer. In this package, the "Architecture and analysis tools" feature is available, but not included by default.

Hovering this feature shows a tooltip indicating that it contains "Code Map, Live Dependency Validation and Code Clone detection". The "Live Dependency Validation" sounds like it might do what I want, but it also sounds quite heavyweight and somewhat intrusive, as described in this blog from the end of 2016. Instead of further modifying my VS installation (and possibly slowing it down), I decided to try another tool.

ReSharper?

What about ReSharper? For a while now, it's included project-dependency graphs and hierarchies. Try as I might, I couldn't get the tools to show me the transitive dependency on PostgreSql that Sandbox Winform was pulling in from somewhere. The hierarchy view is live and quick, but it doesn't show all transitive usages.

The graph view is nicely rendered, but shows dependencies by default instead of dependencies and usages. At any rate, the Sandbox wasn't showing up as a transitive user of PostgreSql.

I didn't believe ReSharper at this point because something was causing the data driver to be copied to the output folder.

NDepend to the rescue

So, as expected, I turned to NDepend. I took a few seconds to run an analysis and then right-clicked the PostgreSql data-driver project to select NDepend => Select Assemblies... => That are Using Me (Directly or Indirectly) to show the following query and results.

Bingo. Sandbox.Model is indirectly referencing the PostgreSql data driver, via a transitive-dependency chain of 4 assemblies. Can I see which assemblies they are? Of course I can: this kind of information is best shown on a graph, so you can show a graph of any query results by clicking "Export to Graph" to show the graph below.

Now I can finally see that the SandboxModel pulls in the Quino.Testing.Models.Generated (to use the BaseTypes module) which, in turn, has a reference to Quino.Tests.Base which, of course, includes the PostgreSql driver because that's the default testing driver for Quino tests.

Now that I know how the reference is coming in, I can fix the problem. Here I'm on my own: I have to solve this problem without NDepend. But at least NDepend was able to show me exactly what I have to fix (unlike VS or ReSharper).

I ended up moving the test-fixture base classes from Quino.Testing.Models.Generated into a new assembly called Quino.Testing.Models.Fixtures. The latter assembly still depends on Quino.Tests.Base and thus the PostgreSql data driver, but it's now possible to reference the Quino testing models without transitively referencing the PostgreSql data driver.

A quick re-analysis with NDepend and I can see that the same query now shows a clean view: only testing code and testing assemblies reference the PostgreSql driver.

Finishing up

And now to finish my original task! I ran the Winform Sandbox application with the PostgreSql driver configured and was greeted with an error message that the driver could not be loaded. I now had parity between PostgreSql and SqlServer.

The fix? Obviously, make sure that the drivers are available by referencing them directly from any Sandbox application that needs to connect to a database. This was the obvious solution from the beginning, but we had to quickly fix a problem with dependencies first. Why? Because we hate hacking. :-)

Two quick references added, a build and I was able to connect to both SQL Server and PostgreSql.

Tools for maintaining Quino

The Quino roadmap shows you where we're headed. How do we plan to get there?

A few years back, we made a big leap in Quino 2.0 to split up dependencies in anticipation of the initial release of .NET Core. Three tools were indispensable: ReSharper, NDepend and, of course, Visual Studio. Almost all .NET developers use Visual Studio, many use ReSharper and most should have at least heard of NDepend.

At the time, I wrote a series of articles on the migration from two monolithic assemblies (Encodo and Quino) to dozens of layered and task-specific assemblies that allows applications to include our software in a much more fine-grained manner. As you can see from the articles, NDepend was the main tool I used for finding and tracking dependencies.1 I used ReSharper to disentangle them.

Since then, I've not taken advantage of NDepend's features for maintaining architecture as much as I'd like. I recently fired it up again to see where Quino stands now, with 5.0 in beta.

But, first, let's think about why we're using yet another tool for examining our code. Since I started using NDepend, other tools have improved their support for helping a developer maintain code quality.

  • ReSharper itself has introduced tools for visualizing project and type dependencies with very nice graphs. However, there is currently no support for establishing boundaries and getting ReSharper to tell me when I've inadvertently introduced new dependencies. In fact, ReSharper's only improved its support for quickly pulling in a dependency with its excellent Nuget-Package integration. ReSharper is excellent for finding lower-level code smells, like formatting, style and null-reference issues, as well as language usage, missing documentation and code-complexity (with an extension). DotCover provides test-coverage data but I haven't used it for real-time analysis yet (I don't use continuous testing with ReSharper on Quino because I feel it would destroy my desktop).
  • Visual Studio has also been playing catch-up with ReSharper and has done an excellent job in the last couple of years. VS 2017 is much, much faster than its predecessors; without it, we would be foundering badly with a Quino solution with almost 150 projects.2 Visual Studio provides Code Analysis and Portability Analysis and can calculate Code Metrics. Code Analysis is mostly covered by ReSharper, although it has a few extra inspections related to proper application and usage of the IDisposable pattern. The Portability Analysis is essential for moving libraries to .NET Standard but doesn't offer any insight into architectural violations like NDepend does.
  • We've recently started working with SonarQube on our TeamCity build server because a customer wanted to use it. It has a very nice UI and very nice reports, but doesn't go much farther than VS/R# inspections. Also, the report isn't in the UI, so it's not as quick to jump into the code. I don't want to review it here, since we only recently started working with it. It looks promising and is a welcome addition to that project. Hopefully more will reveal itself in time.
  • TeamCity provides a lot of the services that ReSharper also provides: inspections and code-coverage for builds. This takes quite a while, though, so we only run inspections and coverage for the Quino nightly build. The reports are nice but, as with SonarQube, of limited use because of the tenuous integration with Visual Studio. The integration works, but it's balky and we don't use it very much. Instead, we analyze inspections in real-time in Visual Studio with ReSharper and don't use real-time code-coverage 3
  • NDepend integrates right into Visual Studio and has a super-fast analysis with a very nice dashboard overview, from which you can drill down into myriad issues and reports and analyses, from technical debt (with very daunting but probably accurate estimates for repair) to type- and assembly-interdependency problems. NDepend can also integrate code-coverage results from DotCover to show how you're doing on that front on the dashboard as well. As with TeamCity and SonarQube, the analyses are retained as snapshots. With NDepend, you can quickly compare them (and comparing against a baseline is even included by default in the dashboard), which is essential to see if you're making progress or regressing. 4 NDepend also integrates with TeamCity, but we haven't set that up (yet).

With a concrete .NET Core/Standard project in the wings/under development, we're finally ready to finish our push to make Quino Core ready for cross-platform development. For that, we're going to need NDepend's help, I think. Let's take a look at where we stand today.

The first step is to choose what you want to cover. In the past, I've selected specific assemblies that corresponded to the "Core". I usually do the same when building code-coverage results, because the UI assemblies tend to skew the results heavily. As noted in a footnote below, we're starting an effort to separate Quino into high-level components (roughly, a core with satellites like Winform, WPF and Web). Once we've done that, the health of the core itself should be more apparent (I hope).

For starters, though, I've thrown all assemblies in for both NDepend analysis as well as code coverage. Let's see how things stand overall.

The amount of information can be quite daunting but the latest incarnation of the dashboard is quite easy to read. All data is presented with a current number and a delta from the analysis against which you're comparing. Since I haven't run an analysis in a while, there's no previous data against which to compare, but that's OK.

  • Lines of Code
  • Code Elements (Types, Methods, etc.)
  • Comments (documentation)
  • Technical Debt
  • Code Coverage 5
  • Quality Gates / Rules / Issues

Let's start with the positive.

  • The Quino sources contain almost 50% documentation. That's not unexpected. The XML documentation from which we generate our developer documentation 6 is usually as long as or longer than the method itself.
  • We have a solid B rating for technical debt, which is really not bad, all things considered. I take that to mean that, even without looking, we instinctively produce code with a reasonable level of quality.

Now to the cool part: you can click anything in the NDepend dashboard to see a full list of all of the data in the panel.

Click the "B" on technical debt and you'll see an itemized and further-drillable list of the grades for all code elements. From there, you can see what led to the grade. By clicking the "Explore Debt" button, you get a drop-down list of pre-selected reports like "Types Hot Spots".

Click lines of code and you get a breakdown of which projects/files/types/methods have the most lines of code

Click failed quality gates to see where you've got the most major problems (Quino currently has 3 categories)

Click "Critical" or "Violated" rules to see architectural rules that you're violating. As with everything in NDepend, you can pick and choose which rules should apply. I use the default set of rules in Quino.

Most of our critical issues are for mutually-dependent namespaces. This is most likely not root namespaces crossing each other (though we'd like to get rid of those ASAP) but sub-namespaces that refer back to the root and vice-versa. This isn't necessarily a no-go, but it's definitely something to watch out for.

There are so many interesting things in these reports:

  • Don't create threads explicitly (this is something we've been trying to reduce; I already knew about the one remaining, but it's great to see it in a report as a tracked metric)
  • Methods with too many parameters (you can adjust the threshold, of course)
  • Types too big: we'd have to check these because some of them are probably generated code, in which case we'd remove them from analysis.
  • Abstract constructors should be protected: ReSharper also indicates this one, but we have it as a suggestion, not a warning, so it doesn't get regularly cleaned up. It's not critical, but a code-style thing. I find the NDepend report much easier to browse than the inspection report in TeamCity.

Click the "Low" issues (Quino has over 46,000!) and you can see that NDepend analyzes your code at an incredibly low level of granularity

  • There are almost 10,000 cases where methods could have a lower visibility. This is good to know, but definitely low-priority.
  • Namespace does not correspond to file location: I'm surprised to see 4,400 violations because I thought that ReSharper managed that for us quite well. This one bears investigating – maybe NDepend found something ReSharper didn't or maybe I need to tweak NDepend's settings.

Finallly, there's absolutely everything, which includes boxing/unboxing issues 7, method-names too long, large interfaces, large instances (could also be generated classes).

These already marked as low, so don't worry that NDepend just rains information down on you. Stick to the critical/high violations and you'll have real issues to deal with (i.e. code that might actually lead to bugs rather than code that leads to maintenance issues or incurs technical debt, both of which are more long-term issues).

What you'll also notice in the screenshots that NDepend doesn't just provide pre-baked reports: everything is based on its query language. That is, NDepend's analysis is lightning fast (takes only a few seconds for all of Quino) during which it builds up a huge database of information about your code that it then queries in real-time. NDepends provides a ton of pre-built queries linked from all over the UI, but you can adjust any of those queries in the pane at the top to tweak the results. The syntax is Linq to Sql and there are a ton of comments in the query to help you figure out what else you can do with it.

As noted above, the amount of information can be overwhelming, but just hang in there and figure out what NDepend is trying to tell you. You can pin or hide a lot of the floating windows if it's all just a bit too much at first.

In our case, the test assemblies have more technical debt than the code that it tests. This isn't optimal, but it's better than the other way around. You might be tempted to exclude test assemblies from the analysis, to boost your grade, but I think that's a bad idea. Testing code is production code. Make it just as good as the code it tests to ensure overall quality.

I did a quick comparison between Quino 4 and Quino 5 and we're moving in the right direction: the estimation of work required to get to grade A was already cut in half, so we've made good progress even without NDepend. I'm quite looking forward to using NDepend more regularly in the coming months. I've got my work cut out for me.

--


  1. Many thanks to Patrick Smacchia of NDepend for generously providing an evaluator's license to me over the years.

  2. We came up with a plan for reducing the size of the core solution in a recent architecture meeting. More on that in a subsequent blog post.

  3. Quino has 10,000 tests, many of which are integration tests, so a change to a highly shared component would trigger thousands of tests to run, possibly for minutes. I can't see how it would be efficient to run tests continuously as I type in Quino. I've used continuous testing in smaller projects and it's really wonderful (both with ReSharper and also Wallaby for TypeScript), but it doesn't work so well with Quino because of its size and highly generalized nature.

  4. I ran the analysis on both Quino 4 and Quino 5, but wasn't able to directly compare results because I think I inadvertently threw them away with our nant clean command. I'd moved the ndepend out folder to the common folder and our command wiped out the previous results. I'll work on persisting those better in the future.

  5. I generated coverage data using DotCover, but realized only later that I should have configured it to generate NDepend-compatible coverage data (as detailed in NDepend Coverage Data. I'll have to do that and run it again. For now, no coverage data in NDepend. This is what it looks like in DotCover, though. Not too shabby:

  6. Getting that documentation out to our developers is also a work-in-progress. Until recently, we've been stymied by the lack of a good tool and ugly templates. But recently we added DocFX support to Quino and the generated documentation is gorgeous. There'll be a post hopefully soon announcing the public availability of Quino documentation.

  7. There's probably a lot of low-hanging fruit of inadvertent allocations here. On the other hand, if they're not code hot paths, then they're mostly harmless. It's more a matter of coding consistently. There's also an extension for ReSharper (the "Heap Allocations Viewer") that indicates allocations directly in the IDE, in real-time. I have it installed, and it's nice to see where I'm incurring allocations.

v4.1.7: Winform bug fixes and resources captions for modules

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

  • Fixed Custom Controls in Winform Navigation (QNO-5889)
  • Use Resource Captions for all standard modules (QNO-5883, QNO-5884)

Note

Unless we find a blocking issue that can't be fixed with a patch to the product, this will be the last release on the 4.x branch.

Breaking changes

  • IExternalLoggerFactory has been renamed to IExternalLoggerProvider
  • ExternalLoggerFactory has been renamed to ExternalLoggerProvider
  • NullExternalLoggerFactory has been renamed to NullExternalLoggerProvider
  • IUserCredentials.AuthenticationToken is now an IToken instead of a string
v4.1.6: Winform / DevExpress improvements

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

Breaking changes

  • The property ReportDefinitionParameter.Hidden now has the default value false. Integrating this release will trigger a schema migration to adjust that value in the database.
On project maintenance

Consider the following scenarios:

  • You maintain a legacy project or your once-greenfield project has now turned a year (or two) old
  • You’ve been busy programming and have been pummelled by project admin

Under the stresses that come with the combination of these two scenarios, software developers often overlook one critical aspect to a successful, future-proof project: external package-maintenance.

I recently sat down and wrote an email explaining how I go about package-maintenance and thought it would be useful to write up those notes and share them with others.

The tech world moves quickly; new code styles, frameworks and best practices evolve in the blink of an eye. Before you know it, the packages you'd installed the previous year are no longer documented and there aren’t any blogposts describing how to upgrade them to their latest versions. Nightmare.

My general rule of thumb to avoid this ill-fated destiny is to set aside some time each sprint to upgrade packages. The process isn’t really involved, but it can be time-consuming if you upgrade a handful of packages at once and find that one of them breaks your code. You then have to go through each, one by one, downgrade and figure out if it’s the culprit.

My upgrade procedure (in this case using the yarn package manager) is:

  • Check which packages are due for upgrade - yarn outdated
  • Look through the READMEs for each of the outdated packages and check if any of the changes are likely to impact your codebase
  • Upgrade those packages that don’t appear to significantly have changed - yarn add clean-webpack-plugin@latest or yarn add clean-webpack-plugin@VERSION_NUMBER to install a specific version
  • Run the project’s test suite and check if the application still works. Fix any issues as required
  • Repeat for packages that have significantly changed

Tom Szpytman is a Software Developer at Encodo and works primarily on the React/Typescript stack

v4.1.5: Calculated Properties and Delegate Expressions

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

Breaking changes

Tuple support for delegates

tl;dr: Applications might have to include the System.Tuple NuGet package in some assemblies.

This release adds an overload for creating delegate expressions that returns a Tuple (object, bool). This improvement allows applications to more easily specify a lambda that returns a value and a flag indicating whether the value is valid.

There are several overloads available for creating a DelegateExpression. The simplest of these assumes that a value can be calculated and is appropriate for constant values or values whose calculation does not depend on the IExpressionContext passed to the expression.

However, many (if not most) delegates should indicate whether a value can be calculated by returning true or false and setting an out object value parameter instead. This is still the standard API, but 4.1.5 introduces an overload that supports tuples, which makes it easier to call.

In 4.1.4, an application had the following two choices for using the "tryGetValue" variant of the API.

Elements.Classes.A.AddCalculatedProperty("FullText", f => f.CreateDelegate(GetFullText));

private object GetFullText(IExpressionContext context, out object value)
{
  var obj = context.GetInstance<IMetaObject>();
  if (obj != null)
  {
    value = obj.ToString();
		 
    return true;
  }

  value = null;
	
  return false;
}

If the application wanted to inline the lambda, the types have to be explicitly specified:

Elements.Classes.A.AddCalculatedProperty("FullText", f => f.CreateDelegate((IExpressionContext context, out object value) => {
  var obj = context.GetInstance<IMetaObject>();
  if (obj != null)
  {
    value = obj.ToString();
		 
    return true;
  }

  value = null;
	
  return false;
}));

The overload that expects a tuple makes this a bit simpler:

Elements.Classes.A.AddCalculatedProperty("FullText", f => f.CreateDelegate(context => {
  var obj = context.GetInstance<IMetaObject>();
	 
  return obj != null ? (obj.ToString(), true) : (null, false);
}));

Empty Expression Contexts

Previously, a DelegateExpression would always return false for a call to TryGetValue() made with an empty context. This has been changed so that the DelegateExpression no longer has any logic of its own. This means, though, that an application must be a little more careful to properly return false when it is missing the information that it needs to calculate the value of an expression.

All but the lowest-level overloads and helper methods are unaffected by this change. An application that uses factory.CreateDelegate<T>() will not be affected. Only calls to new DelegateExpression() need to be examined on upgrade. It is strongly urged to convert direct calls to the construct to use the IMetaExpressionFactory instead.

Imagine if an application used a delegate for a calculated expression as shown below.

Elements.Classes.Person.AddCalculatedProperty("ActiveUserCount", MetaType.Boolean, new DelegateExpression(GetActiveUserCount));

// ...

private object GetActiveUserCount(IExpression context)
{
  return context.GetInstance<Person>().Company.ActiveUsers;
}

When Quino processes the model during startup, expression are evaluated in order to determine whether they can be executed without a context (caching, optimization, etc.). This application was safe in 4.1.4 because Quino automatically ignored an empty context and never called this code.

However, as of 4.1.5, an application that calls this low-level version will have to handle the case of an or insufficient context on its own.

It is highly recommended to move to using the expression factory and typed arguments instead, as shown below:

Elements.Classes.Person.AddCalculatedProperty<Person>("ActiveUserCount", MetaType.Boolean, f => f.CreateDelegate(GetActiveUserCount));

// ...

private object GetActiveUserCount(Person person)
{
  return person.Company.ActiveUsers;
}

If you just want to add your own handling for empty contexts, you can do the following (note that you have to change the signature of GetActiveUserCount):

Elements.Classes.Person.AddCalculatedProperty("ActiveUserCount", MetaType.Boolean, new DelegateExpression(GetActiveUserCount));

// ...

private bool GetActiveUserCount(IExpression context, out object value)
{
  var person = context.GetInstanceOrDefault<Person>();
  if (person == null)
  {
    value = null;
    return false;
  }

  value = person.Company.ActiveUsers;

  return true;
}
v4.1.4: Search-detail fixes and calculated properties

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

Breaking changes

  • None
v4.1: Layouts, captions, multiple requests-per-connection

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

Breaking changes

  • Usages of {{RunWinform}} should be updated to {{RunMetaWinform}}. The method {{RunWinform}} is now defined with a function parameter that expects an {{IApplication}} parameter instead of an {{IDataSession}} parameter. The change was made to allow applications more flexibility in configuring startup for applications with multiple main forms (QNO-4922) while still re-using as much shared code in Quino as possible. {{RunMetaWinform}} extends the {{RunWinform}} support to create and configure an {{IDataSession}} per form.
v4.1.3: Fixes for search layouts, languages, reconnects, descendant relations

The summary below describes major new features, items of note and breaking changes. The full list of issues is in the release notes and is available to those with access to the Encodo issue tracker.

Highlights

  • Fixed reconnect for ADO database connections (QNO-5776)
  • Fixed occasional startup crash when generating data (QNO-5768)
  • Improved search-layout / search-class generation (QNO-5767)
  • Restored support / example for toggling data languages in the UI (QNO-5764)
  • Fixed save for relations with descendant endpoints (QNO-5753)

Breaking changes

  • None