1 2 3 4 5 6 7 8 9 10 11
v2.2: Winform fixes and Query Improvements

The summary below describes major new features, items of note and breaking changes. The full list of issues is also available for those with access to the Encodo issue tracker.

Highlights

  • Lots of bug fixes and improvements for the Winform UI and German translations with the release of Punchclock on this version. (QNO-5162, QNO-5159, QNO-5158, QNO-5157, QNO-5156, QNO-5140, QNO-5155, QNO-5145, QNO-5111, QNO-5107, QNO-5106, QNO-5104, QNO-5015)
  • DateTimeExtensions.GetDayOfWeek() had a leap-day bug (QNO-5051)
  • Fixed how the hash code for GenericObjects is calculated, which fixes sorting issues in grids, specifically for non-persisted or transient objects (QNO-5137)
  • Improvements to the IAccessControl API for getting groups and users and testing membership (QNO-5133)
  • Add support for query aliases (e.g. for joining the same table multiple times) (QNO-531) This changes the API surface only minimally. Applications can pass an alias when calling the Join method, as shown below,
query.Join(Metadata.Project.Deputy, alias: "deputy")

You can find more examples of aliased queries in the TestAliasedQuery(), TestJoinAliasedTables(), TestJoinChildTwice() defined in the QueryTests testing fixture.

  • Add a standalone IQueryAnalyzer for optimizations and in-memory mini-drivers (QNO-4830)

Breaking changes

  • ISchemaManager has been removed. Instead, you should retrieve the interface you were looking for from the IOC. The possible interfaces you might need are IImportHandler, IMappingBuilder, IPlanBuilder or ISchemaCommandFactory.

  • ISchemaManagerSettings.GetAuthorized() has been moved to ISchemaManagerAuthorizer.

  • The hash-code fix for GenericObjects may have an effect on the way your application sorts objects.The IParticipantManager (base interface of IAccessControl) no longer has a single method called GetGroups(IParticipant). This method was previously used to get the groups to which a user belongs and the child groups of a given group. This confusing double duty for the API led to an incorrect implementation for both usages. Instead, there are now two methods:

    • IEnumerable<IGroup> GetGroups(IUser user): Gets the groups for the given user
    • IEnumerable<IGroup> GetChildGroups(IGroup group): Gets the child groups for the given group

The old method has been removed from the interface because (A) it never worked correctly anyway and (B) it conflicts with the new API.

Mini-applications and utilities with Quino

In several articles last year1, I went into a lot of detail about the configuration and startup for Quino applications. Those posts discuss a lot about what led to the architecture Quino has for loading up an application.

imageSome of you might be wondering: what if I want to start up and run an application that doesn't use Quino? Can I build applications that don't use any fancy metadata because they're super-simple and don't even need to store any data? Those are the kind of utility applications I make all the time; do you have anything for me, you continue to wonder?

As you probably suspected from the leading question: You're in luck. Any functionality that doesn't need metadata is available to you without using any of Quino. We call this the "Encodo" libraries, which are the base on which Quino is built. Thanks to the fundamental changes made in Quino 2, you have a wealth of functionality available in just the granularity you're looking for.

Why use a Common Library?

Instead of writing such small applications from scratch -- and we know we could write them -- why would we want to leverage existing code? What are the advantages of doing this?

  • Writing code that is out of scope takes time away from writing code that is in scope.
  • Code you never write has no bugs.
  • It also doesn't require maintenance or documentation.
  • While library code is not guaranteed to be bug-free, it's probably much better off than the code you just wrote seconds ago.
  • Using a library increases the likelihood of having robust, reliable and extendible code for out-of-scope application components.
  • One-off applications tend to be maintainable only by the originator. Applications using a common library can be maintained by anyone familiar with that library.
  • Without a library, common mistakes must be fixed in all copies, once for each one-off application.
  • The application can benefit from bug fixes and improvements made to the library.
  • Good practices and patterns are encouraged/enforced by the library.

What are potential disadvantages?

  • The library might compel a level of complexity that makes it take longer to create the application than writing it from scratch
  • The library might force you to use components that you don't want.
  • The library might hamstring you, preventing innovation.

A developer unfamiliar with a library -- or one who is too impatient to read up on it -- will feel these disadvantages more acutely and earlier.

Two Sample Applications

Let's take a look at some examples below to see how the Encodo/Quino libraries stack up. Are we able to profit from the advantages without suffering from the disadvantages?

We're going to take a look at two simple applications:

  1. An application that loads settings for Windows service-registration. We built this for a customer product.
  2. The Quino Code Generator that we use to generate metadata and ORM classes from the model

Windows Service Installer

The actual service-registration part is boilerplate generated by Microsoft Visual Studio2, but we'd like to replace the hard-coded strings with customized data obtained from a configuration file. So how do we get that data?

  • The main requirement is that the user should be able to indicate which settings to use when registering the Windows service.
  • The utility could read them in from the command line, but it would be nicer to read them from a configuration file.

That doesn't sound that hard, right? I'm sure you could just whip something together with an XMLDocument and some hard-coded paths and filenames that would do the trick.3 It might even work on the first try, too. But do you really want to bother with all of that? Wouldn't you rather just get the scaffolding for free and focus on the part where you load your settings?

Getting the Settings

The following listing shows the main application method, using the Encodo/Quino framework libraries to do the heavy lifting.

[NotNull]
public static ServiceSettings LoadServiceSettings()
{
  ServiceSettings result = null;
  var transcript = new ApplicationManager().Run(
    CreateServiceConfigurationApplication,
    app => result = app.GetInstance<ServiceSettings>()
  );

  if (transcript.ExitCode != ExitCodes.Ok)
  {
    throw new InvalidOperationException(
      "Could not read the service settings from the configuration file." + 
      new SimpleMessageFormatter().GetErrorDetails(transcript.Messages)
    );
  }

  return result;
}

If you've been following along in the other articles (see first footnote below), then this structure should be very familiar. We use an ApplicationManager() to execute the application logic, creating the application with CreateServiceConfigurationApplication and returning the settings configured by the application in the second parameter (the "run" action). If anything went wrong, we get the details and throw an exception.

You can't see it, but the library provides debug/file logging (if you enable it), debug/release mode support (exception-handling, etc.) and everything is customizable/replaceable by registering with an IOC.

Configuring the Settings Loader

Soooo...I can see where we're returning the ServiceSettings, but where are they configured? Let's take a look at the second method, the one that creates the application.

private static IApplication CreateServiceConfigurationApplication()
{
  var application = new Application();
  application
    .UseSimpleInjector()
    .UseStandard()
    .UseConfigurationFile("service-settings.xml")
    .Configure<ServiceSettings>(
      "service", 
      (settings, node) =>
      {
        settings.ServiceName = node.GetValue("name", settings.ServiceName);
        settings.DisplayName = node.GetValue("displayName", settings.DisplayName);
        settings.Description = node.GetValue("description", settings.Description);
        settings.Types = node.GetValue("types", settings.Types);
      }
    ).RegisterSingle<ServiceSettings>();

  return application;
}
  1. First, we create a standard Application, defined in the Encodo.Application assembly. What does this class do? It does very little other than manage the main IOC (see articles linked in the first footnote for details).
  2. The next step is to choose an IOC, which we do by calling UseSimpleInjector(). Quino includes support for the SimpleInjector IOC out of the box. As you can see, you must include this support explicitly, so you're also free to assign your own IOC (e.g. one using Microsoft's Unity). SimpleInjector is very lightweight and super-fast, so there's no downside to using it.
  3. Now we have an application with an IOC that doesn't have any registrations on it. How do we get more functionality? By calling methods like UseStandard(), defined in the Encodo.Application.Standard assembly. Since I know that UseStandard() pulls in what I'm likely to need, I'll just use that.4
  4. The next line tells the application the name of the configuration file to use.5
  5. The very next line is already application-specific code, where we configure the ServiceSettings object that we want to return. For that, there's a Configure method that returns an object from the IOC along with a specific node from the configuration data. This method is called only if everything started up OK.
  6. The final call to RegisterSingle makes sure that the ServiceSettings object created by the IOC is a singleton (it would be silly to configure one instance and return another, unconfigured one).

Basically, because this application is so simple, it has already accomplished its goal by the time the standard startup completes. At the point that we would "run" this application, the ServiceSettings object is already configured and ready for use. That's why, in LoadServiceSettings(), we can just get the settings from the application with GetInstance() and exit immediately.

Code Generator

The code generator has a bit more code, but follows the same pattern as the simple application above. In this case, we use the command line rather than the configuration file to get user input.

Execution

The main method defers all functionality to the ApplicationManager, passing along two methods, one to create the application, the other to run it.

internal static void Main()
{
  new ApplicationManager().Run(CreateApplication, GenerateCode);
}

Configuration

As before, we first create an Application, then choose the SimpleInjector and some standard configuration and registrations with UseStandard(), UseMetaStandardServices() and UseMetaTools().6

We set the application title to "Quino Code Generator" and then include objects with UseSingle() that will be configured from the command line and used later in the application.7 And, finally, we add our own ICommandSet to the command-line processor that will configure the input and output settings. We'll take a look at that part next.

private static IApplication CreateApplication(
  IApplicationCreationSettings applicationCreationSettings)
{
  var application = new Application();

  return
    application
    .UseSimpleInjector()
    .UseStandard()
    .UseMetaStandardServices()
    .UseMetaTools()
    .UseTitle("Quino Code Generator")
    .UseSingle(new CodeGeneratorInputSettings())
    .UseSingle(new CodeGeneratorOutputSettings())
    .UseUnattendedCommand()
    .UseCommandSet(CreateGenerateCodeCommandSet(application))
    .UseConsole();
}

Command-line Processing

The final bit of the application configuration is to see how to add items to the command-line processor.

Basically, each command set consists of required values, optional values and zero or more switches that are considered part of a set.

The one for i simply sets the value of inputSettings.AssemblyFilename to whatever was passed on the command line after that parameter. Note that it pulls the inputSettings from the application to make sure that it sets the values on the same singleton reference as will be used in the rest of the application.

The code below shows only one of the code-generator--specific command-line options.8

private static ICommandSet CreateGenerateCodeCommandSet(
  IApplication application)
{
  var inputSettings = application.GetSingle<CodeGeneratorInputSettings>();
  var outputSettings = application.GetSingle<CodeGeneratorOutputSettings>();

  return new CommandSet("Generate Code")
  {
    Required =
    {
      new OptionCommandDefinition<string>
      {
        ShortName = "i",
        LongName = "in",
        Description = Resources.Program_ParseCommandLineArgumentIn,
        Action = value => inputSettings.AssemblyFilename = value
      },
      // And others...
    },
  };
}

Code-generation

Finally, let's take a look at the main program execution for the code generator. It shouldn't surprise you too much to see that the logic consists mostly of getting objects from the IOC and telling them to do stuff with each other.9

I've highlighted the code-generator--specific objects in the code below. All other objects are standard library tools and interfaces.

private static void GenerateCode(IApplication application)
{
  var logger = application.GetLogger();
  var inputSettings = application.GetInstance<CodeGeneratorInputSettings>();

  if (!inputSettings.TypeNames.Any())
  {
    logger.Log(Levels.Warning, "No types to generate.");
  }
  else
  {
    var modelLoader = application.GetInstance<IMetaModelLoader>();
    var metaCodeGenerator = application.GetInstance<IMetaCodeGenerator>();
    var outputSettings = application.GetInstance<CodeGeneratorOutputSettings>();
    var modelAssembly = AssemblyTools.LoadAssembly(
      inputSettings.AssemblyFilename, logger
    );

    outputSettings.AssemblyDetails = modelAssembly.GetDetails();

    foreach (var typeName in inputSettings.TypeNames)
    {
      metaCodeGenerator.GenerateCode(
        modelLoader.LoadModel(modelAssembly, typeName), 
        outputSettings,
        logger
      );
    }
  }
}

So that's basically it: no matter how simple or complex your application, you configure it by indicating what stuff you want to use, then use all of that stuff once the application has successfully started. The Encodo/Quino framework provides a large amount of standard functionality. It's yours to use as you like and you don't have to worry about building it yourself. Even your tiniest application can benefit from sophisticated error-handling, command-line support, configuration and logging without lifting a finger.


var fileService = new ServiceInstaller();
fileService.StartType = ServiceStartMode.Automatic;
fileService.DisplayName = "Quino Sandbox";
fileService.Description = "Demonstrates a Quino-based service.";
fileService.ServiceName = "Sandbox.Services";

See the ServiceInstaller.cs file in the Sandbox.Server project in Quino 2.1.2 and higher for the full listing.

<?xml version="1.0" encoding="utf-8" ?>
<config>
  <service>
    <name>Quino.Services</name>
    <displayName>Quino Utility</displayName>
    <description>The application to run all Quino backend services.</description>
    <types>All</types>
  </service>
</config>

But that method is just a composition of over a dozen other methods. If, for whatever reason (perhaps dependencies), you don't want all of that functionality, you can just call the subset of methods that you do want. For example, you could call UseApplication() from the Encodo.Application assembly instead. That method includes only the support for:

    * Processing the command line (`ICommandSetManager`)
    * Locating external files (`ILocationManager`)
    * Loading configuration data from file (`IConfigurationDataLoader`)
    * Debug- and file-based logging (`IExternalLoggerFactory`)
    * and interacting with the `IApplicationManager`.

If you want to go even lower than that, you can try UseCore(), defined in the Encodo.Core assembly and then pick and choose the individual components yourself. Methods like UseApplication() and UseStandard() are tried and tested defaults, but you're free to configure your application however you want, pulling from the rich trove of features that Quino offers.

You'll notice that I didn't use Configure<ILocationManager>() for this particular usage. That's ordinarily the way to go if you want to make changes to a singleton before it is used. However, if you want to change where the application looks for configuration files, then you have to change the location manager before it's used any other configuration takes place. It's a special object that is available before the IOC has been fully configured. To reiterate from other articles (because it's important), the order of operations we're interested in here are:

     1. Create application (this is where you call `Use*()` to build the application)
     2. Get the location manager to figure out the path for `LocationNames.Configuration`
     3. Load the configuration file
     4. Execute all remaining actions, including those scheduled with calls to `Configure()`

If you want to change the configuration-file location, then you have to get in there before the startup starts running -- and that's basically during application construction. Alternatively, you could also call UseConfigurationDataLoader() to register your own object to actually load configuration data and do whatever the heck you like in there, including returning constant data. :-)


  1. See Encodos configuration library for Quino Part 1, Part 2 and Part 3 as well as API Design: Running and Application Part 1 and Part 2 and, finally, Starting up an application, in detail.

  2. That boilerplate looks like this:

  3. The standard implementation of Quino's ITextKeyValueNodeReader supports XML, but it would be trivial to create and register a version that supports JSON (QNO-4993) or YAML. The configuration file for the utility looks like this:

  4. If you look at the implementation of the UseStandard method10, it pulls in a lot of stuff, like support for BCrypt, enhanced CSV and enum-value parsing and standard configuration for various components (e.g. the file log and command line). It's called "Standard" because it's the stuff we tend to use in a lot of applications.

  5. By default, the application will look for this file next to the executable. You can configure this as well, by getting the location manager with GetLocationManager() and setting values on it.

  6. The metadata-specific analog to UseStandard() is UseMetaStandard(), but we don't call that. Instead, we call UseMetaStandardServices(). Why? The answer is that we want the code generator to be able to use some objects defined in Quino, but the code generator itself isn't a metadata-based application. We want to include the IOC registrations required by metadata-based applications without adding any of the startup or shutdown actions. Many of the standard Use*() methods included in the base libraries have analogs like this. The Use*Services() analogs are also very useful in automated tests, where you want to be able to create objects but don't want to add anything to the startup.

  7. Wait, why didn't we call RegisterSingle()? For almost any object, we could totally do that. But objects used during the first stage of application startup -- before the IOC is available -- must go in the other IOC, accessed with SetSingle() and GetSingle().

  8. The full listing is in Program.cs in the Quino.CodeGenerator project in any 2.x version of Quino.

  9. Note that, once the application is started, you can use GetInstance() instead of GetSingle() because the IOC is now available and all singletons are mirrored from the startup IOC to the main IOC. In fact, once the application is started, it's recommended to use GetInstance() everywhere, for consistency and to prevent the subtle distinction between IOCs -- present only in the first stage of startup -- from bleeding into your main program logic.

  10. If you have the Quino source code handy, you can look it up there, but if you have ReSharper installed, you can just F12 on UseStandard() to decompile the method. In the latest DotPeek, the extension methods are even displayed much more nicely in decompiled form.

v2.1.1 & v2.1.2: Bug fixes for web authentication, logging and services

The summary below describes major new features, items of note and breaking changes. The full list of issues for 2.1.1 and full list of issues for 2.1.2 are available for those with access to the Encodo issue tracker.

Highlights

  • Improved configuration, logging and error-handling for Windows services. (QNO-4992, QNO-5043, QNO-5057, QNO-5076, QNO-5077, QNO-5109)
  • Schema-based validation is once again applied. Without these validators, it was possible to make a model without the required meta-ids. During migration, this caused odd behavior. (QNO-5118)
  • Use TPL and async/await for services (QNO-5113)
  • Added new GetList(IEnumerable<IMetaRelation>) method to help products avoid lazy-loading (QNO-5113)
  • Reduce traffic for the EventLogger and MailLogger (QNO-5080)
  • Improve usability and error-reporting in the Quino Migrator

Breaking changes

  • The ConfigureDataProviderActionBase has been replaced with ConfigureDataProviderAction.
  • The standard implementations for IFeedback and IStatusFeedback as well as the other special-purpose feedbacks (e.g. IIncidentReporterSubmitterFeedback, ISchemaMigratorFeedback) have all been updated to require an IFeedbackLogger or IStatusLogger in the constructors. This was done to ensure that messages sent to feedbacks are logged, as noted in the highlights above. If you've implemented your own feedbacks, you'll have to accommodate the new constructors.
v2.1: API-smoothing and performance

The summary below describes major new features, items of note and breaking changes. The full list of issues is also available for those with access to the Encodo issue tracker.

Highlights

Quino 2 is finally ready and will go out the door with a 2.1 rather than a 2.0 version number. The reason being that we released 2.0 internally and tested the hell out of it. 2.1 is the result of that testing. It includes a lot of bug fixes as well as API tweaks to make things easier for developers.

On top of that, I've gone through the backlog and found many issues that had either been fixed already, were obsolete or had been inadequately specified. The Quino backlog dropped from 682 to 542 issues.

Breaking changes

The following changes are marked with Obsolete attributes, so you'll get a hint as to how to fix the problem. Since these are changes from an unreleased version of Quino, they cause a compile error.

  • UseMetaSchemaWinformDxFeedback() has been renamed to UseMetaschemaWinformDx()
  • UseSchemaMigrationSupport() has been renamed to UseIntegratedSchemaMigration()
  • MetaHttpApplicationBase.MetaApplication has been renamed to BaseApplication
  • The IServer.Run() extension method is no longer supported.
  • GetStandardFilters, GetStandardFiltersForFormsAuthentication() and GetStandardFiltersForUnrestrictedAuthentication are no longer supported. Instead, you should register filters in the IOC and use the IWebFilterAttributeFactory.CreateFilters() to get the list of supported filters
  • The ToolRequirementAttribute is no longer supported or used.
  • AssemblyExtensions.GetLoadableTypesWithInterface() is no longer supported
  • AssemblyTools.GetValidAssembly() has been replaced with AssemblyTools.GetApplicationAssembly(); GetExecutableName() and GetExecutablePath() have removed.
  • All of the constant expressions on the MetaBuilderBase (e.g. EndOfTimeExpression) are obsolete. Instead, use MetaBuilderBase.ExpressionFactory.Constants.EndOfTime instead.
  • All of the global values on MetaObjectDescriptionExtensions are obsolete; instead, use the IMetaObjectFormatterSettings from the IOC to change settings on startup.
  • Similarly, the set of extension methods that included GetShortDescription() has been moved to the IMetaObjectFormatter. Obtain an instance from the IOC, as usual.
v2.0: Logging, Dependencies, New Assemblies & Nuget

The summary below describes major new features, items of note and breaking changes. The full list of issues is also available for those with access to the Encodo issue tracker.

Highlights

In the beta1 and beta2 release notes, we read about changes to configuration, dependency reduction, the data driver architecture, DDL commands, security and access control in web applications and a new code-generation format.

In 2.0 final -- which was actually released internally on November 13th, 2015 (a Friday) -- we made the following additional improvements:

These notes are being published for completeness and documentation. The first publicly available release of Quino 2.x will be 2.1 or higher (release notes coming soon).

Breaking changes

imageAs we've mentioned before, this release is absolutely merciless in regard to backwards compatibility. Old code is not retained as Obsolete. Instead, a project upgrading to 2.0 will encounter compile errors.

The following notes serve as an incomplete guide that will help you upgrade a Quino-based product.

As I wrote in the release notes for beta1 and beta2, if you arm yourself with a bit of time, ReSharper and the release notes (and possibly keep an Encodo employee on speed-dial), the upgrade is not difficult. It consists mainly of letting ReSharper update namespace references for you.

Global Search/Replace

Instead of going through the errors (example shown to the right) one by one, you can take care of a lot of errors with the following search/replace pairs.

  • Encodo.Quino.Data.Persistence => Encodo.Quino.Data
  • IMetaApplication => IApplication
  • ICoreApplication => IApplication
  • GetServiceLocator() => GetServices()
  • MetaMethodTools.GetInstance => DataMetaMethodExtensions.GetInstance
  • application.ServiceLocator.GetInstance => application.GetInstance
  • Application.ServiceLocator.GetInstance => Application.GetInstance
  • application.ServiceLocator => application.GetServices()
  • Application.ServiceLocator => Application.GetServices()
  • application.Recorder => application.GetLogger()
  • Application.Recorder => Application.GetLogger()
  • session.GetRecorder() => session.GetLogger()
  • Session.GetRecorder() => Session.GetLogger()
  • Session.Application.Recorder => Session.GetLogger()
  • FileTools.Canonicalize() => PathTools.Normalize()
  • application.Messages => application.GetMessageList()
  • Application.Messages => Application.GetMessageList()
  • ServiceLocator.GetInstance => Application.GetInstance
  • MetaLayoutTools => LayoutConstants
  • GlobalContext.Instance.Application.Configuration.Model => GlobalContext.Instance.Application.GetModel()
  • IMessageRecorder => ILogger
  • GetUseReleaseSettings() => IsInReleaseMode()
  • ReportToolsDX => ReportDxExtensions

Although you can't just search/replace everything, it gets you a long way.

Model-Building Fixes

These replacement pairs, while not recommended for global search/replace, are a handy guide for how the API has generally changed.

  • *Generator => *Builder
  • SetUpForModule => CreateModule
  • Builder.SetElementVisibility(prop, true) => prop.Show()
  • Builder.SetElementVisibility(prop, false) => prop.Hide()
  • Builder.SetElementControlIdentifier(prop, ControlIdentifiers => prop.SetInputControl(ControlIdentifiers
  • Builder.SetPropertyHeightInPixels(prop, 200); => prop.SetHeightInPixels(200);

Constructing a module has also changed. Instead of using the following syntax,

var module = Builder.SetUpForModule<AuditModule>(Name, "ApexClearing.Alps.Core", Name, true);

Replace it with the following direct replacement,

var module = Builder.CreateModule(Name, "ApexClearing.Alps.Core", Name);

Or use this replacement, with the recommended style for the v2 format (no more class prefix for generated classes and a standard namespace):

var module = Builder.CreateModule(Name, typeof(AuditModuleBuilder).GetParentNamespace());

Standard Modules (e.g. Reporting, Security, etc.)

Because of how the module class-names have changed, the standard module ORM classes all have different names. The formula is that the ORM class-name is no longer prepended its module name.

  • ReportsReportDefinition => ReportDefinition
  • SecurityUser => User
  • And so on...

Furthermore, all modules have been converted to use the v2 code-generation format, which has the metadata separate from the ORM object. Therefore, instead of referencing metadata using the ORM class-name as the base, you use the module name as the base.

  • ReportReportDefinition.Fields.Name => ReportModule.ReportDefinition.Name.Identifier
  • ReportReportDefinition.MetaProperties.Name => ReportModule.ReportDefinition.Name
  • ReportReportDefinition.Metadata => ReportModule.ReportDefinition.Metadata
  • And so on...

There's an upcoming article that will show more examples of the improved flexibility and capabilities that come with the v2-metadata.

Action names

The standard action names have moved as well.

  • ActionNames => ApplicationActionNames
  • MetaActionNames => MetaApplicationActionNames

Any other, more rarely used action names have been moved back to the actions themselves, so for example

SaveApplicationSettingsAction.ActionName

If you created any actions of your own, then the API there has changed as well. As previously documented in API Design: To Generic or not Generic? (Part II), instead of overriding the following method,

protected override int DoExecute(IApplication application, ConfigurationOptions options, int currentResult)
{
  base.DoExecute(application, options, currentResult);
}

you instead override in the following way,

public override void Execute()
{
  base.Execute();
}

Using NuGet

If you're already using Visual Studio 2015, then the NuGet UI is a good choice for managing packages. If you're still on Visual Studio 2013, then the UI there is pretty flaky and we recommend using the console.

The examples below assume that you have configured a source called "Local Quino" (e.g. a local folder that holds the nupkg files for Quino).

install-package Quino.Data.PostgreSql.Testing -ProjectName Punchclock.Core.Tests -Source "Local Quino"
install-package Quino.Server -ProjectName Punchclock.Server -Source "Local Quino"
install-package Quino.Console -ProjectName Punchclock.Server -Source "Local Quino"
install-package Quino.Web -ProjectName Punchclock.Web.API -Source "Local Quino"

Debugging Support

We recommend using Visual Studio 2015 if at all possible. Visual Studio 2013 is also supported, but we have all migrated to 2015 and our knowhow about 2013 and its debugging idiosyncrasies will deteriorate with time.

These are just brief points of interest to get you set up. As with the NuGet support, these instructions are subject to change as we gain more experience with debugging with packages as well.

  • Hook up to a working symbol-source server (e.g. TeamCity)
  • Get the local sources for your version
  • If you don't have a source server or it's flaky, then get the PDBs for the Quino version you're using (provided in Quino.zip as part of the package release)
  • Add the path to the PDBs to your list of symbol sources in the VS debugging options
  • Tell Visual Studio where the sources are when it asks during debugging
  • Tell R# how to map from the source folder (c:\BuildAgent\work\9a1bb0adebb73b1f for Quino 2.0.0-1765) to the location of your sources

Quino packages are no different than any other NuGet packages. We provide both standard packages as well as packages with symbols and sources. Any complications you encounter with them are due to the whole NuGet experience still being a bit in-flux in the .NET world.

An upcoming post will provide more detail and examples.

Creating Nuget Packages

We generally use our continuous integration server to create packages, but you can also create packages locally (it's up to you to make sure the version number makes sense, so be careful). These instructions are approximate and are subject to change. I provide them here to give you an idea of how packages are created. If they don't work, please contact Encodo for help.

  • Open PowerShell
  • Change to the %QUINO_ROOT%\src directory
  • Run nant build pack to build Quino and packages
  • Set up a local NuGet Source name "Local Quino" to %QUINO_ROOT%\nuget (one-time only)
  • Change to the directory where your Quino packages are installed for your solution.
  • Delete all of the Encodo/Quino packages
  • Execute nant nuget from your project directory to get the latest Quino build from your local folder
Quino 2: Starting up an application, in detail

As part of the final release process for Quino 2, we've upgraded 5 solutions1 from Quino 1.13 to the latest API in order to shake out any remaining API inconsistencies or even just inelegant or clumsy calls or constructs. A lot of questions came up during these conversions, so I wrote the following blog to provide detail on the exact workings and execution order of a Quino application.

I've discussed the design of Quino's configuration before, most recently in API Design: Running an Application (Part I) and API Design: To Generic or not Generic? (Part II) as well as the three-part series that starts with Encodos configuration library for Quino: part I.

Quino Execution Stages

The life-cycle of a Quino 2.0 application breaks down into roughly the following stages:

  1. Build Application: Register services with the IOC, add objects needed during configuration and add actions to the startup and shutdown lists
  2. Load User Configuration: Use non-IOC objects to bootstrap configuration from the command line and configuration files; IOC is initialized and can no longer be modified after action ServicesInitialized
  3. Apply Application Configuration: Apply code-based configuration to IOC objects; ends with the ServicesConfigured action
  4. Execute: execute the loop, event-handler, etc.
  5. Shut Down: dispose of the application, shutting down services in the IOC, setting the exit code, etc.

Stage 1

The first stage is all about putting the application together with calls to Use various services and features. This stage is covered in detail in three parts, starting with Encodos configuration library for Quino: part I.

Stage 2

Let's tackle this one last because it requires a bit more explanation.

Stage 3

Technically, an application can add code to this stage by adding an IApplicationAction before the ServicesConfigured action. Use the Configure<TService>() extension method in stage 1 to configure individual services, as shown below.

application.Configure<IFileLogSettings>(
  s => s.Behavior = FileLogBehavior.MultipleFiles
);

Stage 4

The execution stage is application-specific. This stage can be short or long, depending on what your application does.

For desktop applications or single-user utilities, stage 4 is executed in application code, as shown below, in the Run method, which called by the ApplicationManager after the application has started.

var transcript = new ApplicationManager().Run(CreateApplication, Run);

IApplication CreateApplication() { ... }
void Run(IApplication application) { ... }

If your application is a service, like a daemon or a web server or whatever, then you'll want to execute stages 1--3 and then let the framework send requests to your application's running services. When the framework sends the termination signal, execute stage 5 by disposing of the application. Instead of calling Run, you'll call CreateAndStartupUp.

var application = new ApplicationManager().CreateAndStartUp(CreateApplication);

IApplication CreateApplication() { ... }

Stage 5

Every application has certain tasks to execute during shutdown. For example, an application will want to close down any open connections to external resources, close file (especially log files) and perhaps inform the user of shutdown.

Instead of exposing a specific "shutdown" method, a Quino 2.0 application can simply be disposed to shut it down.

If you use ApplicationManager.Run() as shown above, then you're already sorted -- the application will be disposed and the user will be informed in case of catastrophic failure; otherwise, you can shut down and get the final application transcript from the disposed object.

application.Dispose();
var transcript = application.GetTranscript();
// Do something with the transcript...

Stage 2 Redux

We're finally ready to discuss stage 2 in detail.

An IOC has two phases: in the first phase, the application registers services with the IOC; in the second phase, the application uses services from the IOC.

An application should use the IOC as much as possible, so Quino keeps stage 2 as short as possible. Because it can't use the IOC during the registration phase, code that runs in this stage shares objects via a poor-man's IOC built into the IApplication that allows modification and only supports singletons. Luckily, very little end-developer application code will ever need to run in this stage. It's nevertheless interesting to know how it works.

Obviously, any code in this stage that uses the IOC will cause it to switch from phase one to phase two and subsequent attempts to register services will fail. Therefore, while application code in stage 2 has to be careful, you don't have to worry about not knowing you've screwed up.

Why would we have this stage? Some advocates of using an IOC claim that everything should be configured in code. However, it's not uncommon for applications to want to run very differently based on command-line or other configuration parameters. The Quino startup handles this by placing the following actions in stage 2:

  • Parse and apply command-line
  • Import and apply external configuration (e.g. from file)

An application is free to insert more actions before the ServicesInitialized action, but they have to play by the rules outlined above.

"Single" objects

Code in stage 2 shares objects by calling SetSingle() and GetSingle(). There are only a few objects that fall into this category.

The calls UseCore() and UseApplication() register most of the standard objects used in stage 2. Actually, while they're mostly used during stage 2, some of them are also added to the poor man's IOC in case of catastrophic failure, in which case the IOC cannot be assumed to be available. A good example is the IApplicationCrashReporter.

Executing Stages

Before listing all of the objects, let's take a rough look at how a standard application is started. The following steps outline what we consider to be a good minimum level of support for any application. Of course, the Quino configuration is modular, so you can take as much or as little as you like, but while you can use a naked Application -- which has absolutely nothing registered -- and you can call UseCore() to have a bit more -- it registers a handful of low-level services but no actions -- we recommend calling at least UseApplication() to adds most of the functionality outlined below.

  1. Create application: This involves creating the IOC and most of the IOC registration as well as adding most of the application startup actions (stage 1)
  2. Set debug mode: Get the final value of RunMode from the IRunSettings to determine if the application should catch all exceptions or let them go to the debugger. This involves getting the IRunSettings from the application and getting the final value using the IApplicationManagerPreRunFinalizer. This is commonly an implementation that can allows setting the value of RunMode from the command-line in debug builds. This further depends on the ICommandSetManager (which depends on the IValueTools) and possibly the ICommandLineSettings (to set the CommandLineConfigurationFilename if it was set by the user).
  3. Process command line: Set the ICommandProcessingResult, possibly setting other values and adding other configuration steps to the list of startup actions (e.g. many command-line options are switches that are handled by calling Configure<TSettings>() where TSettings is the configuration object in the IOC to modify).
  4. Read configuration file: Load the configuration data into the IConfigurationDataSettings, involving the ILocationManager to find configuration files and the ITextValueNodeReader to read them.
  5. The ILogger is used throughout by various actions to log application behavior
  6. If there is an unhandled error, the IApplicationCrashReporter uses the IFeedback or the ILogger to notify the user and log the error
  7. The IInMemoryLogger is used to include all in-memory messages in the IApplicationTranscript

The next section provides detail to each of the individual objects referenced in the workflow above.

Available Objects

You can get any one of these objects from the IApplication in at least two ways, either by using GetSingle<TService>() (safe in all situations) or GetInstance<TService>() (safe only in stage 3 or later) or there's almost always a method which starts with "Use" and ends in the service name.

The example below shows how to get the ICommandSetManager2 if you need it.

application.GetCommandSetManager();
application.GetSingle<ICommandSetManager>(); // Prefer the one above
application.GetInstance<ICommandSetManager>();

All three calls return the exact same object, though. The first two from the poor-man's IOC; the last from the real IOC.

Only applications that need access to low-level objects or need to mess around in stage 2 need to know which objects are available where and when. Most applications don't care and will just always use GetInstance().

The objects in the poor-man's IOC are listed below.

Core

  • IValueTools: converts values; used by the command-line parser, mostly to translate enumerate values and flags
  • ILocationManager: an object that manages aliases for file-system locations, like "Configuration", from which configuration files should be loaded or "UserConfiguration" where user-specific overlay configuration files are stored; used by the configuration loader
  • ILogger: a reference to the main logger for the application
  • IInMemoryLogger: a reference to an in-memory message store for the logger (used by the ApplicationManager to retrieve the message log from a crashed application)
  • IMessageFormatter: a reference to the object that formats messages for the logger

Command line

  • ICommandSetManager: sets the schema for a command line; used by the command-line parser
  • ICommandProcessingResult: contains the result of having processed the command line
  • ICommandLineSettings: defines the properties needed to process the command line (e.g. the Arguments and CommandLineConfigurationFilename, which indicates the optional filename to use for configuration in addition to the standard ones)

Configuration

  • IConfigurationDataSettings: defines the ConfigurationData which is the hierarchical representation of all configuration data for the application as well as the MainConfigurationFilename from which this data is read; used by the configuration-loader
  • ITextValueNodeReader: the object that knows how to read ConfigurationData from the file formats supported by the application3; used by the configuration-loader

Run

  • IRunSettings: an object that manages the RunMode ("release" or "debug"), which can be set from the command line and is used by the ApplicationManager to determine whether to use global exception-handling
  • IApplicationManagerPreRunFinalizer: a reference to an object that applies any options from the command line before the decision of whether to execute in release or debug mode is taken.
  • IApplicationCrashReporter: used by the ApplicationManager in the code surrounding the entire application execution and therefore not guaranteed to have a usable IOC available
  • IApplicationDescription: used together with the ILocationManager to set application-specific aliases to user-configuration folders (e.g. AppData\{CompanyTitle}\{ApplicationTitle})
  • IApplicationTranscript: an object that records the last result of having run the application; returned by the ApplicationManager after Run() has completed, but also available through the application object returned by CreateAndStartUp() to indicate the state of the application after startup.

Each of these objects has a very compact interface and has a single responsibility. An application can easily replace any of these objects by calling UseSingle() during stage 1 or 2. This call sets the object in both the poor-man's IOC as well as the real one. For those rare cases where a non-IOC singleton needs to be set after the IOC has been finalized, the application can call SetSingle(), which does not touch the IOC. This feature is currently used only to set the IApplicationTranscript, which needs to happen even after the IOC registration is complete.


application.GetSingle<ITextValueNodeReader>();
application.GetInstance<ITextValueNodeReader>();
application.GetConfigurationDataReader(); // Recommended

  1. Two large customer solutions, two medium-sized internal solutions (Punchclock and JobVortex) as well as the Demo/Sandbox solution. These solutions include the gamut of application types:

    * 3 ASP.NET MVC applications
    * 2 ASP.NET WebAPI applications
    * 2 Windows services
    * 3 Winform/DevExpress applications
    * 2 Winform/DevExpress utilities
    * 4 Console applications and utilities
    

  2. I originally used ITextValueNodeReader as an example, but that's one case where the recommended call doesn't match 1-to-1 with the interface name.

  3. Currently only XML, but JSON is on the way when someone gets a free afternoon.

`IServer`: converting hierarchy to composition

Quino has long included support for connecting to an application server instead of connecting directly to databases or other sources. The application server uses the same model as the client and provides modeled services (application-specific) as well as CRUD for non-modeled data interactions.

We wrote the first version of the server in 2008. Since then, it's acquired better authentication and authorization capabilities as well as routing and state-handling. We've always based it on the .NET HttpListener.

Old and Busted

As late as Quino 2.0-beta2 (which we had deployed in production environments already), the server hierarchy looked like screenshot below, pulled from issue QNO-4927:

image

This screenshot was captured after a few unneeded interfaces had already been removed. As you can see by the class names, we'd struggled heroically to deal with the complexity that arises when you use inheritance rather than composition.

The state-handling was welded onto an authentication-enabled server, and the base machinery for supporting authentication was spread across three hierarchy layers. The hierarchy only hints at composition in its naming: the "Stateful" part of the class name CoreStatefulHttpServerBase<TState> had already been moved to a state provider and a state creator in previous versions. That support is unchanged in the 2.0 version.

Implementation Layers

We mentioned above that implementation was "spread across three hierarchy layers". There's nothing wrong with that, in principle. In fact, it's a good idea to encapsulate higher-level patterns in a layer that doesn't introduce too many dependencies and to introduce dependencies in other layers. This allows applications not only to be able to use a common implementation without pulling in unwanted dependencies, but also to profit from the common tests that ensure the components works as advertised.

In Quino, the following three layers are present in many components:

  1. Abstract: a basic encapsulation of a pattern with almost no dependencies (generally just Encodo.Core).
  2. Standard: a functional implementation of the abstract pattern with dependencies on non-metadata assemblies (e.g. Encodo.Application, Encodo.Connections and so on)
  3. Quino: an enhancement of the standard implementation that makes use of metadata to fill in implementation left abstract in the previous layer. Dependencies can include any of the Quino framework assemblies (e.g. Quino.Meta, Quino.Application and so on).

The New Hotness1

The diagram below shows the new hotness in Quino 2.2

image

The hierarchy is now extremely flat. There is an IServer interface and a Server implementation, both generic in TListener, of type IServerListener. The server manages a single instance of an IServerListener.

The listener, in turn, has an IHttpServerRequestHandler, the main implementation of which uses an IHttpServerAuthenticator.

As mentioned above, the IServerStateProvider is included in this diagram, but is unchanged from Quino 2.0-beta3, except that it is now used by the request handler rather than directly by the server.

You can see how the abstract layer is enhanced by an HTTP-specific layer (the Encodo.Server.Http namespace) and the metadata-specific layer is nice encapsulated in three classes in the Quino.Server assembly.

Server Components and Flow

This type hierarchy has decoupled the main elements of the workflow of handling requests for a server:

  • The server manages listeners (currently a single listener), created by a listener factory
  • The listener, in turn, dispatches requests to the request handler
  • The request handler uses the route handler to figure out where to direct the request
  • The route handler uses a registry to map requests to response items
  • The request handler asks the state provider for the state for the given request
  • The state provider checks its cache for the state (the default support uses persistent states to cache sessions for a limited time); if not found, it creates a new one
  • Finally, the request handler checks whether the user for the request is authenticated and/or authorized to execute the action and, if so, executes the response items

It is important to note that this behavior is unchanged from the previous version -- it's just that now each step is encapsulated in its own component. The components are small and easily replaced, with clear and concise interfaces.

Note also that the current implementation of the request handler is for HTTP servers only. Should the need arise, however, it would be relatively easy to abstract away the HttpListener dependency and generalize most of the logic in the request handler for any kind of server, regardless of protocol and networking implementation. Only the request handler is affected by the HTTP dependency, though: authentication, state-provision and listener-management can all be re-used as-is.

Also of note is that the only full-fledged implementation is for metadata-based applications. At the bottom of the diagram, you can see the metadata-specific implementations for the route registry, state provider and authenticator. This is reflected in the standard registration in the IOC.

These are the service registrations from Encodo.Server:

return handler
  .RegisterSingle<IServerSettings, ServerSettings>()
  .RegisterSingle<IServerListenerFactory<HttpServerListener>, HttpServerListenerFactory>()
  .Register<IServer, Server<HttpServerListener>>();

And these are the service registrations from Quino.Server:

handler
  .RegisterSingle<IServerRouteRegistry<IMetaServerState>, StandardMetaServerRouteRegistry>()
  .RegisterSingle<IServerStateProvider<IMetaServerState>, MetaPersistentServerStateProvider>()
  .RegisterSingle<IServerStateCreator<IMetaServerState>, MetaServerStateCreator>()
  .RegisterSingle<IHttpServerAuthenticator<IMetaServerState>, MetaHttpServerAuthenticator>()
  .RegisterSingle<IHttpServerRequestHandler, HttpServerRequestHandler<IMetaServerState>>()

As you can see, the registration is extremely fine-grained and allows very precise customization as well as easy mocking and testing.



  1. Any Men in Black fans out there? Tommy Lee Jones was "old and busted" while Will Smith was "the new hotness"? No? Just me? All righty then...

  2. This diagram brought to you by the diagramming and architecture tools in ReSharper 9.2. Just select the files or assemblies you want to diagram in the Solution Explorer and choose the option to show them in a diagram. You can right-click any type or assembly to show dependent or referenced modules or types. For type diagrams , you can easily control which relationships are to be shown (e.g. I hide aggregations to avoid clutter) and how the elements are to be grouped (e.g. I grouped by namespace to include the boxes in my diagram).

Iterating with NDepend to remove cyclic dependencies (Part II)

In the previous article, we discussed the task of Splitting up assemblies in Quino using NDepend. In this article, I'll discuss both the high-level and low-level workflows I used with NDepend to efficiently clear up these cycles.

Please note that what follows is a description of how I have used the tool -- so far -- to get my very specific tasks accomplished. If you're looking to solve other problems or want to solve the same problems more efficiently, you should take a look at the official NDepend documentation.

What were we doing?

To recap briefly: we are reducing dependencies among top-level namespaces in two large assemblies, in order to be able to split them up into multiple assemblies. The resulting assemblies will have dependencies on each other, but the idea is to make at least some parts of the Encodo/Quino libraries opt-in.

The plan of attack

On a high-level, I tackled the task in the following loosely defined phases.

Remove direct, root-level dependencies

This is the big first step -- to get rid of the little black boxes. I made NDepend show only direct dependencies at first, to reduce clutter. More on specific techniques below.

Remove indirect dependencies

imageCrank up the magnification to show indirect dependencies as well. This will will help you root out the remaining cycles, which can be trickier if you're not showing enough detail. On the contrary, if you turn on indirect dependencies too soon, you'll be overwhelmed by darkness (see the depressing initial state of the Encodo assembly to the right).

Examine dependencies between root-level namespaces

Even once you've gotten rid of all cycles, you may still have unwanted dependencies that hinder splitting namespaces into the desired constellation of assemblies.

For example, the plan is to split all logging and message-recording into an assembly called Encodo.Logging. However, the IRecorder interface (with a single method, Log()) is used practically everywhere. It quickly becomes necessary to split interfaces and implementation -- with many more potential dependencies -- into two assemblies for some very central interfaces and support classes. In this specific case, I moved IRecorder to Encodo.Core.

Even after you've conquered the black hole, you might still have quite a bit of work to do. Never fear, though: NDepend is there to help root out those dependencies as well.

Examine cycles in non-root namespaces

Because we can split off smaller assemblies regardless, these dependencies are less important to clean up for our current purposes. However, once this code is packed into its own assembly, its namespaces become root namespaces of their own and -- voila! you have more potentially nasty dependencies to deal with. Granted, the problem is less severe because you're dealing with a logically smaller component.

In Quino, use non-root namespaces more for organization and less for defining components. Still, cycles are cycles and they're worth examining and at least plucking the low-hanging fruit.

Removing root-level namespace cycles

With the high-level plan described above in hand, I repeated the following steps for the many dependencies I had to untangle. Don't despair if it looks like your library has a ton of unwanted dependencies. If you're smart about the ones you untangle first, you can make excellent -- and, most importantly, rewarding -- progress relatively quickly.1

  1. Show the dependency matrix
  2. Choose the same assembly in the row and column
  3. Choose a square that's black
  4. Click the name of the namespace in the column to show sub-namespaces
  5. Do the same in a row
  6. Keep zooming until you can see where there are dependencies that you don't want
  7. Refactor/compile/run NDepend analysis to show changes
  8. GOTO 1

Once again, with pictures!

The high-level plan of attack sounded interesting, but might have left you cold with its abstraction. Then there was the promise of detail with a focus on root-level namespaces, but alas, you might still be left wondering just how exactly do you reduce these much-hated cycles?

I took some screenshots as I worked on Quino, to document my process and point out parts of NDepend I thought were eminently helpful.

Show only namespaces

imageimageI mentioned above that you should "[k]eep zooming in", but how do you do that? A good first step is to zoom all the way out and show only direct namespace dependencies. This focuses only on using references instead of the much-more frequent member accesses. In addition, I changed the default setting to show dependencies in only one direction -- when a column references a row (blue), but not vice versa (green).

As you can see, the diagrams are considerably less busy than the one shown above. Here, we can see a few black spots that indicate cycles, but it's not so many as to be overwhelming.2 You can hover over the offending squares to show more detail in a popup.

Show members

imageimageIf you don't see any more cycles between namespaces, switch the detail level to "Members". Another very useful feature is to "Bind Matrix", which forces the columns and rows to be shown in the same order and concentrates the cycles in a smaller area of the matrix.

As you can see in the diagram, NDepend then highlights the offending area and you can even click the upper-left corner to focus the matrix only on that particular cycle.

Drill down to classes

imageimageOnce you're looking at members, it isn't enough to know just the namespaces involved -- you need to know which types are referencing which types. The powerful matrix view lets you drill down through namespaces to show classes as well.

If your classes are large -- another no-no, but one thing at a time -- then you can drill down to show which method is calling which method to create the cycle. In the screenshot to the right, you can see where I had to do just that in order to finally figure out what was going on.

In that screenshot, you can also see something that I only discovered after using the tool for a while: the direction of usage is indicated with an arrow. You can turn off the tooltips -- which are informative, but can be distracting for this task -- and you don't have to remember which color (blue or green) corresponds to which direction of usage.

Indirect dependencies

imageimageOnce you've drilled your way down from namespaces-only to showing member dependencies, to focusing on classes, and even members, your diagram should be shaping up quite well.

On the right, you'll see a diagram of all direct dependencies for the remaining area with a problem. You don't see any black boxes, which means that all direct dependencies are gone. So we have to turn up the power of our microscope further to show indirect dependencies.

On the left, you can see that the scary, scary black hole from the start of our journey has been whittled down to a small, black spot. And that's with all direct and indirect dependencies as well as both directions of usage turned on (i.e. the green boxes are back). This picture is much more pleasing, no?

Queries and graphs

imageimageimageFor the last cluster of indirect dependencies shown above, I had to unpack another feature: NDepend queries: you can select any element and run a query to show using/used by assemblies/namespaces.3 The results are shown in a panel, where you can edit the query see live updates immediately.

Even with a highly zoomed-in view on the cycle, I still couldn't see the problem, so I took NDepend's suggestion and generated a graph of the final indirect dependency between Culture and Enums (through Expression). At this zoom level, the graph becomes more useful (for me) and illuminates problems that remain muddy in the matrix (see right).

Crossing the finish line

In order to finish the job efficiently, here are a handful of miscellaneous tips that are useful, but didn't fit into the guide above.

image

  • I set NDepend to automatically re-run an analysis on a successful build. The matrix updates automatically to reflect changes from the last analysis and won't lose your place.
  • If you have ReSharper, you'll generally be able to tell whether you've fixed the dependencies because the usings will be grayed out in the offending file. You can make several fixes at once before rebuilding and rerunning the analysis.
  • At higher zoom levels (e.g. having drilled down to methods), it is useful to toggle display of row dependencies back on because the dependency issue is only clear when you see the one green box in a sea of blue.
  • Though Matrix Binding is useful for localizing, remember to toggle it off when you want to drill down in the row independently of the namespace selected in the column.

And BOOM! just like that4, phase 1 (root namespaces) for Encodo was complete! Now, on to Quino.dll...

Conclusion

imageDepending on what shape your library is in, do not underestimate the work involved. Even with NDepend riding shotgun and barking out the course like a rally navigator, you still have to actually make the changes. That means lots of refactoring, lots of building, lots of analysis, lots of running tests and lots of reviews of at-times quite-sweeping changes to your code base. The destination is worth the journey, but do not embark on it lightly -- and don't forget to bring the right tools.5



  1. This can be a bit distracting: you might get struck trying to figure out which of all these offenders to fix first.

  2. I'm also happy to report that my initial forays into maintaining a relatively clean library -- as opposed to cleaning it -- with NDepend have been quite efficient.

  3. And much more: I don't think I've even scratched the surface of the analysis and reporting capabilities offered by this ability to directly query the dependency data.

  4. I'm just kidding. It was a lot of time-consuming work.

  5. In this case, in case it's not clear: NDepend for analysis and good ol' ReSharper for refactoring. And ReSharper's new(ish) architecture view is also quite good, though not even close to detailed enough to replace NDepend: it shows assembly-level dependencies only.

Splitting up assemblies in Quino using NDepend (Part I)

imageA lot of work has been put into Quino 2.01, with almost no stone left unturned. Almost every subsystem has been refactored and simplified, including but not limited to the data driver, the schema migration, generated code and metadata, model-building, security and authentication, service-application support and, of course, configuration and execution.

Two of the finishing touches before releasing 2.0 are to reorganize all of the code into a more coherent namespace structure and to reduce the size of the two monolithic assemblies: Encodo and Quino.

A Step Back

The first thing to establish is: why are we doing this? Why do we want to reduce dependencies and reduce the size of our assemblies? There are several reasons, but a major reason is to improve the discoverability of patterns and types in Quino. Two giant assemblies are not inviting -- they are, in fact, daunting. Replace these assemblies with dozens of smaller ones and users of your framework will be more likely to (A) find what they're looking for on their own and (B) build their own extensions with the correct dependencies and patterns. Neither of these is guaranteed, but smaller modules are a great start.

Another big reason is portability. The .NET Core was released as open-source software some time ago and more and more .NET source code is added to it each day. There are portable targets, non-Windows targets, Universal-build targets and much more. It makes sense to split code up into highly portable units with as few dependencies as possible. That is, the dependencies should be explicit and intended.

Not only that, but NuGet packaging has come to the fore more than ever. Quino was originally designed to keep third-party boundaries clear, but we wanted to make it as easy as possible to use Quino. Just include Encodo and Quino and off you went. However, with NuGet, you can now say you want to use Quino.Standard and you'll get Quino.Core, Encodo.Core, Encodo.Services.SimpleInjector, Quino.Services.SimpleInjector and other packages.

With so much interesting code in the Quino framework, we want to make it available as much as possible not only for our internal projects but also for customer projects where appropriate and, also, possibly for open-source distribution.

NDepend

I've used NDepend before2 to clean up dependencies. However, the last analysis I did about a year ago showed quite deep problems3 that needed to be addressed before any further dependency analysis could bear fruit at all. With that work finally out of the way, I'm ready to re-engage with NDepend and see where we stand with Quino.

As luck would have it, NDepend is in version 6, released at the start of summer 2015. As was the case last year, NDepend has generously provided me with an upgrade license to allow me to test and evaluate the new version with a sizable and real-world project.

Here is some of the feedback I sent to NDepend:

I really, really like the depth of insight NDepend gives me into my code. I find myself thinking "SOLID" much more often when I have NDepend shaking its head sadly at me, tsk-tsking at all of the dependency snarls I've managed to build.

  • It's fast and super-reliable. I can work these checks into my workflow relatively easily.
  • I'm using the matrix view a lot more than the graphs because even NDepend recommends I don't use a graph for the number of namespaces/classes I'm usually looking at
  • Where the graph view is super-useful is for examining indirect dependencies, which are harder to decipher with the graph
  • I've found so many silly mistakes/lazy decisions that would lead to confusion for developers new to my framework
  • I'm spending so much time with it and documenting my experiences because I want more people at my company to use it
  • I haven't even scratched the surface of the warnings/errors but want to get to that, as well (the Dashboard tells me of 71 rules violated; 9 critical; I'm afraid to look :-)

Use Cases

Before I get more in-depth with NDepend, please note that there at least two main use cases for this tool4:

  1. Clean up a project or solution that has never had a professional dependency checkup
  2. Analyze and maintain separation and architectural layers in a project or solution

These two use cases are vastly different. The first is like cleaning a gas-station bathroom for the first time in years; the second is more like the weekly once-over you give your bathroom at home. The tools you'll need for the two jobs are similar, but quite different in scope and power. The same goes for NDepend: how you'll use it to claw your way back to architectural purity is different than how you'll use it to occasionally clean up an already mostly-clean project.

Quino is much better than it was the last time we peeked under the covers with NDepend, but we're still going to need a bucket of industrial cleaner before we're done.5

The first step is to make sure that you're analyzing the correct assemblies. Show the project properties to see which assemblies are included. You should remove all assemblies from consideration that don't currently interest you (especially if your library is not quite up to snuff, dependency-wise; afterwards, you can leave as many clean assemblies in the list as you like).6

Industrial-strength cleaner for Quino

Running an analysis with NDepend 6 generates a nice report, which includes the following initial dependency graph for the assemblies.

image

As you can see, Encodo and Quino depend only on system assemblies, but there are components that pull in other references where they might not be needed. The initial dependency matrices for Encodo and Quino both look much better than they did when I last generated one. The images below show what we have to work with in the Encodo and Quino assemblies.

imageimage

It's not as terrible as I've made out, right? There is far less namespace-nesting, so it's much easier to see where the bidirectional dependencies are. There are only a handful of cyclic dependencies in each library, with Encodo edging out Quino because of (A) the nature of the code and (B) I'd put more effort into Encodo so far.

I'm not particularly surprised to see that this is relatively clean because we've put effort into keeping the external dependencies low. It's the internal dependencies in Encodo and Quino that we want to reduce.

Small and Focused Assemblies

imageimageimage

The goal, as stated in the title of this article, is to split Encodo and Quino into separate assemblies. While removing cyclic dependencies is required for such an operation, it's not sufficient. Even without cycles, it's still possible that a given assembly is still too dependent on other assemblies.

Before going any farther, I'm going to list the assemblies we'd like to have. By "like to have", I mean the list that we'd originally planned plus a few more that we added while doing the actual splitting.7 The images on the right show the assemblies in Encodo, Quino and a partial overview of the dependency graph (calculated with the ReSharper Architecture overview rather than with NDepend, just for variety).

Of these, the following assemblies and their dependencies are of particular interest[^8]:

  • Encodo.Core: System dependencies only
  • Encodo.Application: basic application support8
  • Encodo.Application.Standard: configuration methods for non-metadata applications that don't want to pick and choose packages/assemblies
  • Encodo.Expressions: depends only on Encodo.Core
  • Quino.Meta: depends only on Encodo.Core and Encodo.Expressions
  • Quino.Meta.Standard: Optional, but useful metadata extensions
  • Quino.Application: depends only on Encodo.Application and Quino.Meta
  • Quino.Application.Standard: configuration methods for metadata applications that don't want to pick and choose packages/assemblies
  • Quino.Data: depends on Quino.Application and some Encodo.* assemblies
  • Quino.Schema: depends on Quino.Data

This seems like a good spot to stop, before getting into the nitty-gritty detail of how we used NDepend in practice. In the next article, I'll discuss both the high-level and low-level workflows I used with NDepend to efficiently clear up these cycles. Stay tuned!


Articles about design:

    * [Encodos configuration library for Quino: part I](/blogs/developer-blogs/encodos-configuration-library-for-quino-part-i/)
    * [Encodos configuration library for Quino: part II](/blogs/developer-blogs/encodos-configuration-library-for-quino-part-ii/)
    * [Encodos configuration library for Quino: part III](/blogs/developer-blogs/encodos-configuration-library-for-quino-part-iii/)
    * [API Design: Running an Application (Part I)](/blogs/developer-blogs/api-design-running-an-application-part-i/)
    * [API Design: To Generic or not Generic? (Part II)](/blogs/developer-blogs/api-design-to-generic-or-not-generic-part-ii/)
If you already see the correct assemblies in the list, you should still check that NDepend picked up the right paths. That is, if you haven't followed the advice in NDepend's white paper and still have a different `bin` folder for each assembly, you may see something like the following in the tooltip when you hover over the assembly name:

Several valid .NET assemblies with the name have been found. They all have the same version. the one with the biggest file has been chosen.

If NDepend has accidentally found an older copy of your assembly, you must delete that assembly. Even if you add an assembly directly, NDepend will not honor the path from which you added it. This isn't as bad as it sounds, since it's a very strange constellation of circumstances that led to this assembly hanging around anyway:

    * The project is no longer included in the latest Quino but lingers in my workspace
    * The version number is unfortunately the same, even though the assembly is wildly out of date

I only noticed because I knew I didn't have that many dependency cycles left in the Encodo assembly.


  1. Release notes for 2.0 betas:

    * [v2.0-beta1: Configuration, services and web](/blogs/developer-blogs/v20-beta1-configuration-services-and-web/)
    * [v2.0-beta2: Code generation, IOC and configuration](/blogs/developer-blogs/v20-beta2-code-generation-ioc-and-configuration/)
    

  2. I published a two-parter in August and November of 2014.

    * [The Road to Quino 2.0: Maintaining architecture with NDepend (part I)](/blogs/developer-blogs/the-road-to-quino-20-maintaining-architecture-with-ndepend-part-i/)
    * [The Road to Quino 2.0: Maintaining architecture with NDepend (part II)](/blogs/developer-blogs/the-road-to-quino-20-maintaining-architecture-with-ndepend-part-ii/)
    

  3. You can see a lot of the issues associated with these changes in the release notes for Quino 2.0-beta1 (mostly the first point in the "Highlights" section) and Quino 2.0-beta2 (pretty much all of the points in the "Highlights" section).

  4. I'm sure there are more, but those are the ones I can think of that would apply to my project (for now).

  5. ...to stretch the gas-station metaphor even further.

  6. Here I'm going to give you a tip that confused me for a while, but that I think was due to particularly bad luck and is actually quite a rare occurrence.

  7. Especially for larger libraries like Quino, you'll find that your expectations about dependencies between modules will be largely correct, but will still have gossamer filaments connecting them that prevent a clean split. In those cases, we just created new assemblies to hold these common dependencies. Once an initial split is complete, we'll iterate and refactor to reduce some of these ad-hoc assemblies.[^8]: Screenshots, names and dependencies are based on a pre-release version of Quino, so while the likelihood is small, everything is subject to change.

  8. Stay tuned for an upcoming post on the details of starting up an application, which is the support provided in Encodo.Application.

API Design: To Generic or not Generic? (Part II)

imageIn this article, I'm going to continue the discussion started in Part I, where we laid some groundwork about the state machine that is the startup/execution/shutdown feature of Quino. As we discussed, this part of the API still suffers from "several places where generic TApplication parameters [are] cluttering the API". In this article, we'll take a closer look at different design approaches to this concrete example -- and see how we decided whether to use generic type parameters.

Consistency through Patterns and API

Any decision you take with a non-trivial API is going to involve several stakeholders and aspects. It's often not easy to decide which path is best for your stakeholders and your product.

For any API you design, consider how others are likely to extend it -- and whether your pattern is likely to deteriorate from neglect.

For any API you design, consider how others are likely to extend it -- and whether your pattern is likely to deteriorate from neglect. Even a very clever solution has to be balanced with simplicity and elegance if it is to have a hope in hell of being used and standing the test of time.

In Quino 2.0, the focus has been on ruthlessly eradicating properties on the IApplication interface as well as getting rid of the descendant interfaces, ICoreApplication and IMetaApplication. Because Quino now uses a pattern of placing sub-objects in the IOC associated with an IApplication, there is far less need for a generic TApplication parameter in the rest of the framework. See Encodos configuration library for Quino: part I for more information and examples.

This focus raised an API-design question: if we no longer want descendant interfaces, should we eliminate parameters generic in that interface? Or should we continue to support generic parameters for applications so that the caller will always get back the type of application that was passed in?

Before getting too far into the weeds1, let's look at a few concrete examples to illustrate the issue.

Do Fluent APIs require generic return-parameters?

As discussed in Encodos configuration library for Quino: part III in detail, Quino applications are configured with the "Use*" pattern, where the caller includes functionality in an application by calling methods like UseRemoteServer() or UseCommandLine(). The latest version of this API pattern in Quino recommends returning the application that was passed in to allow chaining and fluent configuration.

For example, the following code chains the aforementioned methods together without creating a local variable or other clutter.

return new CodeGeneratorApplication().UseRemoteServer().UseCommandLine();

What should the return type of such standard configuration operations be? Taking a method above as an example, it could be defined as follows:

public static IApplication UseCommandLine(this IApplication application, string[] args) { ... }

This seems like it would work fine, but the original type of the application that was passed in is lost, which is not exactly in keeping with the fluent style. In order to maintain the type, we could define the method as follows:

public static TApplication UseCommandLine<TApplication>(this TApplication application, string[] args)
  where TApplication : IApplication
{ ... }

This style is not as succinct but has the advantage that the caller loses no type information. On the other hand, it's more work to define methods in this way and there is a strong likelihood that many such methods will simply be written in the style in the first example.

Generics definitely offer advantages, but it remains to be seen how much those advantages are worth.

Why would other coders do that? Because it's easier to write code without generics, and because the stronger result type is not needed in 99% of the cases. If every configuration method expects and returns an IApplication, then the stronger type will never come into play. If the compiler isn't going to complain, you can expect a higher rate of entropy in your API right out of the gate.

One way the more-derived type would come in handy is if the caller wanted to define the application-creation method with their own type as a result, as shown below:

private static CodeGeneratorApplication CreateApplication()
{
  return new CodeGeneratorApplication().UseRemoteServer().UseCommandLine();
}

If the library methods expect and return IApplication values, the result of UseCommandLine() will be IApplication and requires a cast to be used as defined above. If the library methods are defined generic in TApplication, then everything works as written above.

This is definitely an advantage, in that the user gets the exact type back that they created. Generics definitely offer advantages, but it remains to be seen how much those advantages are worth.2

Another example: The IApplicationManager

Before we examine the pros and cons further, let's look at another example.

In Quino 1.x, applications were created directly by the client program and passed into the framework. In Quino 2.x, the IApplicationManager is responsible for creating and executing applications. A caller passes in two functions: one to create an application and another to execute an application.

A standard application startup looks like this:

new ApplicationManager().Run(CreateApplication, RunApplication);[^3]

Generic types can trigger an avalanche of generic parameters(tm) throughout your code.

The question is: what should the types of the two function parameters be? Does CreateApplication return an IApplication or a caller-specific derived type? What is the type of the application parameter passed to RunApplication? Also IApplication? Or the more derived type returned by CreateApplication?

As with the previous example, if the IApplicationManager is to return a derived type, then it must be generic in TApplication and both function parameters will be generically typed as well. These generic types will trigger an avalanche of generic parameters(tm) throughout the other extension methods, interfaces and classes involved in initializing and executing applications.

That sounds horrible. This sounds like a pretty easy decision. Why are we even considering the alternative? Well, because it can be very advantageous if the application can declare RunApplication with a strictly typed signature, as shown below.

private static void RunApplication(CodeGeneratorApplication application) { ... }

Neat, right? I've got my very own type back.

Where Generics Goes off the Rails

However, if the IApplicationManager is to call this function, then the signature of CreateAndStartUp() and Run() have to be generic, as shown below.

TApplication CreateAndStartUp<TApplication>(
  Func<IApplicationCreationSettings, TApplication> createApplication
)
 where TApplication : IApplication;

IApplicationExecutionTranscript Run<TApplication>(
  Func<IApplicationCreationSettings, TApplication> createApplication,
  Action<TApplication> run
)
  where TApplication : IApplication;

These are quite messy -- and kinda scary -- signatures.3 if these core methods are already so complex, any other methods involved in startup and execution would have to be equally complex -- including helper methods created by calling applications.4

The advantage here is that the caller will always get back the type of application that was created. The compiler guarantees it. The caller is not obliged to cast an IApplication back up to the original type. The disadvantage is that all of the library code is infected by a generic parameter with its attendant IApplication generic constraint.5

Don't add Support for Conflicting Patterns

The title of this section seems pretty self-explanatory, but we as designers must remain vigilant against the siren call of what seems like a really elegant and strictly typed solution.

But aren't properties on an application exactly what we just worked so hard to eliminate?

The generics above establish a pattern that must be adhered to by subsequent extenders and implementors. And to what end? So that a caller can attach properties to an application and access those in a statically typed manner, i.e. without casting?

But aren't properties on an application exactly what we just worked so hard to eliminate? Isn't the recommended pattern to create a "settings" object and add it to the IOC instead? That is, as of Quino 2.0, you get an IApplication and obtain the desired settings from its IOC. Technically, the cast is still taking place in the IOC somewhere, but that seems somehow less bad than a direct cast.

If the framework recommends that users don't add properties to an application -- and ruthlessly eliminated all standard properties and descendants -- then why would the framework turn around and add support -- at considerable cost in maintenance and readability and extendibility -- for callers that expect a certain type of application?

Wrapping up

Let's take a look at the non-generic implementation and see what we lose or gain. The final version of the IApplicationManager API is shown below, which properly balances the concerns of all stakeholders and hopefully will stand the test of time (or at least last until the next major revision).

IApplication CreateAndStartUp(
  Func<IApplicationCreationSettings, IApplication> createApplication
);

IApplicationExecutionTranscript Run(
  Func<IApplicationCreationSettings, IApplication> createApplication,
  Action<IApplication> run
);

These are the hard questions of API design: ensuring consistency, enforcing intent and balancing simplicity and cleanliness of code with expressiveness.



  1. A predilection of mine, I'll admit, especially when writing about a topic about which I've thought quite a lot. In those cases, the instinct to just skip "the object" and move on to the esoteric details that stand in the way of an elegant, perfect solution, is very, very strong.

  2. This more-realized typing was so attractive that we used it in many places in Quino without properly weighing the consequences. This article is the result of reconsidering that decision.

  3. Yes, the C# compiler will allow you to elide generics for most method calls (so long as the compiler can determine the types of the parameters without it). However, generics cannot be removed from constructor calls. These must always specify all generic parameters, which makes for messier-looking, lengthy code in the caller e.g. when creating the ApplicationManager were it to have been defined with generic parameters. Yet another thing to consider when choosing how to define you API.

  4. As already mentioned elsewhere (but it bears repeating): callers can, of course, eschew the generic types and use IApplication everywhere -- and most probably will, because the advantage offered by making everything generic is vanishingly small.. If your API looks this scary, entropy will eat it alive before the end of the week, to say nothing of its surviving to the next major version.

  5. A more subtle issue that arises is if you do end up -- even accidentally -- mixing generic and non-generic calls (i.e. using IApplication as the extended parameter in some cases and TApplication in others). This issue is in how the application object is registered in the IOC. During development, when the framework was still using generics everywhere (or almost everywhere), some parts of the code were retrieving a reference to the application using the most-derived type whereas the application had been registered in the container as a singleton using IApplication. The call to retrieve the most derived type returned a new instance of the application rather than the pre-registered singleton, which was a subtle and difficult bug to track down.