Quick Links

Recent Changes

    17.10.2019 - Azure Linked Accounts and SSH Keys

    Azure DevOps allows you to link multiple accounts.

    Our concrete use case was:

    • User U1 was registered with an Azure DevOps organization O1
    • Microsoft did some internal management and gave our partner account a new organization O2, complete with new accounts for all users. Now I have user U2 as well, registered with O2.
    • U2 was unable to take tests to qualify for partner benefits, so I had to use U1 but link the accounts so that those test results accrued to O2 as well as O1.
    • We want to start phasing out our users from O1, so we wanted to remove U1 from O1 and add U2

    Are we clear so far? U1 and U2 are linked because reasons. U1 is old and busted; U2 is the new hotness.

    The linking has an unexpected side-effect when managing SSH keys. If you have an SSH key registered with one of the linked accounts, you cannot register an SSH key with the same signature with any of the other accounts.

    This is somewhat understandable (I guess), but while the error message indicates that you have a duplicate, it doesn't tell you that the duplicate is in another account. When you check the account that you're using and see no other SSH keys registered, it's more than a little confusing.

    Not only that, but if the user to which you've added the SSH key has been removed from the organization, it isn't at all obvious how you're supposed to access your SSH key settings for an account that no longer has access to Azure DevOps (in order to remove the SSH key).

    Instead, you're left with an orphan account that's sitting on an SSH key that you'd like to use with a different account.

    So, you could create a new SSH key or you could do the following:

    • Re-add U1 to O1
    • Remove SSH key SSH1 from U1
    • Register SSH key SSH1 with U2
    • Profit

    If you can't add U1 to O1 anymore, then you'll just have to generate and use a new SSH1 key for Azure. It's not an earth-shatteringly bad user experience, but interesting to see how several logical UX decisions led to a place where a couple of IT guys were confused for long minutes.

    17.10.2019 - Using Git efficiently: SmartGit + BeyondCompare

    I've written about using SmartGit (SG) before1 2 and I still strongly recommend that developers who manage projects use a UI for Git.

    If you're just developing a single issue at a time and can branch, commit changes and make pull requests with your IDE tools, then more power to you. For this kind of limited workflow, you can get away with a limited tool-set without too big of a safety or efficiency penalty.

    However, if you need an overview or need to more management, then you're going to sacrifice efficiency and possibly correctness if you use only the command line or IDE tools.

    I tend to manage Git repositories, which means I'm in charge of pruning merged or obsolete branches and making sure that everything is merged. A well-rendered log view and overview of branches is indispensable for this kind of work.


    I have been and continue to be a proponent of SmartGit for all Git-related work. It not only has a powerful and intuitive UI, it also supports pull requests, including code comments that integrate with BitBucket, GitLab and GitHub, among others.

    It has a wonderful log view that I now regularly use as my standard view. It's fast and accurate (I almost never have to refresh explicitly to see changes) and I have a quick overview of the workspace, the index and recent commits. I can search for files and easily get individual logs and blame.

    The file-differ has gotten a lot better and has almost achieved parity with my favorite diffing/merging tool Beyond Compare. Almost, but not quite. The difference is still significant enough to justify Beyond Compare's purchase price of $60.

    What's better in Beyond Compare3?


    • While both differs have syntax-highlighting (and the supported file-types seem to be about the same), Beyond Compare distinguishes between significant and insignificant (e.g. comments) changes. It makes it much easier to see whether code or documentation has changed.
    • The intra-line diffing in Beyond Compare is more fine-grained and tends to highlight changes better. SmartGit is catching up in this regard.
    • You can re-align a diff manually using F7. This is helpful if you moved code and want to compare two chunks that the standard diff no longer sees as being comparable


    I could live without the Beyond Compare differ, but not without the merger.

    • The 4-pane view shows left, base and right above as well as the target below, with the target window being editable. Each change has its own color, so you can see afterwards whether you took left, right or made manual changes.
    • The merge view includes a line-by-line differ that shows left, base, right and target lines directly above one another, with a scrollbar for longer lines.
    • SmartGit has two separate windows for base vs. left/right and right/left vs. target. Long lines are really hard to decipher/merge in SmartGit

    Integrate Beyond Compare into SmartGit

    To set up SmartGit to use Beyond Compare

    1. Select Tools > Diff Tools
    2. Click the "Add..." button
    3. Set File Pattern to *
    4. Select "External diff tool"
    5. Set the command to C:\Program Files (x86)\Beyond Compare 4\BCompare.exe
    6. Set the arguments to "${leftFile}" "${rightFile}"
    7. Select Tools > Conflict Solvers
    8. Select "External Conflict Solver"
    9. Set File Pattern to *
    10. Set the command to C:\Program Files (x86)\Beyond Compare 4\BCompare.exe
    11. Set the arguments to "${leftFile}" "${rightFile}" "${baseFile}" "${mergedFile}"

    1. In Git: Managing local commits and branches (December 2016) and Programming in the modern/current age (February 2013)

    2. I am in no way affiliated with SmartGit.

    3. I am in no way affiliated with BeyondCompare.

    17.10.2019 - Visual Studio 2019 Survey

    Visual Studio 2019 (VS) asked me this morning if I was interested in taking a survey to convey my level of satisfaction with the IDE.

    VS displays the survey in an embedded window using IE11.1 I captured the screen of the first thing I saw when I agreed to take the survey.

    I know it's the SurveyMonkey script that's failing, but it's still not an auspicious start.

    1. I'd just upgraded to Windows 10 build 1903, which includes IE 11.418.18362.0. I can't imagine that they didn't test this combination.

    17.10.2019 - Multi-language web sites

    Why are multi-language web sites so hard to make? Even large companies like Microsoft, Google and Apple regularly send content with mixed-language content.

    This is probably due to several factors:

    1. Large web sites pull data for myriad sources, including CDNs and caching services. Each source needs to respect the requested language, If a source doesn't have support for a requested language, then just that piece of content will be delivered in the fallback format.
    2. Any proxies have to pass the requested language (and other headers) on to the backing server. If the backing server doesn't get the language request, then it can't respect the requested language, obviously.
    3. Any proxy that caches content has to respect the language header (as well as any other data-relevant headers) instead of just caching one copy per URL. While this is standard for commercial proxies and CDNs, it might not be the case for bespoke software.
    4. Some services might have a different context (e.g. logged-in user, detected via token in the request), who has different language settings than the requesting browser. This would mean that, while the main page content is pulled from the server with one language (e.g. en-US), the content for an embedded block might be requested as a logged-in user who has a different preferred language (e.g. de-CH). The server will likely honor the preferred language of the user account rather than the language included in the request, assuming it even even gets the language from the original request.
    5. Finally, Some companies1 are notoriously bad at multi-language software because they generally only acknowledge English and consider supporting other languages as a nice-to-have and that delivering English instead is an acceptable fallback because everyone can read English, right?

    The move to cloud-based and highly cached content has increased complexity considerably. Even if a company does everything right in (1), (2), and (3) above, the realities of (4) may still lead to a page that contains content in multiple languages.

    That is, each piece of software is functioning as designed but combining the output from those pieces of software leads to content that has multiple languages in it. At that point, you can either throw your hands in the air and give up...or you can start to redesign services to respect that requested language even if the user context's preferred language is different. This is not a decision you can make lightly and you run the risk of breaking the service's content in other places. Sometimes there is no right answer.

    Since I live in Switzerland, which has 4 official languages, I've seen EULAs from Apple written in a combination of French, English, German and even a word or two of Italian.

    The example below comes from Microsoft Edge's Tips page that they show when you start using the browser. Edge thinks that my default language is German despite the fact that my Windows is English. Microsoft tends to use the language of the region you're in (Switzerland) rather than the display language that you've expressly set, but...that's another discussion.

    At any rate, Edge thinks I want German content2 but Microsoft can't even reliably deliver German content for this main page, defaulting to English content in several places.

    1. I'm looking at you, US companies.

    2. I quickly checked the settings and could not find out how to change the list of languages I'd like to include in my browser requests. Other browsers do provide a list of accepted languages, but Edge's settings are quite limited.

    26.8.2019 - Reinforcements! (2)

    We are happy to welcome Joel Widmer and Matteo Bossi to our team!

    Joel will be reinforcing our Software-Developer section as well as Matteo as our new Software-Developer apprentice.

    17.7.2019 - How to use Authenticated NuGet Feeds

    Much of Encodo's infrastructure is now housed in Azure. Each employee has an account in Azure.

    From Visual Studio

    Because users are already authenticated in Visual Studio (to register it), they will be able to access Azure NuGet feeds through Visual Studio without any further intervention. You can restore/install/update without providing any additional credentials.

    From the Command Line

    As of today, access to Azure Feeds from the command line is granted only if you provide credentials with the source.

    Sources created in the Visual Studio UI do not include credentials.

    Solutions that include sources in a NuGet.config do not have credentials (because they are stored in the repository).

    Therefore, you must register a NuGet source with authentication for Azure for your user.

    Personal Nuget.Config

    You can find your NuGet.config in your roaming profile on Windows, at C:\Users\<username>\AppData\Roaming\NuGet\NuGet.Config.

    Instead of editing the file directly, use the NuGet command line to add an authenticated source.

    Create a Personal Access Token (PAT)

    You cannot just use your username/password to create an authenticated source. Instead, you have to use a PAT.

    Follow the instructions below to create a PAT for your Azure account.

    • Log in to Azure
    • From the user settings (top-right), select Security
    • Select "Personal access tokens" in the list on the left
    • Press the "New Token" button at the top-left of the list
    • Name it NuGet Feed Access
    • Leave the organization at the default (encodo for employees)
    • Set the expiration to something reasonable.
      • 90 days is probably OK.
      • You can choose up to a year.
      • You can update the expiration date at a later time.
    • Select Custom defined for Scopes
    • Click the "Show all scopes" link at the bottom of the dialog (above the "Create" button)
    • Scroll down to "Packaging" and select "Read"
    • Press "Create" to add the token
    • Copy the token immediately. It will never be shown again.
    • Store the token somewhere safe (a password manager is a good idea). If you forget it, you'll have to regenerate the token.

    Some extra tips:

    • Set up a reminder in your calendar for when your PAT is about to expire
    • You can change the expiration date for the token even after you've created it

    Add an Authenticated NuGet Source

    Now you can set up a NuGet source for your user.

    Execute the following command, replacing the bracketed arguments as follows:

    • <username>: your own user name (e.g. <bob@encodo.ch>)
    • <PAT> the PAT you generated above
    • Change the URL for a feed other than Quino
    nuget sources add -Name "Azure (Authenticated)" -Source https://encodo.pkgs.visualstudio.com/_packaging/Quino/nuget/v3/index.json -UserName <username> -Password <PAT>
    10.7.2019 - Source Link Flakiness in Visual Studio 2017 and 2019

    tl;dr: If MSBuild/Visual Studio tells you that "the value of SourceRoot.RepositoryUrl is invalid..." and you have no idea what it's talking about, it might help to add the following to the offending project and the error becomes a warning.


    Microsoft introduced this fancy new feature called Source Link that integrates with NuGet servers to deliver symbols and source code for packages.

    This feature is opt-in and library and package providers are encouraged to enable it and host packages on a server that supports Source Link.

    Debugging Experience

    The debugging experience is seamless. You can debug into Source-Linked code with barely a pause in debugging.

    The only drawback is that you don't have local sources, so it's trickier to set breakpoints in sources that haven't been downloaded yet. When you had local sources, you could open the source file you wanted and set a breakpoint, knowing that the debugger would look for the file in that path and be able to stop on the breakpoint.

    Also, Visual Studio's default behavior is to show all debugging sources in a single tab, so you don't even have all of the files open that you looked at when your debug session ends. If you hover the tab, you can figure out the storage location, but it's a long and not very intuitive path. Also, it only contains the sources that you've already requested.

    Still, it's a neat feature.

    Getting Pushy

    However, Microsoft is doing some things that suggest that the feature is no longer 100% opt-in. The following error message cropped up in a project with absolutely no Source Link settings or packages. It doesn't even directly use packages that have Source Link enabled (not that that should make a difference).

    There are actually three problems here:

    1. The compiler is complaining about Source Link settings on a project that hasn't opted in to Source Link.
    2. The compiler is breaking the build when Source Link cannot be enabled as expected.
    3. The error/warning messages are extremely oblique and give no indication how one should address them. (Another example is the warning message shown below.)

    It's the second one that make this issue so evil. The issue crops up literally out of nowhere and then prevents you from working. The project builds. Even if I wanted Source Link on my project but it wasn't set up correctly, this is no reason to prevent me from running/debugging my product.

    And, honestly, because of reason #3, I'm still not sure what the actual problem is or how I can address it with anything but a workaround.

    Because, yes, I found a workaround. Else, I wouldn't be writing this article.

    Things that Didn't Work

    The first time I encountered this and lost hours of precious time, I "fixed" it by removing Source Link support for some packages that my product imports. At the time, I thought I was getting the error message because TeamCity was producing corrupted packages when Source Link was included. It was not a quick fix to open up a different solution, remove Source Link support and re-build all packages on CI, but it seemed to work.

    Upon reflection and further reading, this is unlikely to have been the real reason I was seeing the message or why it magically went away. Source Link support in a NuGet server involves having access to source control in order to be able to retrieve the requested sources.

    It's honestly still unclear to me why Visual Studio/MSBuild is complaining about this at build-time in a local environment.

    The Workaround

    Today, I got the error again, in a different project. The packages I'd suspected yesterday were not included in this product. Another, very similar product used the exact same set of packages without a problem.

    Even though the issue Using SourceLink without .git directory isn't really the issue I'm having, I eventually started copying stuff from the answers into the project in my solution that failed to build.

    Add the following to any of the offending projects and the error becomes a warning.


    The ensuing warning? I can't help you there. I threw in a few other directives into the project file, but to no avail. I'm not happy to have a compile warning for a feature I never enabled and cannot disable, but I'm hoping that Microsoft will fix this sooner rather than later.

    19.6.2019 - Reclaiming Disk Space in Windows 10

    About five years ago, I wrote Who's using up my entire SSD?. Much of the advice given in that article is still applicable, but I have an update for more modern applications and packages.


    I use the following tactics to manage free space on my Windows machine:

    • Use the "Disk Cleanup" tool included with Windows
    • Use the "PatchCleaner" tool to remove unneeded packages from the Windows/Installer folder
    • Use "TreeSizeFree" to find out where else you're losing space (usually AppData)
    • Move settings/caches to another drive (e.g. "ReSharper")
    • Clear "NuGet" caches
    • Uninstall other unused software (especially Windows and .NET SDKs )

    Disk Cleanup

    This tool is available from the Windows Menu and works pretty well for basic cleanup.

    • Use "Clean up System Files" to clean not only user files, but also system files
    • Use it after installing larger Windows Updates because it will often offer to clean up multiple gigabytes from the Windows Update cache after such an update
    • For some reason, it rarely manages to empty my recycle bin or temp folder reliably, so check afterwards to see if you still have GBs lying around there


    The PatchCleaner utility determines which patches in the Windows/Installer folder correspond to installed software. Windows does not clean this up, so it's possible that patches are still lying around that correspond to very old software that has either already been uninstalled or that can no longer be uninstalled using that patch.

    The software offers to move the patches to another drive/folder or simply delete them. I've been using the utility for over half a year and have never had a single problem with Windows (i.e. I've never had to restore any of the packages that PatchCleaner removed). At first, I moved the patches, now I just delete them.


    I've been using TreeSizeFree by Jam Software for a long time. It's fast and easy to use. I almost always find that, other than the Windows folder, the largest folder is my user's AppData/Local folder.

    In order to avoid UAC in Windows, many applications now install to this user folder by default. This is a good thing, generally, but some applications also keep copies of their installations—and then never delete them. This practice can eat a lot of space for applications that are frequently updated.

    On my machine, the main culprits are:

    • JetBrains ReSharper
    • Syntevo SmartGit

    These applications have since improved their cleanup practices, but it pays to check whether you've still got installers/caches for older versions.

    See the Uninstall section below for how to best remove the old versions.


    If you're using Package References and more-recent versions of NuGet, then you'll have local caches of packages. While this practice saves a lot of hard-drive space by consolidating caches for all solutions, the default location is in the user's AppData/Local/.nuget folder.

    You can either clear everything with the following command:

    nuget locals all -clear

    or you can change the location with an environment variable NuGetCachePath (see Can the NuGet 3.2 package cache location be changed for more information).

    My cache is 2.2GB right now, but I haven't moved it yet.

    ReSharper Caches

    By default, ReSharper stores its caches in the user's AppData folder. From the ReSharper Options/General/Caches, you can change that location to another drive. That folder is currently 1.6GB on my machine, which is not insubstantial.


    Most manuals about saving disk space start with this step. I've assumed that you're a developer who has already checked this list, but it doesn't hurt to mention it.

    • Open "Apps and Features"
    • Sort by size

    Here you might see the older versions of ReSharper or SmartGit that I mentioned above. If so, go ahead and remove them using the uninstallers.

    If the uninstallers don't work and you still see them using a lot of space in your AppData folder, then do the following:

    • Note the version that you have installed (or just use the latest)
    • Uninstall all versions
    • Clear the local cache/uninstaller folders in your AppData folder manually
    • Reinstall the latest version
    • You should only see a single installation now, in both the "Apps and Features" list as well as the AppData folder.

    You can also gape in awe that "Microsoft SQL Server Management Studio" takes a breathtaking 2.8GB. Shake your head ruefully that you unfortunately need it and can't uninstall it. Or maybe you can? If you have JetBrains Rider, then you also have JetBrains DataGrip, which is an excellent SQL Server client.


    I mention SDKs explicitly because they can take a lot of space and most of them are completely superfluous—a newer version generally completely replaces an older version, even if you're targeting the older version from a solution.

    For example, I had five Windows SDKs on my machine, each of which weighed in at ~2.5GB. These SDKs were for targeting versions of Windows that I'd long since upgraded. Several of them seemed only to be useful if I was doing C++ development (which I have occasionally done, but which happens rarely and doesn't target the Windows API very heavily). I was able discard all of these packages without any drawbacks.

    Next up were the dozens of .NET Core and Framework SDKs for older—and sometimes exquisitely specific (e.g. .NET Core 1.0.4 rc4 preview 2)—versions, each weighing in at between 350MB and 500MB. I was able to remove all but the most recent versions, .NET Core 2.2 and .NET Framework 4.7.2. I have projects that target .NET Core 2.0 and 2.1 explicitly, and they are unaffected.


    Those are most up-to-date tips and tricks I've got for managing hard-drive space. I don't try to optimize my main application installations, like Visual Studio or Office. They seem to spread their data over the Program* folders, but I'm not going to touch those, as long as I've got other places to optimize.

    I've been using and upgrading my Windows image heavily for .NET (and other) software development for almost years without re-imaging and currently I've got a total of 161.6GB, divvied up as shown below.

    Folder Size Description
    Windows 55GB
    Users/marco 25.5GB installers, caches, etc.
    Users/public 6.5GB Hyper-V/Linux subsystem disk image
    Program Files (x86) 23.4GB
    Program Files 20.8GB
    Files 17.4GB Hibernate file, paging file
    ProgramData 9GB
    2.6.2019 - v7.0: Rename/move projects, ASP.Net Core

    The summary below describes major new features, items of note and breaking changes.

    The links above require a login.


    • NuGet Feed: Users can obtain packages via the NuGet feed link given above. Simply add the link as a source, either in Visual Studio or in the solution's NuGet.config file.
    • Source Link: Packages obtained via the NuGet feed include "Source Link", which is integrated with Visual Studio. When you debug into Quino sources, Visual Studio will ask for permission to download symbols and sources and automatically provide seamless debugging.
    • Migration trigger: An application can now control how and when a database change will trigger a schema-migration. It is still highly recommended to use the default behavior, but it is now possible to ignore certain changes on the database side to allow hybrid code-controlled/database-controlled metadata strategies QNO-6170
    • Multi-platform: Includes several bug-fixes for the support for Linux and MacOS that was added in 6.x. The standard CI pipeline is now a Linux image in a Docker container for both Quino-Standard and Quino-WebApi. Quino-Windows uses a Windows container. One of our developers is using JetBrains Rider on a Mac to develop Quino-WebApi.
    • Model-registration: The process for registering a model has been better codified and documented. Applications will still generally call UseModelBuilder<T>, but support for other configurations (ad-hoc/faked models in tests) derives more clearly from there. See Metadata Architecture in the conceptual documentation for more information.
    • Object Graphs: All graph-traversal, formatting and cloning support can now be replaced/configured/extended by products. The GetAllObjects() and GetFormattedGraph() methods for all hierarchical types were hard-coded in previous versions. Now, products inject the appropriate type (e.g. IExpressionGraphTraversalAlgorithm or IExpressionGraphFormattingAlgorithm) and call methods on these objects instead. Additionally, we've created documentation for how a product can implement support its own data hierarchies. See Object Graphs in the conceptual documentation for more information.
    • Web Configuration: We've standardized and documented how to extend ASP.Net applications with Quino. See Web Platform in the conceptual documentation. This configuration will once again change in Quino-WebApi 8.0, where we move to ASP.Net Core and are able to leverage even more of their standard configuration.
    • Command-line Tools: The quino command-line utility that replaced the Quino.Utils package in 6.x has also been extended to support both TeamCity and Azure DevOps. We're using this tool both locally (to update versions and standardize projects) as well as in CI (to set version, update nuspec files for .NET Framework projects and to enable documentation). Both SDK and Framework-style project types are fully supported. We plan to extend the tool further to provide more of the functionality that Quino.Utils used to provide (e.g. updating source headers and fixing myriad project properties). See Tools and, in particular, quino fix in the conceptual documentation for more information on where we are headed.
    • Login Behavior: We've improved the naming in this area to align better with the authentication system. A value of None is no longer supported (there is always a user context) and the default is now UseOperator, which uses (but does not authenticate) as the OS user that executed the software. Single sign-on products would use AuthenticateOperator. See QNO-6147 for more information.
    • Data Cursors and Object Lifetimes: We've improved the event-handling in the data pipeline to avoid inadvertently keeping objects in memory. This was particularly a problem for queries that retrieved a large number of objects. In those cases, even using CreateCursor rather than GetList didn't avoid allocating a ton of memory by the end of the iteration. We detected this when indexing data for Lucene support in a custom product. See QNO-6125 and QNO-5425 for more information.
    • Nullability Annotations: All public APIs in Quino-Standard, Quino-Windows and Quino-WebApi now include [NotNull] and [CanBeNull] annotations for parameters and return types. Many APIs also include [Pure] where appropriate. The annotations are retained and delivered with the NuGet packages, where ReSharper makes use of them in dependent products. See QNO-6092 for more information.
    • Data-driver Debugging: Exceptions in the data driver when using RunMode.Debug now stop in the debugger at the point that they are thrown. Previous versions included a global catch/re-throw handler used to maintain statistics. Errors are now added to statistics in debug mode only if a specific option is set, so that proper debugging behavior has priority, rather than the other way around. See QNO-5723 for more information.
    • Legacy Generated-Code Format: The "V1" generated-code format has been removed. As of Quino 6, all known products using Quino have upgraded to the "V2" format.
    • Web Application Shutdown: We fixed a bug whereby Quino applications weren't being disposed in OWIN applications. This led to dead instances retaining open file handles on log files that the ensuing instance would be unable to open. See QNOWEB-71 for more information.
    • Generic and Metadata Controllers: Both of these controllers include incremental improvements to provide robuster information to generic clients (e.g. the Quino Web Client). We made many improvements to validation, caption and data-retrieval when ramping up to production with several major products based on these technologies.
    • DevExpress Component Upgrade: Quino-Windows now references DevExpress 18.2.5 instead of 15.1.7. DevExpress packages are now available from their own NuGet feed, greatly easing distribution. For backward-compatibility for products that do not wish to pay for a license upgrade, the NuGet feed for Quino-Windows includes pre-release versions of all packages with a version of 7.0.0-devex*. See QNOWIN-243 for more information.

    Breaking Changes

    The recommended upgrade path is as follows:

    • Use the NuGet Package Manager to update to the released version of Quino 7 (
    • Use the NuGet Package Manager to uninstall any direct references to Encodo.* packages. Make note of which packages you've uninstalled.
    • Install the corresponding Quino.*.Core package for the Encodo.* packages you uninstalled in the step above.
    • If necessary, install the remote-data packages, as described in "Package names" below.
    • Install Quino.Processes if you were using the IProcessManager anywhere.
    • Use Visual Studio's Ctrl + R, G to clean up invalid namespaces.
    • Use Visual Studio's Ctrl + . or ReSharper's Alt + Enter to include the updated namespaces. This may take a while, but is reliable and not complicated.

    Runtime targets

    All Quino.WebApi and Quino.Windows packages now target .NET Framework 4.7.2. We made this change to improve interoperability with .NET Standard and .NET Core packages, on a recommendation from Microsoft. See QNOWIN-241 for more information.

    Package names

    We renamed all Encodo.* packages to Quino.Core*. Since this change does not affect high-level packages, most solutions should be largely unaffected. However, if a solution had included one of the Encodo.* assemblies directly, then you need to manually remove that reference and include the Quino.*.Core package instead.

    Additionally, we reduced the surface area of Quino.Application.Core (previously named Encodo.Application) by moving significant parts into sub-packages:

    • Quino.Configuration: contains all support for IKeyValueNode<T> nodes and for loading/managing configuration
    • Quino.Feedback: IFeedback and supporting types and methods
    • Quino.CommandLine: all command-line support

    Quino.Application.Core still depends on these three packages, but they can now also be used independently of pulling in the full application support.

    We also replaced the following packages:

    • Quino.Data.Remoting
    • Quino.Data.Remoting.Json
    • Quino.Server

    with the following packages:

    • Quino.Data.Remote.Client
    • Quino.Data.Remote.Server
    • Quino.Protocol.Json
    • Quino.Protocol.Binary.

    Clients and servers should instead include the server or client package, respectively and the desired protocol packages. Configure the server with


    and the client with



    All namespaces now begin with Encodo.Quino. Types that used to begin with just Encodo (no Quino) are now in the Encodo.Quino namespace. For example, Encodo.Core.Bit is known as Encodo.Quino.Core.Bit.


    • IProcessManager is no longer in the Encodo.Application package. It is now in the Quino.Processes package.
    • IEventAggregator is no longer in the Encodo.Core package. It is now in the Quino.Processes package.
    • IPayloadFactory and its associated types are no longer in the Encodo.Connections package (nor is it the renamed Quino.Connections.Core package). Instead, you can find these base types in the Quino.Protocol.Core package.


    • IMetaPropertyPath no longer extends IList<IMetaProperty> and is now immutable. The implementation MetaPropertyPath is also now immutable. Use the GetFirst(), GetFullPath() and ToList() extension methods to get information about the path.
    14.4.2019 - Tempodrom Racing 2018

    Encodo went go-kart racing at the Tempodrom in Winterthur! The cars are electric and race at almost 40kph over a 400m track. We raced three eight-minute rounds with lots of action: passing, bumping and blocking (I'm looking at you, Tom).