1 2 3 4 5 6 7 8 9 10 11
Source Link Flakiness in Visual Studio 2017 and 2019

tl;dr: If MSBuild/Visual Studio tells you that "the value of SourceRoot.RepositoryUrl is invalid..." and you have no idea what it's talking about, it might help to add the following to the offending project and the error becomes a warning.


Microsoft introduced this fancy new feature called Source Link that integrates with NuGet servers to deliver symbols and source code for packages.

This feature is opt-in and library and package providers are encouraged to enable it and host packages on a server that supports Source Link.

Debugging Experience

The debugging experience is seamless. You can debug into Source-Linked code with barely a pause in debugging.

The only drawback is that you don't have local sources, so it's trickier to set breakpoints in sources that haven't been downloaded yet. When you had local sources, you could open the source file you wanted and set a breakpoint, knowing that the debugger would look for the file in that path and be able to stop on the breakpoint.

Also, Visual Studio's default behavior is to show all debugging sources in a single tab, so you don't even have all of the files open that you looked at when your debug session ends. If you hover the tab, you can figure out the storage location, but it's a long and not very intuitive path. Also, it only contains the sources that you've already requested.

Still, it's a neat feature.

Getting Pushy

However, Microsoft is doing some things that suggest that the feature is no longer 100% opt-in. The following error message cropped up in a project with absolutely no Source Link settings or packages. It doesn't even directly use packages that have Source Link enabled (not that that should make a difference).

There are actually three problems here:

  1. The compiler is complaining about Source Link settings on a project that hasn't opted in to Source Link.
  2. The compiler is breaking the build when Source Link cannot be enabled as expected.
  3. The error/warning messages are extremely oblique and give no indication how one should address them. (Another example is the warning message shown below.)

It's the second one that make this issue so evil. The issue crops up literally out of nowhere and then prevents you from working. The project builds. Even if I wanted Source Link on my project but it wasn't set up correctly, this is no reason to prevent me from running/debugging my product.

And, honestly, because of reason #3, I'm still not sure what the actual problem is or how I can address it with anything but a workaround.

Because, yes, I found a workaround. Else, I wouldn't be writing this article.

Things that Didn't Work

The first time I encountered this and lost hours of precious time, I "fixed" it by removing Source Link support for some packages that my product imports. At the time, I thought I was getting the error message because TeamCity was producing corrupted packages when Source Link was included. It was not a quick fix to open up a different solution, remove Source Link support and re-build all packages on CI, but it seemed to work.

Upon reflection and further reading, this is unlikely to have been the real reason I was seeing the message or why it magically went away. Source Link support in a NuGet server involves having access to source control in order to be able to retrieve the requested sources.

It's honestly still unclear to me why Visual Studio/MSBuild is complaining about this at build-time in a local environment.

The Workaround

Today, I got the error again, in a different project. The packages I'd suspected yesterday were not included in this product. Another, very similar product used the exact same set of packages without a problem.

Even though the issue Using SourceLink without .git directory isn't really the issue I'm having, I eventually started copying stuff from the answers into the project in my solution that failed to build.

Add the following to any of the offending projects and the error becomes a warning.


The ensuing warning? I can't help you there. I threw in a few other directives into the project file, but to no avail. I'm not happy to have a compile warning for a feature I never enabled and cannot disable, but I'm hoping that Microsoft will fix this sooner rather than later.

Reclaiming Disk Space in Windows 10

About five years ago, I wrote Who's using up my entire SSD?. Much of the advice given in that article is still applicable, but I have an update for more modern applications and packages.


I use the following tactics to manage free space on my Windows machine:

  • Use the "Disk Cleanup" tool included with Windows
  • Use the "PatchCleaner" tool to remove unneeded packages from the Windows/Installer folder
  • Use "TreeSizeFree" to find out where else you're losing space (usually AppData)
  • Move settings/caches to another drive (e.g. "ReSharper")
  • Clear "NuGet" caches
  • Uninstall other unused software (especially Windows and .NET SDKs )

Disk Cleanup

This tool is available from the Windows Menu and works pretty well for basic cleanup.

  • Use "Clean up System Files" to clean not only user files, but also system files
  • Use it after installing larger Windows Updates because it will often offer to clean up multiple gigabytes from the Windows Update cache after such an update
  • For some reason, it rarely manages to empty my recycle bin or temp folder reliably, so check afterwards to see if you still have GBs lying around there


The PatchCleaner utility determines which patches in the Windows/Installer folder correspond to installed software. Windows does not clean this up, so it's possible that patches are still lying around that correspond to very old software that has either already been uninstalled or that can no longer be uninstalled using that patch.

The software offers to move the patches to another drive/folder or simply delete them. I've been using the utility for over half a year and have never had a single problem with Windows (i.e. I've never had to restore any of the packages that PatchCleaner removed). At first, I moved the patches, now I just delete them.


I've been using TreeSizeFree by Jam Software for a long time. It's fast and easy to use. I almost always find that, other than the Windows folder, the largest folder is my user's AppData/Local folder.

In order to avoid UAC in Windows, many applications now install to this user folder by default. This is a good thing, generally, but some applications also keep copies of their installations—and then never delete them. This practice can eat a lot of space for applications that are frequently updated.

On my machine, the main culprits are:

  • JetBrains ReSharper
  • Syntevo SmartGit

These applications have since improved their cleanup practices, but it pays to check whether you've still got installers/caches for older versions.

See the Uninstall section below for how to best remove the old versions.


If you're using Package References and more-recent versions of NuGet, then you'll have local caches of packages. While this practice saves a lot of hard-drive space by consolidating caches for all solutions, the default location is in the user's AppData/Local/.nuget folder.

You can either clear everything with the following command:

nuget locals all -clear

or you can change the location with an environment variable NuGetCachePath (see Can the NuGet 3.2 package cache location be changed for more information).

My cache is 2.2GB right now, but I haven't moved it yet.

ReSharper Caches

By default, ReSharper stores its caches in the user's AppData folder. From the ReSharper Options/General/Caches, you can change that location to another drive. That folder is currently 1.6GB on my machine, which is not insubstantial.


Most manuals about saving disk space start with this step. I've assumed that you're a developer who has already checked this list, but it doesn't hurt to mention it.

  • Open "Apps and Features"
  • Sort by size

Here you might see the older versions of ReSharper or SmartGit that I mentioned above. If so, go ahead and remove them using the uninstallers.

If the uninstallers don't work and you still see them using a lot of space in your AppData folder, then do the following:

  • Note the version that you have installed (or just use the latest)
  • Uninstall all versions
  • Clear the local cache/uninstaller folders in your AppData folder manually
  • Reinstall the latest version
  • You should only see a single installation now, in both the "Apps and Features" list as well as the AppData folder.

You can also gape in awe that "Microsoft SQL Server Management Studio" takes a breathtaking 2.8GB. Shake your head ruefully that you unfortunately need it and can't uninstall it. Or maybe you can? If you have JetBrains Rider, then you also have JetBrains DataGrip, which is an excellent SQL Server client.


I mention SDKs explicitly because they can take a lot of space and most of them are completely superfluous—a newer version generally completely replaces an older version, even if you're targeting the older version from a solution.

For example, I had five Windows SDKs on my machine, each of which weighed in at ~2.5GB. These SDKs were for targeting versions of Windows that I'd long since upgraded. Several of them seemed only to be useful if I was doing C++ development (which I have occasionally done, but which happens rarely and doesn't target the Windows API very heavily). I was able discard all of these packages without any drawbacks.

Next up were the dozens of .NET Core and Framework SDKs for older—and sometimes exquisitely specific (e.g. .NET Core 1.0.4 rc4 preview 2)—versions, each weighing in at between 350MB and 500MB. I was able to remove all but the most recent versions, .NET Core 2.2 and .NET Framework 4.7.2. I have projects that target .NET Core 2.0 and 2.1 explicitly, and they are unaffected.


Those are most up-to-date tips and tricks I've got for managing hard-drive space. I don't try to optimize my main application installations, like Visual Studio or Office. They seem to spread their data over the Program* folders, but I'm not going to touch those, as long as I've got other places to optimize.

I've been using and upgrading my Windows image heavily for .NET (and other) software development for almost years without re-imaging and currently I've got a total of 161.6GB, divvied up as shown below.

Folder Size Description
Windows 55GB
Users/marco 25.5GB installers, caches, etc.
Users/public 6.5GB Hyper-V/Linux subsystem disk image
Program Files (x86) 23.4GB
Program Files 20.8GB
Files 17.4GB Hibernate file, paging file
ProgramData 9GB
v7.0: Rename/move projects, update namespaces

The summary below describes major new features, items of note and breaking changes.

The links above require a login.


  • NuGet Feed: Users can obtain packages via the NuGet feed link given above. Simply add the link as a source, either in Visual Studio or in the solution's NuGet.config file.
  • Source Link: Packages obtained via the NuGet feed include "Source Link", which is integrated with Visual Studio. When you debug into Quino sources, Visual Studio will ask for permission to download symbols and sources and automatically provide seamless debugging.
  • Migration trigger: An application can now control how and when a database change will trigger a schema-migration. It is still highly recommended to use the default behavior, but it is now possible to ignore certain changes on the database side to allow hybrid code-controlled/database-controlled metadata strategies QNO-6170
  • Multi-platform: Includes several bug-fixes for the support for Linux and MacOS that was added in 6.x. The standard CI pipeline is now a Linux image in a Docker container for both Quino-Standard and Quino-WebApi. Quino-Windows uses a Windows container. One of our developers is using JetBrains Rider on a Mac to develop Quino-WebApi.
  • Model-registration: The process for registering a model has been better codified and documented. Applications will still generally call UseModelBuilder<T>, but support for other configurations (ad-hoc/faked models in tests) derives more clearly from there. See Metadata Architecture in the conceptual documentation for more information.
  • Object Graphs: All graph-traversal, formatting and cloning support can now be replaced/configured/extended by products. The GetAllObjects() and GetFormattedGraph() methods for all hierarchical types were hard-coded in previous versions. Now, products inject the appropriate type (e.g. IExpressionGraphTraversalAlgorithm or IExpressionGraphFormattingAlgorithm) and call methods on these objects instead. Additionally, we've created documentation for how a product can implement support its own data hierarchies. See Object Graphs in the conceptual documentation for more information.
  • Web Configuration: We've standardized and documented how to extend ASP.Net applications with Quino. See Web Platform in the conceptual documentation. This configuration will once again change in Quino-WebApi 8.0, where we move to ASP.Net Core and are able to leverage even more of their standard configuration.
  • Command-line Tools: The quino command-line utility that replaced the Quino.Utils package in 6.x has also been extended to support both TeamCity and Azure DevOps. We're using this tool both locally (to update versions and standardize projects) as well as in CI (to set version, update nuspec files for .NET Framework projects and to enable documentation). Both SDK and Framework-style project types are fully supported. We plan to extend the tool further to provide more of the functionality that Quino.Utils used to provide (e.g. updating source headers and fixing myriad project properties). See Tools and, in particular, quino fix in the conceptual documentation for more information on where we are headed.
  • Login Behavior: We've improved the naming in this area to align better with the authentication system. A value of None is no longer supported (there is always a user context) and the default is now UseOperator, which uses (but does not authenticate) as the OS user that executed the software. Single sign-on products would use AuthenticateOperator. See QNO-6147 for more information.
  • Data Cursors and Object Lifetimes: We've improved the event-handling in the data pipeline to avoid inadvertently keeping objects in memory. This was particularly a problem for queries that retrieved a large number of objects. In those cases, even using CreateCursor rather than GetList didn't avoid allocating a ton of memory by the end of the iteration. We detected this when indexing data for Lucene support in a custom product. See QNO-6125 and QNO-5425 for more information.
  • Nullability Annotations: All public APIs in Quino-Standard, Quino-Windows and Quino-WebApi now include [NotNull] and [CanBeNull] annotations for parameters and return types. Many APIs also include [Pure] where appropriate. The annotations are retained and delivered with the NuGet packages, where ReSharper makes use of them in dependent products. See QNO-6092 for more information.
  • Data-driver Debugging: Exceptions in the data driver when using RunMode.Debug now stop in the debugger at the point that they are thrown. Previous versions included a global catch/re-throw handler used to maintain statistics. Errors are now added to statistics in debug mode only if a specific option is set, so that proper debugging behavior has priority, rather than the other way around. See QNO-5723 for more information.
  • Legacy Generated-Code Format: The "V1" generated-code format has been removed. As of Quino 6, all known products using Quino have upgraded to the "V2" format.
  • Web Application Shutdown: We fixed a bug whereby Quino applications weren't being disposed in OWIN applications. This led to dead instances retaining open file handles on log files that the ensuing instance would be unable to open. See QNOWEB-71 for more information.
  • Generic and Metadata Controllers: Both of these controllers include incremental improvements to provide robuster information to generic clients (e.g. the Quino Web Client). We made many improvements to validation, caption and data-retrieval when ramping up to production with several major products based on these technologies.
  • DevExpress Component Upgrade: Quino-Windows now references DevExpress 18.2.5 instead of 15.1.7. DevExpress packages are now available from their own NuGet feed, greatly easing distribution. For backward-compatibility for products that do not wish to pay for a license upgrade, the NuGet feed for Quino-Windows includes pre-release versions of all packages with a version of 7.0.0-devex*. See QNOWIN-243 for more information.

Breaking Changes

The recommended upgrade path is as follows:

  • Use the NuGet Package Manager to update to the released version of Quino 7 (
  • Use the NuGet Package Manager to uninstall any direct references to Encodo.* packages. Make note of which packages you've uninstalled.
  • Install the corresponding Quino.*.Core package for the Encodo.* packages you uninstalled in the step above.
  • If necessary, install the remote-data packages, as described in "Package names" below.
  • Install Quino.Processes if you were using the IProcessManager anywhere.
  • Use Visual Studio's Ctrl + R, G to clean up invalid namespaces.
  • Use Visual Studio's Ctrl + . or ReSharper's Alt + Enter to include the updated namespaces. This may take a while, but is reliable and not complicated.

Runtime targets

All Quino.WebApi and Quino.Windows packages now target .NET Framework 4.7.2. We made this change to improve interoperability with .NET Standard and .NET Core packages, on a recommendation from Microsoft. See QNOWIN-241 for more information.

Package names

We renamed all Encodo.* packages to Quino.Core*. Since this change does not affect high-level packages, most solutions should be largely unaffected. However, if a solution had included one of the Encodo.* assemblies directly, then you need to manually remove that reference and include the Quino.*.Core package instead.

Additionally, we reduced the surface area of Quino.Application.Core (previously named Encodo.Application) by moving significant parts into sub-packages:

  • Quino.Configuration: contains all support for IKeyValueNode<T> nodes and for loading/managing configuration
  • Quino.Feedback: IFeedback and supporting types and methods
  • Quino.CommandLine: all command-line support

Quino.Application.Core still depends on these three packages, but they can now also be used independently of pulling in the full application support.

We also replaced the following packages:

  • Quino.Data.Remoting
  • Quino.Data.Remoting.Json
  • Quino.Server

with the following packages:

  • Quino.Data.Remote.Client
  • Quino.Data.Remote.Server
  • Quino.Protocol.Json
  • Quino.Protocol.Binary.

Clients and servers should instead include the server or client package, respectively and the desired protocol packages. Configure the server with


and the client with



All namespaces now begin with Encodo.Quino. Types that used to begin with just Encodo (no Quino) are now in the Encodo.Quino namespace. For example, Encodo.Core.Bit is known as Encodo.Quino.Core.Bit.


  • IProcessManager is no longer in the Encodo.Application package. It is now in the Quino.Processes package.
  • IEventAggregator is no longer in the Encodo.Core package. It is now in the Quino.Processes package.
  • IPayloadFactory and its associated types are no longer in the Encodo.Connections package (nor is it the renamed Quino.Connections.Core package). Instead, you can find these base types in the Quino.Protocol.Core package.


  • IMetaPropertyPath no longer extends IList<IMetaProperty> and is now immutable. The implementation MetaPropertyPath is also now immutable. Use the GetFirst(), GetFullPath() and ToList() extension methods to get information about the path.
v7.0.x: Code-generation, Tooling and Performance Improvements

The summary below describes major new features, items of note and breaking changes.

The links above require a login.


  • We fixed an old performance issue in Quino-Windows Winform support in the MetaEditPanel where the PropertyChanged listener was attached twice on initialization and again each time the layout was updated. This led to increasingly slow UIs as the same property-change was propagated n times. (QNOWIN-278)
  • We fixed a bug where properties with the same name but different paths were generated with the same name in search details. This had the effect that, Company.Id and Id both bound to the same control. (QNOWIN-276) It also had the effect that some bindings failed for nexted fields. (QNOWIN-273)
  • In the web, we improved the file-uploading controller (QNOWEB-110 and QNOWEB-91)
  • All web controllers, return types and responses are now 100% documented and available via Swagger. (QNOWEB-100)
  • GenericObject.ValuesChanged is no longer reset to false whenever a value is loaded from the data handler (it is now set to false only when no values are marked as changed). (QNO-6182)
  • Improved the quino tool to support code-generation and -migration. (QNO-6161)
  • Improved language-specific property-handling (QNO-6241, QNO-6225)
  • Improved logging configuration (QNO-6186, QNO-6220, QNO-6222)

Breaking Changes

  • Moved IToken and Token from Quino.Connections to Quino.Security.
  • Split Quino.Cryptography out of Quino.Security.Core
  • Moved IExtendedApplicationDescription to Quino.Application.Core
  • The interface IPersistentObjectContext now implements IScope rather than IScopedContext. The Push() and Pop() methods are no longer available.
  • The EnableExternalLog() and DisableExternalLog() methods have been moved from the Encodo.Quino.App to the Encodo.Quino.Logging namespace.
  • The functionality formally declared as extension methods for IMessageFormatter has been split into several interfaces: IMessageSequenceFormatter, IMessageDetailsFormatter, IMessageErrorDetailsFormatter.
    • GetMostAppropriateDetails(IEnumerable<IMessage>) is available as IMessageSequenceFormatter.GetText()
    • GetDetails(IEnumerable<IMessage>, IMessage) is available as IMessageDetailsFormatter(IEnumerable<IMessage>, IMessage)
  • IRemoteMethodsInstanceFactory has been renamed to IMetaMethodsInstanceFactory
  • IMetaMethodImplementationContainer is now in the Encodo.Quino.Methods namespace
v8.0.0: ASP.NET Core, Web Client 2, Culture/Language improvements

The summary below describes major new features, items of note and breaking changes.

The links above require a login.


Breaking Changes

Before upgrading, products should make sure that they do not depend on any obsolete members in the current version (7.x).


Quino-Web 8.0 is a rewrite and is therefore mostly incompatible with 7.x.

  • The controller returns data in a completely different format
  • The Quino Client has been completely rewritten to accommodate it
  • The startup and pipeline have been completely rewritten to integrate with ASP.NET Core
  • Testing support has been considerably extended to accommodate end-to-end integration testing and in-process hosts

See the Quino-Web/Sandbox.Web project for a working example. This integrates the standard SandboxApplication into a web site using the standard GenericController and MetadataController to provide data and UI to the generic Quino Client.

Namespace Changes

Some internal types in Quino-Standard have been moved to more appropriate namespaces and assemblies, but the impact on products should be non-existent or very limited.

The following types were moved from Encodo.Quino.Core to Encodo.Quino.Culture:

  • LanguageTextAttribute
  • IValueParser
  • CaptionAttribute
  • LanguageDescriptionAttribute

The following types were moved from Encodo.Quino.Core to Encodo.Quino.TextFormatting:

  • IFileSizeFormatter

Culture- and Language-Handling

Quino's default culture-handling has been overhauled. Instead of tracking its own language, Quino now uses the standard .NET CultureInfo.CurrentUICulture for the default language and CultureInfo.CurrentCulture for default formatting (e.g. times, dates, and currencies). Many fields have been marked as obsolete and are no longer used by Quino.

Default Languages

The default languages in Quino have changed from "en-US" and "de-CH" to "en and "de", respectively.

The reasoning behind this is that, while a requested language should be as specific as possible, a supported language should be as general as possible. The standard culture mechanisms and behavior (e.g. .NET Resources) "fall back" to a parent language when a more-specific language cannot be found. If an application claims to only support "en-US", then a request for "en-GB" fails. If the supported language is "en", then any request to a language in the "en" family (e.g. "en-US", "en-GB", "en-AU") will use "en".

An application that supports "en-US" and "de-CH" has, therefore, a more limited palette of languages that it can support.


Quino code runs in the context of a user, who has a list of preferred languages, in decreasing order of preference. This context can last the entire duration of an application (e.g. a standalone application like a console or desktop application) or last as long as a web request.

The application itself has a list of languages that it supports, as well as resources and metadata that defines text in these languages. The resources are standard .NET Resources with the standard fallback mechanism (i.e. a request for "en-US" can be satisfied by "en"). The metadata uses DynamicString objects, which encapsulate a map from language codes (e.g. "en" or "de") to strings.

During application startup or at the beginning of a web request, the ILanguageResolver determines the language to use for a given set of requested languages. In ASP.NET Core, the requested languages come from the HTTP headers provided by the browser. In standalone applications, the IRequestedLanguageCalculator provides the requested languages. The ILanguageInitializer is responsible for coordinating this during application startup.

The rest of Quino uses the following singletons to work with languages.

  • IDynamicStringFallbackCalculator: Comes into play when a request is made for a language that is not directly supported. For example, if the application supports "en" and "de", then a request for "en-US" will ask this singleton how to resolve the request.
  • IDynamicStringFactory: Creates a dynamic string to describe a given object. The default implementation uses .NET Attributes.
  • ILanguageResolver: Determines the culture to use from a list of available cultures and a list of requested/preferred cultures.
  • IRequestedLanguageCalculator: Provides the sequence of languages from which to choose during initial resolution (web requests do not use this).
  • ILanguageInitializer: Integrates language-selection into the application startup.
  • ICaptionCalculator: Extracts a single caption for a culture from a given object. Applications should use the IDynamicStringFactory in most cases, instead.

An application can control fallback by registering custom IDynamicStringFallbackCalculator and ILanguageResolver implementations (though this is almost certainly not necessary).

Opting in or out

Any product that calls AddEnglishAndGerman() will automatically be upgraded as well. A product can avoid this change by calling AddAmericanEnglishAndSwissGerman() instead.


A product that uses the new languages will have to replace all fields in reports targeted at "en-US" and "de-CH" to target "en" and "de" instead.

Database Fields

A product that does use the new default languages will have to determine how to migrate database fields created for languages that are no longer explicitly supported. If the model includes value-lists (enums) or multi-language properties , the application will have to migrate the database schema to update multi-language fields (e.g. "caption_en_us" => "caption_en").

Manual MetaIds

A product that sets MetaIds manually will migrate without modification (Quino will rename the property in the database).

Automatic MetaIds

A product that does not set MetaIds (this has been the default in Quino since version 2) will have a MetaID mismatch because the name has changed.

By default, Quino will migrate by attempting to drop, then re-create multi-language properties. In the case of value-list captions, this is harmless (since the data stored in these tables are generated wholly from the metadata). For actual multi-language properties with user data in them, this is a problem.

The simple solution is to call UseLegacyLanguageMappingFinalizerBuilder() during application configuration to ensure a smooth migration (Quino will rename the property in the database).

Regenerating Code

A product that updates its languages should regenerate code to update any generated language-specific properties. Properties that had previously been generated as, e.g. Caption_en_us will now be Caption_en.

v8.2.1.4675: Default Culture, Metadata Caption, and Validation improvements

The summary below describes major new features, items of note and breaking changes.

  • Resources
  • Issues/Changelog

The links above require a login.


  • Validation failures now reference the metdata (e.g. property) that caused them (QNO-2844)
  • Improve EnsureLoaded() to support more relations (no complement and multiple complements) (QNO-6409)
  • Improve configuration and handling for the default culture (QNO-6416)
  • Improve API, usage, and configurability of DynamicStrings (QNO-6330, QNO-6389, QNO-6399, QNO-6415, )

Breaking Changes

DynamicString no longer declares a virtual method named GetValueWithoutFallback(). It has been redefined as an extension method and replaced with TryGetValueWithoutFallback(), which accepts an IExpressionContext. This change will only affect products that have inherited from DynamicString.

v6.1/6.2: Cross-platform, SourceLink and Docker

The summary below describes major new features, items of note and breaking changes.

The links above require a login.

*Few changes so far, other than that Quino-Windows targets .NET Framework 4.7.2 and DevExpress 18.2.5


  • Inline Documentation: Quino packages now consistently include inline/developer documentation, in the form of *.xml files. IDEs use these files to provide real-time documentation in tooltips and code-completion.
  • Nullability Attributes: Attributes like NotNull, CanBeNull, [Pure], etc. are now included in Quino assemblies. Tools like R# use these attributes during code-analysis to find possible bugs and improve warnings and suggestions.
  • Online Documentation: The online documentation is once again up-to-date. See the release/6 documentation or default documentation (master branch).
  • Debugging: We've laid a bunch of groundwork for SourceLink. If the NuGet server supports this protocol, then Visual Studio automatically offers to download source code for debugging. This feature will be fully enabled in subsequent releases, after we've upgraded our package server.
  • Cross-platform: We've made more improvements to how Quino-Standard compiles and runs under Linux or MacOS. All tests now run on Linux or MacOS as well as Windows.
  • Containers: Improved integration and usage of Docker for local development and on build servers
  • Roslyn: Encodo.Compilers now uses Roslyn to provide compiling services. Quino-Standard uses these from tests to verify generated code. As of this release, C#6 and C#7 features are supported. Also, the compiler support is available on all platforms.

Breaking Changes

  • UseRunModeCommand() is no longer available by default. Applications have to opt-in to the rm -debug setting. Please see the Debugging documentation for assistance in converting to the new configuration methods.
  • KeyValueNode.GetValue() no longer returns null; use TryGetValue() instead.
  • KeyValueNode.GetValue() no longer accepts a parameter of type logger. All logging is now done to the central logger. If you still need to collect messages from that operation, then see ConfigurableLoggerExtensions.UseLogger() or ConfigurableLoggerExtensions.UseInMemoryLogger().
  • IDatabaseProperties.Collation is now of type string rather than Collation. This change was made to allow an application to specify exactly the desired collation without having Quino reinterpret it or do matching.
  • Similarly, ISqlServerCollationTools.GetEncodingAndCollation() now returns a tuple of (Encoding, string) rather than a tuple of (Encoding, Collation).
  • The constructor of NamedValueNode has changed. Instead, you should use the abstraction INamedValueNodeTools.CreateNode()or INamedValueNodeTools.CreateRootNode().
Quino Release Notes

The following is a complete list of all Quino release notes, from newest to oldest. See the roadmap for future releases.

Using Unity, Collab and Git

If you're familiar with the topic, you might be recoiling in horror. It would be unclear, though, whether you're recoiling from the "using Collab" part or the "using Collab with Git" part.

Neither is as straightforward as I'd hoped.

tl;dr: If you have to use Collab with Unity, but want to back it up with Git, disable core.autocrlf1 and add * -text to the .gitattributes.

Collab's Drawbacks

Collab is the source-control system integrated into the Unity IDE.

It was built for designers to be able to do some version control, but not much more. Even with its limited scope, it's a poor tool.

The functionality horror

  • The system does not ever show you any differences, neither in the web UI nor the local UI, neither for uncommitted nor committed files
  • Some changes cannot be reverted. No reason is given.
  • You can only delete new files from the file system.
  • There is no support for renaming
  • Reverting to a known commit has worked for me exactly once out of about 10 tries. The operation fails with an Error occurred and no further information. If you really get stuck, your only choice is to restore the entire workspace by re-cloning/re-downloading it.
  • Conflict resolution is poorly supported, although it works better than expected (it integrates with BeyondCompare, thank goodness).

The usability horror

  • The UI only lets you commit all changed files at once.
      • There is no notion of "commits".
      • You can’t commit individual files or chunks.
      • There is no staging area.
      • You can't exclude files.
      • You can ignore them completely, but that doesn't help.
  • The UI is only accessible via mouse from the menu bar.
  • You can sometimes revert folders (sometimes you can't, again with an Error occurred message), but you can't revert arbitrary groups of files.
  • The UI is almost entirely in that custom drop-down menu.
  • You can scroll through your changed files, but you can't expand the menu to show more files at once.
  • You can show a commit history, but there are no diffs. None.
  • There aren't even any diffs in the web version of the UI, which is marginally better, but read-only.

Pair Git with Collab

This is really dangerous, especially with Unity projects. There is so much in a Unity project without a proper "Undo" that you very often want to return to a known good version.

So what can we do to improve this situation? We would like to use Git instead of Collab.

However, we have to respect the capabilities and know-how of the designers on our team, who don't know how to use Git.

On our current project, there's no time to train everyone on Git—and they already know how to use Collab and don't feel tremendously limited by it.

Remember, any source control is better than no source control. The designers are regularly backing up their work now. In its defense, Collab is definitely better than nothing (or using a file-share or some other weak form of code-sharing).

Instead, those of us who know Git are using Git alongside Collab.

It kind of works...

We started naively, with all of our default settings in Git. Our workflow was:

  1. Pull in Unity/Collab
  2. Fetch from Git/Rebase to head (we actually just use "pull with rebase")

Unfortunately, we would often end up with a ton of files marked as changed in Collab. These were always line-ending differences. As mentioned above, Collab is not a good tool for reverting changes.

The project has time constraints—it's a prototype for a conference, with a hard deadline—so, despite its limitations, we reverted in Collab and updated Git with the line-endings that Collab expected.

We limped along like this for a bit, but with two developers on Git/Collab on Windows and one designer on Collab on Mac, we were spending too much time "fixing up" files. The benefit of having Git was outweighed by the problems it caused with Collab.

Know Your Enemy

So we investigated what was really going on. The following screenshots show that Collab doesn't seem to care about line-endings. They're all over the map.

JSON file with mixed line-endings

CS file with CRLF line-endings

.unity file with LF line-endings

Configuring Git

Git, on the other hand, really cares about line-endings. By default, Git will transform the line-endings in files that it considers to be text files (this part is important later) to the line-ending of the local platform.

In the repository, all text files are LF-only. If you work on MacOS or Linux, line-endings in the workspace are unchanged; if you work on Windows, Git changes all of these line-endings to CRLF on checkout—and back to LF on commit.

Our first "fix" was to turn off the core.autocrlf option in the local Git repository.

git config --local core.autocrlf false

We thought this would fix everything since now Git was no longer transforming our line-endings on commit and checkout.

This turned out to be only part of the problem, though. As you can see above, the text files in the repository have an arbitrary mix of line-endings already. Even with the feature turned off, Git was still normalizing line-endings to LF on Windows.

The only thing we'd changed so far is to stop using the CRLF instead of LF. Any time we git reset, for example, the line-endings in our workspace would still end up being different than what was in Git or Collab.

Git: Stop doing stuff

What we really want is for Git to stop changing any line-endings at all.

This isn't part of the command-line configuration, though. Instead, you have to set up .gitattributes. Git has default settings that determine which files it treats as which types. We wanted to adjust these default settings by telling Git that, in this repository, it should treat no files as text.

Once we knew this, it's quite easy to configure. Simply add a .gitattributes file to the root of the repository, with the following contents:

* -text

This translates to "do not treat any file as text" (i.e. match all files; disable text-handling).


With these settings, the two developers were able to reset their workspaces and both Git and Collab were happy. Collab is still a sub-par tool, but we can now work with designers and still have Git to allow the developers to use a better workflow.

The designers using only Collab were completely unaffected by our changes.

  1. Technically, I don't think you have to change the autocrlf setting. Turning off text-handling in Git should suffice. However, I haven't tested with this feature left on and, due to time-constraints, am not going to risk it.

C# 6 Features and C# 7 Design Notes

Microsoft has recently made a lot of their .NET code open-source. Not only is the code for many of the base libraries open-source but also the code for the runtime itself. On top of that, basic .NET development is now much more open to community involvement.

In that spirit, even endeavors like designing the features to be included in the next version of C# are online and open to all: C# Design Meeting Notes for Jan 21, 2015 by Mads Torgerson.

C# 6 Recap

You may be surprised at the version number "7" -- aren't we still waiting for C# 6 to be officially released? Yes, we are.

If you'll recall, the primary feature added to C# 5 was support for asynchronous operations through the async/await keywords. Most .NET programmers are only getting around to using this rather far- and deep-reaching feature, to say nothing of the new C# 6 features that are almost officially available.

C# 6 brings the following features with it and can be used in the CTP versions of Visual Studio 2015 or downloaded from the Roslyn project.

Some of the more interesting features of C# 6 are:

  • Auto-Property Initializers: initialize a property in the declaration rather than in the constructor or on an otherwise unnecessary local variable.
  • Out Parameter Declaration: An out parameter can now be declared inline with var or a specific type. This avoids the ugly variable declaration outside of a call to a Try* method.
  • Using Static Class: using can now be used with with a static class as well as a namespace. Direct access to methods and properties of a static class should clean up some code considerably.
  • String Interpolation: Instead of using string.Format() and numbered parameters for formatting, C# 6 allows expressions to be embedded directly in a string (á la PHP): e.g. "{Name} logged in at {Time}"
  • nameof(): This language feature gets the name of the element passed to it; useful for data-binding, logging or anything that refers to variables or properties.
  • Null-conditional operator: This feature reduces conditional, null-checking cruft by returning null when the target of a call is null. E.g. company.People?[0]?.ContactInfo?.BusinessAddress.Street includes three null-checks

Looking ahead to C# 7

If the idea of using await correctly or wrapping your head around the C# 6 features outlined above doesn't already make your poor head spin, then let's move on to language features that aren't even close to being implemented yet.

That said, the first set of design notes for C# 7 by Mads Torgerson include several interesting ideas as well.

  • Pattern-matching: C# has been ogling its similarly named colleague F# for a while. One of the major ideas on the table for C# is improving the ability to represent as well as match against various types of pure data, with an emphasis on immutable data.
  • Metaprogramming: Another focus for C# is reducing boilerplate and capturing common code-generation patterns. They're thinking of delegation of interfaces through composition. Also welcome would be an improvement in the expressiveness of generic constraints.

Related User Voice issues:

  • Expand Generic Constraints for constructors

  • [p]roper (generic) type ali[a]sing

  • Controlling Nullability: Another idea is to be able to declare reference types that can never be null at compile-time (where reasonable -- they do acknowledge that they may end up with a "less ambitious approach").

  • Readonly parameters and locals: Being able to express when change is allowed is a powerful form of expressiveness. C# 7 may include the ability to make local variables and parameters readonly. This will help avoid accidental side-effects.

  • Lambda capture lists: One of the issues with closures is that they currently just close over any referenced variables. The compiler just makes this happen and for the most part works as expected. When it doesn't work as expected, it creates subtle bugs that lead to leaks, race conditions and all sorts of hairy situations that are difficult to debug.

If you throw in the increased use of and nesting of lambda calls, you end up with subtle bugs buried in frameworks and libraries that are nearly impossible to tease out.

The idea of this feature is to allow a lambda to explicitly capture variables and perhaps even indicate whether the capture is read-only. Any additional capture would be flagged by the compiler or tools as an error.Contracts(!): And, finally, this is the feature I'm most excited about because I've been waiting for integrated language support for Design by Contract for literally decades1, ever since I read the Object-Oriented Software Construction 2 (OOSC2) for the first time. The design document doesn't say much about it, but mentions that ".NET already has a contract system", the weaknesses of which I've written about before. Torgersen writes:

When you think about how much code is currently occupied with arguments and result checking, this certainly seems like an attractive way to reduce code bloat and improve readability.

...and expressiveness and provability!

There are a bunch of User Voice issues that I can't encourage you enough to vote for so we can finally get this feature:

With some or all of these improvements, C# 7 would move much closer to a provable language at compile-time, an improvement over being a safe language at run-time.

We can already indicate that instance data or properties are readonly. We can already mark methods as static to prevent the use of this. We can use ReSharper [NotNull] attributes to (kinda) enforce non-null references without using structs and incurring the debt of value-passing and -copying semantics.

I'm already quite happy with C# 5, but if you throw in some or all of the stuff outlined above, I'll be even happier. I'll still have stuff I can think of to increase expressiveness -- covariant return types for polymorphic methods or anchored types or relaxed contravariant type-conformance -- but this next set of features being discussed sounds really, really good.

  1. I love the features of the language Eiffel, but haven't ever been able to use it for work. The tools and IDE are a bit stuck in the past (very dated on Windows; X11 required on OS X). The language is super-strong, with native support for contracts, anchored types, null-safe programming, contravariant type-conformance, covariant return types and probably much more that C# is slowly but surely including with each version. Unfair? I've been writing about this progress for years (from newest to oldest):