1 2 3 4 5 6 7 8 9 10 11
Source Link Flakiness in Visual Studio 2017 and 2019

tl;dr: If MSBuild/Visual Studio tells you that "the value of SourceRoot.RepositoryUrl is invalid..." and you have no idea what it's talking about, it might help to add the following to the offending project and the error becomes a warning.

<PropertyGroup>
   <EnableSourceControlManagerQueries>false</EnableSourceControlManagerQueries>
</PropertyGroup>

Microsoft introduced this fancy new feature called Source Link that integrates with NuGet servers to deliver symbols and source code for packages.

This feature is opt-in and library and package providers are encouraged to enable it and host packages on a server that supports Source Link.

Debugging Experience

The debugging experience is seamless. You can debug into Source-Linked code with barely a pause in debugging.

The only drawback is that you don't have local sources, so it's trickier to set breakpoints in sources that haven't been downloaded yet. When you had local sources, you could open the source file you wanted and set a breakpoint, knowing that the debugger would look for the file in that path and be able to stop on the breakpoint.

Also, Visual Studio's default behavior is to show all debugging sources in a single tab, so you don't even have all of the files open that you looked at when your debug session ends. If you hover the tab, you can figure out the storage location, but it's a long and not very intuitive path. Also, it only contains the sources that you've already requested.

Still, it's a neat feature.

Getting Pushy

However, Microsoft is doing some things that suggest that the feature is no longer 100% opt-in. The following error message cropped up in a project with absolutely no Source Link settings or packages. It doesn't even directly use packages that have Source Link enabled (not that that should make a difference).

There are actually three problems here:

  1. The compiler is complaining about Source Link settings on a project that hasn't opted in to Source Link.
  2. The compiler is breaking the build when Source Link cannot be enabled as expected.
  3. The error/warning messages are extremely oblique and give no indication how one should address them. (Another example is the warning message shown below.)

It's the second one that make this issue so evil. The issue crops up literally out of nowhere and then prevents you from working. The project builds. Even if I wanted Source Link on my project but it wasn't set up correctly, this is no reason to prevent me from running/debugging my product.

And, honestly, because of reason #3, I'm still not sure what the actual problem is or how I can address it with anything but a workaround.

Because, yes, I found a workaround. Else, I wouldn't be writing this article.

Things that Didn't Work

The first time I encountered this and lost hours of precious time, I "fixed" it by removing Source Link support for some packages that my product imports. At the time, I thought I was getting the error message because TeamCity was producing corrupted packages when Source Link was included. It was not a quick fix to open up a different solution, remove Source Link support and re-build all packages on CI, but it seemed to work.

Upon reflection and further reading, this is unlikely to have been the real reason I was seeing the message or why it magically went away. Source Link support in a NuGet server involves having access to source control in order to be able to retrieve the requested sources.

It's honestly still unclear to me why Visual Studio/MSBuild is complaining about this at build-time in a local environment.

The Workaround

Today, I got the error again, in a different project. The packages I'd suspected yesterday were not included in this product. Another, very similar product used the exact same set of packages without a problem.

Even though the issue Using SourceLink without .git directory isn't really the issue I'm having, I eventually started copying stuff from the answers into the project in my solution that failed to build.

Add the following to any of the offending projects and the error becomes a warning.

<PropertyGroup>
   <EnableSourceControlManagerQueries>false</EnableSourceControlManagerQueries>
</PropertyGroup>

The ensuing warning? I can't help you there. I threw in a few other directives into the project file, but to no avail. I'm not happy to have a compile warning for a feature I never enabled and cannot disable, but I'm hoping that Microsoft will fix this sooner rather than later.

Reclaiming Disk Space in Windows 10

About five years ago, I wrote Who's using up my entire SSD?. Much of the advice given in that article is still applicable, but I have an update for more modern applications and packages.

Overview

I use the following tactics to manage free space on my Windows machine:

  • Use the "Disk Cleanup" tool included with Windows
  • Use the "PatchCleaner" tool to remove unneeded packages from the Windows/Installer folder
  • Use "TreeSizeFree" to find out where else you're losing space (usually AppData)
  • Move settings/caches to another drive (e.g. "ReSharper")
  • Clear "NuGet" caches
  • Uninstall other unused software (especially Windows and .NET SDKs )

Disk Cleanup

This tool is available from the Windows Menu and works pretty well for basic cleanup.

  • Use "Clean up System Files" to clean not only user files, but also system files
  • Use it after installing larger Windows Updates because it will often offer to clean up multiple gigabytes from the Windows Update cache after such an update
  • For some reason, it rarely manages to empty my recycle bin or temp folder reliably, so check afterwards to see if you still have GBs lying around there

PatchCleaner

The PatchCleaner utility determines which patches in the Windows/Installer folder correspond to installed software. Windows does not clean this up, so it's possible that patches are still lying around that correspond to very old software that has either already been uninstalled or that can no longer be uninstalled using that patch.

The software offers to move the patches to another drive/folder or simply delete them. I've been using the utility for over half a year and have never had a single problem with Windows (i.e. I've never had to restore any of the packages that PatchCleaner removed). At first, I moved the patches, now I just delete them.

TreeSizeFree/AppData

I've been using TreeSizeFree by Jam Software for a long time. It's fast and easy to use. I almost always find that, other than the Windows folder, the largest folder is my user's AppData/Local folder.

In order to avoid UAC in Windows, many applications now install to this user folder by default. This is a good thing, generally, but some applications also keep copies of their installations—and then never delete them. This practice can eat a lot of space for applications that are frequently updated.

On my machine, the main culprits are:

  • JetBrains ReSharper
  • Syntevo SmartGit

These applications have since improved their cleanup practices, but it pays to check whether you've still got installers/caches for older versions.

See the Uninstall section below for how to best remove the old versions.

NuGet

If you're using Package References and more-recent versions of NuGet, then you'll have local caches of packages. While this practice saves a lot of hard-drive space by consolidating caches for all solutions, the default location is in the user's AppData/Local/.nuget folder.

You can either clear everything with the following command:

nuget locals all -clear

or you can change the location with an environment variable NuGetCachePath (see Can the NuGet 3.2 package cache location be changed for more information).

My cache is 2.2GB right now, but I haven't moved it yet.

ReSharper Caches

By default, ReSharper stores its caches in the user's AppData folder. From the ReSharper Options/General/Caches, you can change that location to another drive. That folder is currently 1.6GB on my machine, which is not insubstantial.

Uninstall

Most manuals about saving disk space start with this step. I've assumed that you're a developer who has already checked this list, but it doesn't hurt to mention it.

  • Open "Apps and Features"
  • Sort by size

Here you might see the older versions of ReSharper or SmartGit that I mentioned above. If so, go ahead and remove them using the uninstallers.

If the uninstallers don't work and you still see them using a lot of space in your AppData folder, then do the following:

  • Note the version that you have installed (or just use the latest)
  • Uninstall all versions
  • Clear the local cache/uninstaller folders in your AppData folder manually
  • Reinstall the latest version
  • You should only see a single installation now, in both the "Apps and Features" list as well as the AppData folder.

You can also gape in awe that "Microsoft SQL Server Management Studio" takes a breathtaking 2.8GB. Shake your head ruefully that you unfortunately need it and can't uninstall it. Or maybe you can? If you have JetBrains Rider, then you also have JetBrains DataGrip, which is an excellent SQL Server client.

SDKs

I mention SDKs explicitly because they can take a lot of space and most of them are completely superfluous—a newer version generally completely replaces an older version, even if you're targeting the older version from a solution.

For example, I had five Windows SDKs on my machine, each of which weighed in at ~2.5GB. These SDKs were for targeting versions of Windows that I'd long since upgraded. Several of them seemed only to be useful if I was doing C++ development (which I have occasionally done, but which happens rarely and doesn't target the Windows API very heavily). I was able discard all of these packages without any drawbacks.

Next up were the dozens of .NET Core and Framework SDKs for older—and sometimes exquisitely specific (e.g. .NET Core 1.0.4 rc4 preview 2)—versions, each weighing in at between 350MB and 500MB. I was able to remove all but the most recent versions, .NET Core 2.2 and .NET Framework 4.7.2. I have projects that target .NET Core 2.0 and 2.1 explicitly, and they are unaffected.

Conclusion

Those are most up-to-date tips and tricks I've got for managing hard-drive space. I don't try to optimize my main application installations, like Visual Studio or Office. They seem to spread their data over the Program* folders, but I'm not going to touch those, as long as I've got other places to optimize.

I've been using and upgrading my Windows image heavily for .NET (and other) software development for almost years without re-imaging and currently I've got a total of 161.6GB, divvied up as shown below.

Folder Size Description
Windows 55GB
Users/marco 25.5GB installers, caches, etc.
Users/public 6.5GB Hyper-V/Linux subsystem disk image
Program Files (x86) 23.4GB
Program Files 20.8GB
Files 17.4GB Hibernate file, paging file
ProgramData 9GB
v7.0: Rename/move projects, update namespaces

The summary below describes major new features, items of note and breaking changes.

The links above require a login.

Highlights

  • NuGet Feed: Users can obtain packages via the NuGet feed link given above. Simply add the link as a source, either in Visual Studio or in the solution's NuGet.config file.
  • Source Link: Packages obtained via the NuGet feed include "Source Link", which is integrated with Visual Studio. When you debug into Quino sources, Visual Studio will ask for permission to download symbols and sources and automatically provide seamless debugging.
  • Migration trigger: An application can now control how and when a database change will trigger a schema-migration. It is still highly recommended to use the default behavior, but it is now possible to ignore certain changes on the database side to allow hybrid code-controlled/database-controlled metadata strategies QNO-6170
  • Multi-platform: Includes several bug-fixes for the support for Linux and MacOS that was added in 6.x. The standard CI pipeline is now a Linux image in a Docker container for both Quino-Standard and Quino-WebApi. Quino-Windows uses a Windows container. One of our developers is using JetBrains Rider on a Mac to develop Quino-WebApi.
  • Model-registration: The process for registering a model has been better codified and documented. Applications will still generally call UseModelBuilder<T>, but support for other configurations (ad-hoc/faked models in tests) derives more clearly from there. See Metadata Architecture in the conceptual documentation for more information.
  • Object Graphs: All graph-traversal, formatting and cloning support can now be replaced/configured/extended by products. The GetAllObjects() and GetFormattedGraph() methods for all hierarchical types were hard-coded in previous versions. Now, products inject the appropriate type (e.g. IExpressionGraphTraversalAlgorithm or IExpressionGraphFormattingAlgorithm) and call methods on these objects instead. Additionally, we've created documentation for how a product can implement support its own data hierarchies. See Object Graphs in the conceptual documentation for more information.
  • Web Configuration: We've standardized and documented how to extend ASP.Net applications with Quino. See Web Platform in the conceptual documentation. This configuration will once again change in Quino-WebApi 8.0, where we move to ASP.Net Core and are able to leverage even more of their standard configuration.
  • Command-line Tools: The quino command-line utility that replaced the Quino.Utils package in 6.x has also been extended to support both TeamCity and Azure DevOps. We're using this tool both locally (to update versions and standardize projects) as well as in CI (to set version, update nuspec files for .NET Framework projects and to enable documentation). Both SDK and Framework-style project types are fully supported. We plan to extend the tool further to provide more of the functionality that Quino.Utils used to provide (e.g. updating source headers and fixing myriad project properties). See Tools and, in particular, quino fix in the conceptual documentation for more information on where we are headed.
  • Login Behavior: We've improved the naming in this area to align better with the authentication system. A value of None is no longer supported (there is always a user context) and the default is now UseOperator, which uses (but does not authenticate) as the OS user that executed the software. Single sign-on products would use AuthenticateOperator. See QNO-6147 for more information.
  • Data Cursors and Object Lifetimes: We've improved the event-handling in the data pipeline to avoid inadvertently keeping objects in memory. This was particularly a problem for queries that retrieved a large number of objects. In those cases, even using CreateCursor rather than GetList didn't avoid allocating a ton of memory by the end of the iteration. We detected this when indexing data for Lucene support in a custom product. See QNO-6125 and QNO-5425 for more information.
  • Nullability Annotations: All public APIs in Quino-Standard, Quino-Windows and Quino-WebApi now include [NotNull] and [CanBeNull] annotations for parameters and return types. Many APIs also include [Pure] where appropriate. The annotations are retained and delivered with the NuGet packages, where ReSharper makes use of them in dependent products. See QNO-6092 for more information.
  • Data-driver Debugging: Exceptions in the data driver when using RunMode.Debug now stop in the debugger at the point that they are thrown. Previous versions included a global catch/re-throw handler used to maintain statistics. Errors are now added to statistics in debug mode only if a specific option is set, so that proper debugging behavior has priority, rather than the other way around. See QNO-5723 for more information.
  • Legacy Generated-Code Format: The "V1" generated-code format has been removed. As of Quino 6, all known products using Quino have upgraded to the "V2" format.
  • Web Application Shutdown: We fixed a bug whereby Quino applications weren't being disposed in OWIN applications. This led to dead instances retaining open file handles on log files that the ensuing instance would be unable to open. See QNOWEB-71 for more information.
  • Generic and Metadata Controllers: Both of these controllers include incremental improvements to provide robuster information to generic clients (e.g. the Quino Web Client). We made many improvements to validation, caption and data-retrieval when ramping up to production with several major products based on these technologies.
  • DevExpress Component Upgrade: Quino-Windows now references DevExpress 18.2.5 instead of 15.1.7. DevExpress packages are now available from their own NuGet feed, greatly easing distribution. For backward-compatibility for products that do not wish to pay for a license upgrade, the NuGet feed for Quino-Windows includes pre-release versions of all packages with a version of 7.0.0-devex*. See QNOWIN-243 for more information.

Breaking Changes

The recommended upgrade path is as follows:

  • Use the NuGet Package Manager to update to the released version of Quino 7 (7.0.1.1115)
  • Use the NuGet Package Manager to uninstall any direct references to Encodo.* packages. Make note of which packages you've uninstalled.
  • Install the corresponding Quino.*.Core package for the Encodo.* packages you uninstalled in the step above.
  • If necessary, install the remote-data packages, as described in "Package names" below.
  • Install Quino.Processes if you were using the IProcessManager anywhere.
  • Use Visual Studio's Ctrl + R, G to clean up invalid namespaces.
  • Use Visual Studio's Ctrl + . or ReSharper's Alt + Enter to include the updated namespaces. This may take a while, but is reliable and not complicated.

Runtime targets

All Quino.WebApi and Quino.Windows packages now target .NET Framework 4.7.2. We made this change to improve interoperability with .NET Standard and .NET Core packages, on a recommendation from Microsoft. See QNOWIN-241 for more information.

Package names

We renamed all Encodo.* packages to Quino.Core*. Since this change does not affect high-level packages, most solutions should be largely unaffected. However, if a solution had included one of the Encodo.* assemblies directly, then you need to manually remove that reference and include the Quino.*.Core package instead.

Additionally, we reduced the surface area of Quino.Application.Core (previously named Encodo.Application) by moving significant parts into sub-packages:

  • Quino.Configuration: contains all support for IKeyValueNode<T> nodes and for loading/managing configuration
  • Quino.Feedback: IFeedback and supporting types and methods
  • Quino.CommandLine: all command-line support

Quino.Application.Core still depends on these three packages, but they can now also be used independently of pulling in the full application support.

We also replaced the following packages:

  • Quino.Data.Remoting
  • Quino.Data.Remoting.Json
  • Quino.Server

with the following packages:

  • Quino.Data.Remote.Client
  • Quino.Data.Remote.Server
  • Quino.Protocol.Json
  • Quino.Protocol.Binary.

Clients and servers should instead include the server or client package, respectively and the desired protocol packages. Configure the server with

application.UseRemoteServer().UseJsonProtocol()

and the client with

application.UseRemoteClient().UseJsonProtocol()

Namespaces

All namespaces now begin with Encodo.Quino. Types that used to begin with just Encodo (no Quino) are now in the Encodo.Quino namespace. For example, Encodo.Core.Bit is known as Encodo.Quino.Core.Bit.

Types

  • IProcessManager is no longer in the Encodo.Application package. It is now in the Quino.Processes package.
  • IEventAggregator is no longer in the Encodo.Core package. It is now in the Quino.Processes package.
  • IPayloadFactory and its associated types are no longer in the Encodo.Connections package (nor is it the renamed Quino.Connections.Core package). Instead, you can find these base types in the Quino.Protocol.Core package.

Metadata

  • IMetaPropertyPath no longer extends IList<IMetaProperty> and is now immutable. The implementation MetaPropertyPath is also now immutable. Use the GetFirst(), GetFullPath() and ToList() extension methods to get information about the path.
v6.1/6.2: Cross-platform, SourceLink and Docker

The summary below describes major new features, items of note and breaking changes.

The links above require a login.

*Few changes so far, other than that Quino-Windows targets .NET Framework 4.7.2 and DevExpress 18.2.5

Highlights

  • Inline Documentation: Quino packages now consistently include inline/developer documentation, in the form of *.xml files. IDEs use these files to provide real-time documentation in tooltips and code-completion.
  • Nullability Attributes: Attributes like NotNull, CanBeNull, [Pure], etc. are now included in Quino assemblies. Tools like R# use these attributes during code-analysis to find possible bugs and improve warnings and suggestions.
  • Online Documentation: The online documentation is once again up-to-date. See the release/6 documentation or default documentation (master branch).
  • Debugging: We've laid a bunch of groundwork for SourceLink. If the NuGet server supports this protocol, then Visual Studio automatically offers to download source code for debugging. This feature will be fully enabled in subsequent releases, after we've upgraded our package server.
  • Cross-platform: We've made more improvements to how Quino-Standard compiles and runs under Linux or MacOS. All tests now run on Linux or MacOS as well as Windows.
  • Containers: Improved integration and usage of Docker for local development and on build servers
  • Roslyn: Encodo.Compilers now uses Roslyn to provide compiling services. Quino-Standard uses these from tests to verify generated code. As of this release, C#6 and C#7 features are supported. Also, the compiler support is available on all platforms.

Breaking Changes

  • UseRunModeCommand() is no longer available by default. Applications have to opt-in to the rm -debug setting. Please see the Debugging documentation for assistance in converting to the new configuration methods.
  • KeyValueNode.GetValue() no longer returns null; use TryGetValue() instead.
  • KeyValueNode.GetValue() no longer accepts a parameter of type logger. All logging is now done to the central logger. If you still need to collect messages from that operation, then see ConfigurableLoggerExtensions.UseLogger() or ConfigurableLoggerExtensions.UseInMemoryLogger().
  • IDatabaseProperties.Collation is now of type string rather than Collation. This change was made to allow an application to specify exactly the desired collation without having Quino reinterpret it or do matching.
  • Similarly, ISqlServerCollationTools.GetEncodingAndCollation() now returns a tuple of (Encoding, string) rather than a tuple of (Encoding, Collation).
  • The constructor of NamedValueNode has changed. Instead, you should use the abstraction INamedValueNodeTools.CreateNode()or INamedValueNodeTools.CreateRootNode().
Quino Release Notes

The following is a complete list of all Quino release notes, from newest to oldest. See the roadmap for future releases.

Using Unity, Collab and Git

If you're familiar with the topic, you might be recoiling in horror. It would be unclear, though, whether you're recoiling from the "using Collab" part or the "using Collab with Git" part.

Neither is as straightforward as I'd hoped.

tl;dr: If you have to use Collab with Unity, but want to back it up with Git, disable core.autocrlf1 and add * -text to the .gitattributes.

Collab's Drawbacks

Collab is the source-control system integrated into the Unity IDE.

It was built for designers to be able to do some version control, but not much more. Even with its limited scope, it's a poor tool.

The functionality horror

  • The system does not ever show you any differences, neither in the web UI nor the local UI, neither for uncommitted nor committed files
  • Some changes cannot be reverted. No reason is given.
  • You can only delete new files from the file system.
  • There is no support for renaming
  • Reverting to a known commit has worked for me exactly once out of about 10 tries. The operation fails with an Error occurred and no further information. If you really get stuck, your only choice is to restore the entire workspace by re-cloning/re-downloading it.
  • Conflict resolution is poorly supported, although it works better than expected (it integrates with BeyondCompare, thank goodness).

The usability horror

  • The UI only lets you commit all changed files at once.
      • There is no notion of "commits".
      • You can’t commit individual files or chunks.
      • There is no staging area.
      • You can't exclude files.
      • You can ignore them completely, but that doesn't help.
  • The UI is only accessible via mouse from the menu bar.
  • You can sometimes revert folders (sometimes you can't, again with an Error occurred message), but you can't revert arbitrary groups of files.
  • The UI is almost entirely in that custom drop-down menu.
  • You can scroll through your changed files, but you can't expand the menu to show more files at once.
  • You can show a commit history, but there are no diffs. None.
  • There aren't even any diffs in the web version of the UI, which is marginally better, but read-only.

Pair Git with Collab

This is really dangerous, especially with Unity projects. There is so much in a Unity project without a proper "Undo" that you very often want to return to a known good version.

So what can we do to improve this situation? We would like to use Git instead of Collab.

However, we have to respect the capabilities and know-how of the designers on our team, who don't know how to use Git.

On our current project, there's no time to train everyone on Git—and they already know how to use Collab and don't feel tremendously limited by it.

Remember, any source control is better than no source control. The designers are regularly backing up their work now. In its defense, Collab is definitely better than nothing (or using a file-share or some other weak form of code-sharing).

Instead, those of us who know Git are using Git alongside Collab.

It kind of works...

We started naively, with all of our default settings in Git. Our workflow was:

  1. Pull in Unity/Collab
  2. Fetch from Git/Rebase to head (we actually just use "pull with rebase")

Unfortunately, we would often end up with a ton of files marked as changed in Collab. These were always line-ending differences. As mentioned above, Collab is not a good tool for reverting changes.

The project has time constraints—it's a prototype for a conference, with a hard deadline—so, despite its limitations, we reverted in Collab and updated Git with the line-endings that Collab expected.

We limped along like this for a bit, but with two developers on Git/Collab on Windows and one designer on Collab on Mac, we were spending too much time "fixing up" files. The benefit of having Git was outweighed by the problems it caused with Collab.

Know Your Enemy

So we investigated what was really going on. The following screenshots show that Collab doesn't seem to care about line-endings. They're all over the map.

JSON file with mixed line-endings

CS file with CRLF line-endings

.unity file with LF line-endings

Configuring Git

Git, on the other hand, really cares about line-endings. By default, Git will transform the line-endings in files that it considers to be text files (this part is important later) to the line-ending of the local platform.

In the repository, all text files are LF-only. If you work on MacOS or Linux, line-endings in the workspace are unchanged; if you work on Windows, Git changes all of these line-endings to CRLF on checkout—and back to LF on commit.

Our first "fix" was to turn off the core.autocrlf option in the local Git repository.

git config --local core.autocrlf false

We thought this would fix everything since now Git was no longer transforming our line-endings on commit and checkout.

This turned out to be only part of the problem, though. As you can see above, the text files in the repository have an arbitrary mix of line-endings already. Even with the feature turned off, Git was still normalizing line-endings to LF on Windows.

The only thing we'd changed so far is to stop using the CRLF instead of LF. Any time we git reset, for example, the line-endings in our workspace would still end up being different than what was in Git or Collab.

Git: Stop doing stuff

What we really want is for Git to stop changing any line-endings at all.

This isn't part of the command-line configuration, though. Instead, you have to set up .gitattributes. Git has default settings that determine which files it treats as which types. We wanted to adjust these default settings by telling Git that, in this repository, it should treat no files as text.

Once we knew this, it's quite easy to configure. Simply add a .gitattributes file to the root of the repository, with the following contents:

* -text

This translates to "do not treat any file as text" (i.e. match all files; disable text-handling).

Conclusion

With these settings, the two developers were able to reset their workspaces and both Git and Collab were happy. Collab is still a sub-par tool, but we can now work with designers and still have Git to allow the developers to use a better workflow.

The designers using only Collab were completely unaffected by our changes.



  1. Technically, I don't think you have to change the autocrlf setting. Turning off text-handling in Git should suffice. However, I haven't tested with this feature left on and, due to time-constraints, am not going to risk it.

C# 6 Features and C# 7 Design Notes

Microsoft has recently made a lot of their .NET code open-source. Not only is the code for many of the base libraries open-source but also the code for the runtime itself. On top of that, basic .NET development is now much more open to community involvement.

In that spirit, even endeavors like designing the features to be included in the next version of C# are online and open to all: C# Design Meeting Notes for Jan 21, 2015 by Mads Torgerson.

C# 6 Recap

You may be surprised at the version number "7" -- aren't we still waiting for C# 6 to be officially released? Yes, we are.

If you'll recall, the primary feature added to C# 5 was support for asynchronous operations through the async/await keywords. Most .NET programmers are only getting around to using this rather far- and deep-reaching feature, to say nothing of the new C# 6 features that are almost officially available.

C# 6 brings the following features with it and can be used in the CTP versions of Visual Studio 2015 or downloaded from the Roslyn project.

Some of the more interesting features of C# 6 are:

  • Auto-Property Initializers: initialize a property in the declaration rather than in the constructor or on an otherwise unnecessary local variable.
  • Out Parameter Declaration: An out parameter can now be declared inline with var or a specific type. This avoids the ugly variable declaration outside of a call to a Try* method.
  • Using Static Class: using can now be used with with a static class as well as a namespace. Direct access to methods and properties of a static class should clean up some code considerably.
  • String Interpolation: Instead of using string.Format() and numbered parameters for formatting, C# 6 allows expressions to be embedded directly in a string (á la PHP): e.g. "{Name} logged in at {Time}"
  • nameof(): This language feature gets the name of the element passed to it; useful for data-binding, logging or anything that refers to variables or properties.
  • Null-conditional operator: This feature reduces conditional, null-checking cruft by returning null when the target of a call is null. E.g. company.People?[0]?.ContactInfo?.BusinessAddress.Street includes three null-checks

Looking ahead to C# 7

If the idea of using await correctly or wrapping your head around the C# 6 features outlined above doesn't already make your poor head spin, then let's move on to language features that aren't even close to being implemented yet.

That said, the first set of design notes for C# 7 by Mads Torgerson include several interesting ideas as well.

  • Pattern-matching: C# has been ogling its similarly named colleague F# for a while. One of the major ideas on the table for C# is improving the ability to represent as well as match against various types of pure data, with an emphasis on immutable data.
  • Metaprogramming: Another focus for C# is reducing boilerplate and capturing common code-generation patterns. They're thinking of delegation of interfaces through composition. Also welcome would be an improvement in the expressiveness of generic constraints.

Related User Voice issues:

  • Expand Generic Constraints for constructors

  • [p]roper (generic) type ali[a]sing

  • Controlling Nullability: Another idea is to be able to declare reference types that can never be null at compile-time (where reasonable -- they do acknowledge that they may end up with a "less ambitious approach").

  • Readonly parameters and locals: Being able to express when change is allowed is a powerful form of expressiveness. C# 7 may include the ability to make local variables and parameters readonly. This will help avoid accidental side-effects.

  • Lambda capture lists: One of the issues with closures is that they currently just close over any referenced variables. The compiler just makes this happen and for the most part works as expected. When it doesn't work as expected, it creates subtle bugs that lead to leaks, race conditions and all sorts of hairy situations that are difficult to debug.

If you throw in the increased use of and nesting of lambda calls, you end up with subtle bugs buried in frameworks and libraries that are nearly impossible to tease out.

The idea of this feature is to allow a lambda to explicitly capture variables and perhaps even indicate whether the capture is read-only. Any additional capture would be flagged by the compiler or tools as an error.Contracts(!): And, finally, this is the feature I'm most excited about because I've been waiting for integrated language support for Design by Contract for literally decades1, ever since I read the Object-Oriented Software Construction 2 (OOSC2) for the first time. The design document doesn't say much about it, but mentions that ".NET already has a contract system", the weaknesses of which I've written about before. Torgersen writes:

When you think about how much code is currently occupied with arguments and result checking, this certainly seems like an attractive way to reduce code bloat and improve readability.

...and expressiveness and provability!

There are a bunch of User Voice issues that I can't encourage you enough to vote for so we can finally get this feature:

With some or all of these improvements, C# 7 would move much closer to a provable language at compile-time, an improvement over being a safe language at run-time.

We can already indicate that instance data or properties are readonly. We can already mark methods as static to prevent the use of this. We can use ReSharper [NotNull] attributes to (kinda) enforce non-null references without using structs and incurring the debt of value-passing and -copying semantics.

I'm already quite happy with C# 5, but if you throw in some or all of the stuff outlined above, I'll be even happier. I'll still have stuff I can think of to increase expressiveness -- covariant return types for polymorphic methods or anchored types or relaxed contravariant type-conformance -- but this next set of features being discussed sounds really, really good.



  1. I love the features of the language Eiffel, but haven't ever been able to use it for work. The tools and IDE are a bit stuck in the past (very dated on Windows; X11 required on OS X). The language is super-strong, with native support for contracts, anchored types, null-safe programming, contravariant type-conformance, covariant return types and probably much more that C# is slowly but surely including with each version. Unfair? I've been writing about this progress for years (from newest to oldest):

Breaking Changes in C#

Due to the nature of the language, there are some API changes that almost inevitably lead to breaking changes in C#.

Change constructor parameters

While you can easily make another constructor, marking the old one(s) as obsolete, if you use an IOC that allows only a single public constructor, you're forced to either

  • remove the obsolete constructor or
  • mark the obsolete constructor as protected.

In either case, the user has a compile error.

Virtual methods/Interfaces

There are several known issues with introducing new methods or changing existing methods on an existing interface. For many of these situations, there are relatively smooth upgrade paths.

I encountered a situation recently that I thought worth mentioning. I wanted to introduce a new overload on an existing type.

Suppose you have the following method:

bool TryGetValue<T>(
  out T value,
  TKey key = default(TKey), 
  [CanBeNull] ILogger logger = null
);

We would like to remove the logger parameter. So we deprecate the method above and declare the new method.

bool TryGetValue<T>(
  out T value, 
  TKey key = default(TKey)
);

Now the compiler/ReSharper notifies you that there will be an ambiguity if a caller does not pass a logger. How to resolve this? Well, we can just remove the default value for that parameter in the obsolete method.

bool TryGetValue<T>(
  out T value,
  TKey key = default(TKey),
  [CanBeNull] ILogger logger
);

But now you've got another problem: The parameter logger cannot come after the key parameter because it doesn't have a default value.

So, now you'd have to move the logger parameter in front of the key parameter. This will cause a compile error in clients, which is what we were trying to avoid in the first place.

In this case, we have a couple of sub-optimal options.

Multiple Releases

Use a different name for the new API (e.g. TryGetValueEx à la Windows) in the next major version, then switch the name back in the version after that and finally remove the obsolete member in yet another version.

That is,

  • in version n, TryGetValue (with logger) is obsolete and users are told to use TryGetValueEx (no logger)
  • in version n+1, TryGetValueEx (no logger) is obsolete and users are told to use TryGetValue (no logger)
  • in version n+2, we finally remove TryGetValueEx.

This is a lot of work and requires three upgrades to accomplish. You really need to stay on the ball in order to get this kind of change integrated and it takes a non-trivial amount of time and effort.

We generally don't use this method, as our customers are developers and can deal with a compile error or two, especially when it's noted in the release notes and the workaround is fairly obvious (e.g. the logger parameter is just no longer required).

Remove instead of deprecating

Accept that there will be a compile error and soften the landing as much as possible for customers by noting it in the release notes.

QQL: A Query Language for Quino

In late 2011 and early 2012, Encodo designed a querying language for Quino. Quino has an ORM that, combined with .NET Linq provides a powerful querying interface for developers. QQL is a DSL that brings this power to non-developers.

QQL never made it to implementation---only specification. In the meantime, the world moved on and we have common, generic querying APIs like OData. The time for QQL is past, but the specification is still an interesting artifact, in its own right.

Who knows? Maybe we'll get around to implementing some of it, at some point.

At any rate, you can download the specification from the downloads section.

The following excerpts should give you an idea of what you're in for, should you download and read the 80-page document.

Details

The TOC lists the following top-level chapters:

  1. Introduction
  2. Examples
  3. Context & Scopes
  4. Standard Queries
  5. Grouping Queries
  6. Evaluation
  7. Syntax
  8. Data Types and Operators
  9. Libraries
  10. Best Practices
  11. Implementation Details
  12. Future Enhancements

From the abstract in the document:

The Quino Query Language (QQL) defines a syntax and semantics for formulating data requests against hierarchical data structures. It is easy to read and learn both for those familiar with SQL and non-programmers with a certain capacity for abstract thinking (i.e. power users). Learning only a few basic rules is enough to allow a user to quickly determine which data will be returned by all but the more complex queries. As with any other language, more complex concepts result in more complex texts, but the syntax of QQL limits these cases.

From the overview:

QQL defines a syntax and semantics for writing queries against hierarchical data structures. A query describes a set of data by choosing an initial context in the data and specifying which data are to be returned and how the results are to be organized. An execution engine generates this result by applying the query to the data.

Examples

Standard Projections

The follow is from chapter 2.1, "Simple Standard Query":

The following query returns the first and last name of all active people as well as their 10 most recent time entries, reverse-sorted first by last name, then by first name.

Person
{
  select
  {
    FirstName; LastName;
    Sample:= TimeEntries { orderby Date desc; limit 10 }
  }
  where Active
  orderby
  {
    LastName desc;
    FirstName desc;
  }
}

In chapter 2, there are also "2.2 Intermediate Standard Query" and "2.3 Complex Standard Query" examples.

Grouping Projections

The following is from chapter 2.4, "Simple Grouping Query":

The following query groups active people by last name and returns the age of the youngest person and the maximum contracts for each last name. Results are ordered by the maximum contracts for each group and then by last name.

group Person
{
  groupby LastName;
  select
  {
    default;
    Age:= (Now - BirthDate.Min).Year;
    MaxContracts:= Contracts.Count.Max
  }
  where Active;
  orderby
  {
    MaxContracts desc;
    LastName desc;
  }
}

In chapter 2, there are also "2.5 Complex Grouping Query", "2.6 Standard Query with Grouping Query" and "2.7 Nested Grouping Queries" examples.

Version numbers in .NET Projects

Any software product should have a version number. This article will answer the following questions about how Encodo works with them.

  • How do we choose a version number?
  • What parts does a version number have?
  • What do these parts mean?
  • How do different stakeholders interpret the number?
  • What conventions exist for choosing numbers?
  • Who chooses and sets these parts?

Stakeholders

In decreasing order of expected expertise,

  • Developers: Writes the software; may change version numbers
  • Testers: Tests the software; highly interested in version numbers that make sense
  • End users: Uses the software as a black box

The intended audience of this document is developers.

Definitions and Assumptions

  • Build servers, not developer desktops, produce artifacts
  • The source-control system is Git
  • The quino command-line tool is installed on all machines. This tool can read and write version numbers for any .NET solution, regardless of which of the many version-numbering methods a given solution actually uses.
  • A software library is a package or product that has a developer as an end user
  • A breaking change in a software library causes one of the following
    • a build error
    • an API to behave differently in a way that cannot be justified as a bug fix

Semantic versions

Encodo uses semantic versions. This scheme has a strict ordering that allows you to determine which version is "newer". It indicates pre-releases (e.g. alphas, betas, rcs) with a "minus", as shown below.

Version numbers come in two flavors:

  • Official releases: [Major].[Minor].[Patch].[Build]
  • Pre-releases: [Major].[Minor].[Patch]-[Label][Build]

See Microsoft's NuGet Package Version Reference for more information.

Examples

  • 0.9.0-alpha34: A pre-release of 0.9.0
  • 0.9.0-beta48: A pre-release of 0.9.0
  • 0.9.0.67: An official release of 0.9.0
  • 1.0.0-rc512: A pre-release of 1.0.0
  • 1.0.0.523: An official release of 1.0.0

The numbers are strictly ordered. The first three parts indicate the "main" version. The final part counts strictly upward.

Parts

The following list describes each of the parts and explains what to expect when it changes.

Build

  • Identifies the build task that produced the artifact
  • Strictly increasing

Label

  • An arbitrary designation for the "type" of pre-release

Patch

  • Introduces bug fixes but no features or API changes
  • May introduce obsolete members
  • May not introduce breaking changes

This part is also known as "Maintenance" (see Software versioning on Wikipedia).

Minor

  • Introduces new features that extend existing functionality
  • May include bug fixes
  • May cause minor breaking changes
  • May introduce obsolete members that cause compile errors
  • Workarounds must be documented in release notes or obsolete messages

Major

  • Introduces major new features
  • Introduces breaking changes that require considerable effort to integrate
  • Introduces a new data or protocol format that requires migration

Conventions

Uniqueness for official releases

There will only ever be one artifact of an official release corresponding to a given "main" version number.

That is, if 1.0.0.523 exists, then there will never be a 1.0.0.524. This is due the fact that the build number (e.g. 524) is purely for auditing.

For example, suppose your software uses a NuGet package with version 1.0.0.523. NuGet will not offer to upgrade to 1.0.0.524.

Pre-release Labels

There are no restrictions on the labels for pre-releases. However, it's recommended to use one of the following:

  • alpha
  • beta
  • rc

Be aware that if you choose a different label, then it is ordered alphabetically relative to the other pre-releases.

For example, if you were to use the label pre-release to produce the version 0.9.0-prealpha21, then that version is considered to be higher than 0.0.0-alpha34. A tool like NuGet will not see the latter version as an upgrade.

Release branches

The name of a release branch should be the major version of that release. E.g. release/1 for version 1.x.x.x.

Pre-release branches

The name of a pre-release branch should be of the form feature/[label] where [label] is one of the labels recommended above. It's also OK to use a personal branch to create a pre-release build, as in mvb/[label].

Setting the base version

A developer uses the quino tool to set the version.

For example, to set the version to 1.0.1, execute the following:

quino fix -v 1.0.1.0

The tool will have updated the version number in all relevant files.

Calculating final version

The build server calculates a release's version number as follows,

  • major: Taken from solution
  • minor: Taken from solution
  • maintenance: Read from solution
  • label: Taken from the Git branch (see below for details)
  • build: Provided by the build server

Git Branches

The name of the Git branch determines which kind of release to produce.

  • If the name of the branch matches the glob **/release/*, then it's an official release
  • Everything else is a pre-release

For example,

  • origin/release/1
  • origin/production/release/new
  • origin/release/
  • release/1
  • production/release/new
  • release/

The name of the branch doesn't influence the version number since an official release doesn't have a label.

Pre-release labels

The label is taken from the last part of the branch name.

For example,

  • origin/feature/beta yields beta
  • origin/feature/rc yields rc
  • origin/mvb/rc yields rc

The following algorithm ensures that the label can be part of a valid semantic version.

  • Remove invalid characters
  • Append an X after a trailing digit
  • Use X if the label is empty (or becomes empty after having removed invalid characters)

For example,

  • origin/feature/rc1 yields rc1X
  • origin/feature/linux_compat yields linuxcompat
  • origin/feature/12 yields X

Examples

Assume that,

  • the version number in the solution is 0.9.0.0
  • the build counter on the build server is at 522

Then,

  • Deploying from branch origin/release/1 produces artifacts with version number 0.9.0.522
  • Deploying from branch origin/feature/rc produces artifacts with version number 0.9.0-rc522

Release Workflow

The following are very concise guides for how to produce artifacts.

Pre-release

  • Ensure you are on a non-release branch (e.g. feature/rc, master)
  • Verify or set the base version (e.g. quino fix -v 1.0.2.0
  • Push any changes to Git
  • Execute the "deploy" task against your branch on the build server

Release

  • Ensure you are on a release branch (e.g. release/1)
  • Verify or set the base version (e.g. quino fix -v 1.0.2.0
  • Push any changes to Git
  • Execute the "deploy" task against your branch on the build server