Quick Links

Recent Changes

    22.1.2019 - Using Unity, Collab and Git

    If you're familiar with the topic, you might be recoiling in horror. It would be unclear, though, whether you're recoiling from the "using Collab" part or the "using Collab with Git" part.

    Neither is as straightforward as I'd hoped.

    tl;dr: If you have to use Collab with Unity, but want to back it up with Git, disable core.autocrlf1 and add * -text to the .gitattributes.

    Collab's Drawbacks

    Collab is the source-control system integrated into the Unity IDE.

    It was built for designers to be able to do some version control, but not much more. Even with its limited scope, it's a poor tool.

    The functionality horror

    • The system does not ever show you any differences, neither in the web UI nor the local UI, neither for uncommitted nor committed files
    • Some changes cannot be reverted. No reason is given.
    • You can only delete new files from the file system.
    • There is no support for renaming
    • Reverting to a known commit has worked for me exactly once out of about 10 tries. The operation fails with an Error occurred and no further information. If you really get stuck, your only choice is to restore the entire workspace by re-cloning/re-downloading it.
    • Conflict resolution is poorly supported, although it works better than expected (it integrates with BeyondCompare, thank goodness).

    The usability horror

    • The UI only lets you commit all changed files at once.
        • There is no notion of "commits".
        • You can’t commit individual files or chunks.
        • There is no staging area.
        • You can't exclude files.
        • You can ignore them completely, but that doesn't help.
    • The UI is only accessible via mouse from the menu bar.
    • You can sometimes revert folders (sometimes you can't, again with an Error occurred message), but you can't revert arbitrary groups of files.
    • The UI is almost entirely in that custom drop-down menu.
    • You can scroll through your changed files, but you can't expand the menu to show more files at once.
    • You can show a commit history, but there are no diffs. None.
    • There aren't even any diffs in the web version of the UI, which is marginally better, but read-only.

    Pair Git with Collab

    This is really dangerous, especially with Unity projects. There is so much in a Unity project without a proper "Undo" that you very often want to return to a known good version.

    So what can we do to improve this situation? We would like to use Git instead of Collab.

    However, we have to respect the capabilities and know-how of the designers on our team, who don't know how to use Git.

    On our current project, there's no time to train everyone on Git—and they already know how to use Collab and don't feel tremendously limited by it.

    Remember, any source control is better than no source control. The designers are regularly backing up their work now. In its defense, Collab is definitely better than nothing (or using a file-share or some other weak form of code-sharing).

    Instead, those of us who know Git are using Git alongside Collab.

    It kind of works...

    We started naively, with all of our default settings in Git. Our workflow was:

    1. Pull in Unity/Collab
    2. Fetch from Git/Rebase to head (we actually just use "pull with rebase")

    Unfortunately, we would often end up with a ton of files marked as changed in Collab. These were always line-ending differences. As mentioned above, Collab is not a good tool for reverting changes.

    The project has time constraints—it's a prototype for a conference, with a hard deadline—so, despite its limitations, we reverted in Collab and updated Git with the line-endings that Collab expected.

    We limped along like this for a bit, but with two developers on Git/Collab on Windows and one designer on Collab on Mac, we were spending too much time "fixing up" files. The benefit of having Git was outweighed by the problems it caused with Collab.

    Know Your Enemy

    So we investigated what was really going on. The following screenshots show that Collab doesn't seem to care about line-endings. They're all over the map.

    JSON file with mixed line-endings

    CS file with CRLF line-endings

    .unity file with LF line-endings

    Configuring Git

    Git, on the other hand, really cares about line-endings. By default, Git will transform the line-endings in files that it considers to be text files (this part is important later) to the line-ending of the local platform.

    In the repository, all text files are LF-only. If you work on MacOS or Linux, line-endings in the workspace are unchanged; if you work on Windows, Git changes all of these line-endings to CRLF on checkout—and back to LF on commit.

    Our first "fix" was to turn off the core.autocrlf option in the local Git repository.

    git config --local core.autocrlf false
    

    We thought this would fix everything since now Git was no longer transforming our line-endings on commit and checkout.

    This turned out to be only part of the problem, though. As you can see above, the text files in the repository have an arbitrary mix of line-endings already. Even with the feature turned off, Git was still normalizing line-endings to LF on Windows.

    The only thing we'd changed so far is to stop using the CRLF instead of LF. Any time we git reset, for example, the line-endings in our workspace would still end up being different than what was in Git or Collab.

    Git: Stop doing stuff

    What we really want is for Git to stop changing any line-endings at all.

    This isn't part of the command-line configuration, though. Instead, you have to set up .gitattributes. Git has default settings that determine which files it treats as which types. We wanted to adjust these default settings by telling Git that, in this repository, it should treat no files as text.

    Once we knew this, it's quite easy to configure. Simply add a .gitattributes file to the root of the repository, with the following contents:

    * -text
    

    This translates to "do not treat any file as text" (i.e. match all files; disable text-handling).

    Conclusion

    With these settings, the two developers were able to reset their workspaces and both Git and Collab were happy. Collab is still a sub-par tool, but we can now work with designers and still have Git to allow the developers to use a better workflow.

    The designers using only Collab were completely unaffected by our changes.



    1. Technically, I don't think you have to change the autocrlf setting. Turning off text-handling in Git should suffice. However, I haven't tested with this feature left on and, due to time-constraints, am not going to risk it.

    18.1.2019 - Unavoidable Breaking Changes

    Due to the nature of the language, there are some API changes that almost inevitably lead to breaking changes in C#.

    Change constructor parameters

    While you can easily make another constructor, marking the old one(s) as obsolete, if you use an IOC that allows only a single public constructor, you're forced to either

    • remove the obsolete constructor or
    • mark the obsolete constructor as protected.

    In either case, the user has a compile error.

    Virtual methods/Interfaces

    There are several known issues with introducing new methods or changing existing methods on an existing interface. For many of these situations, there are relatively smooth upgrade paths.

    I encountered a situation recently that I thought worth mentioning. I wanted to introduce a new overload on an existing type.

    Suppose you have the following method:

    bool TryGetValue<T>(
      out T value,
      TKey key = default(TKey), 
      [CanBeNull] ILogger logger = null
    );
    

    We would like to remove the logger parameter. So we deprecate the method above and declare the new method.

    bool TryGetValue<T>(
      out T value, 
      TKey key = default(TKey)
    );
    

    Now the compiler/ReSharper notifies you that there will be an ambiguity if a caller does not pass a logger. How to resolve this? Well, we can just remove the default value for that parameter in the obsolete method.

    bool TryGetValue<T>(
      out T value,
      TKey key = default(TKey),
      [CanBeNull] ILogger logger
    );
    

    But now you've got another problem: The parameter logger cannot come after the key parameter because it doesn't have a default value.

    So, now you'd have to move the logger parameter in front of the key parameter. This will cause a compile error in clients, which is what we were trying to avoid in the first place.

    In this case, we have a couple of sub-optimal options.

    Multiple Releases

    Use a different name for the new API (e.g. TryGetValueEx à la Windows) in the next major version, then switch the name back in the version after that and finally remove the obsolete member in yet another version.

    That is,

    • in version n, TryGetValue (with logger) is obsolete and users are told to use TryGetValueEx (no logger)
    • in version n+1, TryGetValueEx (no logger) is obsolete and users are told to use TryGetValue (no logger)
    • in version n+2, we finally remove TryGetValueEx.

    This is a lot of work and requires three upgrades to accomplish. You really need to stay on the ball in order to get this kind of change integrated and it takes a non-trivial amount of time and effort.

    We generally don't use this method, as our customers are developers and can deal with a compile error or two, especially when it's noted in the release notes and the workaround is fairly obvious (e.g. the logger parameter is just no longer required).

    Remove instead of deprecating

    Accept that there will be a compile error and soften the landing as much as possible for customers by noting it in the release notes.

    11.1.2019 - Version numbers in .NET Projects

    Any software product should have a version number. This article will answer the following questions about how Encodo works with them.

    • How do we choose a version number?
    • What parts does a version number have?
    • What do these parts mean?
    • How do different stakeholders interpret the number?
    • What conventions exist for choosing numbers?
    • Who chooses and sets these parts?

    Stakeholders

    In decreasing order of expected expertise,

    • Developers: Writes the software; may change version numbers
    • Testers: Tests the software; highly interested in version numbers that make sense
    • End users: Uses the software as a black box

    The intended audience of this document is developers.

    Definitions and Assumptions

    • Build servers, not developer desktops, produce artifacts
    • The source-control system is Git
    • The quino command-line tool is installed on all machines. This tool can read and write version numbers for any .NET solution, regardless of which of the many version-numbering methods a given solution actually uses.
    • A software library is a package or product that has a developer as an end user
    • A breaking change in a software library causes one of the following
      • a build error
      • an API to behave differently in a way that cannot be justified as a bug fix

    Semantic versions

    Encodo uses semantic versions. This scheme has a strict ordering that allows you to determine which version is "newer". It indicates pre-releases (e.g. alphas, betas, rcs) with a "minus", as shown below.

    Version numbers come in two flavors:

    • Official releases: [Major].[Minor].[Patch].[Build]
    • Pre-releases: [Major].[Minor].[Patch]-[Label][Build]

    See Microsoft's NuGet Package Version Reference for more information.

    Examples

    • 0.9.0-alpha34: A pre-release of 0.9.0
    • 0.9.0-beta48: A pre-release of 0.9.0
    • 0.9.0.67: An official release of 0.9.0
    • 1.0.0-rc512: A pre-release of 1.0.0
    • 1.0.0.523: An official release of 1.0.0

    The numbers are strictly ordered. The first three parts indicate the "main" version. The final part counts strictly upward.

    Parts

    The following list describes each of the parts and explains what to expect when it changes.

    Build

    • Identifies the build task that produced the artifact
    • Strictly increasing

    Label

    • An arbitrary designation for the "type" of pre-release

    Patch

    • Introduces bug fixes but no features or API changes
    • May introduce obsolete members
    • May not introduce breaking changes

    This part is also known as "Maintenance" (see Software versioning on Wikipedia).

    Minor

    • Introduces new features that extend existing functionality
    • May include bug fixes
    • May cause minor breaking changes
    • May introduce obsolete members that cause compile errors
    • Workarounds must be documented in release notes or obsolete messages

    Major

    • Introduces major new features
    • Introduces breaking changes that require considerable effort to integrate
    • Introduces a new data or protocol format that requires migration

    Conventions

    Uniqueness for official releases

    There will only ever be one artifact of an official release corresponding to a given "main" version number.

    That is, if 1.0.0.523 exists, then there will never be a 1.0.0.524. This is due the fact that the build number (e.g. 524) is purely for auditing.

    For example, suppose your software uses a NuGet package with version 1.0.0.523. NuGet will not offer to upgrade to 1.0.0.524.

    Pre-release Labels

    There are no restrictions on the labels for pre-releases. However, it's recommended to use one of the following:

    • alpha
    • beta
    • rc

    Be aware that if you choose a different label, then it is ordered alphabetically relative to the other pre-releases.

    For example, if you were to use the label pre-release to produce the version 0.9.0-prealpha21, then that version is considered to be higher than 0.0.0-alpha34. A tool like NuGet will not see the latter version as an upgrade.

    Release branches

    The name of a release branch should be the major version of that release. E.g. release/1 for version 1.x.x.x.

    Pre-release branches

    The name of a pre-release branch should be of the form feature/[label] where [label] is one of the labels recommended above. It's also OK to use a personal branch to create a pre-release build, as in mvb/[label].

    Setting the base version

    A developer uses the quino tool to set the version.

    For example, to set the version to 1.0.1, execute the following:

    quino fix -v 1.0.1.0
    

    The tool will have updated the version number in all relevant files.

    Calculating final version

    The build server calculates a release's version number as follows,

    • major: Taken from solution
    • minor: Taken from solution
    • maintenance: Read from solution
    • label: Taken from the Git branch (see below for details)
    • build: Provided by the build server

    Git Branches

    The name of the Git branch determines which kind of release to produce.

    • If the name of the branch matches the glob **/release/*, then it's an official release
    • Everything else is a pre-release

    For example,

    • origin/release/1
    • origin/production/release/new
    • origin/release/
    • release/1
    • production/release/new
    • release/

    The name of the branch doesn't influence the version number since an official release doesn't have a label.

    Pre-release labels

    The label is taken from the last part of the branch name.

    For example,

    • origin/feature/beta yields beta
    • origin/feature/rc yields rc
    • origin/mvb/rc yields rc

    The following algorithm ensures that the label can be part of a valid semantic version.

    • Remove invalid characters
    • Append an X after a trailing digit
    • Use X if the label is empty (or becomes empty after having removed invalid characters)

    For example,

    • origin/feature/rc1 yields rc1X
    • origin/feature/linux_compat yields linuxcompat
    • origin/feature/12 yields X

    Examples

    Assume that,

    • the version number in the solution is 0.9.0.0
    • the build counter on the build server is at 522

    Then,

    • Deploying from branch origin/release/1 produces artifacts with version number 0.9.0.522
    • Deploying from branch origin/feature/rc produces artifacts with version number 0.9.0-rc522

    Release Workflow

    The following are very concise guides for how to produce artifacts.

    Pre-release

    • Ensure you are on a non-release branch (e.g. feature/rc, master)
    • Verify or set the base version (e.g. quino fix -v 1.0.2.0
    • Push any changes to Git
    • Execute the "deploy" task against your branch on the build server

    Release

    • Ensure you are on a release branch (e.g. release/1)
    • Verify or set the base version (e.g. quino fix -v 1.0.2.0
    • Push any changes to Git
    • Execute the "deploy" task against your branch on the build server
    21.12.2018 - v6.1/6.2: Cross-platform, SourceLink and Docker

    The summary below describes major new features, items of note and breaking changes.

    The links above require a login.

    *Few changes so far, other than that Quino-Windows targets .NET Framework 4.7.2 and DevExpress 18.2.5

    Highlights

    • Inline Documentation: Quino packages now consistently include inline/developer documentation, in the form of *.xml files. IDEs use these files to provide real-time documentation in tooltips and code-completion.
    • Nullability Attributes: Attributes like NotNull, CanBeNull, [Pure], etc. are now included in Quino assemblies. Tools like R# use these attributes during code-analysis to find possible bugs and improve warnings and suggestions.
    • Online Documentation: The online documentation is once again up-to-date. See the release/6 documentation or default documentation (master branch).
    • Debugging: We've laid a bunch of groundwork for SourceLink. If the NuGet server supports this protocol, then Visual Studio automatically offers to download source code for debugging. This feature will be fully enabled in subsequent releases, after we've upgraded our package server.
    • Cross-platform: We've made more improvements to how Quino-Standard compiles and runs under Linux or MacOS. All tests now run on Linux or MacOS as well as Windows.
    • Containers: Improved integration and usage of Docker for local development and on build servers
    • Roslyn: Encodo.Compilers now uses Roslyn to provide compiling services. Quino-Standard uses these from tests to verify generated code. As of this release, C#6 and C#7 features are supported. Also, the compiler support is available on all platforms.

    Breaking Changes

    • UseRunModeCommand() is no longer available by default. Applications have to opt-in to the rm -debug setting. Please see the Debugging documentation for assistance in converting to the new configuration methods.
    • KeyValueNode.GetValue() no longer returns null; use TryGetValue() instead.
    • KeyValueNode.GetValue() no longer accepts a parameter of type logger. All logging is now done to the central logger. If you still need to collect messages from that operation, then see ConfigurableLoggerExtensions.UseLogger() or ConfigurableLoggerExtensions.UseInMemoryLogger().
    • IDatabaseProperties.Collation is now of type string rather than Collation. This change was made to allow an application to specify exactly the desired collation without having Quino reinterpret it or do matching.
    • Similarly, ISqlServerCollationTools.GetEncodingAndCollation() now returns a tuple of (Encoding, string) rather than a tuple of (Encoding, Collation).
    • The constructor of NamedValueNode has changed. Instead, you should use the abstraction INamedValueNodeTools.CreateNode()or INamedValueNodeTools.CreateRootNode().
    21.12.2018 - QQL: A Query Language for Quino

    In late 2011 and early 2012, Encodo designed a querying language for Quino. Quino has an ORM that, combined with .NET Linq provides a powerful querying interface for developers. QQL is a DSL that brings this power to non-developers.

    QQL never made it to implementation---only specification. In the meantime, the world moved on and we have common, generic querying APIs like OData. The time for QQL is past, but the specification is still an interesting artifact, in its own right.

    Who knows? Maybe we'll get around to implementing some of it, at some point.

    At any rate, you can download the specification from the downloads section.

    The following excerpts should give you an idea of what you're in for, should you download and read the 80-page document.

    Details

    The TOC lists the following top-level chapters:

    1. Introduction
    2. Examples
    3. Context & Scopes
    4. Standard Queries
    5. Grouping Queries
    6. Evaluation
    7. Syntax
    8. Data Types and Operators
    9. Libraries
    10. Best Practices
    11. Implementation Details
    12. Future Enhancements

    From the abstract in the document:

    The Quino Query Language (QQL) defines a syntax and semantics for formulating data requests against hierarchical data structures. It is easy to read and learn both for those familiar with SQL and non-programmers with a certain capacity for abstract thinking (i.e. power users). Learning only a few basic rules is enough to allow a user to quickly determine which data will be returned by all but the more complex queries. As with any other language, more complex concepts result in more complex texts, but the syntax of QQL limits these cases.

    From the overview:

    QQL defines a syntax and semantics for writing queries against hierarchical data structures. A query describes a set of data by choosing an initial context in the data and specifying which data are to be returned and how the results are to be organized. An execution engine generates this result by applying the query to the data.

    Examples

    Standard Projections

    The follow is from chapter 2.1, "Simple Standard Query":

    The following query returns the first and last name of all active people as well as their 10 most recent time entries, reverse-sorted first by last name, then by first name.

    Person
    {
      select
      {
        FirstName; LastName;
        Sample:= TimeEntries { orderby Date desc; limit 10 }
      }
      where Active
      orderby
      {
        LastName desc;
        FirstName desc;
      }
    }
    

    In chapter 2, there are also "2.2 Intermediate Standard Query" and "2.3 Complex Standard Query" examples.

    Grouping Projections

    The following is from chapter 2.4, "Simple Grouping Query":

    The following query groups active people by last name and returns the age of the youngest person and the maximum contracts for each last name. Results are ordered by the maximum contracts for each group and then by last name.

    group Person
    {
      groupby LastName;
      select
      {
        default;
        Age:= (Now - BirthDate.Min).Year;
        MaxContracts:= Contracts.Count.Max
      }
      where Active;
      orderby
      {
        MaxContracts desc;
        LastName desc;
      }
    }
    

    In chapter 2, there are also "2.5 Complex Grouping Query", "2.6 Standard Query with Grouping Query" and "2.7 Nested Grouping Queries" examples.

    20.8.2018 - Seilpark Rheinfall 2018

    Encodo took a trip to the north to the beautiful Rhine Falls (Rheinfall), which served as a backdrop for a team event at a rope/adventure park. There was something for everyone and many of us were surprised at how much we were willing to take on as we swung around in the trees.

    We started off with breakfast, then played Tarzan for a few hours and finished up with a long lunch in the afternoon. A few went to the river to go swimming afterwards, while a couple of others (Marco/Tom) rode their bikes back home.

    There are also a couple of videos:

    19.7.2018 - Training #1: Programming with .NET, C# & Quino

    This is the first in a series of three 4-hour trainings designed to introduce a team of software developers with little to no .NET experience to the concepts, tools and world of .NET and Quino

    19.6.2018 - WintiWebDev Meetup June 2018

    Encodo was pleased to be able to host and participate in the June 2018 meetup yesterday evening.

    We heard about usability and user experience from Luca Honegger of Kleinfach GmbH. He emphasized that projects would be well-served by getting users involved in the process early on. He cautioned that user feedback must be taken with a grain of salt, but that it's always essential—if the target users can't use your application, then it doesn't matter how much you can "prove" that it's good design. Early feedback sessions can also be a great way of refining requirements that users can't formulate without "seeing" something.

    Next, we learned about optimizing web-page loading speed from David Gunziger of smoca. He discussed the advantages of HTTP2 as well as aggressive caching, inlining of resources, pipelining requests and tweaking the content to let the browser display content as quickly as possible. He managed to reduce the initial-loading time of his company's web page by over 50%.

    Finally, our own Richard Bützer (just started last month!) presented a fun-but-somewhat-sobering quiz about JavaScript conversions. Is an array equal to an array? Is it equal to true? What about 1? It was a wild ride through some very non-intuitive type conversions that led to some rousing discussions about how to prevent these JavaScript weaknesses from getting in the way of producing quality software. Everybody was able to try their hand at answering questions with Kahoot.

    Afterwards, there were drinks and snacks and lively conversation. All in all, 25 people attended. Thanks to everybody for coming!

    15.3.2018 - Quino Roadmap

    Quino Roadmap

    This document is about the future of Quino. See the release notes for the past.

    The following releases reflect current planning. Changing needs and requirements may affect the order and scheduling of various components. We will update the schedule here if it changes.

    7.0 — Spring 2019

    • Migrate Quino-WebAPI to .NET Core/.NET Standard
    • Rename projects from Encodo.[Name] => Quino.[Name].Core
    • Consolidate some Winform assemblies
    • Improve alignment of some namespaces (especially in the data driver)
    • Continue to improve on and add conceptual documentation
    • Extend "quino" dotnet with more fixers (e.g. reintroduce project and source-file fixers from the deprecated NAnt version)
    • Refactor complex extension methods to singletons

    8.0 — Fall 2019

    • Introduce final replacement for metadata wrappers (i.e. replace extension classes with proper TPH solution)
    26.2.2018 - The Road to React

    This talk discusses the motive behind our move to React, including requirements, candidates and a high-level description of React with learning resources

    19.2.2018 - How do I DI?

    This talk discusses the terms and concepts of DI, IOC and containers. It includes code smells, recommended patterns and tips for refactoring.

    6.2.2018 - Cookies 2017

    Every year, Encodo gets together and puts something together for our customers. Several years ago, we baked cookies ... and those were really, really popular.

    For 2017, we bowed to customer pressure and baked cookies again, digging up our hand-made Encodo-logo cookie forms and baking traditional Swiss Christmas cookies.

    6.2.2018 - Saturn 5 Adventskalendar 2017

    Every year, Encodo hangs an advent calendar in the office, usually filled with chocolate and trinkets. A few of the years, we've had puzzles and larger toys that were split into several pieces.

    This year, we split up the Legos Saturn 5 rocket over the workdays leading up to Christmas. Beforehand, though, Karin, Kath, Remo and Marco put the rocket together—just to see what it looks like.

    26.1.2018 - Encodo's new web site

    As you can see, Encodo finally has a new web site!

    Motive

    We'd had the same design for many years and it was time for a refresh..

    What did we want to change?

    • Improve the design and navigation
    • Make it mobile-friendly
    • Make it easier for Encodo employees to add/update content
    • Improved integration of comments
    • Move to a platform more familiar for more developers at Encodo
    • Make our site representative of our work

    Design

    We didn't do the design ourselves (because we're not really designers). Instead, we contracted our partners at Ergosign to come up with a design for us and we think they did a great job (as usual).

    Requirements

    The previous web site[^1] had the following features:

    • A full-featured album and picture manager
    • A full-featured blog with comments/email-publication/RSS
    • A full-featured text-formatting language for all text
    • Attachments for blogs/external content
    • Security features to restrict access
    • Integration with a separate ASP.NET site for collecting job applications.

    On top of that, we wanted:

    • LDAP/AD integration
    • All content editable/managed by the same back-end
    • Improved search

    Candidates

    On the server side, we evaluated a bunch of approaches:

    • Stick with the existing PHP web site but move static content into the back-end
    • Build the entire site from scratch with Quino
    • Use another blogging framework. Candidates:
      • Umbraco
      • WordPress
      • MovableType, Nucleus, GreyMatter, etc.
      • Static Site Generators (e.g. Jekyll)

    Our Approach

    After much deliberation and some POCs, we went with Umbraco, a framework written in and for .NET C#.

    This approach entailed:

    • Customizing the Umbraco look-and-feel to use the new design
    • Integrating a job-application web API server (written with Quino)
    • Writing an exporter in PHP that exposes a JSON API to return blogs, articles, journals, pictures and albums as Markdown content with Base64-encoded attachments and pictures.
    • Writing an importer in C# that marshals the data returned by the PHP JSON API to Umbraco objects and data

    What's Next?

    We've come a long way toward our goal, but a web site is an ongoing project.

    We've incurred a bit of technical debt on our way to release, so a first step will be to convert some inline JavaScript and CSS to shared TypeScript and LESS. We're also improving our support for mobile devices as we test more.


    [^1] We were using the earthli WebCore, a PHP CMS written by Marco.

    13.11.2017 - Team Event: Summer's End in Ticino!

    imageIn late September, Encodo closed out the summer with a four-day weekend at a rented house in Tenero-Contra. We were 300 meters above the valley floor, clinging to the slopes of the Verzasca Valley just under Mergoscia. We had a commanding view of Lago Maggiore, Monte Ceneri and Monte Tamaro.

    In addition to all of the pictures and the calendar, there are four short journal entries that describe how we passed the time:

    Click here to see the pictures!

    13.11.2017 - Encodo Networking Event 2017.2

    Encodo held its second (and final) networking event of 2017 on November 2nd with Marco presenting Cross-Platform Development for Mobile Apps (slides in English/presentation in Swiss-German).

    Thanks to everyone who attended!

    The abstract is included below.

    imageThis talk starts off discussing a laundry list of requirements and project details that impact on mobile development and can affect which tools, frameworks and libraries you choose to support your work. The second part covers the available frameworks with extra detail where Encodo has experience (many of them). They include: Cordova, PhoneGap, Ionic, Xamarin, Native Development and Flutter.


    > Spoiler Alert: Encodo's most recent experience with Xamarin has been extremely positive - it's our framework of choice now.
    8.11.2017 - Contra-Tenero 2017

    Encodo had a 4-day team weekend in Tenero-Contra, Ticino in the south of Switzerland. The pictures document the journey to Contra (by car, bike and train), the long weekend and the journey back north.

    2.11.2017 - Cross-platform Mobile Development

    This talk discusses tools, techniques and experiences in mobile development.

    14.6.2017 - Networking Event 2017.1 and upcoming WWDT Talk

    Encodo held its first networking event of 2017 on June 1st with Marco presenting A Checklist for new Projects (slides in English/presentation in Swiss-German).

    Thanks to everyone who attended!

    If you missed it and would like to see the talk, Marco will be presenting again at winti web dev/talks on June 19th, 2017. The talk will be in English this time.

    The abstract is included below.

    imageThis talk discusses a checklist of concepts for writing software. Which ones apply? If so, how will you address them? A must for new projects, but also very useful for ongoing, legacy or inherited projects. We'll quickly skim several groups of concepts, and then focus mainly on "core software components".

    If you can't make it, the updated slides are available online.

    1.6.2017 - A Checklist for new Projects

    This talk discusses a checklist of concepts for writing software. Which ones apply? If so, how will you address them? A must for new projects, but also very useful for ongoing, legacy or inherited projects. We'll quickly skim several groups of concepts, and then focus mainly on "core software components".

    20.4.2017 - Segway Tour 2017

    On a warm spring day, we took an afternoon off for lunch and a Segway Tour of Winterthur with Segway City Tours. After an introduction on a flat plaza by the Zeughaus, we cruised all over the city for hours, stopping on the Goldenberg and at the Oskar Reinhart Museum.