Quantcast
Channel: Scrum Bug
Viewing all 216 articles
Browse latest View live

Fixing my Western Digital Sentinel DX4000 server

$
0
0
Fixing my Western Digital Sentinel DX4000 server

A couple of years ago I bought a Western Digital Sentinel DX4000 server. It's a Network Attached Storage device that runs Windows Server. It has served me well for the last 5 years, but it disappeared from my network from one day to the other.

It turns out that the Western Digital Sentinel DX4000 has a common hardware failure: The network ports completely stop working. Of course I only found out about this issue after I lost access to the server.

When it happens the lights on the network ports won't blink and the LCD display shows:

Network Disconnected

This wouldn't have been a problem if the machine had a video port, but because it's a headless server, once you lose network access you're in big trouble. The server runs Windows 2008r2, and doesn't ship with a lot of network drivers built-in and I hadn't installed any myself, so plugging in a USB network adapter didn't work :(.

This left me in a pickle. I do take regular backups to a set of external drives, but I hadn't taken one with all the latest bits. While I didn't care about a lot of the data on the server, the photo archive is important.

I contacted Western Digital, but they don't provide any warranty on the device and don't ship replacement parts (even if you want to pay for them). I contacted a data recovery company and they offered a data extraction for $900, quite a lot of money for 1 month worth of photographs.

So I kept looking. And it paid off. I had previously found the blog series on upgrading the Sentinel DX4000 to Windows 2012 and found it was updated with a download link for a Windows 2016 image. I decided to bite the bullet, ordered a USB network adapter that which Windows 10 and 2012 support by default:

In the mean time I prepared a recovery USB key. The USB key only wipes the OS drive, so if it didn't work out, I'd still have other options.

While waiting for the USB adapter to arrive I researched other options and found that the DX4000 drives can be swapped into another unit without them being wiped. I found a couple of second hand units on Ebay and saved them for later.

Fixing my Western Digital Sentinel DX4000 server
You can find replacement devices for about $150-$200 on ebay.

The ethernet adapter arrived... I crossed my fingers... Plugged it and the recovery key into the server and waited. After about an hour the server showed up quickly and disappeared right after... I almost started thinking this was going to be too easy anyway. But after restarting the server another time it came back up and stayed up. Turned out it had sucked in 2GB of Windows Updates and had been working on getting them applied.

I quickly finished my offline backup and switched the server back off.

Now I'm in the market for a new NAS. I haven't decided yet, the Synology DS918+ is looking very nice, fast CPU, 16GB RAM support, SSD cache as an option and it can run Docker containers... Should I? Or should I just spend the money on a couple of years worth cloud storage?


Hard lessons in asynchronous JavaScript code

$
0
0
Hard lessons in asynchronous JavaScript code

Last Friday I received a bug report for one of the extension I maintain. The reporter mentioned that a specific configuration lead to the task corrupting the JSON files it patched. It did so in a very peculiar way, by adding an extra } at the end of every file, basically making them not JSON.

It took a while to reproduce the case and then to figure out what was happening. Quite early in the debugging process I found that my assumption about the execution order didn't match reality and that I was fighting a race condition.

I'm not sure about you, but I hate fighting race conditions, they're often caused by interesting interactions between different units of code and since they're less likely to occur in a debugger, they're notoriously hard to debug.

Basically the code was supposed to be doing the following:

  • Read a json file
  • Patch a few fields
  • Write the json file
  • Read the same json file
  • Patch a few different fields
  • Write the json file again
  • Zip everything up

As you can imagine there's a few interesting interactions in this code that can happen when things are running out of order. Most executions went according to my expectation, but I did find a few cases where the following happened:

  • Read a json file
  • Read the same json file
  • Patch a few different fields
  • Write the same json file
  • Patch a few...
  • Zip everything up
  • .. fields
  • Write the json file

I still can't fathom how this would result in files having the same consistent issue, }} at the end of every file instead of }. But while it was the thing that triggered this whole investigation there was a much bigger issue.

How did the code end up in this state?

This project grew out of a collaboration between multiple people, has been updated in the past 2 years to support new features, has been partially upgraded to support new language features and frameworks and has received a few Pull Requests that added valuable new features using a mix of the above.

My lessons from fixing this mess are the following.

It took more time to debug than to rewrite the code

You've probably experienced this before, you spend 6 hours trying to debug and refactor the existing code in order to weed out the issue, but end up rewriting it anyway. Worst of all, this was my my first instinctive fear when I received the bug report: I'd end up rewriting this feature. I already knew this part of the code was a hodgepodge of patches, fixes and frameworks.

I do have to admit that rewriting was made a whole lot easier by having spent 6 hours to re-familiarize myself with the code, but I should/could have made this decision earlier saving me a few valuable hours.

LESSON: Don't spend good money after bad money. Be prepared to cut your losses early.

Typescript is less strict in its warnings than the C# compiler

My background isn't in TypeScript/JavaScript, I've learned quite a bit in the past years, but I'm still much stronger in C#. I suffer from biases created by my years of working with the C# compiler. To be frank:

The C# compiler and Static Analysis detects and warns the developer much earlier and with greater accuracy than TypeScript and TsLint.

In C# you'll have to be more explicit if you want to do bad stuff. In JavaScript/TypeScript, bad stuff is allowed to happen without being explicit about it.

LESSON: I must stay vigilant and not let the fact I have no warnings nor errors lull me into a false sense of security.

Node 6, 8 and 10 are different

This may be blindingly obvious to you, but part of the reason the behavior was hard to reproduce, was because my local development station was running Node 10. Node 10 has a number of very useful features, including promisified versions of fs. The VSTS build agent runs its tasks in Node 6 and while there are numerous libraries available to shim the new features into the old runtime, they're not 100% the same. Some shims out there on npm don't have all the latest fixes and may not be 100% compatible.

Installing Node 10 on the build agent doesn't change the Node version used by the agent to execute its tasks. It changes the version of Node used to execute the npm tasks in your workflow.

Only after switching back to Node 6 on my development workstation did my problems become easy to reproduce. It turns out that the shims that add promise support for fs are not working exactly as documented.

LESSON: Be sure to test your code at least on the same runtime used "in production".

Don't mix Promise libraries, async/await and old-style delegates

Over time the code had become a mix of:

  • Callbacks
  • Q Promise library methods
  • Q wrapped callbacks
  • ES6 standard promise methods
  • async/await

The bugged part of the code interacted with little bits of everything from this list.

The first thing I did to clean it up was to remove every mention of Q. This caused the problem to become deterministic. Not exactly sure what Qs impact was, but removing it helped tremendously.

The second thing I did was to replace the fs.writeFile and fs.readFile calls with fs-extra promisified versions. This further reduced unexpected behaviors. I'm sure my partial knowledge on how callbacks, promises and await interact were causing problems here that could have been solved without introducing fs-extra, but alignment helped me wrap my head around the problems without having to fully understand and project all of these different implementation details.

LESSON: Stick to a single paradigm to allow your brain to focus on what's important: what your code is trying to achieve.
LESSON: Don't mix and match all kinds of different async libraries. Each has particular implementation details and they may not work together the way you'd expect them.

Understand Promises when using async/await

Daniel Brain explain this really well in his blogpost with the same title. So I'll stick to a few examples where I messed up:

// THIS IS BAD
const files: string[];
files.map(async () => { doStuff(); } );

zip();

The TypeScript compiler nor TsLint will complain about this code. DoStuff is an async method that returns a Promise and we're not calling await on it, nor returning the promise, so it will happily run when it pleases and will finish when it does.

So let's fix that:

// THIS IS BAD
const files: string[];
files.map(async () => { await doStuff(); } );

// OR THIS IS STILL BAD
files.map(async () => { return doStuff(); } );

zip();

Solved that problem. But wait... When you run this you'll find out that it's not actually waiting... As it turns out files.map will return a Promise[] and again, the TypeScript compiler nor TsLint will complain that you're not waiting for that:

// THIS IS STILL BAD
const files: string[];
await files.map(async () => { return doStuff(); } );
await zip();

Done right? Unfortunately not. When you await on a Promise[], the code will immediately continue. I had expected this would result in waiting for all promises to resolve or to raise a warning (my C# bias fooling me). Instead you'll need to add a little bit of Promise-fairy-dust to solve that one:

// THIS ACTUALLY WORKS
const files: string[];
await Promise.all(files.map(async () => { return doStuff(); } ));
await zip();

With all of these fixes combined, the code runs in the expected order.

LESSON: Be aware of your personal biases. They may fool you into a false sense of security. C# is not TypeScript.
LESSON: Testing these kinds of async patterns is hard. You'll need to change the way you write your code if you want to validate execution order dependencies.

See also:

Unit testing is hard

I teach this all the time. In order to test your code, the best way to write the code is to write the test first. When faced with old, unfamiliar, untested code, the first step is to apply software-vise as Michael Feathers calls it:

Clamp down known good behavior, detect any unintended changes to it – gives us control over the work
Hard lessons in asynchronous JavaScript code
Source: Ink Stone

But it's so easy to ignore this sage advice and try to get a quick fix out. Same applies to spending just enough time to fix all the naming and other objective things that can hide in your code base.

LESSON: Be professional.

Use sync code sparingly

In the past I asked a friend to review some of my build tasks. The comments included: "why are you calling fs.existsSync?" My reply: "because it's easy, because it works and because it makes my code a lot easier to read". His retort: "but that's not the Node way!".

And I'm still torn about this one. The VSTS agent runs a single job at at time. Me calling things synchronously may cause the code to be slightly slower, it may cause the logs to stream to the server after, instead of during, execution... Are those strong enough reasons to make the code harder to maintain... For me? For a person who knows Node?

At the same time, most APIs in Node are using the async/await pattern and Promises. One can't ignore these altogether, is mixing both paradigms really easier to read?

Am I trying to kid myself? Am I procrastinating, by postponing the required learning and working around it? I suppose I am. As a result I've decided to study, to ask reviews to more people and to try and remove this gap in my skills.

LESSON: Know your weaknesses, do not work around them. Dare to ask questions, involve others and be open to feedback. It's what Agile is built on top of.

Conclusion

None of these lessons were a surprise to me. Being Professional is hard and balancing open-source, work and personal time just makes it harder. I'm glad I rewrote most of the code, I've reached out to figure out how I can better test these types of problems and I learned a lot about JavaScript, Promises and asynchronous behavior. I know I still have lots to learn.

PS: I still don't understand why this issue presented itself by corrupting the JSON files. But I'll leave that for someone else to figure out.

PS: I'd love constructive feedback, pull requests or suggestions to further improve.

Photo credit: Frans Berkelaar.

CI/CD Tools for VSTS Extensions 1.1.75

$
0
0
CI/CD Tools for VSTS Extensions 1.1.75

I'm happy to announce that version 1.1.75 of the CI/CD tools for VSTS Extensions is rolling out to the Visual Studio Marketplace. The last few releases have added a couple of significant features that I'd like to quickly highlight in this post.

Support for extensions with more than one major task version of the same task.

It's possible to pack multiple versions of a single build task in the same extension. This enables the following scenarios:

  • Ship an older version of your tasks and allow users to upgrade at their leisure.
  • Ship a preview version of your tasks and allow users to provide early feedback.
  • Ship Long Time Support (LTS) versions of your build tasks.

Multi-Version-Tasks have the following requirements:

  • Task ID must be the same across versions.
  • Task Name must be the same across versions.
  • Each task must have a different Major version.
  • Both tasks are stored in a subfolder under the contributions main folder.

The CI/CD Extensions or VSTS Extensions will now correctly update the version when you use the following settings:

CI/CD Tools for VSTS Extensions 1.1.75
Pass in the desired Extension version, check the Override tasks version and select Replace Minor, Patch or Replace Patch as Override Type

In your sources ensure that the task.json for each task has the appropriate major (and optionally minor) version set.

vss-extension.json { "version": 3.2.1 }
- MultiVersionBuildTask
  - MultiVersionBuildTaskV1
    - task.json { "version": 1.0.0 }
  - MultiVersionBuildTaskV2
    - task.json { "version": 2.0.0 }
  - MultiVersionBuildTaskV3
    - task.json { "version": 3.0.0, "preview": true }

- SingleVersionBuildTask
  - task.json { "version": 1.0.0 }

Note: Remember, each version requires its own set of powershell/node modules.

It's possible to mix multi-version tasks and single version tasks in the same extension, however, they must follow the same Version Override Scheme.

Support for generating Task IDs based on the Publisher, Extension ID and Task Name.

  • Added: Ability to generate consistent GUIDs for Build tasks during build.

The recommend practice for build task development is to have a secondary VSTS account to test your non-production versions of the tasks on. In my case that means I have:

  • jessehouwing.visualstudio.com - My primary VSTS account that contrains all the builds, releases and source repositories for my projects
  • jessehouwing-dev.visualstudio.com - My test environment for VSTS extension that are built, released and shared from the primary account.

The biggest advantage of this setup is that I can re-use the same Task ID and Task Names in my dev/test packages as I use in my production packages. The only thing I override between dev and production is the Extension ID and he Visibility:

CI/CD Tools for VSTS Extensions 1.1.75
When publishing your Development/Test versions to a separate test account, you only need to override the Extension ID and set the Visibility to Private.

Multiple private extensions can share the same Task ID and Task Names.

However, if you need to publish and activate more than one version of the same Extension in a single VSTS account, you'll need to ensure that each extension is using unique GUIDs for the Task ID, or things will break.

This gets us into trouble with GUIDs. They are generally globally unique, so while it's easy to come up with random values every build, that will break any existing Build or Release that references these tasks.

Fortunately, there is a UUID standard (v5) which allows you to feed in the parameters which are used to generate the UUID. The CI/CD tasks for VSTS Extensions can now use this standard to derive the Task ID from the following inputs:

  • Publisher ID (taken from vss-extension.json, unless overridden)
  • Extension ID (taken from vss-extension.json, unless overridden)
  • Extension Tag (appended to Extension ID when supplied)
  • Task Name (taken from task.json)

This option is enabled through the "Override task id" option:

CI/CD Tools for VSTS Extensions 1.1.75
Override the task id only when you need to deploy multiple versions of the same extension to the same VSTS account.

Support for marketplace validation checks

The Visual Studio Marketplace is now actively scanning extensions for unwanted contents (viruses mostly). This validation can be instant, but it sometimes takes longer. In that case we will receive an error upon trying to publish the extension, which will fail your build/release. Or at least, that was the case.

When this feature was introduced in the Marketplace, tfx was also updated to support the --no-wait-validation flag, which would not wait for validation to complete once the extension was uploaded to the marketplace. Great, so your pipeline won't break when validation takes a while, but this also means that the build will succeed, even if the extension doesn't pass validation:

CI/CD Tools for VSTS Extensions 1.1.75
Set the Don't wait for validation flag to continue if validation is taking longer.

We've added two new tasks to help handle this last case:

  • The Is Valid Extension task which runs on the agent - This version is the easiest to use, just drop it into the build or release pipeline at the end and let it poll the marketplace until validation either succeeds or fails. When validation fails the task will fail immediately. The downside is that this task is running on the agent and thus consumes build minutes and allocates a concurrent pipeline.
CI/CD Tools for VSTS Extensions 1.1.75
  • The Is Valid Extension task which runs on the Server - This version requires you to link a Release Pipeline to your build and to configure release gates. It runs on the server and doesn't consume build minutes nor a concurrent pipeline. You may not have access to all the variable in the server context to run this task. When validation fails the task will not fail until the timeout expires.
CI/CD Tools for VSTS Extensions 1.1.75

During development of the Release Gate task, we've provided extensive feedback to the team building Release Gates. We'll address some of the downsides of using the Release Gate task in a future release as soon as the required features become available.

We're also working with the team working on tfx to see if some of the features in these tasks can be merged into tfx. Please vote/comment on these issues if you'd like them to be implemented:

Are you using the CI/CD Tasks for VSTS Extensions? Are there features you'd like to see? Are you using another technique? What benefits do you get from that? I'd love to hear your thoughts in the comments below.

Adding Google Custom Search to Casper

$
0
0
Adding Google Custom Search to Casper

You may have noticed the little search box on this blog. This is not a standard feature of Ghost, it's Google Custom Search embedded in Ghost. In this post I'll walk you through the process to set this up.

This post is part of a series:

Step 1: Activate Google Custom Search for your domain

Go to the Google Custom Search console and add your domain. You can specify which pages to index and which ones to ignore, I simply indexed everything:

Adding Google Custom Search to Casper
Create your google custom search account

Then use Control Panel to customize it. I went for a Search box on every page combined with a results page. Not with a standard Google styled search box, but with a custom one.

Adding Google Custom Search to Casper
Click Control panel to customize

Pick the "Results only" option, that's what we'll be embedding in the Search page later:

Adding Google Custom Search to Casper
Use "Results Only" and then "Save & Get Code"

This should give you a piece of code like this:

<!-- DO NOT USE THE CODE BELOW, YOU NEED TO GENERATE YOUR OWN -->
<script>
  (function() {
    var cx = 'magic-identifier-from-google';
    var gcse = document.createElement('script');
    gcse.type = 'text/javascript';
    gcse.async = true;
    gcse.src = 'https://cse.google.com/cse.js?cx=' + cx;
    var s = document.getElementsByTagName('script')[0];
    s.parentNode.insertBefore(gcse, s);
  })();
</script>
<gcse:searchresults-only></gcse:searchresults-only>

Save this for later.

You can customize your Google Custom Search a bit further, I've linked my Google Analytics, set the Query Parameter name and a few other things.

Step 2: Create a Search page on your Ghost blog

Now go over to Ghost an create a new page:

Adding Google Custom Search to Casper
Set the Post Url and turn the post into a page.

Now create a new HTML card on your page and paste in the following code, be sure to replace the Script block at the bottom with the code you received from Google:

<style>
    .search-wrapper .gsc-control-searchresults-only {
        box-sizing: initial;
        width: 100%;
        line-height: 1.5rem;
    }
    .search-wrapper .gsc-control-cse table {
        background: none;
        overflow-x: initial;
        line-height: 1em;
        margin: 0;
        white-space: normal;
    }
    .search-wrapper .gsc-result .gs-title {
        height: 2rem!important;
        line-height: 1.5rem;
    }
    .gcsc-branding, .gsc-branding-text, .gsc-branding-img {
        display: none !important;
    }
    .search-wrapper .gsc-control-cse table td {
        border: none;
        padding: 0;
    }
    .search-wrapper .gsc-results img {
        margin: 0;
    }
</style>
<div class="search-wrapper">
<script>
  (function() {
    var cx = 'magic-identifier-from-google';
    var gcse = document.createElement('script');
    gcse.type = 'text/javascript';
    gcse.async = true;
    gcse.src = 'https://cse.google.com/cse.js?cx=' + cx;
    var s = document.getElementsByTagName('script')[0];
    s.parentNode.insertBefore(gcse, s);
  })();
</script>
<gcse:searchresults-only></gcse:searchresults-only>
</div>

The code above overrides a number of styles to make the search results appear "pretty", the Casper's default stylesheet will make it look hideous. These overrides alleviate some of those issues.

It's possible to add your Search page directly to the Casper Theme. I have chosen to keep the code in a Post, because it allows me to tweak it more easily.

Step 3: Put the search box in your Casper theme

To add the search box to your page you'll need to tweak the Casper Theme. You may have already seen how I've setup a small CI/CD pipeline to keep my Casper Theme updated.

To add search I've had to add the following elements:

After updating your Ghost theme with these changes, the search box should appear.

That's it. Just 3 simple steps to add Google Custom Search to your Ghost blog.

Useful resources:

What domains are used by your Azure DevOps account?

$
0
0
What domains are used by your Azure DevOps account?

Almost every corporate client asks us this question at some point: Which domains and IP addresses are used by Azure DevOps (formerly Visual Studio Team Services). And given that it's a cloud service, it's not an easy question to answer.

The nature of the cloud, it's ability to scale, fail over and the regular update cadence cause these services to change on a regular basis. Much more often than many corporate client fire walls are able to handle (or at least their change process can deal with).

Of course there is the master list of "IP addresses owned by Azure", but let's face it, it's way to easy to put anything in Azure.  It's may give you a fake feeling of security, but doesn't offer a real form of protection..

Then there is the ability to use the reverse DNS to white-list domains on a proxy server. This is already a better alternative, but getting to the complete list of domains to white-list is still not going to be easy. Many IT departments use tools like Fiddler or log blocked traffic at the proxy level to figure out what domains to allow and which ones to block. It's easy to miss specific domains which may only be used to register agents or to receive service hooks etc.

Today I learned there is actually a service which returns a pretty long list of domains associated to your account.  Just request the following URL and you'll be rewarded with a json response containing all the services associated with your account:

https://dev.azure.com/{account}/_apis/resourceareas/

The response looks like this:

{
    "count": 181,
    "value": [
        {
            "id": "fb13a388-40dd-4a04-b530-013a739c72ef",
            "name": "policy",
            "locationUrl": "https://jessehouwing.visualstudio.com/"
        },
        {
            "id": "c73a23a1-59bb-458c-8ce3-02c83215e015",
            "name": "Licensing",
            "locationUrl": "https://jessehouwing.vssps.visualstudio.com/"
        },
        {
            "id": "01e4817c-857e-485c-9401-0334a33200da",
            "name": "dedup",
            "locationUrl": "https://jessehouwing.vsblob.visualstudio.com/"
        },
        {
            "id": "79134c72-4a58-4b42-976c-04e7115f32bf",
            "name": "core",
            "locationUrl": "https://jessehouwing.visualstudio.com/"
        },
        {
            "id": "67349c8b-6425-42f2-97b6-0843cb037473",
            "name": "Favorite",
            "locationUrl": "https://jessehouwing.visualstudio.com/"
        },
        {
            "id": "5264459e-e5e0-4bd8-b118-0985e68a4ec5",
            "name": "wit",
            "locationUrl": "https://jessehouwing.visualstudio.com/"
        },
        ...
    ]
}

The list isn't complete, but it's much more extensive than any other stack overflow post or MSDN forums posts I've seen out there.

Add to that the whitelisted domains from the Azure DevOps content security policy and you're golden:

What domains are used by your Azure DevOps account?
default-src ;
font-src *.visualstudio.com *.dev.azure.com dev.azure.com *.vsassets.io vsassetscdn.azure.cn ms.gallerycdn.vsassets.io ms.gallerycdn.azure.cn *.microsoft.com *.sharepointonline.com ;
style-src 'unsafe-inline' *.visualstudio.com *.dev.azure.com dev.azure.com cdn.vsassets.io vsassetscdn.azure.cn ms.gallerycdn.vsassets.io ms.gallerycdn.azure.cn ;
connect-src *.visualstudio.com wss://*.visualstudio.com *.dev.azure.com dev.azure.com wss://*.dev.azure.com wss://dev.azure.com *.vsassets.io vsassetscdn.azure.cn ms.gallerycdn.vsassets.io ms.gallerycdn.azure.cn *.blob.core.windows.net ;
img-src http: https: blob: data: ;
script-src 'unsafe-inline' *.visualstudio.com *.dev.azure.com dev.azure.com https://cdn.vsassets.io https://vsassetscdn.azure.cn https://ms.gallerycdn.vsassets.io https://ms.gallerycdn.azure.cn *.ensighten.com *.microsoft.com *.google-analytics.com 'nonce-rl90+lCEC+5X3/aWhviaLg==' ;
child-src blob: tfs: * ;
media-src http: https:

And the beauty of combining these two methods? They're updated automagically by Microsoft every time something changes,  so you can simply monitor them and apply the diffs after a quick review.

Fail your builds when tests are skipped in Azure DevOps Pipelines

$
0
0
Fail your builds when tests are skipped in Azure DevOps Pipelines

When the Visual Studio Test Task in Azure DevOps Pipelines fails to find any tests it logs a warning and happily succeeds. It has been a regular request on the MVP community to do something about that and to ensure that builds fail when no tests have executed.

Since test results are published to the build results, it turned out to be quite easy to create a Server Task which handles this for you. This post will both introduce the task and explain how I built it.

Ensure Tests Task

You can find the Ensure Tests task in the Visual Studio Marketplace, once you have them installed you can add them to your build pipeline. To do so you first have to add an "Agentless job":

Fail your builds when tests are skipped in Azure DevOps Pipelines
Add an agentless job

Then configure that to depend on all of the other phases in your build pipeline that are used to run tests:

Fail your builds when tests are skipped in Azure DevOps Pipelines
Select all the phases that run tests to depend on

Add the "Ensure Tests" tasks to your Agentless job and you're all set!

Fail your builds when tests are skipped in Azure DevOps Pipelines
Add "Ensure tests have executed" task to your Agentless phase.

How does it work?

The "Ensure tests have executed" task relies on the information that's automatically captured by Azure DevOps Pipelines when tests are executed. Test results and coverage data is automatically uploaded. This data is continuously available, even while the build is running.

The task queries the test result API and uses the same endpoint that's used by the Build Result page. I used the Chrome Developer Tools to look up the endpoint used by the build summary page to display the Total Tests number:

Fail your builds when tests are skipped in Azure DevOps Pipelines
Use Chrome Developer Tools to find the correct request

The result json contains quite a few useful metrics, the ones I used was totalTests, but as you can see it would be easy to also check for the number of Test Runs, increases or decreases in the number of tests etc.

{
  "aggregatedResultsAnalysis": {
    "previousContext": {
      "contextType": 0,
      "build": null,
      "release": null
    },
    "resultsDifference": {
      "increaseInTotalTests": 16,
      "increaseInFailures": 0,
      "increaseInPassedTests": 16,
      "increaseInOtherTests": 0,
      "increaseInDuration": "00:00:02.5170000"
    },
    "totalTests": 16,                             // Number of tests
    "duration": "00:00:02.5170000",
    "resultsByOutcome": {
      "Passed": {
        "outcome": "passed",
        "count": 16,
        "duration": "00:00:02.1570000"
      }
    },
    "runSummaryByState": {
      "Completed": {
        "state": "completed",
        "runsCount": 1,                           // Number of test runs
        "resultsByOutcome": {
          "Passed": {
            "outcome": "passed",
            "count": 16,
            "duration": "00:00:02.1570000"
          }
        }
      }
    },
    "runSummaryByOutcome": {
      "Passed": {
        "runsCount": 1
      }
    }
  },
  "testFailures": {
    "previousContext": null,
    "newFailures": {
      "count": 0,
      "testResults": []
    },
    "existingFailures": {
      "count": 0,
      "testResults": []
    },
    "fixedTests": {
      "count": 0,
      "testResults": []
    }
  },
  "testResultsContext": {
    "contextType": "build",
    "build": {
      "id": 120,
      "definitionId": 0,
      "uri": "vstfs:///Build/Build/120"
    },
    "release": null
  },
  "teamProject": {
    "id": "068e8d2d-878b-405c-9d71-e653b8412284",
    "name": "PSD-001",
    "state": "unchanged",
    "visibility": "unchanged"
  }
}

I then used my Server Expression Tester to create the correct condition to pass or fail the build using the above information.

You can find more details on the Server Expression Tester in a previous post and download the latest version from GitHub.
Fail your builds when tests are skipped in Azure DevOps Pipelines
Use the Server Expression Tester to validate the Condition Expression for the task using the captured json response.

And then proceeded wrapping that into a Task:

{
  "id": "25d3d29e-5ea1-4453-9ce6-02e1b34ab30c",
  "name": "Ensure tests have executed.",
  "friendlyName": "Ensure tests have executed.",
  "description": "Ensure tests have executed.",
  "author": "Jesse Houwing",
  "helpMarkDown": "",
  "category": "Test",
  "version": {
    "Major": 0,
    "Minor": 0,
    "Patch": 3
  },
  "visibility": [
    "Build",
    "Release"
  ],
  "runsOn": [
    "Server"
  ],
  "preview": true,
  "instanceNameFormat": "Ensure tests have executed",
  "inputs": [],
  "execution": {
    "HttpRequest": {
      "Execute": {
        "EndpointId": "",
        "EndpointUrl": "$(system.teamFoundationCollectionUri)$(System.TeamProject)/_apis/test/ResultSummaryByBuild?buildId=$(Build.BuildId)",
        "Method": "GET",
        "Body": "",
        "Headers":"{\n\"Content-Type\":\"application/json\"\n, \"Authorization\":\"Bearer $(system.accesstoken)\"\n}",
        "WaitForCompletion": "false",
        "Expression": "le(1, jsonpath('$.aggregatedResultsAnalysis.totalTests')[0])"
      }
    }
  }
}

I grabbed the EndPointUrl from the QueryWorkItems task on the vsts-tasks GitHub tasks.

Tip: Many of Microsoft's tasks do fancy things. Some of these tricks aren't well documented, but most of it is happening out in the open. If you want to replicate a certain behavior it pays to browse through one of these repositories:

- https://github.com/Microsoft/vsts-tasks/
- https://github.com/Microsoft/vsts-task-lib
- https://github.com/Microsoft/vsts-agent

Not only will you find examples and test approaches, but also preliminary documentation.

I packaged the task up in an extension and published it using the CI/CD Tools for Azure DevOps Extensions like I do with all of my extensions.

It looks like people are finding this a useful extension, there have been 21 installs, only 5 star reviews and I've already merged the first pull request. What other test metrics would you like to Ensure as part of your build pipeline? Leave a comment below or file an issue on GitHub!

Photo by: Jon Ardern used under Creative Commons

Pipeline extension scoped to a specific TFS or Agent version

$
0
0
Pipeline extension scoped to a specific TFS or Agent version

In The Netherlands more and more cities are creating an artificial wall around the city centers for older cars, certain fuel types or lorries. This is happening elsewhere in Europe as well. A similar things needs to happen in the Azure DevOps marketplace now that the number of supported target server versions is increasing.

As an extension developer it is becoming harder and harder to retain compatibility with all the different TFS versions out there as well as with Azure Devops with its high release frequency.

While Microsoft officially only supports the RTM and the latest Update pack of each major TFS version (2015, 2015.4.1, 2017, 2017.3, 2018.0.1 and 2018.3), many clients are lingering on some version in between 2015 and 2018.3 and wish for your extensions to work with their version.

Up until now I've simply had my extensions target Azure DevOps and TFS. No version limitations. Yet with the introduction of YAML and Output Parameters and more modern versions of the underlying Node runtime, I'm considering adding these features which may break 2015 and 2017 versions of my build tasks.

There are multiple places where a task developer can scope their extensions or tasks. Let's have a quick look at our options:

Ways to target your extension to Azure DevOps and/or TFS

The simplest option you have is to scope the vss-extension.json manifest file to Azure DevOps or TFS using the Installation targets:

Target Available on
Microsoft.VisualStudio.Services.Cloud Azure DevOps
Microsoft.TeamFoundation.Server Team Foundation Server
Microsoft.VisualStudio.Services Azure DevOps
and Team Foundation Server

You can either list support for Azure DevOps and Team Foundation Server explicitly or use the older Visual Studio Services option that still stems from the time Azure DevOps was called Visual Studio Online.

{
    "targets": [
        {
            "id": "Microsoft.VisualStudio.Services.Cloud"
        },
        {
            "id": "Microsoft.TeamFoundation.Server",
        }
    ]
}

Is the explicit equivalent of:

{
    "targets": [
        {
            "id": "Microsoft.VisualStudio.Services"
        }
    ]
}

Ways to target specific versions of Team Foundation Server

In my case I want to support TFS 2018 and Azure DevOps, since TFS 2018 has support for the new Output Variables. In order to exclude older versions of TFS you'll need to add a version to your target. The exact syntax is clearly defined in the docs.

{
    "targets": [
        {
            "id": "Microsoft.VisualStudio.Services.Cloud"
        },
        {
            "id": "Microsoft.TeamFoundation.Server",
            "version": "[16.0,)"
        }
    ]
}

It's better to target API versions instead of Servers

If you're dependent on the availability of a specific API version, then you can add demands to your extension manifest instead of a version range. These demands ensure that the server you're installing into has the required server side APIs available:

{
    "demands": [
        "api-version/3.0"
    ]
}

You can demand the availability of an extension point as well:

{
    "demands": [
        "contribution/ms.vss-dashboards-web.widget-catalog"
    ]
}

The marketplace will compare the demands against the known APIs in Team Foundation Server and will show the correct list of supported servers, even if a demand is added to a later update pack without your knowledge. Nifty eh? The same should go for Preview features.

Unfortunately there is no way to demand specific Build and Release features. You can't demand "Release Gates available" as an option, though you may be able to work around this by looking up some of the UI extension points for these features.
Similarly, you can't demand a minimum agent version for your extension. You can for tasks, but that won't prevent people from installing the extension to a TFS server version that will never be able to run your tasks.

So what about build and release tasks?

Your build task contribution comes with its task.json manifest and you can tweak a few settings there to show support for different versions.

minimumAgentVersion

If you're dependent on a feature in the VSTS Task API that depends on a specific version of the agent, you can set the minimumAgentVersion property in the Task Manifest. A comprehensive list of agent versions and API features can be found int he vsts-task-lib docs.

The same page also lists which versions of the Agent have sipped with which versions of TFS. This will allow you to set the supported target version in the vss-extension.json accordingly.

{
  "minimumAgentVersion": "1.83.0"
}

demands

Another way to handle compatibility checks is through Demands. Demands are matched against the capabilities of the available Agents and against the capabilities provided by other tasks (like Tool Installers).

As a task author you can also specific custom demands that Agent Administrators need to configure on each Agent. This will signal the agent is compatible and will at least force an DevOps Pipelines administrator to look into the compatibility requirements of your task.

A long list of demands is available by default, but unfortunately, you can't perform any logic on them. You can only detect the presence of a demand from your task manifest.

Capability name Capability value
Agent.ComputerName SNAPPIE
Agent.HomeDirectory C:\TfsData\jessehouwing
Agent.Name SNAPPIE
Agent.OS Windows_NT
Agent.OSVersion 10.0.17763
Agent.Version 2.136.1
ALLUSERSPROFILE C:\ProgramData
APPDATA C:\WINDOWS\ServiceProfiles\NetworkService\AppData\Roaming
AzurePS 5.7.0
ChocolateyInstall C:\ProgramData\chocolatey
Cmd C:\WINDOWS\system32\cmd.exe
CommonProgramFiles C:\Program Files\Common Files
CommonProgramFiles(x86) C:\Program Files (x86)\Common Files
CommonProgramW6432 C:\Program Files\Common Files
COMPUTERNAME SNAPPIE
ComSpec C:\WINDOWS\system32\cmd.exe
docker C:\Program Files\Docker\Docker\Resources\bin\docker.exe
DotNetFramework C:\Windows\Microsoft.NET\Framework64\v4.0.30319
DotNetFramework_2.0 C:\Windows\Microsoft.NET\Framework\v2.0.50727
DotNetFramework_2.0_x64 C:\Windows\Microsoft.NET\Framework64\v2.0.50727
DotNetFramework_3.0 C:\Windows\Microsoft.NET\Framework\v3.0
DotNetFramework_3.0_x64 C:\Windows\Microsoft.NET\Framework64\v3.0
DotNetFramework_3.5 C:\Windows\Microsoft.NET\Framework\v3.5
DotNetFramework_3.5_x64 C:\Windows\Microsoft.NET\Framework64\v3.5
DotNetFramework_4.7.0 C:\Windows\Microsoft.NET\Framework\v4.0.30319
DotNetFramework_4.7.0_x64 C:\Windows\Microsoft.NET\Framework64\v4.0.30319
DriverData C:\Windows\System32\Drivers\DriverData
ES_HEAP_SIZE 9600m
FSHARPINSTALLDIR C:\Program Files (x86)\Microsoft SDKs\F#\10.1\Framework\v4.0\
InteractiveSession FALSE
java C:\Program Files (x86)\Java\jre1.8.0_181
java_8 C:\Program Files (x86)\Java\jre1.8.0_181
JAVA_HOME C:\Program Files\Microsoft Team Foundation Server 15.0\Search\Java\jre1.8.0_141
LOCALAPPDATA C:\WINDOWS\ServiceProfiles\NetworkService\AppData\Local
MSBuild C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\MSBuild\15.0\Bin\
MSBuild_12.0 C:\Program Files (x86)\MSBuild\12.0\bin\
MSBuild_14.0 C:\Program Files (x86)\MSBuild\14.0\bin\
MSBuild_14.0_x64 C:\Program Files (x86)\MSBuild\14.0\bin\amd64\
MSBuild_15.0 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\MSBuild\15.0\Bin\
MSBuild_15.0_x64 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\MSBuild\15.0\Bin\amd64\
MSBuild_2.0 C:\Windows\Microsoft.NET\Framework\v2.0.50727\
MSBuild_2.0_x64 C:\Windows\Microsoft.NET\Framework64\v2.0.50727\
MSBuild_3.5 C:\Windows\Microsoft.NET\Framework\v3.5\
MSBuild_3.5_x64 C:\Windows\Microsoft.NET\Framework64\v3.5\
MSBuild_4.0 C:\Windows\Microsoft.NET\Framework\v4.0.30319\
MSBuild_4.0_x64 C:\Windows\Microsoft.NET\Framework64\v4.0.30319\
MSBuild_x64 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\MSBuild\15.0\Bin\amd64\
MSMPI_BIN C:\Program Files\Microsoft MPI\Bin\
node.js C:\Program Files\nodejs\node.exe
npm C:\Program Files\nodejs\npm.cmd
NUMBER_OF_PROCESSORS 8
OS Windows_NT
PATHEXT .COM,.EXE,.BAT,.CMD,.VBS,.VBE,.JS,.JSE,.WSF,.WSH,.MSC,.RB,.RBW
PowerShell 5.1.17763.1
PROCESSOR_ARCHITECTURE AMD64
ProgramData C:\ProgramData
ProgramFiles C:\Program Files
ProgramFiles(x86) C:\Program Files (x86)
ProgramW6432 C:\Program Files
PSModulePath %ProgramFiles%\WindowsPowerShell\Modules,C:\WINDOWS\system32\WindowsPowerShell\v1.0\Modules,C:\Program Files (x86)\Microsoft SQL Server\130\Tools\PowerShell\Modules,C:\Program Files\Microsoft Azure Recovery Services Agent\bin\Modules\
PUBLIC C:\Users\Public
PYTHON C:\Users\JesseHouwing\AppData\Local\Programs\Python\Python36\
SEARCH_ES_INDEX_PATH C:\TfsData\Search\IndexStore
SqlPackage C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin\SqlPackage.exe
SystemDrive C:
SystemRoot C:\WINDOWS
TEAMCITY_SERVER_OPTS -Djava.library.path=C:\ProgramData\JetBrains\TeamCity\lib\jdbc\sqljdbc_auth.dll
TEMP C:\WINDOWS\ServiceProfiles\NetworkService\AppData\Local\Temp
TMP C:\WINDOWS\ServiceProfiles\NetworkService\AppData\Local\Temp
USERPROFILE C:\WINDOWS\ServiceProfiles\NetworkService
VisualStudio C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\
VisualStudio_15.0 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\
VisualStudio_IDE C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\Common7\IDE\
VisualStudio_IDE_15.0 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\Common7\IDE\
VS140COMNTOOLS C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\Tools\
VSTest C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\Common7\IDE\CommonExtensions\Microsoft\TestWindow
VSTest_15.0 C:\Program Files (x86)\Microsoft Visual Studio\Preview\Enterprise\Common7\IDE\CommonExtensions\Microsoft\TestWindow
windir C:\WINDOWS
WindowsKit C:\Program Files (x86)\Windows Kits\8.1\
WindowsKit_8.1 C:\Program Files (x86)\Windows Kits\8.1\
WindowsSdk C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A
WindowsSdk_8.1 C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A
WindowsSdk_8.1_NetFx40Tools C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools
WindowsSdk_8.1_NetFx40Tools_x64 C:\Program Files (x86)\Microsoft SDKs\Windows\v8.1A\bin\NETFX 4.5.1 Tools\x64
WIX C:\Program Files (x86)\WiX Toolset v3.11\

And you'd almost expect to be able to read these agent capabilities from your task, but that's also not the case. Some of these are System level and user level Environment Variables, you can get to their values through there, others are supplied by the agent itself and aren't easy to resolve.

Querying the agent version

The Agent will set a Variable you can read from your tasks and a function in the Task Library to ensure you're running on the right agent:

/**
 * Asserts the agent version is at least the specified minimum.
 *
 * @param    minimum    minimum version version - must be 2.104.1 or higher
 */
export function assertAgent(minimum: string): void {
    if (semver.lt(minimum, '2.104.1')) {
        throw new Error('assertAgent() requires the parameter to be 2.104.1 or higher');
    }

    let agent = getVariable('Agent.Version');
    if (agent && semver.lt(agent, minimum)) {
        throw new Error(`Agent version ${minimum} or higher is required`);
    }
}

As you can see in the snippet above, this variable was introduced with agent 2.104.1, detecting it's presence will tell you you're on an older agent, but no on which.

If you'd ask me, this isn't ideal yet.

As task author there is no way to package multiple tasks in the same extension with different supported target platforms. There's no way to ship a version which presents a different UI in 2015 than in 2018 and slowly the number of features available between all the versions of TFS in the field is growing larger.

I hope the Azure DevOps or the Visual Studio Marketplace team will be able to provide us with ways to target specific extension versions to specific TFS versions. That may be the only backwards compatible way that will allow us to keep using the same Extension ID and Task ID in our extension; Which is the only way to not break your users on upgrades.

Physical and Digital tools for Scrum Masters and their teams

$
0
0
Physical and Digital tools for Scrum Masters and their teams

About 40 people gathered at the nlScrum meetup dedicated to physical and digital tools for the agile workspace. Tools that help developers, coaches and trainers to survive in the toughest environments.

We broke the evening up into a hybrid between an Open Space and your local Farmers' market with people displaying their wares and the tools they use. Rosenboom showcased their office solutions including interactive smart screens and high quality whiteboards among the usual assortment of markers and post-its. And Plantronics brought their collection of noise cancelling headphones for much needed concentration or a well understood remote collaboration as well as a couple of their conference kits that link up to a phone or laptop to turn any room into a reasonably good conference room.

I had taken possession of one such tables to show the little things I carry around or have in the back of my car in case I might need them.

My own bag

In my bag I carry a broad assortment of personal gadgets as well as things I use for teaching and facilitation. Especially when teaching in-company classes (which often take place in the offices of the client), I bring some back-up materials as you never really know what kind of facilities the room will have.

Physical tools

Roll of static whiteboard film

I have a roll of of Legamaster Magic Chart in the back of my car. It's a roll of static film you can throw against almost any surface and it acts as an impromptu whiteboard. You can use as many sheets side-by-side as you want to create a surface to draw on of just about any size.

Physical and Digital tools for Scrum Masters and their teams

While I prefer a proper high quality whiteboard, it's better than having none or having to use a whiteboard that lost it's shine a long, long time ago.

There are many different brands available.

Adhesive magnetic whiteboard

If it's your own whiteboard that could use some proper renovation, or you'd like to turn an existing wall or pane of glass into a (magnetic) whiteboard, then it's good to know that there's an option to just stick a new writing surface on any flat object.

And it's not just to repair old whiteboards, you can turn any door, glass panel, table or even a concrete support pillar into a high quality writable magnetic surface. Rosenboom brought samples of Legamaster Wrap-Up.

Physical and Digital tools for Scrum Masters and their teams

3M offers similar products, their magnetic whiteboard film is somewhat of a hybrid between the static film and the adhesive whiteboard. Their solution can be applied to metallic surfaces and doesn't require any adhesives.

3M table stand flip-charts

In many of the classes we teach, the group is divided into teams. each team is asked to discuss and visualize things and often we run out of flip-charts quickly. This is why I have a couple of 3M table stand flip charts in my car. Think of these as mini flip-charts as well as maxi-post-its. They're super sticky and teams can quickly put their results up on a wall or window.

Physical and Digital tools for Scrum Masters and their teams

Another advantage of these is that they are easy to hide under the tables during presentations or demonstrations and they fold flat for transportation.

Embrava BlyncLight

When you're working in a busy office and are dealing with many interruptions, sometimes you need a little time for yourself to concentrate and to really get some important things done. Especially teams working in a open floor plan office (don't get me started) can suffer greatly from the high number of interruptions. For those setups I sometimes use my Embrava BlyncLight. A small USB controlled LED that can be synced to your Skype status (supports many more communication tools) and it can be set manually through a keyboard shortcut.  

Physical and Digital tools for Scrum Masters and their teams

I happen to use their USB-connected device, but Embrava also offers wireless versions. I would forget to charge these I'm afraid. Their most recent addition in the series is an e-paper nameplate for offices that don't provide people with a fixed desk to work on. I don't really agree with the term "Agile Workspaces" which Embrava uses to describe these.

Embrava even ships an SDK which allows you to integrate their light into any application. In the past I've used it as a Build Status Monitor and it's probably quite easy to set the color based on the number of bugs in your Azure DevOps project.

Portable speakers

I happen to own a JBL Charge 2, I like them very much, but it probably won't matter a lot what brand you end up using. I bought my Charge 2 for the following reasons:

  • Built-in battery
  • Ability to charge my phone on-the-go
  • Bluetooth as well as USB connection
  • Built-in microphones for conference call support
  • Enough sound to fill a small training room

Many training rooms are equipped with some form of Audio/Video equipment, but unless you've brought the right converter cables and unless you're very lucky it's often not easy to get everything to work. Even if you can get the screen/projector to work, the audio is often the worst.

For those places I bring these portable speakers. The Charge 2 can produce enough noise to show short YouTube clips to participants without causing issues.

It also doubles as a Bluetooth conference call set so you can put it on a table and quickly have a conversation with people on the other end of the world. Either from your phone or from your laptop. It works with Skype, Teams, Slack you name it.

The Charge 2 has been superseded by the Charge 4 which offers a very similar feature set, maybe you'll be able to find a 2 or 3 with discount.

Plantronics Blackwire 725

Even though I hate open floor plans, it's almost impossible to do consulting work and not encounter them on a regular basis. The added noise doesn't just make it hard to concentrate, but the lack of small break-out rooms often also makes it hard to have a good phone conversation or join a conference call.

After my previous headset suffered a fatal cable problem, I switched to an USB powered active noise cancelling (ANC) headset with a long cable. One less device to charge, but still some room to move. While I love the freedom of a Bluetooth powered headset on a charging stand, that's not something I want to carry around all day in my backpack.

After trying a few headsets, I ended up with these Plantronics. The crisp sound allows me to hear other participants of a conference call and the high quality mic. I love the dedicated Mute button on the cable, it will mute/un-mute the mic regardless of which application has focus. The noise cancellation works two ways, it works in the ear pieces to cancel noise and it works on the microphone to prevent noise from your end to spill over into a conference call.

Google USB-A to USB-C adapter

I use the Plantronics Blackwire 725 with the USB-A to USB-C adapter that came with my Pixel 2 XL. Though officially meant for  transferring your data from one device to another, it can also be used to connect (and power) small USB devices such as storage devices and USB headsets.

This will allow you to use the Plantronics headset to receive phone calls, join Skype calls from your phone etc. It's also great for listening to a couple of podcasts all the while enjoying the same Active Noise Cancelling features. The only disadvantage is that the ANC is powered from your phone's battery, so it will drain a little faster.

Google also sells a similar adapter separately and Plantronics offers a similar adapter as well.

Microsoft Wireless Display Adapter

When delivering a presentation, it's common to really stand in front of the audience while presenting. When delivering training or when facilitating you may want to be mobile or be part of the group.

Or when you're in the unlucky situation that it's not just the audio that's bad, but even the video cables are letting you down.

For those occasions I bring my Microsoft Wireless Display Adapter. These clever devices allow you to connect any screen or projector that supports HDMI over Wireless. It uses a protocol called Miracast, making it compatible with every Windows 10 and modern Android device. It adds a true second screen, so your PowerPoint presenter mode will work without issues.

I have protected mine with a pass code, that way people won't be able to take over your presentation and project unwanted content in the middle of it.

If you're an Apple user, you may need to bring an AppleTV instead, Apple has decided not to support Miracast in favor of their own proprietary AirPlay protocol.

ClickShare

Despite a lot of recent negative tweets, I like the ClickShare solution for allowing people to present on the big screen. When delivering a Professional Scrum Developer class we ask each team to present their work to the other teams. Switching cables, finding the right adapters and setting the right resolutions seems to be harder and harder these days. It's not uncommon to have people from the same company in a single class, yet need:

  • Lightning to HDMI
  • Micro HDMI to HDMI
  • Displayport to HDMI
  • Mini Displayport to HDMI
  • USB-C to HDMI

... adapters to allow 5 different teams to present (yes, our preferred connector is HDMI). To add to that, long cables are fragile, expensive and never long enough.

So instead, our offices in Amsterdam are equipped with ClickShare and we have a couple of small USB-dongles that are synced to the screen. They're also equipped with HDMI, but we primarily use that for the main presenter/trainer.

With ClickShare each team is just the click of a button away from sharing their desktop directly to the big screen.

By installing the ClickShare Extension Pack you can use your connected dongle as a Wireless Display Adapter, adding true second screen support and enabling Presenter mode in PowerPoint. But the bandwidth isn't good enough to stream HD video or very fancy animations.

Digitalinx

While the Microsoft Display Adapter and ClickShare are great solutions, I always love it when I get to a client that has a Digitalinx multi-adapter bolted to their main HDMI cable.

The Digitalinx adapter ring allows you to literally bolt the most common HDMI adapters directly to the HDMI cable. that way they won't get lost, don't end up in the bags of your guest presenters on accident and are always available where and when you need them

Physical and Digital tools for Scrum Masters and their teams
This will allow just about every modern device to connect, though you may need to order USB-C and Apple adapters separately and bolt them to the cable using the adapter ring.

Neuland markers

My colleague Laurens Bonnema has taken up sketch-noting a couple of years ago and while I am not even close to achieving his works, I have found that proper use of colors, borders and highlights really make a difference between a standard flip-chart drawing and one that stands out.

Physical and Digital tools for Scrum Masters and their teams
Some of Simon Reindl's use of Neuland markers at the last PSM Train-the-Trainer

In the past I used the standard Red-Green-Blue-Black markers you can find on every whiteboard, but after seeing Laurens and Ralph Jocham and Simon Reindl in action with their Neulands I decided to upgrade. In my trainer bag there are now 2 boxes and two bags of:

I started out with the Big Ones, but if you'd ask me now I'd probably recommend the No Ones Art over them. These are smaller and have a brush-like tip. They offer more flexibility. The default color set is pretty good. I added dark grey, Xpirit orange and signal red.

Neuland has a year of celebrations. Every month you get another free product. So far I've received a free bag for my markers, fine-liners and pastels.

I use the fine-ones for better note taking. I found that adding small drawings in between my written notes allows me to remember more details of my meetings and attended events.

Apart from a wide selection of colors and types, the fact that you can refill them as well as replace the tips will save you money and the environment in the long run.

Digital tools

Visual Studio Live Share

In the past when I wanted to work together on the same piece of code with a remote co-worker, or even a colleague working in the same office, but a couple of floors away, a quick screen share over Zoom or Skype for Business was the best we could do. While not bad, it has a pretty high bandwidth demand especially when you also add live video and audio.

We all know that communication works better when two people can at least see each other and spoken word works better than written text in many cases.

The problem with the traditional Screen Sharing solutions is that you're forced to look at the same code and usually only a single person can use the keyboard.

Visual Studio Live Share takes away these limitations. It's best described as Office Online or Google Docs for your code and it works in Visual Studio and Visual Studio Code on Windows, Mac and Linux.

Visual Studio Live Share

I foresee a whole range of cool scenario's with this new tool:

  • Remote Pair and Mob Programming
  • Enabling live 4-eyes principle for analyzing and solving issues on environments that may currently be off-limits to at least 2 of the 4 eyes.
  • Collaborative merge-conflict resolving

It's been something I've long wanted to see more of, I remember talking to David Starr when Code Lens first came out on some of the scenario's that Visual Studio Live Share now enables by integrating presence and ownership into Visual Studio. These are exiting times indeed!

Spartez Agile Cards for Azure DevOps and Jira

The topic of physical vs digital boards came up surprisingly often during the evening. And a common thread was that while most people preferred a physical board, digital had a couple of advantages as well.

Having both adds the overhead of having to sync between them; this is where Spartez Agile Cards comes in. It annotates you physical board with QR codes and requires you to add a QR code to each of you stickies. Using those codes it will be able to detect which item is in which states and can either offer you to move the digital cards to be in sync with you physical ones or spit out an updated photograph with annotations that help you quickly move the physical cards to be in sync with the digital ones.

Physical and Digital tools for Scrum Masters and their teams
Auto-update your digital board after a day in the office.

Spartez Agile Cards is available for:

I'd love to hear from you

What tools do you use, what things do you carry around?


Tasks and Release Gates on RadioTFS

$
0
0
Tasks and Release Gates on RadioTFS

In this episode of RadioTFS we spoke about the Global DevOps Bootcamp, Azure Pipelines and more specifically Release Gates and setting up a pipeline for your own Azure DevOps Extensions and some other recent blog posts.

Global DevOps Bootcamp:

Release Gates:

CI/CD Pipelines for Extensions:

Other:

And you can listen to the Podcast over at RadioTFS.

Tasks and Release Gates on RadioTFS
Thursday 18 October 2018
In this episode Greg is joined by returning guest, Jesse Houwing. The two geek out for about an hour, chatting about Scrum Bug, Global DevOps Bootcamp, Release Gates (and a must have utility when creating your own), CI/CD Tasks for Azure Pipelining your Azure DevOps extension, TFS Aggregator and much more...

VSTS Build & Release to Azure DevOps Pipelines tasks

$
0
0
VSTS Build & Release to Azure DevOps Pipelines tasks

Out with the old, in with the new! With the rename of Visual Studio Team Services Build and Release Management to Azure DevOps Pipelines all of the underlying libraries are being renamed too.

In order to stay up to date and receive the latest updates you'll need to make a few small changes to your existing build tasks:

Update your vss-extension.json

The categories in the Marketplace have changed to reflect the new names of the Hubs in Azure DevOps.Your tasks must be updated to reflect this change:

  "description": "Snyk continuously finds and fixes vulnerabilities in your dependencies.",
  "categories": [
-    "Build and release"
+    "Azure Pipelines"
  ],

Change the category of your extension in your extension manifest from it's old name to the new names:Azure Repos, Azure Boards, Azure Pipelines, Azure Test Plans, and Azure Artifacts.

Remove your dependency on vsts-task-lib

The vsts-task-lib has been renamed to azure-pipelines-task-lib and you'll need to update your package.json to pickup this new library:

npm uninstall vsts-task-lib --save
npm install azure-pipelines-task-lib@latest --save

Or manually update your package.json:

  "dependencies": {
-    "vsts-task-lib": "^2.3.0"
+    "azure-pipelines-task-lib": "^2.7.5"
  },

Update your typescript files to import the new library

And now that you have a new module you depend on, you'll need to import it in your typescript files:

- import * as tl from "vsts-task-lib/task";
- import * as tr from "vsts-task-lib/toolrunner";
+ import * as tl from "azure-pipelines-task-lib/task";
+ import * as tr from "azure-pipelines-task-lib/toolrunner";

After changing your import / require statements, all the existing functions should remain operational, unless you're upgrading from a very old version of the vsts-task-lib.

Update to the latest version of tfx-cli

If you're using the old version of tfx-cli it will balk at the new extension categories. The easiest way to fix that is to update to the latest version of tfx-cli, but if you're stuck on an older version for whatever reason, suppress local validation to get your extension published:

tfx extension publish --bypass-validation

And if you're using the Azure DevOps Tasks for Extensions you'll have to make sure you're on 1.2.8 or you'll have to enable this feature:

VSTS Build & Release to Azure DevOps Pipelines tasks
Use Bypass local validation if you're on an older version of the Azure DevOps Extension Tasks.

I've just upgraded my first task, the Snyk Task, and you can see all the changes here. I've done a bit of refactoring in this release as well and the old name has mostly been eradicated. The only thing I'll be stuck with is the Extension ID, all of my extensions have vsts in their ID and there is no way to get rid of that without breaking all my users.

Launch WSL bash prompt from Tower

$
0
0
Launch WSL bash prompt from Tower

When you launch a terminal from Tower, it launches an included MingW bash shell. Now that Windows 10 ships with the Windows Subsystem for Linux it would be nice to use that bash shell instead.

My first attempts at launching WSL failed miserably. It looks like Windows Filesystem Redirection causes to not see wsl.exe or bash.exe because Tower is a 32-bit program.

After a number of tries, I found that you can use the sysnative filesystem redirection point to escape the 32 bit world:

Title: Bash (WSL)
Path: C:\WINDOWS\sysnative\wsl.exe
Arguments:

Launch WSL bash prompt from Tower
Launch Windows Subsystem for Linux from Tower
Launch WSL bash prompt from Tower
Launching the Windows Subsystem for Linux bash prompt from Tower

Uninstall Visual Studio extension from the command line

$
0
0
Uninstall Visual Studio extension from the command line

After installation of Visual Studio 2019 preview, it wasn't able to start after installing all my favorite extensions. The primary candidates for the freeze were Resharper and OzCode. Resharper has an installer which allows me to uninstall the extension, but OzCode is a vsix extension and if you don't know how, can only be uninstalled from the Extensions window in Visual Studio itself. With Visual Studio freezing on start-up, that's not an easy thing to do.

There is an alternative, you can use vsixinstaller to uninstall extensions from the commandline. The steps are as follows:

  1. Find the vsix file you used to install the extension
  2. Open it in your favorite archiver (mine is 7-zip)
  3. Grab the extension's GUID from the extension.vsixmanifest.
  4. Run vsixinstaller /u:GUID to remove the extension from a Developer Command Prompt.
Uninstall Visual Studio extension from the command line
Find the Visual Studio Extension identifier for an extension
Uninstall Visual Studio extension from the command line
Remove the extension

Photo credit: Lore.

Staged execution of tests in Azure DevOps Pipelines

$
0
0
Staged execution of tests in Azure DevOps Pipelines

When running Unit Tests in your build system, you may want to first run the most important suite, the one that should never fail, the ones that are currently being changed, finally the regression suite and integration tests that may be slower to execute.

The Visual Studio Test task supports this feature, though many probably never knew that this was in the system since the first release of the Workflow based Team Build 2010.

In your build definition you can easily add multiple test runs, and for each test run define which tests should run. Not just fixed list of tests you have to curate, but a dynamic list based on attributes of the test.

The following steps will take you through the basic gist to get this working in your Azure Pipeline.

1. Indentify the tests in different categories

Go through your test base and apply categories to your automated tests. Use the [TestCategory] attribute for MsTest and the [Category] attribute for NUnit and [Trait] for xUnit.

// MsTest
[TestCategory("Critical")]
[TestMethod]
public void MyCriticalTest {}

// NUnit
[Category("Critical")]
[Test]
public void MyCriticalTest {}

// xUnit
[Trait("Category", "Critical")]
[Fact]
public void MyCriticalTest {}

The Test Filter options for Visual Studio Test can also filter on other characteristics of your test, such as the name of the test, it's parent class, namespace or assembly. You can combine these elements:

// MsTest
[TestCategory("Critical")]
[TestPriority(1)]
[TestMethod]
public void MyCriticalTest {}

2. Create multiple test runs in Team Build

Once you've categorized your tests create the test runs in Azure Pipelines. Each stage or set of tests will map to its own test run. In this example we're going to distinguish critical tests from non-critical tests using a simple attribute.

In your Azure Pipeline add multiple Visual Studio Test tasks, one for each set of tests you want to run:

Staged execution of tests in Azure DevOps Pipelines
Add a Visual Studio Test task for each set of tests to run.

We're going to configure the first run to execute the critical tests (and fail the build when any of these tests fail). Only when these tests succeed will we run the non-critical tests.

In the VsTest - Critical Tests configure the test filter to only run Critical tests:

Staged execution of tests in Azure DevOps Pipelines
Set the Test filter Criteria to TestCategory=Critical
Staged execution of tests in Azure DevOps Pipelines
Name the test run so you can easily distinguish the results.

In the second Visual Studio Test tasks configure the inverse filter:

Staged execution of tests in Azure DevOps Pipelines
Filter the second Visual Studio Test task to run the inverse set of tests.

Alternative filters

Test Categories are a very common way to filter tests, but you may have a large set of tests that already follow a different naming pattern. A common distinction is to put all Integration tests in their own assembly or in a different namespace.

This can be accomplished quite easily as well. To understand what filtering options you have available, here is an overview of what you can do.

Assembly name

The Test files follows the standard Azure Pipelines glob patterns:

  • **/ recursive folders
  • * wildcard search
  • ? Single character placeholder

When your Integration tests are in their own projects you can use the project and assembly name to select these tests:

  • **/*IntegrationTests.dll for integration tests
  • **/*UnitTests.dll for unit tests

Namespace or class

The Test filter criteria option can filter on a number of items, Full name and Classname (not all projects support the class name filter) can be used to filter. You can either use an exact match (=), or a contains filter (~), for example:

  • FullyQualifiedName~Integration
  • ClassName~Important

The operators and properties are explained here quite well.

Category or Priority

As mentioned above you can use Test Categories, or you can filter on Priority.

  • Priority=1

You can also combine multiple items using & (and) and | (or):

  • Priority=1&TestCategory=Critical
  • Priority=2&TestCategory=Critical
  • Priority=3&TestCategory=Critical

This would queue 3 runs, each would run less important tests that are still critical, before running the rest of the tests.

Continue on test failure

Each of the test tasks can be configured to not fail the executing pipeline. This is a standard feature that's found on every task on Azure DevOps and Team Foundation Server 2017 or later.

I sometimes use this to filter out flaky tests:

Staged execution of tests in Azure DevOps Pipelines
Run your flaky tests in a separate run
Staged execution of tests in Azure DevOps Pipelines
Don't fail the pipeline

This way you do get statistics on these tests and can refactor them over time to be stable, yet still get releases out of the door. It's useful on code bases with technical debt and it can help while your working on cleaning-up the mess.

Source: StackOverflow

More Physical and Digital tools for Scrum Masters and their teams

$
0
0
More Physical and Digital tools for Scrum Masters and their teams

A couple of months ago I blogged about some of the tools and toys that live in the trunk of my car. I take these along everywhere I teach and coach. Since posting, people have suggested additional items that just must be in my toolbox.

Time Timer Plus

Time-boxing is an important component of Scrum. It provides focus towards a goal and prevents you from over-analyzing things. We use time boxes extensively in each Professional Scrum class.

This lightweight, large timer helps visualize the time-box clearly to the class without having to juggle tools on screen.

Thanks: Evelien Roos and Just Meddens for this tip.

Logitech R800/700 and Spotlight

I'm using a trusty old Logitech Wireless presenter R800/700. It's easy to use, has a laser pointer that works no matter what you point it at and has a built-in timer that can be used to warn you when you've been talking for far too long.

The R800 is the US version which is equipped with a green laser pointer. Green lasers are illegal in Europe, so Logitech sells the R700 with a red laser.

It works on two AAA batteries which last for months.

The Spotlight is Logitech's most recent wireless presenter. It requires additional software on your machine and uses sensors to "shine a spotlight" on the part of the screen you want to highlight. It can even act as a magnifying glass.

It has a built-in battery and fast-charges over USB-C. Nowadays that means you always have a full charge at hand and won't have to juggle any batteries.

Thanks: Laurens Bonnema

Super sticky post-its

Most training rooms - and let's face it also most team rooms - have only a few spaces where normal post-it notes will stick, let along stay up for a substantial amount of time.

On top of that, many companies try to save money by buying white-label sticky notes. In some cases these have so little sticking power that they will fall off even a decent whiteboard within hours.

Using the proper technique to pull stickies off a stack helps, but what really helps are Extra Sticky post-its from 3M.

More Physical and Digital tools for Scrum Masters and their teams
Many scrum trainers and agile coaches will teach you how to pull stickies from a stack. I teach this very simple rule-of-thumb: not up, not side, but down: up-side-down.

You can find these post-it notes in all kinds of shapes, sizes and colors.

Tingsha Bells

When I'm working with larger groups it can be hard to quickly return the attention to me to give further instructions, provide a hint or simply signal then end of a time-box.

In experienced groups it can often be enough to simply raise your hand and wait for all participants to respond to the cue. In less experienced groups pair of Tingsha bells also does wonders. Their sound pierces even the loudest crowds.

Elmo

In class ELMO stands for "Enough, Let's Move On" and participants can hold up an Elmo doll to do a subtle intervention and get the group back on topic or track. I've started using these small vibrating Elmo heads. They're small, lightweight and have a high fun-factor:

More Physical and Digital tools for Scrum Masters and their teams

When discussions are taking too long, even after subtle signalling you can sometimes see these little buggers fly through the room to intervene a little less subtle :).

This was part 2 in a series of tools for Scrum Masters, Trainers and Coaches. You can find part 1 here. Are there any tools or toys you'd want to add in a future post? Reply in the comments or send me a tweet!

Even more Physical tools for Scrum Masters and their teams

$
0
0
Even more Physical tools for Scrum Masters and their teams

This is the 3rd post in a series. You can find the older posts here:

Based on LinkedIn and Twitter feedback on previous posts some additions from the field!

Using perforation reinforcement to stick things to the wall

This tip comes all the way from Japan. I love it how it uses something for which it wasn't intended. These little circular stickers are normally used to reinforce perforation holes. And with the small dispenser they can also be used as strong, small sticky tapes!

Even more Physical tools for Scrum Masters and their teams
Make things stick with these small circular pieces of sticky tape in a simple dispenser.

Scotch Restickable Glue Stick

If you're not up to abusing things in ways they weren't intended, then this Scotch glue stick is a great alternative. It turns any piece of paper into a post-it note! Ideal for training where we pre-print PBIs and when you're combining a physical and a digital board.

Vinyl tape for whiteboards

Some people can't stand warped lines... You shouldn't write on whiteboards with permanent markers (if you intend to use that whiteboard in for other purposes in the future)...

So I use vinyl tape to quickly create semi-permanent, straight lines. Just attach to one end, unroll to the other end of the board and apply the whole line at once.

Instant camera

Add a little retro flavor to your board! Get a mini instant photo camera to your tool bag. Start a training by taking a picture of each participant and put their names on it with a sharpie. Stick them on a magnet to turn them into signal-cues on your Impediment Board...

There are a number of options, some purely old-school (shake it like a Polaroid picture):

Some pack a mini digicam and printer, giving you a digital capture of the photo as well.

As John says: Its basic but fun. The best thing is meeting a team for the first time and getting a photo is a good icebreaker.  By adding their name and key skill it can form a lasting reference  for stakeholders and people coming into the team space.  This has been great on some of the large scaled work I've delivered.

In a training class you can leave these camera's in the room for anyone to capture good vibes, interesting observations etc, a picture is more powerful than a sentence captured on a post-it.

Thanks: John Erikson

Diversity Markers

I recently attended a Scrum Trainer Face to Face in Melbourne Australia and had a great conversation with Wai Ling Ko about drawing and diversity. The default set of colours I carry has limited options for skin colours... Dark brown, skin-pink and... yellow? It looks like Neuland has been listening in, because they just released their set of Diversity markers. Combine with the standard colour set to also include more colourful hairstyles and your drawings will become a lot more inclusive!

Even more Physical tools for Scrum Masters and their teams
Source: Neuland

These are now also in my standard collection!

So what's in your bag?

I've found a couple of new gems since I started writing these posts. What's in your bag? Why should it be in mine? Leave a comment below!


Decrypt BitLocker OS drive of corrupted windows installation

$
0
0
Decrypt BitLocker OS drive of corrupted windows installation

Long story short: I had an issue in Windows which prevented me from booting from my NVME SSD drive in my laptop. In order to install a fresh copy of Windows to that drive without losing any data stored on it, I needed to decrypt the drive. It turns out you can't install onto an encrypted disk. To decrypt a disk, you need to be logged on... I ended up using Windows-to-go to solve my problems

Mount the drive in a different system

The easiest way to solve this problem is by taking the drive and adding it to another system that already runs Windows, boot into that system, unlock the data partition using the BitLocker recovery key and then decrypt it from the BitLocker control panel:

Decrypt BitLocker OS drive of corrupted windows installation
Turn off BitLocker

However, this requires access to a system with a spare NVME slot you can install the drive into. Turns out that most laptops in our company only have a single slot which is already taken by the primary OS disk.

Start Windows in Recovery mode

If there is a local administrator available on the system, you can possibly use that to launch into recovery mode and decrypt BitLocker from there. My system is Azure Active Directory joined and doesn't have a local administrator. That rules out this option as well.

Windows-to-go to the rescue

After going through many different options that all turned out to be fruitless, I remembered Windows-to-go. Windows-to-go is basically a way to launch Windows from a USB key. I downloaded Windows 1703 Enterprise Edition (there is a bug in 1809 that will cause a blue-screen)  from my visual studio subscription and used Rufus to create a Windows-to-go USB key.

Decrypt BitLocker OS drive of corrupted windows installation
Use Rufus to create a Windows-to-go key from any Windows ISO file

Insert the USB key into the troublesome system, select it as boot device (optionally turn off Secure Boot) and let it reboot a couple of times until you are presented with a Windows Desktop.

Find the affected drive in Windows Explorer (it will show a lock icon on the drive) and enter the BitLocker Recovery Key to unlock the drive. Now you can open the BitLocker Control Panel en decrypt the drive.

Decrypt BitLocker OS drive of corrupted windows installation
Turn off BitLocker

Use the Windows Image Download tool to create a setup drive

Now use the Windows Media Creation Tool to build a Windows 10 setup image and install a fresh Windows 10 over the broken one. You'll have to reinstall all your applications after this, but you won't lose any documents when done correctly.

Decrypt BitLocker OS drive of corrupted windows installation
Use the Media Creation Tool to create a bootable windows installation image
I had used Rufus before to create a bootable Windows Image, but somehow those can't install windows onto a GPT partitioned hard-drive, the USB key provisioned by this tool can.

Lessons learned

So as always when working with BitLocker:

  • Make sure you have a backup of your BitLocker Recovery Key. In my case it's uploaded to Azure Active Directory and stored in 1Password.
  • Create a local admin account with a very complex password in case of emergency. That way you can boot into a Windows Recovery Console and get your data out.
  • Have a Windows-to-go compatible USB key. I've used this one:

While not certified, it worked like a charm:

Decrypt BitLocker OS drive of corrupted windows installation

Yay!

While I have much of my data synced to a cloud storage account, Azure Backup configured, my projects in Azure DevOps or GitHub, it's still much easier to just have all my files where "they are supposed to be" on a drive where I can access them at 300 MB/s.

I'm up and running again.

Connect any version of Visual Studio to Azure DevOps or Team Foundation Server

$
0
0

Team Foudation Server and Azure DevOps, formerly known as Visual Studio Team Services (VSTS), Visual Studio Online (VSO) and Team Foundation Service.

Connect any version of Visual Studio to Azure DevOps or Team Foundation Server

Visual Studio has been around for a long time and there are still people developing in Visual Basic 6 or Visual Studio 2008. I sincerely hope these people store their sources securely, because these old IDEs and codebases will be causing them enough headaches. Even if you're using a more recent IDE, you could need one or more hotfixes and/or service packs.

To connect your IDE to Azure DevOps you usually need to make sure you have Team Explorer or an extension to your IDE installed. Team Explorer ships with Visual Studio in recent versions, but in older versions it was a separate installation.

The following variables make up what you need to install in order to be able to connect:

  • Your IDE and version
  • Your operating system and version
  • Whether connecting to Azure DevOps or TFS (and which version)

Note that for many of the items listed below, the order of installation is important. If you've previously installed any of the packages you'll need to uninstall them first or repair all packages in the order listed.

Microsoft has an official Client Compatiblity matrix. In addition to it, this post also lists the required hotfixes to make everything work.

If you also want to install the Team Foundation Server Power tools to match your Visual Studio/TFS version, check out this separate post.

Client compatibility matrix

See download instructions in the Installation section.

Visual Studio 2019

Supported operating system: Windows 10 1703+, Windows 8.1, Windows 7 SP1

Connects Git TFVC Install instructions
Azure DevOps yes yes yes Team Explorer 2019
Azure DevOps Server 2019 yes yes yes Team Explorer 2019
TFS 2018 yes yes yes Team Explorer 2019
TFS 2017 yes yes yes Team Explorer 2019
TFS 2015 yes yes yes Team Explorer 2019
TFS 2013 yes no yes Team Explorer 2019
TFS 2012 yes no yes Team Explorer 2019
TFS 2010 yes no yes Team Explorer 2019
TFS 2008 no no no
TFS 2005 no no no

Visual Studio 2017

Supported operating system: Windows 10, Windows 8.1, Windows 7 SP1

Connects Git TFVC Install instructions
Azure DevOps yes yes yes Team Explorer 2017
Azure DevOps Server 2019 yes yes yes Team Explorer 2017
TFS 2018 yes yes yes Team Explorer 2017
TFS 2017 yes yes yes Team Explorer 2017
TFS 2015 yes yes yes Team Explorer 2017
TFS 2013 yes no yes Team Explorer 2017
TFS 2012 yes no yes Team Explorer 2017
TFS 2010 yes no yes Team Explorer 2017
TFS 2008 no no no
TFS 2005 no no no

Visual Studio 2017 for Mac

Supported operating system: Mac

Connects Git TFVC Install instructions
Azure DevOps yes yes yes* Team Foundation Version Control for TFS and VSTS
Azure DevOps Server 2019 yes yes yes* Team Foundation Version Control for TFS and VSTS
TFS 2018 yes yes yes* Team Foundation Version Control for TFS and VSTS
TFS 2017 yes yes yes* Team Foundation Version Control for TFS and VSTS
TFS 2015 yes yes yes* Team Foundation Version Control for TFS and VSTS
TFS 2013 no no yes* Team Foundation Version Control for TFS and VSTS
TFS 2012 no no yes* Team Foundation Version Control for TFS and VSTS
TFS 2010 no no yes* Team Foundation Version Control for TFS and VSTS
TFS 2008 no no no
TFS 2005 no no no

Visual Studio Code

Supported operating system: Windows 10, Windows 8.1, Linux, Mac

Connects Git TFVC Install instructions
Azure DevOps yes yes yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
Azure DevOps Server 2019 yes yes yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2018 yes yes yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2017 yes yes yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2015 yes yes yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2013 yes no yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2012 yes no yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2010 yes no yes* Visual Studio Team Sevices extension
Team Explorer 2017 or Team Explorer Everywhere
TFS 2008 no no no
TFS 2005 no no no

Jetbrains / IntelliJ

Supported operating system: Windows 10, Windows 8.1, Linux, Mac

Connects Git TFVC Install instructions
Azure DevOps yes yes yes* Visual Studio IntelliJ Plugin
Azure DevOps Server 2019 yes yes yes* Visual Studio IntelliJ Plugin
TFS 2018 yes yes yes* Visual Studio IntelliJ Plugin
TFS 2017 yes yes yes* Visual Studio IntelliJ Plugin
TFS 2015 yes yes yes* Visual Studio IntelliJ Plugin
TFS 2013 yes no yes* Visual Studio IntelliJ Plugin
TFS 2012 yes no yes* Visual Studio IntelliJ Plugin
TFS 2010 yes no yes* Visual Studio IntelliJ Plugin
TFS 2008 no no no
TFS 2005 no no no

Eclipse

Supported operating system: Windows 10, Windows 8.1, Linux, Mac

Connects Git TFVC Install instructions
Azure DevOps yes yes* yes* Team Explorer Everywhere
egit
Azure DevOps Server 2019 yes yes* yes* Team Explorer Everywhere
egit
TFS 2018 yes yes* yes* Team Explorer Everywhere
egit
TFS 2017 yes yes* yes* Team Explorer Everywhere
egit
TFS 2015 yes yes* yes* Team Explorer Everywhere
egit
TFS 2013 yes no yes* Team Explorer Everywhere
TFS 2012 yes no yes* Team Explorer Everywhere
TFS 2010 yes no yes* Team Explorer Everywhere
TFS 2008 no no no
TFS 2005 no no no

Visual Studio 2015

Supported operating system: Windows 10, Windows 8.1, Windows 8

Connects Git TFVC Install instructions
Azure DevOps yes yes yes Team Explorer 2015
Azure DevOps Server 2019 yes yes yes Team Explorer 2015
TFS 2018 yes yes yes Team Explorer 2015
TFS 2017 yes yes yes Team Explorer 2015
TFS 2015 yes yes yes Team Explorer 2015
TFS 2013 yes no yes Team Explorer 2015
TFS 2012 yes no yes Team Explorer 2015
TFS 2010 yes no yes Team Explorer 2015
TFS 2008 no no no
TFS 2005 no no no

Visual Studio 2013

Supported operating system: Windows 10, Windows 8.1, Windows 8, Windows 7, Windows Vista, Windows XP

Connects Git TFVC Install instructions
Azure DevOps yes yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2013
Azure DevOps Server 2019 yes yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2013

| TFS 2018 | yes |yes |yes | Team Explorer 2013 |
| TFS 2017 | yes |yes |yes | Team Explorer 2013 |
| TFS 2015 | yes |yes |yes | Team Explorer 2013 |
| TFS 2013 | yes |no |yes | Team Explorer 2013 |
| TFS 2012 | yes |no |yes | Team Explorer 2013 |
| TFS 2010 | yes |no |yes | Team Explorer 2013 |
| TFS 2008 | no |no |no | |
| TFS 2005 | no |no |no | |

Visual Studio 2012

Supported operating system: Windows 10, Windows 8.1, Windows 8, Windows 7, Windows Vista, Windows XP

Connects Git TFVC Install instructions
Azure DevOps yes* yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2012
Visual Studio Tools for Git
Azure DevOps Server 2019 yes* yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2012
Visual Studio Tools for Git
TFS 2018 yes yes* yes Team Explorer 2012
Visual Studio Tools for Git
TFS 2017 yes yes* yes Team Explorer 2012
Visual Studio Tools for Git
TFS 2015 yes yes* yes Team Explorer 2012
Visual Studio Tools for Git
TFS 2013 yes no yes Team Explorer 2012
TFS 2012 yes no yes Team Explorer 2012
TFS 2010 yes no yes Team Explorer 2012
TFS 2008 no no no
TFS 2005 no no no

Visual Studio 2010

Supported operating system: Windows 10, Windows 8.1, Windows 8, Windows 7, Windows Vista, Windows XP

Connects Git TFVC Install instructions
Azure DevOps no yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2010
Azure DevOps Server 2019 no yes
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2010
TFS 2018 yes no yes Team Explorer 2010
TFS 2017 yes no yes Team Explorer 2010
TFS 2015 yes no yes Team Explorer 2010
TFS 2013 yes no yes Team Explorer 2010
TFS 2012 yes no yes Team Explorer 2010
TFS 2010 yes no yes Team Explorer 2010
TFS 2008 yes no yes Team Explorer 2010
TFS 2005 yes no yes Team Explorer 2010

Visual Studio 2008

Supported operating system: Windows 10, Windows 8.1, Windows 8, Windows 7, Windows Vista, Windows XP

Connects Git TFVC Additional requiremenrs
Azure DevOps no yes*
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2013
+ MSSCCI
Azure DevOps Server 2019 no yes*
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2013
+ MSSCCI
TFS 2018 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2013
+ MSSCCI
TFS 2017 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2013
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2015 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2013
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2013 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2013
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2012 yes no yes Team Explorer 2008
TFS 2010 yes no yes Team Explorer 2008
TFS 2008 yes no yes Team Explorer 2008
TFS 2005 yes no yes Team Explorer 2008

Visual Studio 2005

Supported operating system: Windows 10, Windows 8.1, Windows 8, Windows 7, Windows Vista, Windows XP

Connects Git TFVC Additional requiremenrs
Azure DevOps no yes*
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2012
+ MSSCCI
Azure DevOps Server 2019 no yes*
- Windows 10, 8.1, 8, 7 yes IE11
Team Explorer 2012
+ MSSCCI
TFS 2018 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2012
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2017 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2012
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2015 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2012
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2013 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2012
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2012 yes no yes*
- Windows 10, 8.1, 8, 7, Vista yes Team Explorer 2012
+ MSSCCI
- Windows XP yes Team Explorer 2010
+ MSSCCI
TFS 2010 yes no yes Team Explorer 2005
TFS 2008 yes no yes Team Explorer 2005
TFS 2005 yes no yes Team Explorer 2005

Visual Studio 2003, .NET, 6

Supported operating system: Windows XP

Connects Git TFVC Additional requiremenrs
Azure DevOps no no no
Azure DevOps Server 2019 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2018 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2017 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2015 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2013 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2012 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2010 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2008 yes no yes* Team Explorer 2010
+ MSSCCI
TFS 2005 yes no yes* Team Explorer 2010
+ MSSCCI

Installation

Code

Mac

Eclipse

JetBrains / IntelliJ

Team Explorer 2019

Team Explorer 2017

Team Explorer 2015

Team Explorer 2013

Install (or repair) in the following order:

Team Explorer 2012 + Visual Studio Tools for Git

Install (or repair) in the following order:

Team Explorer 2010

Install (or repair) in the following order:

Team Explorer 2013 + MSSCCI 2013

Install (or repair) in the following order:

Notes:

  • Do not use the Team Explorer tab or the Team sub menu to connect to TFS, instead use File, Source Control.
  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2012 + MSSCCI 2012

Install (or repair) in the following order:

Notes:

  • Do not use the Team Explorer tab or the Team sub menu to connect to TFS, instead use File, Source Control.
  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2010 + MSSCCI 2010

Install (or repair) in the following order:

Notes:

  • Do not use the Team Explorer tab or the Team sub menu to connect to TFS, instead use File, Source Control.
  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2008

Install (or repair) in the following order:

Notes:

  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2005

Install (or repair) in the following order:

Notes:

  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2008 + MSSCCI

Install (or repair) in the following order:

Notes:

  • Do not use the Team Explorer tab or the Team sub menu to connect to TFS, instead use File, Source Control.
  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Team Explorer 2005 + MSSCCI 2005

Install (or repair) in the following order:

Notes:

  • Do not use the Team Explorer tab or the Team sub menu to connect to TFS, instead use File, Source Control.
  • Connect using the http://server:port/tfs/ProjectCollection url format for Team Foundation Server.
  • Connect using the https://[account].visualstudio.com/ or https://dev.azure.com/[account]/ url format for Azure DevOps.

Configuration

Configuring the MSSCCI provider for Visual Studio

To use the MSSCCI provider in Visual Studio you must make sure you have the correct
Source Control provider selected. In your version of Visual Studio go to Tools,
Options, Source Control and select the MSSCCI provider:

Connect any version of Visual Studio to Azure DevOps or Team Foundation Server
Select the Team Foundation Server MSSCCI Provider

Then go to File, Source Control to open a project from source control.

You won't be able to trigger builds or access work items using the version of Visual
Studio you are now using. Instead you must start Team Explorer 2010 or higher to
interact with these features from Visual Studio.

When you use Visual Studio 2012 or higher to configure your Version Control mappings, you need to make sure you select a "Server Workspace".

Connect any version of Visual Studio to Azure DevOps or Team Foundation Server
Change workspace location to Server.

Picture used under Creative Commons. Thanks to XKCD.

Configure Visual Studio to use a different Git Credential Manager for Windows

$
0
0
Configure Visual Studio to use a different Git Credential Manager for Windows

Visual Studio ships with the Git credential Manager for Windows (GCMW)as part of its Team Explorer feature. This nifty little helper allows you to authenticate to Azure Repos among other git providers using your normal username and password and optional 2FA and it will handle the Personal Access Token + Renewal for you. There are cases when you need a specific (usually newer) version of the GCMW.

One of those current scenarios is when you want to access a Microsoft Account backed Azure DevOps organisation using an Azure Active Directory account:

Configure Visual Studio to use a different Git Credential Manager for Windows
You may receive "Git failed with a fatal error. Authentication failed for ..."

Similar issues have occurred in the past while trying to access BitBucket. The error was slightly different in this case, but the root cause was the same:

Git failed with a fatal error.
HttpRequestException encountered.
cannot spawn /C/Program Files (x86)/Microsoft Visual Studio/2017/Community/Common7/IDE/CommonExtensions/Microsoft/TeamFoundation/Team Explorer/Git/mingw32/libexec/git-core/git-askpass.exe: No such file or directory
could not read Username for 'https://github.com': terminal prompts disabled.

Solution

In all of these cases the recommendation is to upgrade the Git credential Manager for Windows and many posts on Stack Overflow will tell you to overwrite the files in your Visual Studio installation with the latest files from the GCMW repository. While this usually works (I've done this myself in the past), it can cause issues when installing a Visual Studio Update in the future.

It's better to point Git to a specific version of the Git Credential Manager for Windows. First install the latest version of the Git credential Manager for Windows. You can always find the latest version of the Git Credential Manager for Windows here. Then update your global git config:

c:\>git config --global --edit

Find the [credential] section and overwrite it with (update the path to the location where git-credential-manager.exe is installed on your system:

[credential]
    helper = C:\\\\Program\\ Files\\\\Git\\\\mingw64\\\\libexec\\\\git-core\\\\git-credential-manager.exe

This will ensure that all Git installations on your system will use this specific installation of GCMW.

Restart any open instances of Visual Studio and optionally clear your existing credentials from the Windows Credential Manager before trying again:

Configure Visual Studio to use a different Git Credential Manager for Windows

Azure DevOps Extension Tasks 1.2.19

$
0
0
Azure DevOps Extension Tasks 1.2.19

It's been a while since I last blogged about the Azure DevOps Extension Tasks. Here are the things that have changed since the last update:

  • Added support for publishing Visual Studio extensions.
  • Support for versioning localized Azure Pipeline tasks
  • Support for executing under Node 10
  • Fixed more promise issues
  • Renamed Visual Studio Team Services to Azure DevOps

Added support for publishing Visual Studio extensions

Utkarsh Shigihally added support for publishing Visual Studio extensions. The task is currently in preview and is missing some of the features of the other tasks, but it's a great starting point.

The task acts as a wrapper for vsixpublisher.exe and depends on the Visual Studio SDK being present on the agent.

Azure DevOps Extension Tasks 1.2.19
Visual Studio Extension publishing

Support for versioning localized Azure Pipeline tasks

Another contribution from the community adds support for overwriting task versions of localized build tasks. Thanks Ethan Dennis!

If you're using the "Extension Version" and "Override Tasks Version", the extension will now successfully update both the task.json and the task.loc.json.

Azure DevOps Extension Tasks 1.2.19

Support for executing under Node 10

Microsoft is adding support for Node 10 to the Azure Pipelines agent which will replace the Node 6 handler over time. It looks like some of our users have started to force this newer task handler on their agents.

So far we've been notified of one issue and further testing hasn't revealed other issues. I'm extending my test scripts to ensure we test against Node 10 going forward.

Fixed more promise issues

The last big update was triggered by a number of issues that were the result of unexpected async behavior in Node. We found a few more issues since that update and these have now been resolved.

We've also started using Typestrict to get early notification of these kinds of issues.

Renamed Visual Studio Team Services to Azure DevOps

It's unlikely you haven't noticed all the recent naming changes in Azure DevOps. Visual Studio Team Services is now called Azure DevOps. Build is now called Azure Pipelines etc.

This change isn't just superficial. All the libraries we depend on have also been renamed. I've previously blogged about the steps required to fix your dependencies and now all of these tasks are also completely up to date.

Is there anything you'd like to see in the Azure Extension Tasks? Do you have a few powerful scripts tugged away in your own pipeline that could be valuable to the community? Please, do submit an issue or, even better a pull request.

Use Visual Studio 2017 or 2019 as merge tool in Tower

$
0
0
Updated with Visual Studio 2019 support.
Use Visual Studio 2017 or 2019 as merge tool in Tower

I recently started using Tower as my Git client in Windows, it's great in many aspects, but it doesn't ship with any Diff/Merge capabilities. It relies on 3rd party tool to supply that feature. It comes with a long list of supported tools, but my two default editors aren't part of that list. Visual Studio and Visual Studio Code.

To enable Visual Studio 2017 or 2019 in Tower all you need to do is put the following file here: %LOCALAPPDATA%\fournova\Tower\Settings\CompareTools\vs2019.json:

{
  "DisplayName":           "Visual Studio 2019",
  "MinimumVersion":        "",
  "SupportsDiffChangeset": true,
  "SupportsDirectoryDiff": false,
  "DiffToolArguments":     "$LOCAL $REMOTE //t",
  "MergeToolArguments":    "$REMOTE $LOCAL $BASE $MERGED //m",
  "ApplicationRegistryIdentifiers": [
  ],
  "ApplicationPaths": [
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Preview\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Preview\\Enterprise\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Preview\\Professional\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Preview\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",  
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Preview\\TeamExplorer\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",  
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Enterprise\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Professional\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2019\\TeamExplorer\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\Preview\\Enterprise\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\Preview\\Professional\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\Preview\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",  
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\Preview\\TeamExplorer\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",  
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2017\\Enterprise\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2017\\Professional\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2017\\Community\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe",
      "%ProgramFiles(x86)%\\Microsoft Visual Studio\\2017\\TeamExplorer\\Common7\\IDE\\CommonExtensions\\Microsoft\\TeamFoundation\\Team Explorer\\vsDiffMerge.exe"
  ]
}

It should automatically detect the Enterprise, Professional or Community Edition of Visual Studio 2017, 2019 and 2017 Preview and 2019 Preview, or the stand-alone installation of Team Explorer.

Make sure you update your preferences in Tower to select it (you need to restart Tower to detect your custom merge tool changes):

Use Visual Studio 2017 or 2019 as merge tool in Tower
Select Visual Studio 2017 as your Diff and Merge tool.

Due to changes in how Microsoft stores the Installation Path it's not easy to grab this from the registry, so if you're not using the default installation path, you may need to update the search paths above.

Visual Studio not your tool of choice? It's pretty straightforward to add your own!

Photo Credits: Phil Dolby.

Viewing all 216 articles
Browse latest View live