How to read Azure Dev Ops logs from Node.js using REST API

Post Editor

Turns out the logs response is a zip file which contains multiple log file entries. We'll get PAT for authorization, fetch and unzip them...

8 min read
post
Cover photo by Monika Grabkowska on Unsplash.

How to read Azure Dev Ops logs from Node.js using REST API

Turns out the logs response is a zip file which contains multiple log file entries. We'll get PAT for authorization, fetch and unzip them...

post
Cover photo by Monika Grabkowska on Unsplash.
post
Cover photo by Monika Grabkowska on Unsplash.
8 min read
8 min read

Step by step guideLink to this section

Getting some logs can't be that hard, can it? I can surely do that in a few lines of code!

Well, that's what I thought... initially. It turns out it's more than just calling a GET endpoint.

Some of it is that the response from the logs endpoint is actually a zip file. And inside that there are multiple file entries - one for each pipeline task. Plus there's the authorization part. So...

This article will take you step by step from a blank file to having the logs from your Azure Dev Ops Release pipeline. As you may know they are available online, but to get to them there are a couple of steps/clicks one must go through. You may want to get the logs and process them programmatically. For example I had to check if a particular string is part of the Release pipeline logs.

Prerequisites:

What we'll doLink to this section

  • Start with a blank node.js project and include dependencies - axios and yauzl.
  • Get a personal access token(PAT) from Azure Dev Ops and store it in an environment variable. Use that for authorization.
  • Get the zipped logs via the Azure Dev Ops REST API.
  • Unzip in memory and read the text contents.
  • We'll read logs out of a Release Pipeline run, but at the end there is a section on how to convert the script and read a Build Pipeline script.

If you only want the finished script here's the gist for reading logs from a Release Pipeline and Build Pipeline. I've left reminders // TODO Replace with your own for the variables.

My setupLink to this section

I'll be using ts-node because I prefer typescript safety and don't want to have to deal with the transpilation step. So instead of node index.js I'll do ts-node index.ts. The script should work as plain js, after the types are removed if you so prefer.

My shell is bash running inside Windows Subsystem for Linux (WSL).

1. StartLink to this section

In a folder azdo-logs initialize a node package:

<>Copy
mkdir azdo-logs cd azdo-logs npm init -y

Expect to see an output similar to:
1-init-package

Create index.ts file and include these lines:

<>Copy
/// <reference types="node" /> const accessToken = process.env.AZURE_ACCESS_TOKEN; if (accessToken == null || accessToken === '') { throw new Error('Please provide an access token'); } else { console.log('token is present!'); }

We'd like to be sure the token is there, safely hidden in your private environment variable and NOT checked in with the code!

The reference on top gives us access to the nodejs types. You might need to install them as a dev dependency:

<>Copy
npm i @types/node -D

2. Add dependenciesLink to this section

Globally install ts-node and typescript to execute our script.

<>Copy
npm i -g ts-node typescript

Install axios and yauzl in our package. The flag -s will save them to our package.json. And @types/yauzl will give us typing, adding it to devDependencies with the -D flag

<>Copy
npm i axios yauzl -s npm i @types/yauzl -D

2-deps

This is how package.json looks like now:

<>Copy
{ "name": "azdo-logs", "version": "1.0.0", "description": "", "main": "index.js", "scripts": { "test": "echo \"Error: no test specified\" && exit 1" }, "keywords": [], "author": "", "license": "ISC", "dependencies": { "axios": "^0.19.2", "yauzl": "^2.10.0" }, "devDependencies": { "@types/yauzl": "^2.9.1" } }

3. Get Azure Dev Ops tokenLink to this section

It can be acquired via the profile menu

  • open personal access tokens page
    3-PAT
  • create a new access token with Release Read permission

4-PAT-create

  • store it because you will not be able to see it anymore(you'll be able to recreate it if you lose it)
    5-pat-success

Finally place that in an environment variable on you local machine or in safe storage (e.g. secret environment variable)

<>Copy
export AZURE_ACCESS_TOKEN = "token-placeholder-not-actual-thing"; # replace token-placeholder-not-actual-thing with your token

or Windows command line

<>Copy
set AZURE_ACCESS_TOKEN="token-placeholder-not-actual-thing";

I've added this line to my .bashrc file so the PAT is available on bash start and I don't have to remember to export it every time I start a terminal.

6-pat-bashrc

On Windows you can add it to your Environment Variables. Keep in mind you'll need to restart your session (logout/login) for the env variables to take effect.

Now run ts-node index.ts and you should see
7-token-present
For more details on how to get personal access token see this link.

Ok - we now have the token and dependencies!

4. Get the Azure Dev Ops Organization and ProjectLink to this section

To get the logs, we'll need the organization and project names as well as the release id we'd like to read the logs of. The latter would increment for each subsequent run - release 1, 2, 3 so we'd need to provide it per call. For this example, I'll target the release pipeline for a package I maintain.

The project name and organization we can get from the Azure Dev Ops UI:

8-org-proj-name

In my case, it's organization 'gparlakov' and project 'Scuri'. Add those lines in index.ts and replace with your org and project names:

<>Copy
const project = 'Scuri'; const organization = 'gparlakov';

5. AuthorizationLink to this section

To get authorization for the API endpoint using a personal access token (PAT) we need to send a header with the token encoded in base64 format adhering to a specific contract. Add the following at the and of index.ts:

<>Copy
const headers = { Authorization: `Basic ${Buffer.from(`PAT:${this.token}`).toString('base64')}`, 'X-TFS-FedAuthRedirect': 'Suppress', // we can't handle auth redirect so - suppress }; export const axiosInstance = axios.create({ baseURL: `https://vsrm.dev.azure.com/${organization}/${project}/_apis/`, headers: headers, });

We need to import the axios module, at the top of index.ts

<>Copy
import axios from 'axios';

6. Get the logs for your releaseLink to this section

For this example, I'll use an actual release with the id of 58 (replace with your own). Appending to index.ts:

<>Copy
const releaseId = 58; axiosInstance .get(`release/releases/${releaseId}/logs`, { responseType: 'stream', }) .then((logs) => { if (logs.status != 200) { throw new Error('logs missing'); } console.log('Received bytes:', logs.data.read().length); });

Running ts-node index.ts should yield something similar to:

9-logs-present

That proves we are authorized to use this REST API endpoint!

7. Unzip the logsLink to this section

Delete or comment out the console.log line - we'll not need it for now and change the axiosInstance call so it looks like this:

<>Copy
axiosInstance .get(`release/releases/${releaseId}/logs`, { responseType: 'stream', }) .then((logs) => { if (logs.status != 200) { throw new Error('logs missing'); } return readLogs(logs.data); }) .then(({ logs }) => { console.log(logs); });

and finally the readLogs function:

<>Copy
function readLogs zipBuffer: NodeJS.ReadableStream ): Promise<{ logs: string; skippedFor: Error[] }> { // we'll reject the promise when we can't read anything from the zip // and resolve it when we could read (some) plus add the errors for the skipped parts // in the end we'd like to say - yes the logs contain the Proof OR no the logs do not contain the proof but there were skipped parts return new Promise((res, rej) => { const es: Error[] = []; const zipChunks: any[] = []; zipBuffer.on('data', (d) => zipChunks.push(d)); zipBuffer.on('end', () => { yauzl.fromBuffer(Buffer.concat(zipChunks), { lazyEntries: true }, function (err, zipfile) { // can not even open the archive just reject the promise if (err) { rej(err); } if (zipfile != null) { const chunks: any[] = []; zipfile.on('entry', function (entry) { if (/\/$/.test(entry.fileName)) { // Directory file names end with '/'. // Note that entries for directories themselves are optional. // An entry's fileName implicitly requires its parent directories to exist. zipfile.readEntry(); } else { // file entry zipfile.openReadStream(entry, function (err, readStream) { if (err) { es.push(err); // skip this one - could not read it from zip zipfile.readEntry(); } if (readStream == null) { // just skip - could not get a read stream from it es.push( new Error( 'Could not create a readable stream for the log ' + (entry || {}).fileName || '<missing file name>' ) ); zipfile.readEntry(); } else { readStream.on('data', (c) => chunks.push(c)); readStream.on('error', (e) => { es.push(e); // skip this one - could not read it from zip zipfile.readEntry(); }); readStream.on('end', function () { zipfile.readEntry(); }); } }); } }); zipfile.once('end', function () { zipfile.close(); res({ logs: Buffer.concat(chunks).toString('utf8'), skippedFor: es }); }); zipfile.readEntry(); } else { // can't read the archive - reject the promise rej(new Error('Could not read the zipfile contents')); } }); }); }); }

There seems to be a lot going on here. It boils down to working with 3 streams.

  • First, we read the zipFile push into the zipChunks and concat those into a Buffer.
  • Then, use that Buffer in the yauzl.fromBuffer() call which returns an object that has a readEntry() method. I think of it as a next, because it reads the next entry in the archive.
  • We get a readStream for each zip file entry. That is a ReadableStream that we push into the chunks.
  • Finally, we concat all files' chunks into a buffer and read a string out of it:
    <>Copy
    Buffer.concat(chunks).toString('utf8');

Done!Link to this section

We now have a string variable containing all our logs!

Here's a gist of the final index.ts. I've left reminders // TODO Replace with your own for the variables.

Reading a build pipeline logsLink to this section

To read the logs from a build pipeline, we would need to

  1. Add the "Build: Read" permission to our token or issue a new one with that permission:
    10-permissions-build
  2. Change a bit (just remove one piece) the auth logic:
    <>Copy
    const headers = { Authorization: `Basic ${Buffer.from(`:${this.token}`).toString('base64')}`, 'X-TFS-FedAuthRedirect': 'Suppress', // we can't handle auth redirect so - suppress };
  3. Change the base URL:
    <>Copy
    export const axiosInstance = axios.create({ baseURL: `https://dev.azure.com/${organization}/${project}/_apis/`, headers: headers, });
  4. Change the endpoint address and provide a build number (in my case I'll use this build)
    <>Copy
    const buildId = 200; axiosInstance.get(`build/builds/${buildId}/logs`, { responseType: 'stream', headers: { accept: 'application/zip', }, });

Here's the final script.

Memory consumption noteLink to this section

This whole approach keeps a few buffers in memory, basically copying the zip file a few times* in memory. Considering that we are reading pipeline logs, this should not be a problem. I expect they won't be too large. If that's a problem for you, store the archive locally (though that may be a security consern as Samuel Attard @marshallofsound pointed out) and then use the other method of yauzl

<>Copy
logs.data.pipe(fs.createWriteStream('my-temp-zip-file.zip')) yauzl.open('my-temp-zip-file.zip', { lazyEntries: true }, function(err, zipfile) { //... same code from here on down

*the response stream, the chunks, the buffer, the zip content chunks, their buffer and finally the string

ResourcesLink to this section

  • Restful API docs - really helpful
  • nodejs client for the API (but its around 116k minified+GZipped! according to bundlephobia ~830k worth of script for your runtime to parse - for each request)
  • Docs for axios
  • Docs for yauzl
Discuss with community

Share

About the author

author_image

Author of Angular libs like SCuri (Angular unit test automation!) and ngx-forms-typed (type your Angular forms!) that make developer's work easier. https://gparlakov.github.io/

author_image

About the author

Georgi Parlakov

Author of Angular libs like SCuri (Angular unit test automation!) and ngx-forms-typed (type your Angular forms!) that make developer's work easier. https://gparlakov.github.io/

About the author

author_image

Author of Angular libs like SCuri (Angular unit test automation!) and ngx-forms-typed (type your Angular forms!) that make developer's work easier. https://gparlakov.github.io/

NxAngularCli
NxAngularCli
NxAngularCli

Featured articles