Categories
development programming security

Litterboxing

Sandboxing is a great idea. Isolate your processes. Isolate your data. Chinese Walls everywhere to ensure everything is kept separate, independent and secure. Whether that’s containers or SELinux

Litterbox – it’s like a sandbox but when you look closer it’s full of shit

“There’s the inkling of a good idea in there. He’s almost describing a sandbox (without, of course, any of the isolation required) and front-end microservices (without a pipeline, or components). Can we call this pattern litterboxing? It’s a sandbox until you look closer and see it’s full of catnip.”

https://thedailywtf.com/articles/comments/classic-wtf-i-am-right-and-the-entire-industry-is-wrong/1#comment-506416

Categories
cloud development programming

Cloud thinking : Executable documentation.

Documentation is just comments in a separate file. At least developers can see the comments when they change code. Tests are better comments. Tests know when they don’t match the code.

Infrastructure is the same. I can write a checklist to set up an environment, and write an architecture diagram to define it, but as soon as I change something in production, it’s out of date, unless it’s very high level, and therefore only useful to provide an outline, not detail.

Unit tests document code, acceptance tests document requirements, code analysis documents style and readability, and desired-state-configuration documents infrastructure. All of them can be checked automatically every time you commit.

Documentation as code means the documentation is executable. It doesn’t always mean it’s human readable; ARM templates in particular can be impenetrable at times. If machines understand it, the documentation can be tested continuously, repeated endlessly across multiple environments, reconfigured and redeployed at the stroke of a keyboard.

The more human you have in a process, the more opportunities for human error. It doesn’t remove mistakes, but it’s much easier to stop the same mistake happening twice.

Categories
development programming

Modular doesn’t mean interchangeable

Microservices are great. Containers are great. Packages are great. Objects are great. Functions are great. At each level they encapsulate functionality into re-usable modules. One place to make changes. One place to optimise. One abstraction to help you move to the best layer.

For the sake of argument though, I’m going to focus on microservices, specifically those accessed via an API, as that’s what I know best.

Interfaces and APIs are good for test harnesses, but isolating systems for your own benefit (to make modules open for extension but closed for change) bears no relation to making that module useful to other systems.

Sure, I could rip out the modules that contain business logic, and package it as a framework, but that doesn’t mean it’s useful for anyone else. It could well be an inner system . It may still be coupled to domain/architectural knowledge that make it too valuable to swap out. There could be an implicit dependency for just that one version of that one tool.

It is of course possible to make modules re-usable, but they have to be designed that way, whether up front, or evolved. And anything interchangeable has to be tested, in at least 2 conditions, to make sure it’s suitably robust.

Design an interface. Evolve the design. But don’t mistake solving one problem in one place for a general solution. And don’t think that adding an interface is all it takes to turn a local SQLite data store into a cloud file system repository.

Categories
code development programming

Software Development as a Christmas Story

Gingerbread Christmas Tree
Oh Tanenbaum

We decorated the house for Christmas. A smaller project than most of the software I’ve worked on, but a useful reflection.

The Cost of context switching

Sometimes, in the middle of one task, you need to do another. Either because you were interrupted, or because you weren’t prepared. Consider the tree, a tall tree, one you need a ladder to decorate. But you left the 🌟 in the attic.

You don’t just have the cost of picking up the star, you need to get down the ladder, fold it up, carry it upstairs, open the attic, unfold the ladder, climb to the attic, find the star, then pick it up and unwind all the other steps just to get back where you stated before you can continue the task of putting the Star on the tree.

It’s just a minute to get the star…

Yak shaving

Sometimes it’s more than just one thing. Sometimes to get to where you need to go, there’s a cascade of other tasks.

You want to put the tree up, but it’s the first time you’ve had a real tree, so you need a new base. It’s bigger than last year’s tree, so your lights don’t fit. So you need to go to the shops. But you need to fill up your car. And the tyre pressure warning light comes on so you need to top them up. And you need to check the tread depth whilst you’re there, and so on.

Programming in a nutshell

Performance at scale

Our tree stood up fine when it was delivered. But as it scaled out and the branches widened, it pushed against the wall, making it unstable in the condensed space. It fell over.

Luckily no lights or baubles were using it at the time, but it’s an interesting challenge holding up a heavy tree in one hand, trying to adjust it’s position with the other hand, as I avoided the puddles of water in the floor as my wife mopped them up. If you’ve ever worked support, this may sound familiar.

Turns out that it was harder to stabilise than I anticipated.

The branches were unevenly partitioned, providing more load on one side, so I had to stabilise it against the wall. And the tree was almost a foot taller than expected, which turned it out to be 2 foot taller than the base was rated for.

We upgraded the base to handle the load. It’s bigger than we need.

Technical debt

I have some old pre-LED tree lights and they slowly fail so each year I replace the unlit ones from the packet of spares but I haven’t been replacing the spares. Eventually, they will run out, and the spares will no longer be available. I’ll have to throw them all out and start again.

The new led ones don’t let me replace them individually. But they last longer and are cheaper to run.

Which debt is easier to live with?

With a big tree, those old lights aren’t long enough. So, do I buy another set for the rest of the tree that doesn’t match them, or throw out the existing ones and buy a longer set? The latter looks better, but throws out the sunk cost along with the lights.

You know computers, can you look at this for me?

No, I can’t fix your tree. I can navigate a garden centre. I know enough physics to keep a tree upright, eventually, and safely put the angel on, but I’ve no Idea how to grow one, or what that weird stand with 17 gears you bought does, or how to assemble your plastic one. And I’ve no idea what that bug in the tree is.

Merry Christmas. Catch you all in the New Year.

Categories
code development programming

How I approach coding challenges

This time last year I was doing a lots of tests for job interviews, and I was practicing with https://adventofcode.com

I know there’s various data structures and algorithms you can learn to make these easier (and I hope to come back to some of my favourites in a later post), but there’s a general approach I take to these that helps me, and might help some of you too.

Use TDD

I’ve always been a fan of Test-Driven Development, and these type of puzzles lend themselves well to TDD thinking. Often examples are provided that will form the initial test suite. The algorithm is often broken into steps (sometimes explicitly “On every clock tick…”)

Get into the habit of checking as you go to make the most of the information provided to you. Think about the problem first, and the mapping between the input and the output, before you write any code.

Check everything on simple solutions.

Have templates

Most challenges involve a few common tasks. Read input from a file (ignoring blank lines), create a list of…, output as CSV, etc.

Some of these will have standardised patterns in your chosen language, and you should use them. Some will likely require some error handling. Some will require loading from standard or 3rd party libraries. And all will require a test framework.

As you go, build up a utilities list of functions, classes and packages that you know how to use, are robust enough and have minimal clutter. They’re not meant to be usable by anyone but you, but you will use them a lot. Collect and cherish them.

Think simple

Technical tests and coding challenges, in general, are designed to have short, easy solutions. If you need to solve P == NP to make your code fast, it’s likely you’re using an algorithm that’s a poor fit. Find 2 other ways to solve the problem, and pick the simpler one.

Simpler than that

Think about the simplest solution that could possibly work and write it simply.
Make the code readable. This is not the time for clever solutions. When it fails on an input, make it easy to change.

But not that simple

You still need to think about edge cases, and memory usage, and a myriad of other resource constraints. Think about what could go wrong, within the confines of the puzzle. Advent of Code has a particular knack for detecting edge cases in your code for Part 2 of each day’s challenge.

Think fast

Beware of premature optimisation, but expect long inputs and many loops if you use the naive implementation. Learn new algorithms and data structures.

Categories
development programming

Somebody else’s problem

Sometimes dividing tasks creates inefficiency. When one person both cooks and cleans, they tend to create less washing up than if the tasks are divided, because washing becomes “somebody else’s problem”. However, when one person washes and the other dries, the load is equivalent. This setup can be prone to stoppages when the drainer is empty and the washer is trying to get that stubborn burnt bit off the pan, at which point any good ToC practitioner will ask the drier-upper to help with the washing to maintain throughput.

When teams are divided by layer instead of features, the interface between them becomes a fracture because each team has context missing from the other. Data has the wrong shape, validation requires fields that can’t be captured. If you build both sides, the transition is smooth.

When developers test, they can focus on smaller, independent chunks, which means fewer tests with greater coverage. Writing integration tests for complex calculations is painful and slow. Testing those same calculations as an xUnit Theory is cleaner and faster.

Specialists are great, but where is specialisation causing fractures and more work?

Categories
code development programming security

This is raw

This is raw chicken : 🐀

If you eat it like that, you may get hurt immediately, by its beak, or its claws. It may grab your money and run off with it.

If you want to eat it, better to kill it first. πŸ’€

If you eat it like that, you may get hurt or die, in a few hours, or days. Washing it won’t help.

Cook it. Cook it well. If there’s any sign of pink, cook it some more. πŸ”₯

It might still kill you, but at least you’re a lot safer than when you started.


This is raw data : 🐀

If you display it, users will get hurt immediately, whether by cross-site scripting, cookie sniffing, crypt-currency mining, or something else. If you’re lucky, it will be something your user’s see immediately and leave your site never to return. Otherwise they may get infected.

If you want to use it, better to validate it first. πŸ’€

If you save it like that, your users are still vulnerable. It might appear on the front end in a different form. It might be a string of unicode characters that crashes your phone. It might be a link to somewhere they can’t trust.

Encapsulate it. Sandbox it. Never trust it, in or out . HTML encode, whitelist the output as well as the input.

And if you need to avoid spam, or incitement, or solicitation, maybe you need editors. Computers can’t fix all the social problems. πŸ”₯

Categories
code development programming

Translations with React, Redux and Asp.Net

In a recent project, which was the first time I’ve used React and Redux in anger, we had a requirement to support 2 different languages, in the .Net backend for emails and PDFs, and in the React frontend.

As the translators we used were used to resx files, we wanted to use those as the master source. In other projects we’ve done the javascript translations via a pre-compilation step into static javascript files, but since redux has it’s own store, I decided to see if we could use that to store and process the translations.

This approach has the advantage that we can use the redux state to translate the site automatically when a new language is selected, without having to reload any pages. For an application that depends on up-to-date data, and had to operate on por data connections, avoiding reloads is essential.

The redux examples shown here use TypeScript, which certainly helped in our development process, but there’s a lot of gotchas getting the dotnet new react-redux template and many 3rd party react and redux libraries working nicely with TypeScript. You can make it work, but it’s definitely not an out-of-the-box solution in most cases.

The solution consists of 3 parts:

  1. The Asp.Net Core 2.0 controller to translate the resx data into JSON (in this example, the resource file is called UserFacingStrings.resx and the default is en-US,
  2. The Redux configuration to retrieve the data and populate it into the store; and
  3. A React function that reads the translation from the store and presents it to the user.

The solution below attempts to get the localisation from the user headers, so you’ll need to enable localisation in your Startup.cs, but depending on your use case, you may want save user language settings in the user’s account data, or in a cookie or other storage in their browser.

ASP.Net Core API controller

[Produces("application/json")]
[Route("api/Localisation")]
public class LocalisationController : Controller
{
    private static Dictionary _cultureCache = new Dictionary();

    /// Get all known translations
    /// Choose a culture (e.g. `en-US` )
    /// will default to browser culture if not specified
    [HttpGet("[action]/{culture?}")]
    public IActionResult Translations(string culture)
    {
        try
        {
            var cultureInfo = CultureInfo.CurrentUICulture;
            try
            {
                string preferredLanguage = HttpContext.GetPreferredLanguage();
                if (!string.IsNullOrWhiteSpace(culture))
                {
                    cultureInfo = CultureInfo.GetCultureInfo(culture);
                }
                else if (!string.IsNullOrWhiteSpace(preferredLanguage))
                {
                    cultureInfo = CultureInfo.GetCultureInfo(preferredLanguage);
                }
            }
            catch (CultureNotFoundException)
            {
                // Fallback to en-US
            }

            if (_cultureCache.Keys.Contains(cultureInfo.ToString()))
            {
                return new OkObjectResult(new { t = _cultureCache[cultureInfo.ToString()] });
            }

            var translations = new Dictionary();

            // Insert all resources, to then be overwritten
            if (cultureInfo.TwoLetterISOLanguageName != "en")
            {
                ExtractResources(translations, UserFacingStrings.ResourceManager.GetResourceSet(CultureInfo.GetCultureInfo("en-US"), true, true));
            }

            ExtractResources(translations, UserFacingStrings.ResourceManager.GetResourceSet(cultureInfo, true, true));

            _cultureCache[cultureInfo.ToString()] = translations;

            return new OkObjectResult(new { t = translations });
    }
    catch (Exception)
    {
        return NotFound();
    }
}

private static void ExtractResources(Dictionary translations, ResourceSet resourceSet)
{
    foreach (DictionaryEntry res in resourceSet)
    {
        translations[res.Key.ToString()] = res.Value.ToString();
    }
}

Browser Extension

public static class BrowserDetailsExtensions
{
    public static string GetPreferredLanguage(this HttpContext http)
    {
        return http?.Request
            ?.Headers["Accept-Language"].ToString()
            ?.Split(',').FirstOrDefault()
            ?.Split(';').FirstOrDefault();
    }
}

Redux plumbing

CultureActions.ts

import { AppThunkAction } from '../store/index';
import { fetch } from 'domain-task';
import { CALL_API } from 'redux-api-middleware';
import { actionCreatorFactory } from 'typescript-fsa';
import { Translations } from '../reducers/CultureState';
import { Dictionary } from 'lodash';

const API_ADDRESS: string = "/api/Localisation/";
const API_GET_TRANSLATIONS: string = "Translations/";
const API_LANGUAGE_CODE: string = ""; // "pt-BR", "en-GB"
const API_REQUEST_TYPE_GET: string = "GET";

export interface ApiError {
    name: 'ApiError',
    status: number,
    statusText: string,
    response: string
}

export interface GetTranslationsAction {
    type: 'RECEIVE_TRANSLATIONS';
    t: Dictionary;
};

export interface TranslationsApiPosted {
    type: 'Translations_REQUEST_POSTED';
    payload: never;
};

export interface TranslationsApiFailed {
    type: 'Translations_FAILURE';
    payload: ApiError;
    error: true;
};

export interface TranslationsApiSuccess {
    type: 'Translations_SUCCESS';
    t: Dictionary;
}

// Declare a 'discriminated union' type. This guarantees that all references to 'type' properties contain one of the
// declared type strings (and not any other arbitrary string).
export type CultureAction = GetTranslationsAction;

const actionCreator = actionCreatorFactory();
export const translationsAwaitingResponse = actionCreator('Translations_REQUEST_POSTED');
export const translationsSuccess = actionCreator('Translations_SUCCESS');
export const translationsFailure = actionCreator('Translations_FAILURE');

// ACTION CREATORS don't directly mutate state, but they can have external side-effects (such as loading data).
export const cultureActions = {
    getTranslations: (): AppThunkAction => (dispatch, getState) => {
        console.log("REQUEST TRANSLATIONS ACTION")
        const rsaaRequestAuctions = {
            [CALL_API]: {
                credentials: 'same-origin',
                endpoint: API_ADDRESS + API_GET_TRANSLATIONS + API_LANGUAGE_CODE,
                method: API_REQUEST_TYPE_GET,
                types: ['Translations_REQUEST_POSTED', 'Translations_SUCCESS', 'Translations_FAILURE']
            }
        }
        dispatch(rsaaRequestAuctions);
    },
};

CultureStore.ts and the tr() method

import { Dictionary } from "lodash";
import * as He from 'he';
import { Decimal } from "decimal.js";

export type CultureState = {
    strings: Translations
};

export interface Translations {
    t: Dictionary
};

export const emptyCulture: CultureState = {
    strings: {
        t: {}
    }
};

export function tr(culture: CultureState, key: string): string {
    try {
        return He.decode(culture.strings.t[key]) || key;
    } catch (error) {
        console.log("translation for '" + key + "' not found");
        if (key == undefined || key == null) {
            return "- -";
        }
        return "-".repeat(key.length);
    }
}

// Returns the key value itself if no match is found in the resx
export function trFallback(culture: CultureState, key: string): string {
    try {
        return He.decode(culture.strings.t[key]) || key;
    } catch (error) {
        return key;
    }
}

export function trFormat(culture: CultureState, key: string, args: string[]): string {
    return formatString(tr(culture, key), args);
}

export function formatDecimalAmount(subject: Decimal): string {
    return formatAmount(subject.toNumber());
};

export function formatAmount(subject: number): string {
    return subject.toLocaleString(navigator.language, { maximumFractionDigits: 2 });
};

function formatString(subject: string, args: string[]): string {
    if (subject === undefined) {
        return "";
    }
    return subject.replace(/{(\d+)}/g, function (match, number) {
        return typeof args[number] != 'undefined'
            ? args[number]
            : match
        ;
     });
};

CultureReducer.ts

import { Action, Reducer } from 'redux';
import { isType } from 'typescript-fsa';
import { Translations, emptyCulture, CultureState } from './CultureState';
import { ApiError, CultureAction, translationsAwaitingResponse, translationsFailure, translationsSuccess } from '../actions/CultureActions';
import { Dictionary } from 'lodash';

export interface CultureStateR extends CultureState { }

const unloadedState: CultureStateR = emptyCulture;

export const reducer: Reducer = (state: CultureStateR, incomingAction: Action) => {
    const action = incomingAction as CultureAction;

    if (typeof state === undefined) {
        return unloadedState;
    }

    if (isType(incomingAction, translationsAwaitingResponse)) {
        return {
            strings: state.strings
        };
    }
    if (isType(incomingAction, translationsFailure)) {
        return {
            strings: state.strings
        };
    }
    if (isType(incomingAction, translationsSuccess)) {
        const rawTranslations: Dictionary = incomingAction.payload.t;
        return {
            strings: { t: rawTranslations }
        };
    }

    switch (action.type) {
        case 'RECEIVE_TRANSLATIONS':
            return state;

        default:
            break;
    }

    return state || unloadedState;
}

Use in account controller

import { CultureState, tr } from '../reducers/CultureState';
/// Some imports removed for clarity

/// Add CultureState into props for this component
type AccountProps = AccountState & CultureState & typeof accountActions & RouteComponentProps;

export class Account extends React.Component {
}

Use tr() in view tsx file

</pre>
<h2>{tr(this.props, "PersonalDetails")}</h2>
<table>
<tbody>
<tr>
<td>{tr(this.props, "Name")}:</td>
<td>{this.props.user.userName}</td>
<td><a href="/account/profile">{tr(this.props, "Edit")}</a></td>
</tr>
<tr>
<td>{tr(this.props, "Company")}:</td>
<td>{this.props.user.companyName}</td>
</tr>
<tr>
<td>{tr(this.props, "Email")}:</td>
<td>{this.props.user.emailAddress}</td>
</tr>
</tbody>
</table>
<pre>
Categories
development programming security

NMandelbrot : running arbitrary code on client

As part of my grand plan for map-reduce in JavaScript and zero-install distributed computing, I had to think about how to gain user trust in a security context where we don’t trust the server. I couldn’t come up with a good answer.

Since then, we’ve seen stories of malicious JavaScript installed to mine cryptocurrenciesΒ , we know that JavaScript can be exploited to read kernel memory, including passwords, on the client, and I suspect we’ll see a lot more restrictions on what JavaScript is allowed to do – although as the Spectre exploit is fundamentally an array read, it’s going to be a complex fix at multiple levels.

I had ideas about how to sandbox the client JavaScript (I was looking at Python’s virtualenv and Docker containers to isolate code, as well as locking them into service workers which already have a vastly more limited API), but that relies on the browser and OS maintaining separation, and if VMs can’t maintain separationΒ with their levels of isolation, it’s not an easy job for browser developers, or anyone running on their platform.

I also wondered if the clients should be written in a functional language that transpiled to JavaScript, to have language level enforcement of immutability and safety checks. And of course, because a functional style and API provides a simpler context to reason about map-reduce, by avoiding any implicit shared context.

Do you allow someone else’s JavaScript on your site, whether a library, or a tracking script, or random ads from Russia, Korea, botnets and script kiddies? How do you keep your customers safe? And how do you isolate processes you trust from processes that deal with the outside world and users? JavaScript will be more secure in the future, and the research is fascinating (JavaScript Zero: real JavaScript, and zero side-channel attacks) but can you afford to wait?

Meltdown and Spectre shouldn’t change any of this. But now is a good time to think about it. Make 2018 the year you become paranoid about users, 3rd parties and other threats. The year is still young, but the exploits are piling up.

 

Categories
development programming

Cloud thinking : storage as data structures

We’ve all experienced the performance implications of saving files. We use buffers, and we use them asynchronously.

Azure storage has Blobs. They look like files, and they fail like files if you write a line at a time rather than buffering. But they’re not files, they’re data structures, and you need to understand them as you need to understand O(N) when looking at Arrays and Linked Lists. Are you optimising for inserts, appends or reads, or cost?

I know it’s tempting to ignore the performance and just blame “the network” but you own your dependencies and you can do better. Understand your tools, and embrace data structures as persistent storage. After all, in the new serverless world, that’s all you’ve got.


Understanding Block Blobs, Append Blobs, and Page Blobs | Microsoft Docs