Categories
.net code development programming

return Task vs return await Task


The asychronous pattern in C# is great for simplifying the boilerplate of dealing with threads. It makes things look simple, but as with all abstractions, there are leaks
.

In most cases, when an async method calls another and there’s no chain (e.g. public method calling a private method directly) return Task is fine, and avoids the overhead of the await state machine for that method.

BUT never do inside a using, or other scoped construct where the object required by the inner call requires context to successfully complete.

In the following example, the first variation, without the await, returns a database call that escapes from the context of the database connection at the end of the method, so when the call is made, the database connection has been closed. You need the await in the second example to postpone the disposal of the database connection until the GetUserAsync call returns.

In this example it’s fairly obvious, but the same pattern can cause threads to unravel far from where this fray was introduced.

class UserDetailsRepository
{
    public UserDetailsRepository(IDbConnectionFactory dbConnectionFactory)
    {
        _dbConnectionFactory = dbConnectionFactory;
    }

    public Task<UserDetailsResult> GetUserDetails(string userId, CancellationToken cancellationToken = default)
    {
        using (var dbConnection = _dbConnectionFactory.CreateConnection())
        {
            return GetUserAsync(userId, dbConnection, cancellationToken);
        }
    }

    public async Task<UserDetailsResult> GetUserDetailsAsync(string userId, CancellationToken cancellationToken = default)
    {
        using (var dbConnection = _dbConnectionFactory.CreateConnection())
        {
            return await GetUserAsync(userId, dbConnection,  cancellationToken);
        }
    }
}

Advertisement
Categories
.net code development

Thoughts on Global Azure Bootcamp 2019

I’ve got a collaborative post coming up on the talks themselves on my employer’s blog but as a speaker and tech enthusiast, I wanted to share a few thoughts on the bootcamp as a whole.

Firstly, I’d like to thank Gregor Suttie who organised the Glasgow chapter under the Glasgow Azure User Group banner.

It’s an impressive feat getting so many speakers on so many topics around the world. Each city is going to be limited in the talks they can offer, but I was impressed by the distances some of the speakers at the Glasgow event traveled.

I would have liked to have been more of a global feel. I know the challenges of live video, especially interactive, but this is an event that would benefit from some of that (or indeed, more speakers participating the the online bootcamp via pre-recorded YouTube videos). I realize an event like Google I/O or Microsoft Build is different in focus, being company rather than community driven, but it felt like a set of parallel events rather than one, so it also feels as if some of the content is going to be lost, and there were a lot of interesting looking talks in other cities that turned up on the Twitter hashtags.

There’s obviously a lot to cover under the Azure umbrella, so it’s going to be hard to find talks that interest all the audience all the time, and it was hard to know where to pitch the content. I aimed for an overview for beginners which I think was the right CosmosDb pitch for the Glasgow audience, but I was helped by the serendipity of coming after an event sourcing talk so I could stand on the shoulders of that talk for some of my content.

I would maybe have liked to see more “virtual tracks” so that it was easier to track themes within the hashtags, whether it’s general themes like “data” or “serverless” or technology/tool focused like “CosmosDB”, “Azure DevOps” or “Office 365”, to help me connect with the other channels and see what content is must interesting to follow up. Although Twitter was a good basis for it, I think there’s scope to build a conversational overview on top, into which YouTube videos, Twitter content, Github links, blog posts, official documentation and slide content could be fed.

As a speaker, the biggest challenge was keeping my knowledge up to date with all the updates that are happening, and events like this do help, but as I’ve been on a project that’s Docker and SQL focused recently, it’s a lot of work on top of my day job to keep in touch with the latest updates to CosmosDb, especially as a few tickets on the project were picked up and moved to “In Progress” between submitting and delivering my talk, and a new C# SDK was released.

Categories
code development programming

Translations with React, Redux and Asp.Net

In a recent project, which was the first time I’ve used React and Redux in anger, we had a requirement to support 2 different languages, in the .Net backend for emails and PDFs, and in the React frontend.

As the translators we used were used to resx files, we wanted to use those as the master source. In other projects we’ve done the javascript translations via a pre-compilation step into static javascript files, but since redux has it’s own store, I decided to see if we could use that to store and process the translations.

This approach has the advantage that we can use the redux state to translate the site automatically when a new language is selected, without having to reload any pages. For an application that depends on up-to-date data, and had to operate on por data connections, avoiding reloads is essential.

The redux examples shown here use TypeScript, which certainly helped in our development process, but there’s a lot of gotchas getting the dotnet new react-redux template and many 3rd party react and redux libraries working nicely with TypeScript. You can make it work, but it’s definitely not an out-of-the-box solution in most cases.

The solution consists of 3 parts:

  1. The Asp.Net Core 2.0 controller to translate the resx data into JSON (in this example, the resource file is called UserFacingStrings.resx and the default is en-US,
  2. The Redux configuration to retrieve the data and populate it into the store; and
  3. A React function that reads the translation from the store and presents it to the user.

The solution below attempts to get the localisation from the user headers, so you’ll need to enable localisation in your Startup.cs, but depending on your use case, you may want save user language settings in the user’s account data, or in a cookie or other storage in their browser.

ASP.Net Core API controller

[Produces("application/json")]
[Route("api/Localisation")]
public class LocalisationController : Controller
{
    private static Dictionary _cultureCache = new Dictionary();

    /// Get all known translations
    /// Choose a culture (e.g. `en-US` )
    /// will default to browser culture if not specified
    [HttpGet("[action]/{culture?}")]
    public IActionResult Translations(string culture)
    {
        try
        {
            var cultureInfo = CultureInfo.CurrentUICulture;
            try
            {
                string preferredLanguage = HttpContext.GetPreferredLanguage();
                if (!string.IsNullOrWhiteSpace(culture))
                {
                    cultureInfo = CultureInfo.GetCultureInfo(culture);
                }
                else if (!string.IsNullOrWhiteSpace(preferredLanguage))
                {
                    cultureInfo = CultureInfo.GetCultureInfo(preferredLanguage);
                }
            }
            catch (CultureNotFoundException)
            {
                // Fallback to en-US
            }

            if (_cultureCache.Keys.Contains(cultureInfo.ToString()))
            {
                return new OkObjectResult(new { t = _cultureCache[cultureInfo.ToString()] });
            }

            var translations = new Dictionary();

            // Insert all resources, to then be overwritten
            if (cultureInfo.TwoLetterISOLanguageName != "en")
            {
                ExtractResources(translations, UserFacingStrings.ResourceManager.GetResourceSet(CultureInfo.GetCultureInfo("en-US"), true, true));
            }

            ExtractResources(translations, UserFacingStrings.ResourceManager.GetResourceSet(cultureInfo, true, true));

            _cultureCache[cultureInfo.ToString()] = translations;

            return new OkObjectResult(new { t = translations });
    }
    catch (Exception)
    {
        return NotFound();
    }
}

private static void ExtractResources(Dictionary translations, ResourceSet resourceSet)
{
    foreach (DictionaryEntry res in resourceSet)
    {
        translations[res.Key.ToString()] = res.Value.ToString();
    }
}

Browser Extension

public static class BrowserDetailsExtensions
{
    public static string GetPreferredLanguage(this HttpContext http)
    {
        return http?.Request
            ?.Headers["Accept-Language"].ToString()
            ?.Split(',').FirstOrDefault()
            ?.Split(';').FirstOrDefault();
    }
}

Redux plumbing

CultureActions.ts

import { AppThunkAction } from '../store/index';
import { fetch } from 'domain-task';
import { CALL_API } from 'redux-api-middleware';
import { actionCreatorFactory } from 'typescript-fsa';
import { Translations } from '../reducers/CultureState';
import { Dictionary } from 'lodash';

const API_ADDRESS: string = "/api/Localisation/";
const API_GET_TRANSLATIONS: string = "Translations/";
const API_LANGUAGE_CODE: string = ""; // "pt-BR", "en-GB"
const API_REQUEST_TYPE_GET: string = "GET";

export interface ApiError {
    name: 'ApiError',
    status: number,
    statusText: string,
    response: string
}

export interface GetTranslationsAction {
    type: 'RECEIVE_TRANSLATIONS';
    t: Dictionary;
};

export interface TranslationsApiPosted {
    type: 'Translations_REQUEST_POSTED';
    payload: never;
};

export interface TranslationsApiFailed {
    type: 'Translations_FAILURE';
    payload: ApiError;
    error: true;
};

export interface TranslationsApiSuccess {
    type: 'Translations_SUCCESS';
    t: Dictionary;
}

// Declare a 'discriminated union' type. This guarantees that all references to 'type' properties contain one of the
// declared type strings (and not any other arbitrary string).
export type CultureAction = GetTranslationsAction;

const actionCreator = actionCreatorFactory();
export const translationsAwaitingResponse = actionCreator('Translations_REQUEST_POSTED');
export const translationsSuccess = actionCreator('Translations_SUCCESS');
export const translationsFailure = actionCreator('Translations_FAILURE');

// ACTION CREATORS don't directly mutate state, but they can have external side-effects (such as loading data).
export const cultureActions = {
    getTranslations: (): AppThunkAction => (dispatch, getState) => {
        console.log("REQUEST TRANSLATIONS ACTION")
        const rsaaRequestAuctions = {
            [CALL_API]: {
                credentials: 'same-origin',
                endpoint: API_ADDRESS + API_GET_TRANSLATIONS + API_LANGUAGE_CODE,
                method: API_REQUEST_TYPE_GET,
                types: ['Translations_REQUEST_POSTED', 'Translations_SUCCESS', 'Translations_FAILURE']
            }
        }
        dispatch(rsaaRequestAuctions);
    },
};

CultureStore.ts and the tr() method

import { Dictionary } from "lodash";
import * as He from 'he';
import { Decimal } from "decimal.js";

export type CultureState = {
    strings: Translations
};

export interface Translations {
    t: Dictionary
};

export const emptyCulture: CultureState = {
    strings: {
        t: {}
    }
};

export function tr(culture: CultureState, key: string): string {
    try {
        return He.decode(culture.strings.t[key]) || key;
    } catch (error) {
        console.log("translation for '" + key + "' not found");
        if (key == undefined || key == null) {
            return "- -";
        }
        return "-".repeat(key.length);
    }
}

// Returns the key value itself if no match is found in the resx
export function trFallback(culture: CultureState, key: string): string {
    try {
        return He.decode(culture.strings.t[key]) || key;
    } catch (error) {
        return key;
    }
}

export function trFormat(culture: CultureState, key: string, args: string[]): string {
    return formatString(tr(culture, key), args);
}

export function formatDecimalAmount(subject: Decimal): string {
    return formatAmount(subject.toNumber());
};

export function formatAmount(subject: number): string {
    return subject.toLocaleString(navigator.language, { maximumFractionDigits: 2 });
};

function formatString(subject: string, args: string[]): string {
    if (subject === undefined) {
        return "";
    }
    return subject.replace(/{(\d+)}/g, function (match, number) {
        return typeof args[number] != 'undefined'
            ? args[number]
            : match
        ;
     });
};

CultureReducer.ts

import { Action, Reducer } from 'redux';
import { isType } from 'typescript-fsa';
import { Translations, emptyCulture, CultureState } from './CultureState';
import { ApiError, CultureAction, translationsAwaitingResponse, translationsFailure, translationsSuccess } from '../actions/CultureActions';
import { Dictionary } from 'lodash';

export interface CultureStateR extends CultureState { }

const unloadedState: CultureStateR = emptyCulture;

export const reducer: Reducer = (state: CultureStateR, incomingAction: Action) => {
    const action = incomingAction as CultureAction;

    if (typeof state === undefined) {
        return unloadedState;
    }

    if (isType(incomingAction, translationsAwaitingResponse)) {
        return {
            strings: state.strings
        };
    }
    if (isType(incomingAction, translationsFailure)) {
        return {
            strings: state.strings
        };
    }
    if (isType(incomingAction, translationsSuccess)) {
        const rawTranslations: Dictionary = incomingAction.payload.t;
        return {
            strings: { t: rawTranslations }
        };
    }

    switch (action.type) {
        case 'RECEIVE_TRANSLATIONS':
            return state;

        default:
            break;
    }

    return state || unloadedState;
}

Use in account controller

import { CultureState, tr } from '../reducers/CultureState';
/// Some imports removed for clarity

/// Add CultureState into props for this component
type AccountProps = AccountState & CultureState & typeof accountActions & RouteComponentProps;

export class Account extends React.Component {
}

Use tr() in view tsx file

</pre>
<h2>{tr(this.props, "PersonalDetails")}</h2>
<table>
<tbody>
<tr>
<td>{tr(this.props, "Name")}:</td>
<td>{this.props.user.userName}</td>
<td><a href="/account/profile">{tr(this.props, "Edit")}</a></td>
</tr>
<tr>
<td>{tr(this.props, "Company")}:</td>
<td>{this.props.user.companyName}</td>
</tr>
<tr>
<td>{tr(this.props, "Email")}:</td>
<td>{this.props.user.emailAddress}</td>
</tr>
</tbody>
</table>
<pre>
Categories
.net development

Debugging Asp.Net Core 2 apps in Azure

I’ve been getting under the skin of Asp.Net Core 2 following a failed experiment with Asp.Net Core v1. I certainly found .Net standard and the associated framework support to be very welcome, and unlike my previous project we didn’t need to support GDI or SignalR libraries, so it’s much more in the Core sweet spot.

I’ll talk about the project and the technologies involved in some follow up posts, so if you’ve got any real-world questions on Asp.Net core, React/Redux with Typescript, or CosmosDb in C#. let me know and I’ll try and address them as I get to those technologies.

For now, though, I want to start with the basics. Asp.Net core on Azure works great, mostly, but if something goes wrong in Startup.cs, there’s no Application Insights, a generic 502.5 IIS error – which just means it can’t talk to Kestrel, and no web logs to help you. So before you deploy to Azure, do yourself a favour and add the following to web.config so you’ve got logs to help you.

<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\web.api.dll" stdoutLogEnabled="true" stdoutLogFile=".\logs\stdout" forwardWindowsAuthToken="true" />
</system.webServer>

That way, if you do see a 502.5 error on your site, you can jump into Kudu and start reading the logs. They can grow quite quickly, depending on your web app lifecycle settings, so you may benefit from a regular cleanout of your logs folder.

If the logs still aren’t helping, or you don’t understand what you’re seeing, there’s a nice Asp.Net core 2 troubleshooting guide over at MSDN, but not all of it applies to Azure.

Categories
.net development programming

My .net journey

With the release of Visual Studio 2017 and .net core, I’ve seen a few folk talking about their story with the platform. This is mine.

I’ve never been the biggest Microsoft fan, ever since I grabbed a copy of Mandrake Linux and figured out how much more tinkering was available and how much more logical certain operations were than on Windows 95. But it was definitely still a tinkerers platform.

But I got an internship at Edinburgh University whilst I was a student there, funded by Microsoft. I got a laptop for the summer and a iPaq (remember that?) to keep. I also got a trip to Amsterdam to meet the other interns and some folk from Microsoft, back before they had much more than sales people in the UK. And they told me, no matter how much anyone hates Microsoft, they always hate Oracle more.

It meant that I was among the first to get the .net 1.0 CD, so I could legitimately claim later that yes, I did have 2 years of .net experience.

But from there, I stayed in Linux, learning the joys of Java Threading on Solaris (top tip : Sun really should have known what they were doing, that they didn’t means I can see some of why they failed – it was far easier working with threads on Red Hat or Windows).

And then I did my PhD, digging into device drivers, DirectX and MFC. I hated Microsoft’s Win32 GUI stuff, but the rest, in C++, was quite nice. I enjoyed being closer to the metal, and shedding the Java ceremony. I trained on templates and started to understand their power. And Java just wasn’t good enough.

I wrote research projects in C++ and data analysis engines in Python. It was good.

But Java came back, and I wrote some media playback for MacOS, and fought iTunes to listen to music. And I vowed never to buy from Apple because both were a right pain.

And I needed a new job. And I’d written bots in IronPython against C#, so I got a .Net job. And I missed the Java and Python communities, the open source chatter. And I wanted to write code in C# that was as beautiful and testable as C++. And I wanted to feel that Bulmer’s Developers! chant was a rallying call, not a lunch order from a corporate monster.

So I found alt.net and it was in Scotland, and I wrote a lot of code, and I learned that open source did exist in c#, and that there was a conference named after that chant and I met more like minded developers. I fought my nervousness and my stumbling voice and I found some confidence to present. And blog. And help write a package manager. And then everyone else learned Ruby.

And then the Scotts joined Microsoft and alt.net became .net. And then LINQ came and I remembered how clean functional programming is, and I started feeling like I was writing Python if I squinted hard, and ignored types. And then core came, and Microsoft had some growing pains. But it’s a sign that the company has completely shifted in the right direction, learning from the guys who left for Ruby. And Node.

I’m proud of what I’ve built in C#, and it’s a much better language than Java, now. It’s definitely the right choice for what I’ve built. The documentation is definitely better than Apple or Sun/Oracle produce, although MSDN and docs.microsoft.com are having some migration pains of its own.

And alt.net is making a comeback.

And I still use Python on hobby projects.

Categories
Blogroll code development search

Microsoft Edge, ungooglability and a new class of bugs

Microsoft definitely has a naming problem. .net core was one thing, but calling a browser Edge was just trolling developers. Try searching for “Edge CSS” or “JavaScript Edge”. It’s a lesson in frustration, which means the bugs in the new browser are extra painful to debug because it’s that much harder to find the blog posts and Q&A for the last person to fix the problem.

And Edge doesn’t behave like IE, or Firefox, or Chrome. I’m sure Microsoft, like the other vendors, are updating OSS frameworks to help them target Edge, but there’s still a lot of Javascript and CSS that breaks silently, so no Console logs to help, no odd numbers in the calculated CSS, and no hacks to persuade Edge that it can render just like the big browsers.

I want to like the browser, I really do. Anything that brings the end of IE closer has to be welcomed, but even after the Anniversary update of Windows 10, it’s far from ready. If I try to open IIS failure logs in Windows 10, it opens up IE, and displays with the correct CSS, and then tells me I should use Edge, where the CSS is broken. It’s frustrating as a user, and as a developer. It’s an alpha product, and it should have been treated as such. Give it to devs, allow power users to opt in, and iterate it. Microsoft still needs to learn what it means to develop in the open.

Documentation

Unfortunately the problem is then compounded by Microsoft’s documentation problem. For all the faults of IE, at least Microsoft had a good reputation for documentation at the height of MSDN. Unfortunately, MSDN is starting to decay, and there’s a number of conflicting alternatives springing up. For us developers, the seemingly preferred route for latest information is blog posts (or the comments thereon – which were the only source of information for a knotty Docker problem we had), but there’s also GitHub, docs.microsoft.com and the occasional update to the existing MSDN documentation suite.

Microsoft seem to be trying to frustrate developers. Especially when they have evolving, and conflicting APIs (I’m looking at you Azure, and the Python vs PowerShell vs Node APIs, and the Portal experience). The documentation experience at Microsoft feels like the Google UI experience before Material Design. And it needs a similar overhaul.

I love seeing Microsoft trying to be more open and I see it working, to a certain extent, in the C# and .Net space, aside from the .Net Core RC release cycle chaos. They’ve come a long way from the days of alt.Net (although I agree that we need to recapture that passion, both for the sake of new developers, and for the sake of keeping Microsoft in check), but they’re in danger of alienating developers once more with the confusion, and the inconsistencies within certain platforms.

In that context, removing project.json and keeping .csproj was the right decision. One clear and consistent path. Now go and apply it across the board.

Categories
.net code development

Naming things is hard – Microsoft’s Core problem

There are 2 hard problems in computer science: cache invalidation, naming things, and off-by-1 errors.

Microsoft are changing the name of the next iteration of .Net – what was .Net 5.0 is now .Net Core 1.0, and like Rick Strahl, I like the idea, but I’m not find of the timing. Although, as it’s pre-release, anyone writing production code on it was playing with fire anyway. I’m not a big fan of the naming scheme though.

The timing issue is an easy one to solve. I’ve had projects running under placeholder codewords for months whilst the business agreed a proper name. Microsoft has been doing this for years. No-one expected to see Windows Chicago, Windows Threshold or SQL Server Denali. No-one would have expected to ship .Net Sapphire (hey, they’ve admitted that were going after the Ruby developers 😉 ) Everyone knew they were codewords, to be replaced pre-release once marketing figured out what year it was being released – or the regression testers discovered how much old code thought 9 < 7.

The problem I have with it is that ASP.Net Core sounds like part of the family that includes ASP.Net MVC, ASP.Net SignalR or ASP.Net AJAX, whereas it actually replaces the ASP.Net part with its own .Net Core stack. I can’t at the moment think of a better way to name the ASP.Net Core though, without losing the ASP.Net brand.

The naming works when you compare .Net Framework to .Net Core, and to me it’s as natural as when Netscape open sourced Navigator to give us Firefox (once they managed to choose a unique name). It was from some of the same team, and it did the same job, but the new one was leaner, faster and more focused.

Naming things is hard, but starting a new roadmap and resetting expectations is definitely the right thing to do. .Net and the ASP.Net frameworks have needed a decent overhaul for a while to fix the untestable, interdependent ball of mud that the .Net framework install had become. I like the new Core world, with NuGet and Gulp and Grunt, and Github at the heart of development. This is not a Microsoft I thought I’d see when I was at the ScotAlt.Net meetings all those years ago.

Alt.Net is dead. Long live .Net Core.

Categories
code development krugle programming

Adding HTTP Headers to WaTiN tests

EDIT (2010-08-17 21:26 GMT) : I found a bug in the COM interop that prevented the code from compiling on certain machines. This has now been fixed so if you copied this code before, please try it again.

It’s taken me a while to figure this one out, so I’m putting up a post in case anyone else has a problem. In our system, we need to inject specific headers into each request to simulate our live environment, which means either injecting those expected headers into our tests or having different controller behaviour for dev and live environments.

WaTiN is great for all the other tests we do, but it doesn’t currently support adding HTTP headers to a request, and after a look through the source code, I can see the hoops they’d have to jump through to make it cross-browser. For our purposes, we just needed IE support, so we have the option of intercepting the events on the IE instance directly.

MyHeaders is a property of type object for COM interoperability. It uses the SHDocVw dll to interact with IE. If you allow MyHeaders to be null, you will get an infinite loop and a StackOverflow. I will not guarantee anything about this code, but I hope it’s one small step towards native WaTiN cross-browser HTTP headers support. I am happy to hear about any success you have using or extending this. I will leave setting of test-specific values and compatibility for other browsers as an exercise for the reader.

Snippet

        private object MyHeaders { get; set; }

        private void beforenavigate2replaceheaders(object pDisp, ref object URL, ref object Flags, ref object TargetFrameName, ref object PostData, ref object Headers, ref bool Cancel)
        {
            if (Headers == null)
            {
                Cancel = true;
                object objTestHeaders = MyHeaders;
                _ieInstance.Navigate2(ref URL, ref Flags, ref TargetFrameName, ref PostData, ref objTestHeaders);
            }
        }
 
        private void beforenavigatereplaceheaders(string URL, int Flags, string TargetFrameName, ref object PostData, string Headers, ref bool Cancel)
        {
            if (Headers == null)
            {
                Cancel = true;
                object objFlags = Flags;
                object objTargetFrameName = TargetFrameName;
                object objTestHeaders = MyHeaders;
                _ieInstance.Navigate(URL, ref objFlags, ref objTargetFrameName, ref PostData, ref objTestHeaders);
            }
        }

        private void SetUpBrowser()
        {
            CloseBrowser();
            Settings.AutoMoveMousePointerToTopLeft = false;
            Settings.MakeNewIeInstanceVisible = true;
            Settings.WaitForCompleteTimeOut = 120;

            _ieInstance = new SHDocVw.InternetExplorerClass();
            MyHeaders = "Content-Type: application/json\r\n";
            ((SHDocVw.InternetExplorerClass)_ieInstance).BeforeNavigate += beforenavigatereplaceheaders;
            ((SHDocVw.InternetExplorerClass)_ieInstance).BeforeNavigate2 += beforenavigate2replaceheaders;
            _browser = new IE(_ieInstance) { AutoClose = true });
        }