Http Header Propagation Asp net 6 with HttpClientFactory


Today I show you how to add header propagation in ASP net.

“Header what?” I hear you say.


Essentially it’s a mechanism whereby when you make Http requests via a HttpClient, you can automatically ‘forward’ headers that were issued to your endpoint.

HeaderX that is passed to an ASP net endpoint gets added to out going http requests via a http client.


How

Manually:


If we were to do this manually, we would interrogate the http request, find the HeaderX and add it in the outgoing HttpClient request headers.


Propagation:

Lets get ASP to do the heavy lifting

1) Add the Microsoft.AspNetCore.HeaderPropagation package


2) Add Header propagation (in Startup.cs or program.cs (minimal api))

3) Use Header Propagation (in Startup.cs or program.cs (minimal api))


4) Create a service that takes HttpClient as a constructor arg

4) Add this AwesomeService to your DI config and set the delegated headers

builder.Services.AddTransient<IAwesomeService, AwesomeService>();
builder.Services.AddHttpClient<IAwesomeService, AwesomeService>(o => o.Timeout = TimeSpan.FromMinutes(1))
.AddHeaderPropagation(o => o.Headers.Add("Header1"));

Note: this step is only needed with HttpClientFactory, you can see for the IAwesomeService only the “Header1” will be propagated even though we’re configuring Header1 and Header2 in general.

5) Testing
One way of testing this is to use Fiddler,
a) Enable the proxy to be seen by .net core


b) Open Fiddler
I find it’s easier to filter on certain hosts


c) Enter the composer and call (execute) your service with some headers

d) Inspect that you receive the Header1


Enjoy!

Parallel Batch Request Handling

 

Picture this:

You find yourself with a big list of resource identifiers

You want to make get information for each id via a http request

You want to batch the resource identifiers rather than making individual http calls

You want to make multiple parallel requests for these batches

You don’t want to manage multiple thread access to a single response collection

 

You have come to the right place!

 

There are multiple approaches I can think of, but here’s one that may fit your use case:

 

Get Method:

The code in the method itself is not that important!
what you should take from it is that it’s an ordinary async/await method that takes a list of Ids and returns the results for same.

 

To Parallelizm and beyond

 

Let’s unravel what’s happening above.

 

Firstly we are using .net6 where we have the Chunk method from the framework (gone are the days of writing this by hand thankfully!),
the chunk method basically takes the ids list and breaks it into a list of smaller lists of size ‘batchSize’ or smaller.

e.g.

If we chunked [1,2,3,4,5,6,7,8,9,10,11] with a batch size of 2, then we’d end up with

[[1,2],[3,4],[5,6],[7,8],[9,10],[11]]

 

 

Secondly we pass these smaller arrays to the GetIds call, by using a Linq Select expression.

We await the result of all these Selected tasks via Task.WhenAll

Lastly to combine all the individual batched responses we use the Linq SelectMany expression.

 

I like this approach as it is terse, concise and propagates exceptions without wrapping them in Aggregates.

Await forever – deadlocked so easily

Async/Await simplifies async code, use it everywhere and life becomes so simple right?
While this is true I’ve seen situations where users either chose to, or had to, mix async and non async code and got themselves into a world of problems.

 

One problem I’ve seen time over time is with Windows Desktop applications where a simple blocking call on a Task can deadlock an application entirely, here I demonstrate the problem with a contrived Windows Forms example.

Application simply downloads some html asynchronously and displays it in a web browser

Implementation of async function is:

(Let’s ignore the urge to make the click handler async, imagine the async call was in the form constructor if you must)


How can such a simple bit of code deadlock the windows application?

Well the problem occurs because of how Async/Await state machines work.
I’m really going to simplify this explanation as I want people to grasp it (so grit ur teeth if you already know the detail Winking smile)

 

The async keyworks is simply a compile instruction that doesn’t do much so lets ingore that and focus on the await call

The await calls an async function then waits on a call back, when the call back occurs the code resumes to the next step…

ok so far so good, this is what we expect… simple right…wrong!

 

A quick recap of windows UI threads and messages

Before we continue let’s have a quick recap of windows UI threads and message loops.
A message loop is an obligatory section of code in every program that uses a graphical user interface under Microsoft Windows. Windows programs that have a GUI are event-driven. Windows maintains an individual message queue for each thread that has created a window.

Now as anyone working on a windows application knows you always call any code that updates the UX on the GUI thread, try with any other threads and you’ll get presented with a cross thread exception.

Windows forms application code can call Invoke/BeginInvoke on a windows control and execute the code back on the GUI Thread, in WPF we would use a dispatcher, in UWP/Wintr it’s something else.

Another approach is to use the SyncronizationContext contruct, this is aware if it should call Invoke or Dispatch or something else on our behalf.

 

Back to await

The callback I mentioned above is smart in that it tries to use the existing Syncronization context if it exists, so when that await finally returns we’re back on the GUI thread can can update the UX without those pesky errors.
In our calling code above we never left the GUI thread as we were blocked on the Result of the task.

The crux of the problem is that the await call back puts a message on the windows message queue to tell it to continue, but the message queue is bocked in that call to .Result on the task, so we’re well and truly deadlocked.

 

Solutions

I’ll avoid telling you to embrace async/await everywhere! and offer some alternate solutions..

1) ConfigureAwait(false)


This works as it tells the task not to continue on the current syncroniztion context (which let’s remember is the GUI thread), another context is used to complete the async await state machine call back and allow the task to be completed.

2) Run on another syncronization context you create

 

3) Use Async/Await everywhere

Summary

This is a very dumbed down explanation of how you might encounter a deadlock with async await.

If it happens then don’t panic it can be easily fixed once you know what’s happening, best practise is nearly never to work around the problem and always use async await entirely.

Standard documentation is great there are lots for really good articles floating about: e.g. https://devblogs.microsoft.com/dotnet/configureawait-faq/

Azure Arc Server Registration Error


Today while adding a new server to an Azure subscription I encountered the following error:

The subscription is not registered to use namespace 'Microsoft.HybridCompute'

In this video I show you how register the Hybrid Compute provider in your subscription to overcome this obstacle.

Bitbucket- Pipelines -Terraform - Private Modules

So you’ve started using terraform

You’ve progressed to creating terraform modules

You’ve put your module in a private bitbucket repo

Now you want to access it from a bitbucket build pipeline and you see the following

Solution

In my case I reached out to one of my friendly devops colleagues  @BlnaryMlke who showed me how ssh keys and git hang together, I don’t know if i should be ashamed to say I’ve never used git with ssh keys until today.

Armed with this new knowledge I set off to do the same in my bitbucket pipeline only to discover that Bitbucket has some primary support for this scenario!

What follows are the steps required in a bitbucket pipeline in order to to use a private git bitbucket repo that contains a terraform module

Show me

1) First create a new key in the the project that contains your pipeline (i.e. the project that is including the terraform module), you’ll find this option under project settings, pipelines/ssh keys