In this video I show you how to leverage Azure Managed Identities to allow access between Azure resources.
(excuse the audio quality.. i need to improve on this)
I believe I’ve previously covered c# generics covariance and contravariance in the past, now it’s javas turn
As you may or may not know
The term PECS stands for “Producer Extends, Consumer Super,” which is an odd acronym coined by Joshua Block in his Effective Java book, but provides a mnemonic on what to do. It means that if a parameterized type represents a producer, use extends. If it represents a consumer, use super. If the parameter is both, don’t use wildcards at all—the only type that satisfies both requirements is the explicit type itself.
Covariance in java uses the extends keyword (yes even with interfaces), e.g. List<? extends Number> accommodates all types that derive from Number
Contravariance on the other hand uses the super keyword e.g. List<? super Number> accommodates all the types that Number derives from and of course number itself.
So what exactly is PECS recommending we do?
- Use extends when you only get values out of a data structure
- Use super when you only put values into a data structure
- Use the exact type when you plan on doing both
If you are debugging with VS2017/9 and want to pass environment variables to your container then read this post, if you are looking for picture of cats then sorry but leave a comment how you got here
Create a new text file, the name doesn’t matter, but i called mine Dockerfile.env
Add this file to your .csproj file.
Not really a step but you you can simply query your Environment variable in the usual fashion (Environment.GetEnvironmentVariable())
Needless to say when you run in production you’ll need to pass the Environment variable according to Docker documentation which I don’t cover here
I thought it worth sharing how to configure Azure Active Directory to work with a .net core 2.2 webapi backend and an angular7 front end that uses ADAL (i.e. v1 of Azure AD)
As you may or may not be aware, Azure AD has two implementations of security protocols, v1 is the common one but v2 is becoming more popular. From an Angular point of view you will pull in either the ADAL library for v1 or the MSAL library for v2, I’m not going to dwell on what the differences are or why to use either, in a recent project I was working on we found that there was no Java springboot support for v2 at the time, so went with the v1 endpoints to get our POC up and running quickly.
Asp Web Api
To configure Asp.net core 2.2 for use with v1 you’ll need a jwt token
For angular7 I used the adal-angular4 library (this is an unfortunate name as it is not limited to v4)
The application settings are configured in the environment
The module then adds adal and interceptors via the providers statement
Now when you make a http request the bearer token will be added by the angular interceptor and recognised by the webapi
The C# compiler defaults to the latest major version of the language that has been released. You may choose to compile any project using a new point release of the language. Choosing a newer version of the language enables your project to make use of the latest language features. In other scenarios, you may need to validate that a project compiles cleanly when using an older version of the language.
This capability decouples the decision to install new versions of the SDK and tools in your development environment from the decision to incorporate new language features in a project. You can install the latest SDK and tools on your build machine. Each project can be configured to use a specific version of the language for its build
Screenshot shows me selecting C# 7.2 for a .net core 2.1 application by changing the advanced options of the project properties build pane
Imagine the scenario you are on a team race, there are a number of stages along the route however only once all you teammates have gotten the the end of a stage can anyone proceed to the next stage.
Now imagine the competitors are threads/tasks and that you had to write this code…. with the .net Barrier class this is quite trivial.
The result looks like this:
In windows we have two types of semaphores, local and named system semaphores.
You can think of a semaphore as a bounder in a nightclub, the responsibility is to only allow a number of people into the club at any one time.
.net has a lightweight semaphore, ‘SemaphoreSlim’ that can be used for local communication (that is to say, system synchronization is not supported)
If you run the code above (e.g. in .net core 2.1 project) you will be presented with the following result
What is happening is that all the tasks try get access to the semaphore, they are all initially blocked until
is called which allows 3 tasks to enter at any one time.
Recently I found myself using Azure managed Kubernetes (AKS), however the images I wanted to pull were in AWS ECR. If my k8s cluster was in AWS it would be transparent to me provided the IAM user had permission but, in order to pull such an image from Azure; one can create a secret and to pull the image, sadly (or maybe thankfully) this secret expires after 12 hours so we need to keep refreshing.
Below I present an approach which could be used, it creates a service account (note I apply the permissions to the default account also as some of my deployments dont reference this service account yet) for pulling the image with RBAC, permissions, there is a kubernetes job that will execute immediately and a cronjob that will execute every 8 hours thereafter
Just use this secret in your deployments
- name: dg-ecr-pull
Hope this is of benefit to others! remember to update those <TODO> sections!
If you see this post it means that BlogEngine.NET is running and the hard part of creating your own blog is done. There is only a few things left to do.
To be able to log in, write posts and customize blog, you need to enable write permissions on the App_Data and Custom folders. If your blog is hosted at a hosting provider, you can either log into your account’s admin page or call the support.
If you wish to use a database to store your blog data, we still encourage you to enable this write access for an images you may wish to store for your blog posts. If you are interested in using Microsoft SQL Server, MySQL, SQL CE, or other databases, please see the BlogEngine docs to get started.
When you`ve got write permissions set, you need to change the username and password. Find the sign-in link located either at the bottom or top of the page depending on your current theme and click it. Now enter "admin" in both the username and password fields and click the button. You will now see an admin menu appear. It has a link to the "Users" admin page. From there you can change password, create new users and set roles and permissions. Passwords are hashed by default so you better configure email in settings for password recovery to work or learn how to do it manually.
Configuration and Profile
Now that you have your blog secured, take a look through the settings and give your new blog a title. BlogEngine.NET is set up to take full advantage of many semantic formats and technologies such as FOAF, SIOC and APML. It means that the content stored in your BlogEngine.NET installation will be fully portable and auto-discoverable. Be sure to fill in your author profile to take better advantage of this.
Themes and Plugins
One last thing to consider is customizing the look and behavior of your blog. We have themes and plugins available right out of the box. You can install more right from admin panel under Custom. Also you can check out our high quality themes.
On the web
You can find news, tutorials, documentation, tips and tricks about BlogEngine.NET on the official website. The ongoing development of BlogEngine.NET can be followed at Github.
Good luck and happy writing.
The BlogEngine.NET team
So I’ve been looking for a serverless framework that can run on-prem and in the cloud, I’ve been leaning towards OpenFaaS as it appears to be gaining more traction, however I love Azure functions and though let’s see if this is a viable solution.
I download what is a Preview, so I wasn’t expecting miracles, I’m sharing the reasons why I can’t use it for my own requirements below.
It might save some of you guys the effort, I must reiterate that this is still a preview so some of the stuff I say here will be out of date really quickly!
I have decided against Azure Function On Prem in March 2017 because:
- It needs Sql Server, I can’t rely on having this at least not for some brown field projects I want to use serverless for.
- It needs IIS, I have to run on Linux (might be a solved problem… especially as it’s using the new .net core runtime )
- The packaging was a windows installer, I was hoping for some docker images, I expect this is a solved solution and for now the MSI is a quick win for the developers.
Next it’s down the rabbits burrow with OpenFaas on Kubernetes, cross your fingers for me!
Aside from the above which are mostly external limitations it’s nice to see Azure Functions Running locally