AWS examples in C# – run the solution

Last Updated on by

Post summary: Explanation of how to install and use the solution in AWS examples in C# blog post series.

This post is part of AWS examples in C# – working with SQS, DynamoDB, Lambda, ECS series. The code used for this series of blog posts is located in aws.examples.csharp GitHub repository. In the current post, I give information on how to install and run the project.


The solution has commands to deploy to the cloud as well as to clean resources. Note not all resources are cleaned, read more in the Cleanup section. In order to be run in AWS valid account is needed. I am not going to describe how to create an account. If an account is present, then there is also knowledge and awareness of how to use it.

Important: current examples generate costs on AWS account. Use cautiously at your own risk!


The project was tested to be working on Linux and Windows. For Windows, it is working only with Git Bash. The project requires a valid AWS account.

Required installations

In order to fully run and enjoy the project following needs to be installed:


AWS CLI has to be configured in order to run properly. It happens with aws configure. If there is no AWS account, this is not an issue, put some values for access and secret key and put a correct region, like us-east-1.

Import Postman collection, in order to be able to try the examples. Postman collection is in aws.examples.csharp.postman_collection.json file in the code. This is an optional step, below there are cURL examples also.

Run the project in AWS

Running on AWS requires the setting of environment variables:

export AwsAccessKey=KIA57FV4.....
export AwsSecretKey=mSgsxOWVh...
export AwsRegion=us-east-1

Then the solution is deployed to AWS with ./ script. Note that the output of the command gives the API Gateway URL and API key, as well as the SqsWriter and SqsReader endpoints. See image below:

Run the project partly in AWS

The most expensive part of the current setup is the running of the docker containers in EC2 (Elastic Compute Cloud). I have prepared a script called ./, which runs the containers locally and all other things are in AWS. Still, environment variables from the previous section are mandatory to be set. I believe this is the optimal variant if you want to test and run the code in a “production-like” environment.

Run the project in LocalStack

LocalStack provides an easy-to-use test/mocking framework for developing Cloud applications. It spins up a testing environment on local machines that provide the same functionality and APIs as the real AWS cloud environment. I have experimented with LocalStack, there is a localstack branch in GitHub. The solution can be run with command. I cannot really estimate if this is a good alternative, because I am running the free tier in AWS and the most expensive part is ECS, which I skip by running the containers locally, instead of AWS. See the previous section on how to run a hybrid.


There is a Postman collection which allows easy firing of the requests. Another option is to use cURL, examples of all requests with their Postman names are available below.


SqsWriter is a .NET Core 3.0 application, that is dockerized and run in AWS ECS (Elastic Container Service). It exposes an API that can be used to publish Actor or Movie objects. There is also a health check request. After AWS deployment proper endpoint is needed. The endpoint can be found as an output of deployment scripts. See the image above.


curl --location --request POST 'http://localhost:5100/api/publish/actor' \
--header 'Content-Type: application/json' \
--data-raw '{
	"FirstName": "Bruce",
	"LastName": "Willis"


curl --location --request POST 'http://localhost:5100/api/publish/movie' \
--header 'Content-Type: application/json' \
--data-raw '{
	"Title": "Die Hard",
	"Genre": "Action Movie"

When Actor or Movie is published, it goes to the SQS queue, SqsReader picks it up from there and processes it. What is visible in the logs is that both LogEntryMessageProcessor and ActorMessageProcessor are invoked. See the screenshot:


curl --location --request GET 'http://localhost:5100/health'


SqsReader is a .NET Core 3.0 application, that is dockerized and run in AWS ECS. It has a consumer that listens to the SQS queue and processes the messages by writing them into appropriate AQS DynamoDB tables. It also exposes API to stop or start processing, as long as reprocess the dead-letter queue or simply get the queue status. After AWS deployment proper endpoint is needed. The endpoint can be found as an output of deployment scripts. See the image above.


curl --location --request POST 'http://localhost:5200/api/consumer/start' \
--header 'Content-Type: application/json' \
--data-raw ''


curl --location --request POST 'http://localhost:5200/api/consumer/stop' \
--header 'Content-Type: application/json' \
--data-raw ''


curl --location --request GET 'http://localhost:5200/api/consumer/status'

If this one is invoked with no messages in the dead-letter queue then it takes 20 seconds to finish, because it actually waits for long polling timeout.

curl --location --request POST 'http://localhost:5200/api/consumer/reprocess' \
--header 'Content-Type: application/json' \
--data-raw ''


curl --location --request GET 'http://localhost:5200/health'


This lambda is managed by the Serverless framework. It is exposed as REST API via AWS API Gateway. It also has a custom authorizer as well as API Key attached. Those are described in a further post.


In the case of AWS, the API Key and URL are needed, those can be obtained from deployment command logs. See the screenshot above. Put correct values to CHANGE_ME and REGION placeholders. Request is:

curl --location --request POST '' \
--header 'Content-Type: application/json' \
--header 'x-api-key: CHANGE_ME' \
--header 'Authorization: Bearer validToken' \
--data-raw '{
    "FirstName": "Bruce",
    "LastName": "Willis"



Put correct values to CHANGE_ME and REGION placeholders. Request is:

curl --location --request GET ' Hard'


Nota bene: This is a very important step, as leaving the solution running in AWS will accumulate costs.

In order to stop and clean up all AWS resources run ./ script.

Nota bene: There a resource that is not automatically deleted by the scripts. This is a Route53 resource created by AWS Cloud Map. It has to be deleted with the following commands. Note that the id in the delete command comes from the result of list-namespaces command.

aws servicediscovery list-namespaces
aws servicediscovery delete-namespace --id ns-kneie4niu6pwwela

Verify cleanup

In order to be sure there are no leftovers from the examples, following AWS services has to be checked:

  • SQS
  • DynamoDB
  • IAM -> Roles
  • EC2 -> Security Groups
  • ECS -> Clusters
  • ECS -> Task Definitions
  • ECR -> Repositories
  • Lambda -> Functions
  • Lambda -> Applications
  • CloudFormation -> Stacks
  • S3
  • CloudWatch -> Log Groups
  • Route 53
  • AWS Cloud Map

On top of it, Billing should be regularly monitored to ensure no costs are applied.


This post describes how to run and they the solution described in AWS examples in C# – working with SQS, DynamoDB, Lambda, ECS series

Related Posts


Introduction to Postman with examples

Last Updated on by

Post summary: This post is demonstrating different Postman features with examples.

All examples shown in this post are available at Postman Examples link and can be imported in Postman. Environments are also used in attached examples and are available in Admin environment and User environment. In order to run all the examples you need to download and run Dropwizard stub described in Build a RESTful stub server with Dropwizard post and available in sample-dropwizard-rest-stub GitHub repo, otherwise, you can just see the Postman code and screenshots.


Postman is a Chrome add-on and Mac application which is used to fire requests to an API. It is very lightweight and fast. Requests can be organized in groups, also tests can be created with verifications for certain conditions on the response. With its features, it is very good and convenient API tool. It is possible to make different kinds of HTTP requests – GET, POST, PUT, PATCH and DELETE. It is possible to add headers to the requests. In current post I will write about more interesting features it has Variables, Pre-Request Script, Environments, and Tests.



There are two types of variables – global and environment. Global variables are for all requests, environment variables are defined per specific environment which can be selected from a drop-down or no environment can be selected. Environments will be discussed in details later in current port. Global variables are editable by a small eye-shaped icon in the top right corner. Once defined variables can be used in a request with format surrounded by curly brackets: {{VARIABLE_NAME}}.


Pre-Request Script

Postman allows users to do some JavaScript coding with which to manipulate the data being sent with the request. One such example is when testing and API with security as explained in How to implement secure REST API authentication over HTTP post – SHA256 hash (build from apiKey + secretKey + timestamp in seconds) is sent as a request parameter with the request. Calculating SHA256 hash is done with the following pre-request script and then stored as a global variable token.

var timeInSeconds = parseInt((new Date()).getTime() / 1000);
var sigString = postman.getGlobalVariable("apiKey") + 
	postman.getGlobalVariable("secretKey") + timeInSeconds
var token = CryptoJS.SHA256(sigString);
postman.setGlobalVariable("token", token);

Here CryptoJS library is used to create a SHA256 hash. All available libraries in Postman are described in Postman Sandbox page. Global variable {{token}} is then sent as token request parameter.



The code shown above is working fine with just one set of credentials because they are stored as global variables. If you need to switch between different credentials this is where environments come into play. By switching environment and with no change in the request you can send different parameters to API. Environments are managed from Settings icon in the top right corner which opens a menu with “Manage Environments” link.


Postman supports so-called shared environments, which means the whole team can use one and the same credentials managed centrally. It requires sign in and some plan though but might be a good investment in the long run.

In order to use environments, pre-request script has to be changed to:

var timeInSeconds = parseInt((new Date()).getTime() / 1000);
var sigString = environment.apiKey + environment.secretKey + timeInSeconds
var token = CryptoJS.SHA256(sigString);
postman.setEnvironmentVariable("token", token);

Both apiKey and secretKey are read from the environment. An environment can be changed from the top right corner.

Nota bene: There is specific behavior (I would not call it bug as it makes sense) in Postman. If select “No Environment” and fire request above Postman will automatically create an environment with the name “No Environment”. This is because it actually needs an environment to store a variable into. This might be very confusing the first time.

Post-Request Script

There is no such term defined in Postman. The idea is that in many cases you will need to do something with the response and extract a variable from it in order to use it at a later stage. This can be done in “Tests” tab. The example given below is to take all persons with API call and then to process the response and at random select one id which is stored as a global variable and then used in next request. You can put whatever JavaScript code you like in order to fulfill your logic.

var jsonData = JSON.parse(responseBody)
var size = jsonData.length
var index = parseInt(Math.random() * size)
postman.setGlobalVariable("userId", index);


Then in the subsequent request you can use GET call to URL: http://localhost:9000/person/get/{{userId}}


After a response is received Postman has a functionality to make verifications on it. This is done in “Tests” tab. Below is an example of different verifications. Most interesting part is in case of JSON response it can be parsed to an array and then elements accessed by index and value jsonData[0].id or even iterated as shown below. Format is: tests[“TEST_NAME”] = BOOLEAN_CONDITION.

tests["Status code is 200"] = responseCode.code === 200;

tests["Response time is less than 200ms"] = responseTime < 200;

var expected = ""
tests["Body cointains string: " + expected] = responseBody.has(expected);

var jsonData = JSON.parse(responseBody);
var expectedCount = 4
tests["Response count is: " + expectedCount] = jsonData.length === expectedCount;

for(var i=1; i<=expectedCount; i++) {
	tests["Verify id is: " + i] = jsonData[i-1].id === i;



Nota bene: if you use responseTime verification you have to know that it measures just the TTFB (time to the first bite) it does not measure the time needed to transfer the data. If you have API with big responses or network is slow you may fire the request, wait a lot and then Postman shows very small response time which might be confusing.

Run from command line

In order to run Postman tests in the command line as part of some CI process, there is a separate tool called Newman. It requires NodeJS to be installed and runs on NodeJS environment. It is very well described in How to write powerful automated API tests with Postman, Newman and Jenkins.

Code reuse between requests

It is very convenient some piece of code to be re-used between a request to prevent copy/paste it. Postman does not support yet code re-use between requests. Good thing is that there is a workaround for this. It is possible to do it by defining a helper function with verifications which are saved as a global variable in the first request from your test scenario:

postman.setGlobalVariable("loadHelpers", function loadHelpers() {
	let helpers = {};

	helpers.verifyCount = function verifyCount(expectedCount) {
		var jsonData = JSON.parse(responseBody);
		tests["Response count is: " + expectedCount] 
			= jsonData.length === expectedCount;

	// ...additional helpers

	return helpers;
} + '; loadHelpers();');

Then from other requests helpers are taken from global variables and verification functions can be used:

var helpers = eval(globals.loadHelpers);

See more in Reusing pre-request scripts across requests in a collection issue thread.


Postman is a very nice tool to use when developing your API or manual test it. I would definitely recommend Postman, I use it on daily basis for probing the API. For serious API functional tests automation, I would say Postman is not ready yet and you’d better go for another approach. Good thing is that there is a big community around it which is growing and new features are added.

Related Posts