Web API security using JSON web tokens


Today data security during financial transactions is super important and critical. The protection of sensitive user data should be a major priority for developers working on applications that use financial or personal information of the clients.

These days, many apps are accessed through multiple devices including desktops, laptops, mobile phones and tablets. Both web apps, and native apps can use web APIs for accessing data and providing services. This article addresses the topic of ensuring client security of a web API during the development phase. I will share my experience with using JSON web tokens (JWT) to ensure security of a representational state transfer (REST) web API.

There are a two simpler alternatives to JWT that I will briefly mention first:

  1. Basic authentication:

    This method is very easy to implement. A username and password is passed and validated in a database to identify legitimate users. Since the username and password are sent as plain text, every request is very susceptible to cross-site request forgery (CSRF). The security can be improved somewhat by passing the details in the headers section of the web API instead of the URL, nevertheless this method is not very secure as it does not involve any encryption.

  2. API keys:

    This technique is used to overcome the drawbacks of basic authentication. In this method, a unique key is assigned every time the user signs in indicating that the user is known. A user can use the same key to re-enter the system. The security issue with this method is that the key can easily be picked up during network transmission. Often, the key is passed as a query string in the URL, making it easier for someone to compromise the security of the web API.

JWT avoids the security flaws of the two simpler methods, by providing a bearer token authentication of the Web API. With this method, the user name and password validates, whether, the user exists in the system. Information about the validated user like name, email address and UserID can be fetched. These items are included in the ‘claim’. Claims are pieces of information about a user that have been packaged and signed into security tokens.

A JWT token consists of three parts, the header, the payload and the signature.

Header – Contains the type of token and signing algorithm used

Payload – Contains the issuer of the claim, the subject of the claim and the audience, which refers to the intended recipient of the claim. Other information can also be included, such as an expiry time of the token, or additional user information.

Signature –Contains the encoded header, encoded payload and a secret key


To give you more details about JWT implementation, I’ll be going through the steps I took to implement JWT in my web API. First I created a web API project in .Net core 2.2. Next I installed two packages via npm of visual studio, using the following commands:

  • Install-Package System.IdentityModel.Tokens.Jwt -Version 5.6.0
  • Install-Package Microsoft.AspNetCore.Authentication.JwtBearer -Version 3.1.0

In the appsetting.json file, I added my JWT keys including the secret key, issuer, subject and audience as follows:

JWT keys

Next, I registered a JWT authentication schema by using the “AddAuthentication” method and specifying JwtBearerDefaults.AuthenticationScheme. in the ConfigureServices section of the start-up class.

JWT Authentication Schema

I also added app.UseAuthentication() in the configure method of the startup class.


Next, I created a token controller in the web API. This token controller action GetApiToken took the two input parameters: Username and Password, and validated these details against the database. Once the user is validated, I generated a token using the secret key, claims information and signing credentials.


The generated token was then stored as an item in sessionStorage.

For all my web API requests, I used the following key in the header section of each Ajax web API  call request.


Finally, I applied the [Authorize] attribute to my controller to which I was calling the web API.


These were all the steps I required to implement JWT authentication in my Web API. The tokens are encrypted, so they are difficult to tamper with. They expire at specific intervals and are cryptographically signed using a cryptographic algorithm.


The final implementation step is to remove the generated token item which was stored in sessionStorage when a user logs out of the system.


MetaSys has extensive expertise in building secure web APIs for web applications. Our team has experience in building custom software solutions for clients across different industry verticals. Please feel free to contact us if you are in need of a partner to build a secure web API.  For more info, visit our website: https://www.metasyssoftware.com/dot-net.

Improve Performance of Web Applications

We all know how frustrating it is to see the progress spinning wheel going on and on while navigating through a web app. It’s due to these performance issues that users lose interest in a web application, which can hinder the success of the app. Improving performance is an important task for any app developer, and there are many commercial tools available that can be useful. In this article I will share my experience and opinions on two commercially available tools; ANTS Performance Profiler and New Relic.

ANTS Profiler

ANTS Profiler can be used on any .Net application, either web based or in Windows to identify slow processes. We have found that it is useful both at the development stage, and the QA stage. Using the tool involves starting up the ANTS Profiler, and navigating through the app to view the pages with slow performance. Once the profiling is complete, we can dive deeper into the processes that the profiler identifies as slow.

To give you an idea, here are some examples of performance issues we were able to identify and address:

  1. The first step we took when analyzing a slow app using the ANTS tool, was to check the database calls. The profiling showed that certain stored procedures were taking a lot of time. The problem was addressed by rearranging joins used in the stored procedures, and selecting only those columns necessary for the output. This significantly improved the rendering time for the page.
  2. The profiling also showed that in a single page, there were multiple stored procedure calls which were increasing the database hits and slowing down the app. To overcome this problem, we combined multiple stored procedures into one, which improved the page performance.
  3. It was further identified that whilst rendering the page, multiple JavaScript and CSS files were getting loaded, making rendering very slow. The ANTS Profiling helped identify the slowest web requests. This allowed us to use the bundling concept to group files together in order to reduce the number of web requests, thereby improving the speed of rendering.

New Relic

New Relic is another commercial tool which can be used to analyze performance once an app has already been launched. It provides real-time data of user experience with the application in the production environment, which is extremely useful to optimise the process of improving application performance.

To give you an idea of how New Relic can be used, below are some insights we gained from New Relic when trying to improve a customized web application.

New Relic

  1. The data showed us that as the number of users increases, the rendering time for the page increases as well. It was able to give us a lot of insight into how CPU and memory is used by the application.
  2. The data also showed us which pages are most viewed, allowing us to focus on improving the performance of these pages. It also showed us on which pages errors were most frequently encountered, as well as the time taken for web requests and database calls on these pages. These were then fine-tuned to significantly minimize the frequency of errors.
  3. The tool gives us analytic information about the most frequently used browsers and devices used to access the web application. This information helped us focus the application implementation in order to improve the user friendliness on those browsers and devices.

Improving the app performance in terms of speed and user friendliness will improve the user experience and thereby significantly increase web traffic. Although working on application performance can be a pain, ignoring it is not advisable for any developer. The use of these tools can be very helpful at different stages: ANTS Profiler is most useful at the development environment and for QA, whereas New Relic is most useful in the production environment to analyse the user data.

MetaSys has extensive expertise in improving application performance, including the use of the different tools, some of which have been described in this article. Feel free to contact us for help with improving your application performance. For more info https://www.metasyssoftware.com/dot-net


InBody Integration for biometric and blood pressure data into a web application

Inbody integration with web appPeople today are more health-conscious than ever before, and digital technology is playing an important role in this development. Thanks to modern technology, there are many tools and devices to measure and record physical characteristics that relate to personal health. Tracking exercise routines and nutrition has become a popular tool for individuals to keep up a healthy lifestyle. Many health-related online platforms, applications and tools are available for individuals, and integration of such tracking devices can improve their service. 

This article details such an integration project. Our Dot Net application team helped a client integrate InBody technology into their application. 

InBody devices offer detailed measurement of body composition balance, including key health factors such as protein, minerals, BMI and body fat percentage. Our project revolved around integrating the InBody 570 device with a .Net application. The solution we implemented involved the .Net application listening to the serial ports attached to the device for the duration of the test. After receiving a data stream from the InBody device, relevant data is extracted using indices of specific data factors mentioned in the InBody 570 technical documentation. Key settings required by the .Net application include baud rate, parity, data bits, and stop bits. The InBody technical specification includes more details about the values of these settings.  Once the data stream is received, and the data factors are extracted, the data was saved in an SQL server database. The .Net web application reads this data from the SQL server database and displays it to the user. 

Key Challenges

The idea is that the application can be used by the fitness facility subscribing to the application and having an InBody machine. The application runs on a laptop or desktop computer connected to the Inbody machine. However, one challenge was that there is no guarantee that the computer is always connected using the same port. If the computer had more than one port on which the device stream can be received, then we had to lock and keep listening to all available ports. Our solution to these issues was to program the app to lock and keep listening to all available and open ports of the laptop and as soon as we receive either data stream or exception, we unlock all ports locked by the .net application. It is important to note that only that process which has locked the serial port can unlock it, no other process can forcibly remove that lock. In the event that the process which locked the port gets terminated, one is left with only the option of system reboot in order to unlock that port.

Another issue is that the same app was required to integrate with another measurement device manufactured by InBody, the BPBIO 320. This device reads systolic blood pressure, diastolic blood pressure and pulse, and we modified the .Net app to work with both InBody devices. We used a condition-based code, which looks at application settings and accordingly saves the data in either the production or the staging environment. The BPBIO 320 requires a different baud rate to that of InBody 570, therefore we adapted the application so that the user enters the test type, and then the baud rate is automatically set up by the software. We also process information contained within the data stream that indicates status such as “Measurement started”, “Measurement interrupted due to use of the stop button”, and “Measurement interrupted due to error”.  These cases are handled in the application and clear information is passed on to the user.  

We used a PuTTY tool to simulate the device stream in the development environment, as the physical device was not always available. Device testing was performed after development was complete, and we made the necessary changes to the application before the launch.

We will be glad to help anyone interested in doing similar kinds of integrations in their software. For more info https://www.metasyssoftware.com/case-study-dotnet