Azure Space in its Entirety

The space community is quickly expanding and innovation is decreasing the hurdles to entry for both public and private-sector organisations. Microsoft’s goal is to make space networking and computing more accessible to businesses, especially for industries such as agriculture, energy, telecommunications, and government.

For this purpose, Microsoft is working with SpaceX to provide satellite-powered Internet connectivity on Azure. This collaboration will help Microsoft to deliver cloud services to even the remotest regions of the planet. With Azure Space, Microsoft provides the most ambitious cloud service for businesses.

What is Azure Space?

Azure Space extends Azure capabilities anywhere in the world with Space infrastructure.

It is a cloud computing service by Microsoft for the space community. Its products and services are built on key partnerships and designed to meet the needs of the space community. This technology will help connect people with Azure services and space satellite technology.

Azure Space will link Azure’s existing cloud platform with new data centres and a network of privately owned satellites to supply high-speed Internet across the globe. It is particularly aimed at businesses that are located in remote corners of the world. This will help billions of people worldwide to stay connected.

Benefits of Azure Space:

  • Connectivity to and from remote locations
  • Availability and flexibility of Hyperscale Services
  • Improvement in the quality of satellite imagery using AI (Artificial Intelligence) and the cloud
  • Fastest Internet connections
  • Ability to see through clouds using the SAR and SpaceEye
  • Management of large amounts of data generated by satellites

Azure Space provides:

  • Global Connectivity: It can connect data from anywhere in the world.
  • Artificial Intelligence For Space: – It can collect huge amount of raw space data and turn it into usable information using artificial intelligence.
  • Digital Engineering: It will aid the space community to innovate with speed and mission assurances.
  • Azure in Space: It will also help create unlimited cloud endpoint devices on the planet to connect people from anywhere around the world.

Azure Orbital

Businesses may utilize Azure Orbital Emulator, an impressive feature of Azure Space. Artificial Intelligence simulators are required by this Service to get ready for the space mission. It can aid missions and assist in avoiding costly blunders.

Applications of Azure Orbital

  • Application development
  • AI data and analytics
  • Hybrid cloud and infrastructure
  • Internet of Things. Connect, monitor, and control devices with secure, scalable, and open edge-to-cloud solutions
  • Security and governance

How does Azure Orbital work with other Azure products and services?

Azure Orbital complements other Azure services to make it easy to process and store data from satellites either in the local region of each Azure Orbital antenna or in another Azure region (using the Azure Global backbone).

Azure Modular Datacenter

An Azure Modular Datacenter, or MDC, is a movable field component of Azure data centre that can be used from any location. It is used for those customers who need cloud computing capabilities in hybrid environments, including remote locations.

The Azure Modular Datacenter provides reliability and security, even in executing the most critical applications. It provides Azure processing power and storage.

Azure Modular Datacenter gives customers the ability to migrate their apps to Azure while running these workloads on-premises with low-latency connections to their own data centre.

Azure Services from MetaSys

A software services firm with headquarters in India, Metasys Software offers solutions built with DotNet, FileMaker, iOS, PHP, and React. Our Microsoft Azure-certified experts provide Azure consulting services. They can assist with Azure services in areas such as developing new product/solution, application modernization, application migration, Windows integration, and Azure web application development. For more details click here.

Guide to Avoid Navigation Rework in FileMaker

Did you ever encounter the turmoil of adding a new module on all layouts post creation of standard menu navigation?

Well, we all know how bothersome that could be! Here we are with an easy solution, read along…

Undoubtedly, navigation is an integral part of every application. It plays a pivotal role when a user requires navigating to different modules and sections.

Static navigation is rigid, which demands rework of all modules even when a single module is to be updated/added. Planning an end-to-end navigation prior to application development is the perfect solution. However, practically it is impossible to predict all modules and create a navigation way ahead of time. The second-best solution is to setup a smart extensive set of menu items for the future.

Let’s take a look at two elaborate approaches to implement a sidebar navigation!

Approach 1: Using Fixed Left Navigation

Contacts

 In this approach, a button bar is added on the left-hand side of the layout in advance. Each button has calculations referring to dynamic variables as button labels, with a common script to define every button action. By default, this navigation bar is visible on all layouts.

The downside to this approach, it can work for Form views, but not for List views. Additionally, the demand for more real estate in the layout of the application is another drawback that leads us to Approach 2.

Approach 2: Hamburger Menu Using Popover

Hamburger menu with popover

 

In this approach, a Hamburger menu icon is placed in the header of the layout. Clicking this will expand/collapse the popover. The popover has a button bar and it’s implementation is similar to Approach 1, with the only difference of buttons placed on the Popover. This approach works well for Form view, List view, and Table view.

Contact list

Let us take a look at the Application Implementation Steps in detail.

 Step 1 – Opening Script

Create an Opening script or a Startup script to set global variable values that are used as calculation to define labels for each button segment in a Button bar.

For any new module navigation, set the global variable to the same.

Opening Script

 

Contacts 2

 

Step 2 – Specify Active button

The Sidebar_Values variable defined in the Opening script will store the list of table names, i.e. main navigation module table names. It is then used to define the active button bar segment of selected module as below.

Specify Active Button

 

Step 3 – Main Navigation Script:

Create another script which is the main script used for navigation between modules. This script is attached to every segment of button bar, with a script parameter defined for the module to navigate.

If the user is currently on the Details view, then the application redirects to Details view of the destination module. Similarly, If the user is in List view, the application redirects to the List view of destination module.

Each layout has set icons to help navigate to Details view or List view specifically. For instance, if the user is in Details view, clicking relevant icon will help navigate to List view and vice versa.

Main Navigation Script

 

This script checks if the user is on Details or List view and sets it to a variable. Append this variable to script parameter to get the layout where the user wants to navigate .

Step 4 – Naming convention:

Each layout requires a specific naming convention . In this case, it is <<tablename>> _<<Viewtype>>.

Here the ViewType can be DetailsView, ListView or TableView. Let’s take a look at some of these examples.

  1. Contact details layout will be ‘Contacts_DetailsView’
  2. Contact list layout will be ‘Contacts_ListView’
  3. Contact table view layout will be ‘Contacts_TableView’

By following the Approach 2, adding a new module is matter of few simple steps and the application is as good as new.

Hope you find this article insightful. Feel free to leave a comment or post any challenges that you may have.

Our FileMaker development experience includes handling complex project management systems, e-commerce shopping sites, cruise booking systems and more.

A Developer’s Tale: Using Microsoft technologies to integrate Myzone device with Genavix application

It is no secret that digital technology is slowly shaping the future of the healthcare industry. The use of fitness tracking devices has grown rapidly, the philosophy being ‘that which gets measured can be improved’. The MetaSys team has worked on integrating tracking devices with Fitness and Nutrition Applications.  This article describes one such achievement, namely integration of Myzone into our clients’ application.

Our client: Genavix is a venture in the preventive health care services industry. It offers members comprehensive health and fitness solutions to reduce health risks and improve wellness. They provide various fitness and nutrition plans helping associates follow a healthy lifestyle.

About Myzone: Myzone is an innovative wearable heart rate based system. It uses wireless and cloud technology to monitor physical activity. Myzone allows the user to view their fitness efforts live on their phone, such as how many calories they are burning. Users are also able to make connections and compete with other individuals in the network. Based on their fitness efforts, users are awarded MEPs (Myzone Effort Points), which they can use to challenge others on the platform.

We at MetaSys were given the task of integrating the Myzone system with our client’s application. Our solution used the fact that each Myzone device has a unique belt number, which is registered with a particular wellness facility. Users are able to submit their Myzone belt number, after which a unique ID is assigned to their profile. The client system can then track all Myzone activities.

One of the challenges we faced in the integration, was that the data obtained from the Myzone API is only available for the last three months. As a long term health app, our client tracks fitness for a much longer duration. To resolve this issue, we decided to import the data every night by creating a task scheduler, which imports data from the Myzone API into the application SQL tables.

Each wellness facility with registered Myzone belts has a unique API key. Using this API key and Myzone Cloudbase API – 2.0.3, a web request API call is made which retrieves Myzone API data of each user in the previous three months. The data is received in JSON (JavaScript Object Notation), is then interpreted in Asp.net and imported into the application SQL tables. The imported data is used to build a user fitness diary that integrates the Myzone data with data that has been manually entered in the application by the user. The detailed tracking of fitness, such as tracking calories is used to help every user, achieve their daily health and fitness goals.

If you wish to collaborate on similar projects, feel free to contact us. We have extensive experience in integrating devices into web applications such as ASP.NET Framework, JavaScript, jQuery, Microsoft SQL Server. For more details, please visit – https://www.metasyssoftware.com/dot-net

 

Barcode Scanning for a web based application

In this article I will share some information about a recent barcode scanning implementation we did for a web based application for one of our clients.

Barcodes are nothing more than a machine readable form of data represented in the form of lines.  Nowadays, barcodes are an essential part of inventory management for a number of reasons. Firstly, saving time both in terms of data entry, and the automatic processing of the entries. Secondly, entry errors are reduced as the barcode scanning process has a very low error rate. Finally, barcodes help companies track the product across the entire production pipeline. Even after the product is shipped out, the company can track the product throughout its entire lifecycle.

Recently, we worked on a project for a client who wished to include barcode scanning capability in a personal health tracking software application. The required functionality was that the end-user could scan various food items and store the data in the applications web portal. This would allow the user to record their daily food intake conveniently, without wasting much time entering the data.

The first step in the implementation was a data import of standard food item barcodes, which we imported from an available data library. This gave us over 200,000 records of day-to-day food items of popular brands.

Since the users don’t typically own barcode readers, we required a solution that allowed the users to scan the barcodes using their personal electronic devices. Since most people carry mobile phones with a camera, we started looking into the option of using phone cameras as barcode readers.

Since we had a web-based application, it was preferable for us to use a client-side code library or plug-in. After evaluating a few possible options, we decided to use ‘QuaggaJS’ which is a JavaScript-based advanced barcode reader. ‘QuaggaJS’ can read various types of barcodes such as EANCODE 128CODE 39EAN 8UPC-AUPC-CI2of52of5CODE 93 and CODABAR.

‘QuaggaJS’ implements the following steps:

  1. Read the image and convert it into a binary representation
  2. Find the location and rotation of barcode
  3. Decode the barcode

We wanted to allow the users to scan barcodes using their laptop as well as mobile phones. We kept specific benchmarks about camera resolutions, and if a user’s laptop or mobile camera met those benchmarks, then they could scan the barcode. We also required an alternative solution for users with older mobile phones which did not have cameras that met the benchmark. We decided to let the user choose any of three options to enter a food on the portal:

  1. Live scan: using the mobile camera to scan the barcode
  2. File upload: upload an image of the barcode on the portal
  3. Manual entry: enter the barcode numerically

After entering the barcode, the user can look up various information about the item if it is in the library. The library includes valuable information such as calories, portion sizes, and nutritional content. Our goal was to make food tracking on the application very user friendly, and using barcode scanning we managed to provide the user a very quick and easy way to track packaged foods.

Feel free to contact us if you are interested in a similar implementation for your application.

Device and Browser Testing Strategies

Testing without proper planning can cause major problems for an app release, as it can result in compromised software quality and an increase in total cost. Defining and following a suitable and thorough testing procedure is a very important part of the development process that should be considered from the very beginning. Time should be specifically allocated to the manual testing on devices and browsers, as this is a low cost strategy to significantly improve the quality of the app release.  In this article, I will share some of the strategies we follow at MetaSys for real device and browser testing.

There are four points that we consider when defining our testing strategy.

  1. The first point is determining which devices and browsers will be used for testing. This is entirely dependent on the project requirements, and the development team analyses the application use cases to make the selection based on the following principles:
  • For web applications, we usually test on the three most commonly used browsers (Chrome, Firefox and Safari). If time allows for more extensive testing, we will also test on other browsers like Internet Explorer and Microsoft Edge.
  • For Device testing of web applications, we choose the devices based on the functional requirements and priorities of the applications. In other words, if a web application is supposed to run especially well on any particular device we focus the testing on the corresponding commonly used browsers with the appropriate resolution. For instance, for the Android platform we focus on Chrome and Firefox, whereas for the iOS platform we focus on Safari and Chrome.
  • For Native applications we directly test the application on the devices themselves, rather than using an emulator. This provides the most accurate feedback in terms of functionality and application performance.
  1. There are instances where the project timeline and/or budget limits the amount of testing that we can do. It is very important to identify these situations, and to develop strategies in order to still deliver high quality software to the client. At MetaSys we handle these cases by focusing on high level general testing, which covers most of the UI and the functional part of the applications.
  2. For functional testing of web applications, we utilise automation as much as possible. For repetitive testing of browsers, we usually design automated test cases. Using automation not only helps save the time of the testers, it is also very useful for retesting resolved issues. We use the Selenium WebDriver tool for automation testing and the Microsoft Team Foundation Server 2019 and the Microsoft Test Management tools for bug reporting and test case management.
  3. For web applications, we put a strong emphasis on performance, in addition to the ‘look and feel’., The speed of the app is one of the most important factors that determines the user experience. For performance testing we use the Apache JMeter and New Relic tools which give very accurate results regarding the application performance. The New Relic tool also provides an analysis of database query level problems, and gives many more reports and real time graphs. This helps significantly with troubleshooting, and improving performance.

At MetaSys, We have a team of experienced Dot Net developers who build solutions using Microsoft technologies. We have done web application development using ASP.Net Core, .Net & ASP.Net Framework, Visual Studio, Microsoft SQL Server, MVC, Team Foundation Server, Javascript and JQuery. For more info. https://www.metasyssoftware.com/dot-net

Web API security using JSON web tokens

 

Today data security during financial transactions is super important and critical. The protection of sensitive user data should be a major priority for developers working on applications that use financial or personal information of the clients.

These days, many apps are accessed through multiple devices including desktops, laptops, mobile phones and tablets. Both web apps, and native apps can use web APIs for accessing data and providing services. This article addresses the topic of ensuring client security of a web API during the development phase. I will share my experience with using JSON web tokens (JWT) to ensure security of a representational state transfer (REST) web API.

There are a two simpler alternatives to JWT that I will briefly mention first:

  1. Basic authentication:

    This method is very easy to implement. A username and password is passed and validated in a database to identify legitimate users. Since the username and password are sent as plain text, every request is very susceptible to cross-site request forgery (CSRF). The security can be improved somewhat by passing the details in the headers section of the web API instead of the URL, nevertheless this method is not very secure as it does not involve any encryption.

  2. API keys:

    This technique is used to overcome the drawbacks of basic authentication. In this method, a unique key is assigned every time the user signs in indicating that the user is known. A user can use the same key to re-enter the system. The security issue with this method is that the key can easily be picked up during network transmission. Often, the key is passed as a query string in the URL, making it easier for someone to compromise the security of the web API.

JWT avoids the security flaws of the two simpler methods, by providing a bearer token authentication of the Web API. With this method, the user name and password validates, whether, the user exists in the system. Information about the validated user like name, email address and UserID can be fetched. These items are included in the ‘claim’. Claims are pieces of information about a user that have been packaged and signed into security tokens.

A JWT token consists of three parts, the header, the payload and the signature.

Header – Contains the type of token and signing algorithm used

Payload – Contains the issuer of the claim, the subject of the claim and the audience, which refers to the intended recipient of the claim. Other information can also be included, such as an expiry time of the token, or additional user information.

Signature –Contains the encoded header, encoded payload and a secret key

Implementation

To give you more details about JWT implementation, I’ll be going through the steps I took to implement JWT in my web API. First I created a web API project in .Net core 2.2. Next I installed two packages via npm of visual studio, using the following commands:

  • Install-Package System.IdentityModel.Tokens.Jwt -Version 5.6.0
  • Install-Package Microsoft.AspNetCore.Authentication.JwtBearer -Version 3.1.0

In the appsetting.json file, I added my JWT keys including the secret key, issuer, subject and audience as follows:

JWT keys

Next, I registered a JWT authentication schema by using the “AddAuthentication” method and specifying JwtBearerDefaults.AuthenticationScheme. in the ConfigureServices section of the start-up class.

JWT Authentication Schema

I also added app.UseAuthentication() in the configure method of the startup class.

UseAuthenticationConfiguration

Next, I created a token controller in the web API. This token controller action GetApiToken took the two input parameters: Username and Password, and validated these details against the database. Once the user is validated, I generated a token using the secret key, claims information and signing credentials.

TokenControlerInfo

The generated token was then stored as an item in sessionStorage.

For all my web API requests, I used the following key in the header section of each Ajax web API  call request.

AjaxCallWithBearerToken

Finally, I applied the [Authorize] attribute to my controller to which I was calling the web API.

AuthorizeAttribute

These were all the steps I required to implement JWT authentication in my Web API. The tokens are encrypted, so they are difficult to tamper with. They expire at specific intervals and are cryptographically signed using a cryptographic algorithm.

AjaxCallRequestHeaders

The final implementation step is to remove the generated token item which was stored in sessionStorage when a user logs out of the system.

LogoutInfo

MetaSys has extensive expertise in building secure web APIs for web applications. Our team has experience in building custom software solutions for clients across different industry verticals. Please feel free to contact us if you are in need of a partner to build a secure web API.  For more info, visit our website: https://www.metasyssoftware.com/dot-net.

Improve Performance of Web Applications

We all know how frustrating it is to see the progress spinning wheel going on and on while navigating through a web app. It’s due to these performance issues that users lose interest in a web application, which can hinder the success of the app. Improving performance is an important task for any app developer, and there are many commercial tools available that can be useful. In this article I will share my experience and opinions on two commercially available tools; ANTS Performance Profiler and New Relic.

ANTS Profiler

ANTS Profiler can be used on any .Net application, either web based or in Windows to identify slow processes. We have found that it is useful both at the development stage, and the QA stage. Using the tool involves starting up the ANTS Profiler, and navigating through the app to view the pages with slow performance. Once the profiling is complete, we can dive deeper into the processes that the profiler identifies as slow.

To give you an idea, here are some examples of performance issues we were able to identify and address:

  1. The first step we took when analyzing a slow app using the ANTS tool, was to check the database calls. The profiling showed that certain stored procedures were taking a lot of time. The problem was addressed by rearranging joins used in the stored procedures, and selecting only those columns necessary for the output. This significantly improved the rendering time for the page.
  2. The profiling also showed that in a single page, there were multiple stored procedure calls which were increasing the database hits and slowing down the app. To overcome this problem, we combined multiple stored procedures into one, which improved the page performance.
  3. It was further identified that whilst rendering the page, multiple JavaScript and CSS files were getting loaded, making rendering very slow. The ANTS Profiling helped identify the slowest web requests. This allowed us to use the bundling concept to group files together in order to reduce the number of web requests, thereby improving the speed of rendering.

New Relic

New Relic is another commercial tool which can be used to analyze performance once an app has already been launched. It provides real-time data of user experience with the application in the production environment, which is extremely useful to optimise the process of improving application performance.

To give you an idea of how New Relic can be used, below are some insights we gained from New Relic when trying to improve a customized web application.

New Relic

  1. The data showed us that as the number of users increases, the rendering time for the page increases as well. It was able to give us a lot of insight into how CPU and memory is used by the application.
  2. The data also showed us which pages are most viewed, allowing us to focus on improving the performance of these pages. It also showed us on which pages errors were most frequently encountered, as well as the time taken for web requests and database calls on these pages. These were then fine-tuned to significantly minimize the frequency of errors.
  3. The tool gives us analytic information about the most frequently used browsers and devices used to access the web application. This information helped us focus the application implementation in order to improve the user friendliness on those browsers and devices.

Improving the app performance in terms of speed and user friendliness will improve the user experience and thereby significantly increase web traffic. Although working on application performance can be a pain, ignoring it is not advisable for any developer. The use of these tools can be very helpful at different stages: ANTS Profiler is most useful at the development environment and for QA, whereas New Relic is most useful in the production environment to analyse the user data.

MetaSys has extensive expertise in improving application performance, including the use of the different tools, some of which have been described in this article. Feel free to contact us for help with improving your application performance. For more info https://www.metasyssoftware.com/dot-net

 

How MetaSys handled performance Issues related to Entity Framework

In building web applications for clients, two important factors we at MetaSys focus on are performance, and speed of development. Good performance is crucial for the success of any web application, as users expect pages and screens to load instantly. Users will quickly stop using slow programs in favour of other web or mobile applications. Development speed is also important, as clients currently expect rapid application development.

We have experienced difficulties in both these areas using the Entity Framework, and in this article, I will be describing the cause of the issues, and the solutions that we at MetaSys came up with. If you are having performance issues with Entity Framework, this article might provide some useful insight and suggestions.

Our issues with the  Entity Framework:

Let me quickly brief you about how we started using Entity Framework. We started application development of a .NET web application roughly 10 years ago. At the time, Entity Framework was a new concept introduced by Microsoft. Its purpose was to allow the developer to more easily write SQL queries including calculations, it also simplified CRUD operations and handled results into objects. We used Entity Framework for our web application, and during initial testing, everything was working very well.

The performance issues arose after the client started using the application, particularly as the amount of data in the database started growing. We used the Ants Profiler tool to identify the root cause of poor performance. It showed us that stored procedures were executed fast without any significant delay, but with the Entity Framework code, it was taking a long time to render data on a page.

Another issue was that the SQL database for the application had more than 300 tables. Updating the Entity model with a change in any of the tables would take a very long time. It was also difficult to merge changes of only one developer, or only one module, as it would update the entire Entity model. This made it a challenge to release the application module-wise.

MetaSys Approach :

To overcome performance issues, we first tried to change some of the settings of EDMX, and secondly updated the Entity Framework to the latest version. Neither made much difference to the performance. In the meanwhile, the applications database size and complexity kept on growing, as the application grew.

Eventually, we replaced the Entity with ADO.NET, and we immediately saw a significant improvement in performance. The difficulty we faced with the conversion was how to handle the ADO.NET result into objects. We resolved this using the open-source Dapper ORM. Dapper is a framework for mapping relational object models. Like Entity Framework, It eases the handling of data in objects and supports multiple data query results. This solution not only improved the page loading time, but as there was no need to update the entity model, the developer’s time also reduced significantly.

So far we have found that using ADO.Net with Dapper ORM solved all the problems we experienced with the Entity Framework.

About Us:

Our team of Dot Net developers have experience of building Dot Net solutions using Microsoft technologies for more than two decades using VB to latest .Net Core applications. For more info. https://www.metasyssoftware.com/dot-net

 

 

How I cracked my MCSA Web Applications certification Exam

It has been a couple of months since I gained MCSA Certification Exam in Web Applications, and in this article, I will share my experience of preparing for and taking the exams. If you’re interested in getting MCSA certification, this article might give you an idea of how long you might need to prepare, and what the process of taking the exams is like. Keep in mind that Microsoft is continually updating the requirements for its certifications. To avoid disappointment I would recommend that you are alert and check their website frequently. 

The story started almost a year ago when the CEO of our company and my project manager insisted that I get Microsoft certification. Two exams are required: Exam 70-483: Programming in C#, or Exam 70-486: Developing ASP.NET MVC Web Applications. I chose to tackle the C# exam (70-483) first, as I thought it might be the easier option of the two. Nevertheless, the exam syllabus for C# was substantial, and I was determined to take the exam after four months of preparation. 

Preparing for the C# Exam

Even though I knew C# quite well, I started the exam preparations from the beginning by watching C# sharp tutorial videos from https://www.youtube.com/watch?v=SXmVym6L8dw&list=PLAC325451207E3105

I watched all the videos to brush up my knowledge on C# basics, and found it very helpful.

Next, I read the exam guide that covers all the topics from the syllabus. This allowed me to get a grasp of the scope of the syllabus, and helped me identify other helpful resources. Two additional books that I found useful were ‘C# Fundamentals for Absolute Beginners’ and ‘Programming in C# Jump Start’. One of the best ways to thoroughly prepare for this exam is by going through the Microsoft documentation for C#.

https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/

I was lucky enough to have a lot of support at the office. During the time that I was preparing for the exam, I was working on relevant live projects at the office. These were MVC based projects with WEB API and Azure. Working on the project implementation allowed me to consolidate the theoretical concepts from the videos and books. Furthermore, the seniors in the team helped me by sharing their extensive practical experience. 

Taking the exam:

You can schedule the C# exam from this link https://www.microsoft.com/en-us/learning/exam-70-483.aspx. I scheduled my exam at 11:30 AM, on a weekday so I could reach the exam center easily and then go straight to the office. You do also have the option to take the exam from home, but I prefered to go to the Microsoft training center. 

The C# exam was MCQ based, and the questions ranged from basic to advanced  C#. The majority of the questions were scenario based, and could be flagged and returned to. The exceptions were four problem-based questions which could not be returned to. The exam was not easy, and I believe that some practical experience in C# programming is necessary to pass. I was very pleased to pass the exam on my first attempt.

Preparing for the MVC Exam

I started preparing for the second exam on MVC Web applications the day  after passing the first one. I gave myself 5 months to study for the exam. The resources I used for the Web Applications exam were similar to the resources I used for the C# exam. I relied heavily on the exam reference book ‘Exam Ref 70-486: Developing ASP.NET MVC 4 Web Applications’ to best understand the scope of the exam. Another book I found useful was: ‘Pro ASP.NET MVC 5’. I also used youtube videos, the microsoft documentation, and various other websites linked below.

Links-
https://youtube.com/watch?v=-pzwRwYlXMw
https://examref.com/MCSD/70-486/1
https://docs.microsoft.com/en-us/aspnet/mvc/overview/getting-started/introduction/getting-started
https://docs.microsoft.com/en-us/aspnet/core/?view=aspnetcore-3.1
https://www.guru99.com/microsoft-azure-tutorial.html


During the time I was preparing, training was conducted in the office for ASP .Net Core. This was helpful as the syllabus of the MVC exam newly included ASP .Net Core and I had no prior experience with it. Following this training, the project manager encouraged the team to convert a live MVC project to .Net Core. This allowed me to gain valuable hands-on experiences with this new platform. 

You can schedule the MVC exam using the link: https://www.microsoft.com/en-us/learning/exam-70-486.aspx

The MVC exam questionnaire starts with a number of questions about a case study, based on a real-world example. The rest of the questions are scenario-based. 

Although I found the exam very difficult, I was really pleased to pass with a good score on my first attempt, and to finally get the MCSA Certification. After I finished the exam I returned to the office, there was quite a celebration in the office and it made my day!

MetaSys has a team of developers working on Microsoft technologies and building dynamic web applications using ASP.NET, ASP.NET CORE and SQL Server. For more info on the kind of projects handled by us https://www.metasyssoftware.com/case-study-dotnet