How to use a Database project in .NET Visual Studio?

Introduction

The Database plays a vital role in any web or windows application development. The Database project can be used to maintain the repository of the schema changes.

Why do we need a Database project?

While developing any application, the developer has to work on many database tasks such as creating tables, triggers, stored procedures, functions, etc. It is difficult to manage the scripts at the time of deployment. Also, it is tedious to review the changes made by the developer during development. The database project in Visual studio addresses these issues.

How to create a database project?

  1. Create a new project in visual studio. Select the SQL Server Database Project, as shown below.

New project

2. Right-click on the project and select Import🡪Database

Importing database

3. After this, you get one popup as follows. Click on ‘Select Connection…’ option to import the database.

Import database

4. You will receive another popup to select the connection properties of your database. After entering the credentials (and authenticating successfully), you will get the dropdown filled with a list of databases on  the SQL server. Select the database for which you are creating the project.

Connection properties

5. Once you click on ‘Connect’, you will notice all the tables, procedures, functions, etc. which will get imported.

Import database summary

6. Check solution explorer for your database related artefacts.

Solution explorer

7. Now compare your project schema with the database schema after performing modifications to the database. Right-click on the solution and select Schema compare, as shown below.

Database demo

As you can see in the above image, there are two sections Source and Target.

Source: If you have made any changes in the database and you want to compare those on the target, then select the database name. If however, you have made changes in the database project that you want to compare with the database, then select your project path.

Target: It will always be with the source which will be compared. If you want, you can swap Source and Target. Once selected, click on ‘Compare’ to compare Source and Target schemas.

8. For example, I have added the Enquiry table in the database; Clicking on Compare allows you to see a list of modifications.

Database demo 8

The Action column in the above image is displayed if the changes are new or if there are any updates or deletions in the schema.

When you check the box and click on ‘Update’, all the changes will be applied to Target.

9. Now if you have made any changes in the database project and you want to execute those on the Production UAT database, then execute it in the same way as shown above.

10. In order to create the scripts for the changes, then compare the Database project with the Target Database. Then click on ‘Generate Scripts’.

Merits and Demerits of Database Project:

Merits:

  1. DDL (Data Definition Language) is readily available for all the objects.
  2. Deployment scripts can be easily generated.
  3. A database project also includes designers for working with these files.
  4. Version control can be done for the database through a series of check-in and check-outs for various operations performed on scripts and schema changes.
  5. Reviewing database changes is easier and allows  Dot Net developers to be more effective in terms of making changes and reviewing previous modifications done by team members.

Demerits:

  1. Schema designing is not available.
  2. References to external database sources cannot be created in the database project.
  3. The Visual studio database project only supports the SQL Server.

Power BI – A Visualization Tool which is easy to understand and develop

Power BI is a tool used for generating business intelligence reports, charts and graphs, that incorporate easy to understand visuals. It is a self-service BI tool that is particularly useful for data analysts who create and distribute BI reports throughout the organization. With moderate knowledge of SQL, one can develop simple power BI visuals after only very basic training.

The first important point is that it is essential to determine the Storage mode before starting to develop any power BI report. Storage modes that can be used for accessing the data can include:

  1. Import mode
  2. DirectQuery mode
  3. LiveConnect mode
  4. Push mode

My experience in developing a power BI report increased quickly, as I started by developing a very simple report and ended up learning a huge amount about more advanced elements of Power BI as the project requirements changed.

In the beginning, it was exciting to work on the first simple power BI report, which I developed using a database like SQL Server and Storage mode as Import.

PowerBI 1

As shown above, the visualisation was for a count of orders for the time period indicated by the date slider. The stacked column chart on the right shows the orders based on service type on a yearly basis. The visualisation can be done for any date hierarchy including daily, weekly, monthly, quarterly and yearly. In the pie chart on the left, the total orders are shown as both a total count and a percentage breakdown. As specifically requested by the client, we also added a reset button to restore the initially selected filters values.

Later on, during the deployment phase, we had to search for an on-premises data gateway to maintain the continuous connection with the data relying on the SQL server. We installed the on-premises data gateway on our database server connecting to the datasets created on the Power BI portal as shown below:

Datasets 2

We maintained daily refresh schedule as shown below:

Scheduled Cache refreshing 3

After deploying this report to production, the client requested a live report that showed current statistics without needing a page refresh. After some research I found that automatic page refresh can be achieved using the DirectQuery Storage mode. Unfortunately, my report was developed using the Import mode. This is when I learned the hard way that choosing the right storage mode from the beginning is very important, as I had to recreate the whole report using the DirectQuery storage mode. Meeting the clients’ needs required converting the storage mode to Direct Query, recreating the report and setting the automatic page refresh option to 5 seconds.

Orders 4

We can develop a variety of reports using the Visualization options as shown in the image below.

Visualization option 5

We can even use visuals other than those available in the Visualization pane, such as the Search tool shown below:

Search tool 6

At MetaSys, we  are focused on investing time into new and innovative projects like Power BI, to meet our clients’ needs.

For more information refer to  https://www.metasyssoftware.com/contact

InBody Integration for biometric and blood pressure data into a web application

Inbody integration with web appPeople today are more health-conscious than ever before, and digital technology is playing an important role in this development. Thanks to modern technology, there are many tools and devices to measure and record physical characteristics that relate to personal health. Tracking exercise routines and nutrition has become a popular tool for individuals to keep up a healthy lifestyle. Many health-related online platforms, applications and tools are available for individuals, and integration of such tracking devices can improve their service. 

This article details such an integration project. Our Dot Net application team helped a client integrate InBody technology into their application. 

InBody devices offer detailed measurement of body composition balance, including key health factors such as protein, minerals, BMI and body fat percentage. Our project revolved around integrating the InBody 570 device with a .Net application. The solution we implemented involved the .Net application listening to the serial ports attached to the device for the duration of the test. After receiving a data stream from the InBody device, relevant data is extracted using indices of specific data factors mentioned in the InBody 570 technical documentation. Key settings required by the .Net application include baud rate, parity, data bits, and stop bits. The InBody technical specification includes more details about the values of these settings.  Once the data stream is received, and the data factors are extracted, the data was saved in an SQL server database. The .Net web application reads this data from the SQL server database and displays it to the user. 

Key Challenges

The idea is that the application can be used by the fitness facility subscribing to the application and having an InBody machine. The application runs on a laptop or desktop computer connected to the Inbody machine. However, one challenge was that there is no guarantee that the computer is always connected using the same port. If the computer had more than one port on which the device stream can be received, then we had to lock and keep listening to all available ports. Our solution to these issues was to program the app to lock and keep listening to all available and open ports of the laptop and as soon as we receive either data stream or exception, we unlock all ports locked by the .net application. It is important to note that only that process which has locked the serial port can unlock it, no other process can forcibly remove that lock. In the event that the process which locked the port gets terminated, one is left with only the option of system reboot in order to unlock that port.

Another issue is that the same app was required to integrate with another measurement device manufactured by InBody, the BPBIO 320. This device reads systolic blood pressure, diastolic blood pressure and pulse, and we modified the .Net app to work with both InBody devices. We used a condition-based code, which looks at application settings and accordingly saves the data in either the production or the staging environment. The BPBIO 320 requires a different baud rate to that of InBody 570, therefore we adapted the application so that the user enters the test type, and then the baud rate is automatically set up by the software. We also process information contained within the data stream that indicates status such as “Measurement started”, “Measurement interrupted due to use of the stop button”, and “Measurement interrupted due to error”.  These cases are handled in the application and clear information is passed on to the user.  

We used a PuTTY tool to simulate the device stream in the development environment, as the physical device was not always available. Device testing was performed after development was complete, and we made the necessary changes to the application before the launch.

We will be glad to help anyone interested in doing similar kinds of integrations in their software. For more info https://www.metasyssoftware.com/case-study-dotnet

SQL Server on Linux!? – Meet SQL Server 2017

Over the last decade, DATA is the NEW OIL. While the increasing amount of digitization has led to data exploding exponentially, several other factors have also contributed. The cost of data storage has dropped substantially; enterprises are unwilling to delete any of their data. DATA is now a Corporate Asset and archived NOT deleted.
In the “Always On-Line” world all the systems accumulate huge amounts of data in real time. Analysts & experts want to analyze these mountains of data, popularly known as “Big Data”. The world today believes that in order to prosper and grow, they have to have the ability to “spot” emerging trends and patterns before they become mainstream.
This has led to a spectacular increase in the use of the Apache Hadoop Linux platform for Analytics. Data scientists across the world have been using this platform to manage and analyze “Big Data” since traditional SQL databases were not suitable in this environment.In the past, Microsoft provided interfaces to these technologies. To aid its own expansion and grab a market share in the emerging technologies Microsoft released R server for Hadoop & Linux and included R services starting with SQL 2016. They are further fortifying this position with the launch of SQL 2017 RC1.
Microsoft has introduced SQL Server 2017 RC1 which supports Linux. Interestingly Microsoft has chosen to support Red Hat Server & Desktop, Suse Linux & Ubuntu Linux flavors which cover most of the current installations. SQL server 2017 supports Linux and Active Directory services authentication for both Windows & Linux domain clients, allowing authentication of users across different platforms.
SQL server 2017 includes analytics using built-in Python & R services! Microsoft has always focussed on security. SQL server 2017 has TLS support so the data between SQL Server & clients can be always encrypted!
Some the features are not supported in this release of SQL server as yet. You can find details of the Microsoft release notes here.
MetaSys Software has been delivering high performance and secure database solutions using legacy SQL server version and recent undertakings are on SQL server 2016. If you are looking for web or database solutions, then please connect with us.