Installing SSAS Template in Visual Studio 2017

If you have not Selected Data Processing templates during your Visual Studio Installation, then you need to Download the SSDT for VS 2017 from following URL:

https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017

Installation

On running the downloaded executable file, you will get the following prompt.  Choose the First Option if you want to work with All Templates in current Visual Studio instance.  Also, Select the Analysis Services option.

image

Click the Next button to start Installation.

image

Once the Installation is completed, You will get the following message.

image

Inside Visual Studio

Open Visual Studio. If you can see following Templates then you are good!

image

Summary

This post is a Part of Starting with Azure Analysis Services series. Please check other Articles for the continuity.

AutoMapper vs. QuickMapper vs. Reflection

In this post I would like to Compare the Speed Performance between:

  • AutoMapper
  • Reflection
  • Manual Mapper

AutoMapper

AutoMapper is a well-known framework for Mapping Properties between Class Instances.  It is very useful in the case of DTO to Entity mapping & vice-versa.

Reflection

Here I am writing my own Mapping code using .Net Reflection.

Manual Mapper

Here I will be using Manual code for assigning the property values.

Scenario

I am using an Entity class of 10 Properties and Creating 100K instances.  Let us see whether AutoMapper performs better than Raw Reflection code.

Following is the Entity class.

public class Entity
{
     public string Property1 { get; set; }
     public string Property2 { get; set; }
     public string Property3 { get; set; }
     public string Property4 { get; set; }
     public string Property5 { get; set; }
     public string Property6 { get; set; }
     public string Property7 { get; set; }
     public string Property8 { get; set; }
     public string Property9 { get; set; }
     public string Property10 { get; set; }
}

Following is the Dto class.

public class Dto
{
     public string Property1 { get; set; }
     public string Property2 { get; set; }
     public string Property3 { get; set; }
     public string Property4 { get; set; }
     public string Property5 { get; set; }
     public string Property6 { get; set; }
     public string Property7 { get; set; }
     public string Property8 { get; set; }
     public string Property9 { get; set; }
     public string Property10 { get; set; }
}

Following is the AutoMapper Nuget package name.

image

Following is the Reflection code.

public class ReflectionMapper
     {
         public static List<TResult> Map<TSource, TResult>(IList<TSource> sourceList) where TResult : new()
         {
             var result = new List<TResult>(sourceList.Count);

            PropertyDescriptorCollection psrc = TypeDescriptor.GetProperties(typeof(TSource));
             PropertyDescriptorCollection presult = TypeDescriptor.GetProperties(typeof(TResult));

            TResult obj;
             Object colVal;
             string field1 = “”;
             string field2 = “”;

            foreach (TSource item in sourceList)
             {
                 obj = new TResult();

                for (int iResult = 0; iResult < presult.Count; iResult++)
                 {
                     PropertyDescriptor propResult = presult[iResult];
                     field1 = propResult.Name;

                    for (int ix = 0; ix < presult.Count; ix++)
                     {
                         PropertyDescriptor propSource = psrc[ix];

                        field2 = propSource.Name;

                        if (field1 == field2)
                         {
                             colVal = propSource.GetValue(item) ?? null;
                             propResult.SetValue(obj, colVal);
                         }
                     }
                 }

                result.Add(obj);
             }
             return result;
         }

Following is the Manual Mapping code.

public class ReflectionMapper
     {
         public static List<TResult> Map<TSource, TResult>(IList<TSource> sourceList) where TResult : new()
         {
             var result = new List<TResult>(sourceList.Count);

            PropertyDescriptorCollection psrc = TypeDescriptor.GetProperties(typeof(TSource));
             PropertyDescriptorCollection presult = TypeDescriptor.GetProperties(typeof(TResult));

            TResult obj;
             Object colVal;
             string field1 = “”;
             string field2 = “”;

            foreach (TSource item in sourceList)
             {
                 obj = new TResult();

                for (int iResult = 0; iResult < presult.Count; iResult++)
                 {
                     PropertyDescriptor propResult = presult[iResult];
                     field1 = propResult.Name;

                    for (int ix = 0; ix < presult.Count; ix++)
                     {
                         PropertyDescriptor propSource = psrc[ix];

                        field2 = propSource.Name;

                        if (field1 == field2)
                         {
                             colVal = propSource.GetValue(item) ?? null;
                             propResult.SetValue(obj, colVal);
                         }
                     }
                 }

                result.Add(obj);
             }
             return result;
         }

On The Marks!

I have used a Stopwatch for getting the Milliseconds after each operation.  Following is the testing code.

Mapper.Initialize(cfg => cfg.CreateMap<Entity, Dto>());

           IList<Entity> entities = new List<Entity>();

           Stopwatch watch = Stopwatch.StartNew();

           for (int i = 1; i <= 100000; i++)
            {
                Entity entity = new Entity()
                {
                    Property1 = “test value”,
                    Property2 = “test value”,
                    Property3 = “test value”,
                    Property4 = “test value”,
                    Property5 = “test value”,
                    Property6 = “test value”,
                    Property7 = “test value”,
                    Property8 = “test value”,
                    Property9 = “test value”,
                    Property10 = “test value”,
                };
                entities.Add(entity);
            }
            Console.WriteLine(“List Creation: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtosManual = ManualMap(entities);
            Console.WriteLine(“Manual Mapper: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtos = Mapper.Map<IList<Dto>>(entities);
            Console.WriteLine(“Auto Mapper: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtos2 = ReflectionMapper.Map<Entity, Dto>(entities);
            Console.WriteLine(“Reflection Mapper: ” + watch.ElapsedMilliseconds.ToString());

           Console.ReadKey(false);

Following is the Results.

image

Summary

Manual Mapping is the Fastest. Recommended for N^N mapping scenarios.

AutoMapper is next.  The mapping speed is good & negligeble considering current high power machines with scalability in mind.

Reflection Code is slower.

Azure VM–Save Cost by Auto-Shutdown

Azure VM definitely gives lot of flexibility like:

  • High-end Hardware Configuration
  • Ready Software
  • High-speed Internet
  • Quick Availability

At the same time the Cost can be Quite Huge if one is considering High Configuration with Costly Software Licenses.

Save Cost

Most of the time I have noticed the Users will be using VM only less than 6 hour per day.  Rest of the time it will be left alone.  This is simply eating lot of $ per hour if you calculate the hours X days for a Month.

Auto-Shutdown Feature

Here the Auto-Shutdown feature comes handy. Here are the steps to use it:

Open VM > Control Panel

image

In the appearing blade choose the Enabled option.

image

You can also set an Email for notifying you before the shutdown.

Adding Gzip Compression to .Net Core

In this article I would like to Explore the usage of GZip compression in the Server-side for a .Net Core web application.

Scenario

I am sending a JSON object list consisting of 1 Thousand items.  In the ordinary response format it is taking 1 MB of file size and 1 Second to receive in the client-side.

I am using an Azure S2 Service for deployment & testing.

The Challenge

Following is the Chrome display of the URL Statistics.

image

The Solution

Now we are trying to achieve the solution using ASP.NET Core Response Compression.  For this we need to add the Library mentioned below to the application.

image

The Code

In the Service.cs file add the following highlighted lines of code.

public void ConfigureServices(IServiceCollection services)
         {
             services.AddMvc();

            services.AddResponseCompression(options =>
             {
                 options.Providers.Add<GzipCompressionProvider>();
                 options.MimeTypes =
                     ResponseCompressionDefaults.MimeTypes.Concat(
                         new[] { “text/json”, “application/json” });
             });


            services.Configure<GzipCompressionProviderOptions>(options =>
             {
                 options.Level = CompressionLevel.Optimal;
             });

         }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
         public void Configure(IApplicationBuilder app, IHostingEnvironment env)
         {
            app.UseResponseCompression();

            if (env.IsDevelopment())
             {
                 app.UseDeveloperExceptionPage();
             }

            app.UseMvc();
         }

Now compile, deploy & retest with the Chrome browser.

You can see there is 90% reduction in size of the response!

The response time also got reduce by 70%.

image

The Client Code

HttpClientHandler handler = new HttpClientHandler()
                {
                    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate

               };
                var client = new HttpClient(handler);
                client.BaseAddress = new Uri(URL);
                client.DefaultRequestHeaders.Accept.Clear();
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue(“application/json”));

               Stopwatch watch = Stopwatch.StartNew();

               HttpResponseMessage response = client.GetAsync(“api/kpi/list”).Result;
                response.EnsureSuccessStatusCode();

               double ms = watch.ElapsedMilliseconds;

               Console.WriteLine(“Elapsed Milliseconds: ” + ms.ToString());

Summary

The above code shows the components & code required to add JSON compression to your .Net Core application.

TDD Test Sample

In this post I would like to demonstrate few of my TDD (Test Driven Development) skills!

TDD

I  started TDD during 2010 with a Product Development Company.  Thereafter I am a Big Fan & Advocate of TDD.

Following are the Properties & Advantages of TDD:

  1. In TDD we first write the Unit Test. Then we start with the actual Code Method.  As the Test Method evolves the Original Method evolves too.
  2. TDD ensures the code is released with enough testing.
  3. TDD is an Investment for Future
  4. TDD is essential for Products
  5. TDD gives the Flexibility for Refactoring, More Confident Deployments
  6. TDD is used mostly in the Backend side

Coding Test

We want to expose a service that calculates the total spend given a supplier ID.

public class SpendService

{

public SpendSummary GetTotalSpend(int supplierId) { … }

}

The business logic is quite straightforward: the total spend is the sum of all the invoices, grouped by year. However, some of the suppliers are working with a separate branch of the company, which has its own invoicing platform.

Therefore, the flow should work as follows:

1) A SupplierService returns supplier data, which can be used to understand whether a supplier is external or not

2) If the supplier is not external, invoice data can be retrieved through the InvoiceRepository class

3) If the supplier is external, invoice data can be retrieved through the ExternalInvoiceService class

4) ExternalInvoiceService invokes a separate system, which might fail. However, data from this system is regularly backed up in a failover storage. A FailoverInvoiceService class gives access to that storage. It is ok to return failover data when ExternalInvoiceService fails.

5) Failover data might be not fresh. A timestamp property indicates when it has been originally stored. If this date is older than a month, it means that it has not been refreshed. In this case, the GetTotalSpend method should fail.

6) When ExternalInvoiceService is offline, usually calls tend to timeout, which means that the method takes long to complete. Therefore, after 3 consecutive errors, we want to bypass ExternalInvoiceService and go to FailoverInvoiceService directly, with the same logic as before. After 1 minute, we can try to re-enable ExternalInvoiceService again.

Solution

[TestClass]
    public class SpendServiceTest
    {
        private UnityContainer container;

        [TestInitialize]
        public void TestInit()
        {
            container = new UnityContainer();
            container.RegisterType<ISupplierDataService, SupplierDataServiceStub>();
            container.RegisterType<ISupplierService, SupplierService>();
            container.RegisterType<IInvoiceRepository, InvoiceRepositoryStub>();
            container.RegisterType<IExternalSpendService, ExternalInvoiceServiceStub>();
            container.RegisterType<ICircuitBreaker, ExternalSpendServiceInvoker>();
            container.RegisterType<IFailoverInvoiceService, FailoverInvoiceServiceStub>();
        }

        [TestMethod]
        public void SpendService_InternalCustomer_Name_Test()
        {
            // Arrange
            int supplierId = 1;
            SpendService spendService = container.Resolve<InternalSpendService>();

            // Act
            SpendSummary result = spendService.GetTotalSpend(supplierId);

            // Assert
            Assert.IsNotNull(result);
            Assert.AreEqual(“Supplier Internal”, result.Name);
        }

        [TestMethod]
        public void SpendService_InternalCustomer_Spend_Test()
        {
            // Arrange
            int supplierId = 1;
            SpendService spendService = container.Resolve<InternalSpendService>();

            // Act
            SpendSummary result = spendService.GetTotalSpend(supplierId);

            // Assert
            Assert.IsNotNull(result);
            Assert.AreEqual(2, result.Years.Count);
            Assert.AreEqual(2000, result.Years[0].TotalSpend);
            Assert.AreEqual(1000, result.Years[1].TotalSpend);
        }

        [TestMethod]
        public void SpendService_ExternalCustomer_Name_Test()
        {
            // Arrange
            int supplierId = 2;
            SpendService spendService = container.Resolve<ExternalSpendService>();

            // Act
            SpendSummary result = spendService.GetTotalSpend(supplierId);

            // Assert
            Assert.IsNotNull(result);
            Assert.AreEqual(“Supplier External”, result.Name);
        }

}

Following is Circuit Breaker.

public interface ICircuitBreaker
{
    int MaxTries { get; }

    int ClosedTimeSeconds { get; }

    int ObsoleteDaysLimit { get; }

    bool IsOpenState { get; set; }

    DateTime FailedTimestamp { get; set; }

    void Reset();

    List<SpendDetail> GetSpendDetail(int supplierId);
}

One Stub Sample.

public class FailoverInvoiceServiceStub : IFailoverInvoiceService
    {
        public FailoverInvoiceCollection GetInvoices(int supplierId)
        {
            FailoverInvoiceCollection result = new FailoverInvoiceCollection();

            if (supplierId == 2)
            {
                IList<ExternalInvoice> list = new List<ExternalInvoice>();
                list.Add(new ExternalInvoice() { Year = 2018, TotalAmount = 900 }); // Only 1 Year
                result.Timestamp = DateTime.Now;
                result.Invoices = list.ToArray();
            }
            else if (supplierId == 3)
            {
                IList<ExternalInvoice> list = new List<ExternalInvoice>();
                list.Add(new ExternalInvoice() { Year = 2018, TotalAmount = 900 }); // Only 1 Year
                result.Timestamp = DateTime.Now;
                result.Invoices = list.ToArray();
            }
            else if (supplierId == 4)
            {
                IList<ExternalInvoice> list = new List<ExternalInvoice>();
                list.Add(new ExternalInvoice() { Year = 2017, TotalAmount = 800 });
                result.Timestamp = DateTime.Now.AddDays(-32); // Obsolete Date
                result.Invoices = list.ToArray();
            }

            return result;
        }
    }

Strategies

I have been using following Frameworks & Strategies:

  1. Unit Test created using C#
  2. Moq Framework for Mocking
  3. Stubbing using C# Classes
  4. Wrapper Methods for Static Methods unit testing

References

You can find the Code in GitHub.

https://github.com/jeanpaulva/jp-tdd

Summary

This is a TDD Code Demonstration to showcase my skills. 

Thank you All!

Azure Database – Service Tiers–Performance

In this Post we can explore the Advantage of SQL Database Tiers Scalability.

Scalability

Azure provides Flexibility of Scaling Up/Down based on the Demand. 

Create a Database

Create an Azure Database with Default Options.  You will get it created in Standard service tier.

image

Performance Test

You can create a Table with 10 columns & Insert 10 Lakh Records.

It took around 2 hour for me.

Scale Up

Now we need to Scale Up and Test.

Choose the database Configuration as shown below.

image

In the appearing window choose Premium.

Choose the 500 DTU option.  Now click Apply button.  Wait for Few minutes for the Upgrade to complete.

image

DTU

DTU represents Data Throughput Unit. More the DTU More the Performance.

Run the Results

Now run the same Query again.  The performance got improved by 50% by completing in 1 hour.

Summary

Increasing the Service Tier / DTU will give quick jump up in performance during Peak hours.

Note

Do not forget to scale down after the purpose. Else the bills will also perform huge.

Mobile App Advantages

In this post we can consider the Advantages of Mobile App Feature of App Service.

Offline Storage

The Offline Storage provides data storage during No-Internet times.

Improved Responsiveness

For critical applications which require read/write response within 1 seconds can be achieved through Offline Storage of data helping read/write seamless.

Push Notifications

Azure Notification Hub support for Sending messages to multiple devices/platforms of Android, iOS, Windows etc.

Identity Management

AD Enabled for easier Security Integration

Social Media Integration

Facebook, Twitter Integration for Federated Authentication, Post Integration etc.

Scalability

Multiple devices can be supported through Auto Scale feature.

Staged Deployments

We can do Staged deployments where Production switching is possible through changing IPs

Continuous Integration

Continuous Integration with Visual Studio & Github possible.

Create Mobile App Service

We can create Mobile App Service from below link.

image