Installing SSAS Template in Visual Studio 2017

If you have not Selected Data Processing templates during your Visual Studio Installation, then you need to Download the SSDT for VS 2017 from following URL:

https://docs.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017

Installation

On running the downloaded executable file, you will get the following prompt.  Choose the First Option if you want to work with All Templates in current Visual Studio instance.  Also, Select the Analysis Services option.

image

Click the Next button to start Installation.

image

Once the Installation is completed, You will get the following message.

image

Inside Visual Studio

Open Visual Studio. If you can see following Templates then you are good!

image

Summary

This post is a Part of Starting with Azure Analysis Services series. Please check other Articles for the continuity.

My DAX Days…

In this post I would like to Record my DAX Learnings during my SQL Server Analysis Services days.

SQL Server Analysis Services

It is 1000X faster than SQL Server since the Data is stored in Memory in Compressed Format.

DAX

DAX stands for Data Analysis Expressions.  It is widely used in Analysis Services, Power BI etc. 

DAX looks similar to Excel Formulas.

DAX Measure

DAX Measure is the actual Formula.

DAX Variables

DAX Variables will be executed once they are Invoked

EVALUATE

Evaluates the Expression.

Eg: EVALUATE ( ‘DimProducts’)

Eg: EVALUATE( ‘DimProducts’ ) ORDER BY ‘DimProducts'[Date] DESC  –Sorting Example

For Scalar Values, Use EVALUATE (ROW(“Column”, Variable)) syntax

Note: Add an EVALUATE in front of the Expression

ROW

Returns a Table with a Single Row.

Eg: EVALUATE ROW(“Total Sales”, SUM(FactSales[Price]), 
         “Total Quantity”, SUM(FactSales[Quantity]))

image

SUMMARY

Returns a summary table.

EVALUATE
SUMMARIZE(
        DimProducts 
      , DimProducts[Id] 
      , DimProducts[Class] 
      , DimProducts[SubClass] 
      , “Sales Total”, SUM(FactSales[Price]) 
      , “Quantity Total”,  SUM(FactSales[Quantity]) 
      ) 

image

CALCULATE

Evaluates an Expression when modified by a Filter.

FILTER

Returns a Subset of Table or Expression.

Eg: EVALUATE
FILTER(DimProducts,DimProducts[Class]=”Home Theatre”)

image

*Note the double quotes

More Filters: https://bit.ly/2L3iOdr

SUM

Sum is an Aggregator.

Eg: EVALUATE
SUMMARIZE (
DimProducts,
“Sum”, SUM ( DimProducts[Price] )
)

image

SUMX

SumX is an Iterator.  Can Add Multiplications into this.

Eg: EVALUATE
SUMMARIZE (
DimProducts,
“Sum”, SUMX (DimProducts, DimProducts[Price] * 100)
)

image

ADDCOLUMNS

Adds calculated columns to the given table

References

https://www.sqlbi.com/

https://msdn.microsoft.com/en-us/query-bi/dax

More DAX Days

In this post I would  like to Publish few more DAX scenarios I have encountered.

Join Tables

We can use CROSSJOIN function on tables to use All Columns required by the SUMMARIZE function.

DEFINE
    VAR JoinTable =
        FILTER (
            CROSSJOIN ( Products, FactSales ),
            DimProducts[Id] = SalesFactSales[ProductId]
        )

    VAR Result Table=
        SUMMARIZE (
            JoinTable ,
            DimProducts[Choice],
            DimProducts[Id],
            FactSales[StoreID],
            FactSales[WeekId],
            “Quantity”, SUMX (
                FILTER ( ‘FactSales’, ‘DimProducts'[Id] = ‘FactSales'[ProductId] ),
                FactSales[Quantity]
            )
        )

 

EVALUATE
ResultTable

SUMMARIZECOLUMNS

SUMMARIZECOLUMNS is recommended than SUMMARIZE because it is Faster & Optimized.

SUMMARIZE still exists for Backwordk Compatibility.

Let us see the Cross Join using SUMMARIZECOLUMNS.

Cross Join with Matching ProductID

DEFINE VAR Results =
SUMMARIZECOLUMNS (
    ‘DimProducts'[ProductID],
    “Sales Amount”,
    SUMX (        FILTER (
            CROSSJOIN ( DimProducts, FactSales ),
            DimProducts[ProductId] = FactSales[ProductId]
        ), FactSales[Quantity] * FactSales[Price] )

)

EVALUATE Results

image

Another way to see the results.

EVALUATE FILTER(CROSSJOIN (
                        DimProducts,
                        FactSales
                    ),
                    DimProducts[ProductId] = FactSales[ProductId]

)

— Returns as many rows as in Sales table

image

Working on Table Variables

We can work on Table Variables to do more Aggregations. Example is given below.

For example to find Average, Use AVERAGEX function.

VAR ResultsTable =
        ADDCOLUMNS (
            SUMMARIZE (
                ProductJoinSales,
                DimProducts[Choice],
                DimProducts[Id]
            ),
            “CalcValue”,  DIVIDE(
                DIVIDE (
                    SUM (
                        FactSales[Quantity]
                        ),
                        COUNT ( FactSales[StoreID] )
                )
            )
        )

VAR Result = AVERAGEX (ResultsTable, [CalcValue]) 

Note Ensure to display the value using ROW function.

EVALUATE ROW(“Average”, Result)

OR Condition in Filter

Or Conditions in Filters are pretty easy.  Use the || operator of C#.

FILTER ( FactSales, FactSales[Price] > 0 || FactSales[Quantity] > 0)

Reference

https://azure-arena.com/2018/08/03/my-dax-days/

https://azure-arena.com/2018/08/01/sql-tables-used-in-dax/

SQL Tables used in DAX

Following are the SQL Tables Schema used in the DAX exercises.

CREATE TABLE [dbo].[DimProducts](
    [ProductID] [varchar](50) NOT NULL,
    [Name] [varchar](50) NULL,
    [Class] [varchar](50) NULL,
    [SubClass] [varchar](50) NULL,
    [Price] [float] NULL,
    [Choice] [varchar](50) NULL,
CONSTRAINT [PK_DimProducts] PRIMARY KEY CLUSTERED
(
    [ProductID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

 

CREATE TABLE [dbo].[FactSales](
    [SalesId] [varchar](50) NOT NULL,
    [Date] [date] NULL,
    [Quantity] [int] NULL,
    [Price] [float] NULL,
    [ProductID] [varchar](50) NULL,
    [WeekID] [int] NULL,
    [StoreID] [int] NULL,
CONSTRAINT [PK_FactSales] PRIMARY KEY CLUSTERED
(
    [Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO

Relations

The one-to-many relation is shown as below.

image

AutoMapper vs. QuickMapper vs. Reflection

In this post I would like to Compare the Speed Performance between:

  • AutoMapper
  • Reflection
  • Manual Mapper

AutoMapper

AutoMapper is a well-known framework for Mapping Properties between Class Instances.  It is very useful in the case of DTO to Entity mapping & vice-versa.

Reflection

Here I am writing my own Mapping code using .Net Reflection.

Manual Mapper

Here I will be using Manual code for assigning the property values.

Scenario

I am using an Entity class of 10 Properties and Creating 100K instances.  Let us see whether AutoMapper performs better than Raw Reflection code.

Following is the Entity class.

public class Entity
{
     public string Property1 { get; set; }
     public string Property2 { get; set; }
     public string Property3 { get; set; }
     public string Property4 { get; set; }
     public string Property5 { get; set; }
     public string Property6 { get; set; }
     public string Property7 { get; set; }
     public string Property8 { get; set; }
     public string Property9 { get; set; }
     public string Property10 { get; set; }
}

Following is the Dto class.

public class Dto
{
     public string Property1 { get; set; }
     public string Property2 { get; set; }
     public string Property3 { get; set; }
     public string Property4 { get; set; }
     public string Property5 { get; set; }
     public string Property6 { get; set; }
     public string Property7 { get; set; }
     public string Property8 { get; set; }
     public string Property9 { get; set; }
     public string Property10 { get; set; }
}

Following is the AutoMapper Nuget package name.

image

Following is the Reflection code.

public class ReflectionMapper
     {
         public static List<TResult> Map<TSource, TResult>(IList<TSource> sourceList) where TResult : new()
         {
             var result = new List<TResult>(sourceList.Count);

            PropertyDescriptorCollection psrc = TypeDescriptor.GetProperties(typeof(TSource));
             PropertyDescriptorCollection presult = TypeDescriptor.GetProperties(typeof(TResult));

            TResult obj;
             Object colVal;
             string field1 = “”;
             string field2 = “”;

            foreach (TSource item in sourceList)
             {
                 obj = new TResult();

                for (int iResult = 0; iResult < presult.Count; iResult++)
                 {
                     PropertyDescriptor propResult = presult[iResult];
                     field1 = propResult.Name;

                    for (int ix = 0; ix < presult.Count; ix++)
                     {
                         PropertyDescriptor propSource = psrc[ix];

                        field2 = propSource.Name;

                        if (field1 == field2)
                         {
                             colVal = propSource.GetValue(item) ?? null;
                             propResult.SetValue(obj, colVal);
                         }
                     }
                 }

                result.Add(obj);
             }
             return result;
         }

Following is the Manual Mapping code.

public class ReflectionMapper
     {
         public static List<TResult> Map<TSource, TResult>(IList<TSource> sourceList) where TResult : new()
         {
             var result = new List<TResult>(sourceList.Count);

            PropertyDescriptorCollection psrc = TypeDescriptor.GetProperties(typeof(TSource));
             PropertyDescriptorCollection presult = TypeDescriptor.GetProperties(typeof(TResult));

            TResult obj;
             Object colVal;
             string field1 = “”;
             string field2 = “”;

            foreach (TSource item in sourceList)
             {
                 obj = new TResult();

                for (int iResult = 0; iResult < presult.Count; iResult++)
                 {
                     PropertyDescriptor propResult = presult[iResult];
                     field1 = propResult.Name;

                    for (int ix = 0; ix < presult.Count; ix++)
                     {
                         PropertyDescriptor propSource = psrc[ix];

                        field2 = propSource.Name;

                        if (field1 == field2)
                         {
                             colVal = propSource.GetValue(item) ?? null;
                             propResult.SetValue(obj, colVal);
                         }
                     }
                 }

                result.Add(obj);
             }
             return result;
         }

On The Marks!

I have used a Stopwatch for getting the Milliseconds after each operation.  Following is the testing code.

Mapper.Initialize(cfg => cfg.CreateMap<Entity, Dto>());

           IList<Entity> entities = new List<Entity>();

           Stopwatch watch = Stopwatch.StartNew();

           for (int i = 1; i <= 100000; i++)
            {
                Entity entity = new Entity()
                {
                    Property1 = “test value”,
                    Property2 = “test value”,
                    Property3 = “test value”,
                    Property4 = “test value”,
                    Property5 = “test value”,
                    Property6 = “test value”,
                    Property7 = “test value”,
                    Property8 = “test value”,
                    Property9 = “test value”,
                    Property10 = “test value”,
                };
                entities.Add(entity);
            }
            Console.WriteLine(“List Creation: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtosManual = ManualMap(entities);
            Console.WriteLine(“Manual Mapper: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtos = Mapper.Map<IList<Dto>>(entities);
            Console.WriteLine(“Auto Mapper: ” + watch.ElapsedMilliseconds.ToString());

           watch.Start();
            IList<Dto> dtos2 = ReflectionMapper.Map<Entity, Dto>(entities);
            Console.WriteLine(“Reflection Mapper: ” + watch.ElapsedMilliseconds.ToString());

           Console.ReadKey(false);

Following is the Results.

image

Summary

Manual Mapping is the Fastest. Recommended for N^N mapping scenarios.

AutoMapper is next.  The mapping speed is good & negligeble considering current high power machines with scalability in mind.

Reflection Code is slower.

Azure VM–Save Cost by Auto-Shutdown

Azure VM definitely gives lot of flexibility like:

  • High-end Hardware Configuration
  • Ready Software
  • High-speed Internet
  • Quick Availability

At the same time the Cost can be Quite Huge if one is considering High Configuration with Costly Software Licenses.

Save Cost

Most of the time I have noticed the Users will be using VM only less than 6 hour per day.  Rest of the time it will be left alone.  This is simply eating lot of $ per hour if you calculate the hours X days for a Month.

Auto-Shutdown Feature

Here the Auto-Shutdown feature comes handy. Here are the steps to use it:

Open VM > Control Panel

image

In the appearing blade choose the Enabled option.

image

You can also set an Email for notifying you before the shutdown.

Adding Gzip Compression to .Net Core

In this article I would like to Explore the usage of GZip compression in the Server-side for a .Net Core web application.

Scenario

I am sending a JSON object list consisting of 1 Thousand items.  In the ordinary response format it is taking 1 MB of file size and 1 Second to receive in the client-side.

I am using an Azure S2 Service for deployment & testing.

The Challenge

Following is the Chrome display of the URL Statistics.

image

The Solution

Now we are trying to achieve the solution using ASP.NET Core Response Compression.  For this we need to add the Library mentioned below to the application.

image

The Code

In the Service.cs file add the following highlighted lines of code.

public void ConfigureServices(IServiceCollection services)
         {
             services.AddMvc();

            services.AddResponseCompression(options =>
             {
                 options.Providers.Add<GzipCompressionProvider>();
                 options.MimeTypes =
                     ResponseCompressionDefaults.MimeTypes.Concat(
                         new[] { “text/json”, “application/json” });
             });


            services.Configure<GzipCompressionProviderOptions>(options =>
             {
                 options.Level = CompressionLevel.Optimal;
             });

         }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
         public void Configure(IApplicationBuilder app, IHostingEnvironment env)
         {
            app.UseResponseCompression();

            if (env.IsDevelopment())
             {
                 app.UseDeveloperExceptionPage();
             }

            app.UseMvc();
         }

Now compile, deploy & retest with the Chrome browser.

You can see there is 90% reduction in size of the response!

The response time also got reduce by 70%.

image

The Client Code

HttpClientHandler handler = new HttpClientHandler()
                {
                    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate

               };
                var client = new HttpClient(handler);
                client.BaseAddress = new Uri(URL);
                client.DefaultRequestHeaders.Accept.Clear();
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue(“application/json”));

               Stopwatch watch = Stopwatch.StartNew();

               HttpResponseMessage response = client.GetAsync(“api/kpi/list”).Result;
                response.EnsureSuccessStatusCode();

               double ms = watch.ElapsedMilliseconds;

               Console.WriteLine(“Elapsed Milliseconds: ” + ms.ToString());

Summary

The above code shows the components & code required to add JSON compression to your .Net Core application.