My DAX Days…

In this post I would like to Record my DAX Learnings during my SQL Server Analysis Services days.

SQL Server Analysis Services

It is 1000X faster than SQL Server since the Data is stored in Memory in Compressed Format.

DAX

DAX stands for Data Analysis Expressions.  It is widely used in Analysis Services, Power BI etc. 

DAX looks similar to Excel Formulas.

DAX Measure

DAX Measure is the actual Formula.

DAX Variables

DAX Variables will be executed once they are Invoked

EVALUATE

Evaluates the Expression.

Eg: EVALUATE ( ‘DimProducts’)

Eg: EVALUATE( ‘DimProducts’ ) ORDER BY ‘DimProducts'[Date] DESC  –Sorting Example

For Scalar Values, Use EVALUATE (ROW(“Column”, Variable)) syntax

Note: Add an EVALUATE in front of the Expression

ROW

Returns a Table with a Single Row.

Eg: EVALUATE ROW(“Total Sales”, SUM(FactSales[Price]), 
         “Total Quantity”, SUM(FactSales[Quantity]))

image

SUMMARY

Returns a summary table.

EVALUATE
SUMMARIZE(
        DimProducts 
      , DimProducts[Id] 
      , DimProducts[Class] 
      , DimProducts[SubClass] 
      , “Sales Total”, SUM(FactSales[Price]) 
      , “Quantity Total”,  SUM(FactSales[Quantity]) 
      ) 

image

CALCULATE

Evaluates an Expression when modified by a Filter.

FILTER

Returns a Subset of Table or Expression.

Eg: EVALUATE
FILTER(DimProducts,DimProducts[Class]=”Home Theatre”)

image

*Note the double quotes

More Filters: https://bit.ly/2L3iOdr

SUM

Sum is an Aggregator.

Eg: EVALUATE
SUMMARIZE (
DimProducts,
“Sum”, SUM ( DimProducts[Price] )
)

image

SUMX

SumX is an Iterator.  Can Add Multiplications into this.

Eg: EVALUATE
SUMMARIZE (
DimProducts,
“Sum”, SUMX (DimProducts, DimProducts[Price] * 100)
)

image

ADDCOLUMNS

Adds calculated columns to the given table

References

https://www.sqlbi.com/

https://msdn.microsoft.com/en-us/query-bi/dax

More DAX Days

In this post I would  like to Publish few more DAX scenarios I have encountered.

Join Tables

We can use CROSSJOIN function on tables to use All Columns required by the SUMMARIZE function.

DEFINE
    VAR JoinTable =
        FILTER (
            CROSSJOIN ( Products, FactSales ),
            DimProducts[Id] = SalesFactSales[ProductId]
        )

    VAR Result Table=
        SUMMARIZE (
            JoinTable ,
            DimProducts[Choice],
            DimProducts[Id],
            FactSales[StoreID],
            FactSales[WeekId],
            “Quantity”, SUMX (
                FILTER ( ‘FactSales’, ‘DimProducts'[Id] = ‘FactSales'[ProductId] ),
                FactSales[Quantity]
            )
        )

 

EVALUATE
ResultTable

SUMMARIZECOLUMNS

SUMMARIZECOLUMNS is recommended than SUMMARIZE because it is Faster & Optimized.

SUMMARIZE still exists for Backwordk Compatibility.

Let us see the Cross Join using SUMMARIZECOLUMNS.

Cross Join with Matching ProductID

DEFINE VAR Results =
SUMMARIZECOLUMNS (
    ‘DimProducts'[ProductID],
    “Sales Amount”,
    SUMX (        FILTER (
            CROSSJOIN ( DimProducts, FactSales ),
            DimProducts[ProductId] = FactSales[ProductId]
        ), FactSales[Quantity] * FactSales[Price] )

)

EVALUATE Results

image

Another way to see the results.

EVALUATE FILTER(CROSSJOIN (
                        DimProducts,
                        FactSales
                    ),
                    DimProducts[ProductId] = FactSales[ProductId]

)

— Returns as many rows as in Sales table

image

Working on Table Variables

We can work on Table Variables to do more Aggregations. Example is given below.

For example to find Average, Use AVERAGEX function.

VAR ResultsTable =
        ADDCOLUMNS (
            SUMMARIZE (
                ProductJoinSales,
                DimProducts[Choice],
                DimProducts[Id]
            ),
            “CalcValue”,  DIVIDE(
                DIVIDE (
                    SUM (
                        FactSales[Quantity]
                        ),
                        COUNT ( FactSales[StoreID] )
                )
            )
        )

VAR Result = AVERAGEX (ResultsTable, [CalcValue]) 

Note Ensure to display the value using ROW function.

EVALUATE ROW(“Average”, Result)

OR Condition in Filter

Or Conditions in Filters are pretty easy.  Use the || operator of C#.

FILTER ( FactSales, FactSales[Price] > 0 || FactSales[Quantity] > 0)

Reference

https://azure-arena.com/2018/08/03/my-dax-days/

https://azure-arena.com/2018/08/01/sql-tables-used-in-dax/

Azure VM–Save Cost by Auto-Shutdown

Azure VM definitely gives lot of flexibility like:

  • High-end Hardware Configuration
  • Ready Software
  • High-speed Internet
  • Quick Availability

At the same time the Cost can be Quite Huge if one is considering High Configuration with Costly Software Licenses.

Save Cost

Most of the time I have noticed the Users will be using VM only less than 6 hour per day.  Rest of the time it will be left alone.  This is simply eating lot of $ per hour if you calculate the hours X days for a Month.

Auto-Shutdown Feature

Here the Auto-Shutdown feature comes handy. Here are the steps to use it:

Open VM > Control Panel

image

In the appearing blade choose the Enabled option.

image

You can also set an Email for notifying you before the shutdown.

Adding Gzip Compression to .Net Core

In this article I would like to Explore the usage of GZip compression in the Server-side for a .Net Core web application.

Scenario

I am sending a JSON object list consisting of 1 Thousand items.  In the ordinary response format it is taking 1 MB of file size and 1 Second to receive in the client-side.

I am using an Azure S2 Service for deployment & testing.

The Challenge

Following is the Chrome display of the URL Statistics.

image

The Solution

Now we are trying to achieve the solution using ASP.NET Core Response Compression.  For this we need to add the Library mentioned below to the application.

image

The Code

In the Service.cs file add the following highlighted lines of code.

public void ConfigureServices(IServiceCollection services)
         {
             services.AddMvc();

            services.AddResponseCompression(options =>
             {
                 options.Providers.Add<GzipCompressionProvider>();
                 options.MimeTypes =
                     ResponseCompressionDefaults.MimeTypes.Concat(
                         new[] { “text/json”, “application/json” });
             });


            services.Configure<GzipCompressionProviderOptions>(options =>
             {
                 options.Level = CompressionLevel.Optimal;
             });

         }

        // This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
         public void Configure(IApplicationBuilder app, IHostingEnvironment env)
         {
            app.UseResponseCompression();

            if (env.IsDevelopment())
             {
                 app.UseDeveloperExceptionPage();
             }

            app.UseMvc();
         }

Now compile, deploy & retest with the Chrome browser.

You can see there is 90% reduction in size of the response!

The response time also got reduce by 70%.

image

The Client Code

HttpClientHandler handler = new HttpClientHandler()
                {
                    AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate

               };
                var client = new HttpClient(handler);
                client.BaseAddress = new Uri(URL);
                client.DefaultRequestHeaders.Accept.Clear();
                client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue(“application/json”));

               Stopwatch watch = Stopwatch.StartNew();

               HttpResponseMessage response = client.GetAsync(“api/kpi/list”).Result;
                response.EnsureSuccessStatusCode();

               double ms = watch.ElapsedMilliseconds;

               Console.WriteLine(“Elapsed Milliseconds: ” + ms.ToString());

Summary

The above code shows the components & code required to add JSON compression to your .Net Core application.

Azure Database – Service Tiers–Performance

In this Post we can explore the Advantage of SQL Database Tiers Scalability.

Scalability

Azure provides Flexibility of Scaling Up/Down based on the Demand. 

Create a Database

Create an Azure Database with Default Options.  You will get it created in Standard service tier.

image

Performance Test

You can create a Table with 10 columns & Insert 10 Lakh Records.

It took around 2 hour for me.

Scale Up

Now we need to Scale Up and Test.

Choose the database Configuration as shown below.

image

In the appearing window choose Premium.

Choose the 500 DTU option.  Now click Apply button.  Wait for Few minutes for the Upgrade to complete.

image

DTU

DTU represents Data Throughput Unit. More the DTU More the Performance.

Run the Results

Now run the same Query again.  The performance got improved by 50% by completing in 1 hour.

Summary

Increasing the Service Tier / DTU will give quick jump up in performance during Peak hours.

Note

Do not forget to scale down after the purpose. Else the bills will also perform huge.

Mobile App Advantages

In this post we can consider the Advantages of Mobile App Feature of App Service.

Offline Storage

The Offline Storage provides data storage during No-Internet times.

Improved Responsiveness

For critical applications which require read/write response within 1 seconds can be achieved through Offline Storage of data helping read/write seamless.

Push Notifications

Azure Notification Hub support for Sending messages to multiple devices/platforms of Android, iOS, Windows etc.

Identity Management

AD Enabled for easier Security Integration

Social Media Integration

Facebook, Twitter Integration for Federated Authentication, Post Integration etc.

Scalability

Multiple devices can be supported through Auto Scale feature.

Staged Deployments

We can do Staged deployments where Production switching is possible through changing IPs

Continuous Integration

Continuous Integration with Visual Studio & Github possible.

Create Mobile App Service

We can create Mobile App Service from below link.

image

High Availability for Azure Deployments

Following are High Availability strategies while developing Azure Applications.

Multiple Instances

Ensure the application components, services have multiple instances.

Retry Pattern

In case of Transient failures that should not last long, the application should implement a Retry pattern to repeat the invoke and get through the service.

Circuit Breaker Pattern

For prolonged failures, the application should implement Circuit Breaker pattern so that subsequent failures will not hang the application.  Once the failure is resolved, the Circuit Breaker should open the gates back.

Message Broker

Implement a Message Broker for high critical tasks to ensure the tasks in the Queue is definitely processed.

Automate Deployments

Consider using Automated Deployments including for Updates too.

Deployment Slots

Usage of Deployment Slots enable switching between Production & Staging deployments.

Geo-replicate Database

Geo-Replicate the database for higher availability.

Monitoring Systems

Implement proper monitoring systems to track failures, notify on errors, long duration tasks etc.

Disaster Recovery

Create a multi-site recovery plan for mission critical applications.