BI is dead. Long live BI!

As I was riding the bus home from jury duty the other day[1] I saw this tweet come in from Eric Vogelpohl.

 

There’s a lot to unpack here. and I don’t expect to do it all justice in this post, but Eric’s thought-provoking tweet made me want to reply, and I knew it wouldn’t fit into 280 characters… but I can tackle some of the more important and interesting elements.

First and foremost, Eric tags me before he tags Marco, Chris, or Curbal. I am officially number one, and I will never let Marco or Chris forget it[2].

With that massive ego boost out of the way, let’s get to the BI, which is definitely dead. And also definitely not dead.

Eric’s post starts off with a bold and simple assertion: If you have the reactive/historical insights you need today, you have enough business intelligence and should focus on other things instead. I’m paraphrasing, but I believe this effectively captures the essence of his claim. Let me pick apart some of the assumptions I believe underlie this assertion.

First, this claim seems to assume that all organizations are “good w/ BI.” Although this may be true of an increasing number of mature companies, in my experience it is definitely not something that can be taken for granted. The alignment of business and technology, and the cultural changes required to initiate and maintain this alignment, are not yet ubiquitous.

Should they be? Should we be able to take for granted that in 2019 companies have all the BI they need? [3]

The second major assumption behind Eric’s first point seems to be that “good w/ BI” today translates to “good w/ BI” tomorrow… as if BI capabilities are a blanket solution rather than something scoped and constrained to a specific set of business and data domains. In reality[4], BI capabilities are developed and deployed incrementally based on priorities and constraints, and are then maintained and extended as the priorities and constraints evolve over time.

My job gives me the opportunity to work with large enterprise companies to help them succeed in their efforts related to data, business intelligence, and analytics. Many of these companies have built successful BI architectures and are reaping the benefits of their work. These companies may well be characterized as being “good w/ BI” but none of them are resting on their laurels – they are instead looking for ways to extend the scope of their BI investments, and to optimize what they have.

I don’t believe BI is going anywhere in the near future. Not only are most companies not “good w/ BI” today, the concept of being “good w/ BI” simply doesn’t make sense in the context in which BI exists. So long as business requirements and environments change over time, and so long as businesses need to understand and react, there will be a continuing need for BI. Being “good w/ BI” isn’t a meaningful concept beyond a specific point in time… and time never slows down.

If your refrigerator is stocked with what your family likes to eat, are you “good w/ food”? This may be the case today, but what about when your children become teenagers and eat more? What about when someone in the family develops food allergies? What about when one of your children goes vegan? What about when the kids go off to college? Although this analogy won’t hold up to close inspection[5] it hopefully shows how difficult it is to be “good” over the long term, even for a well-understood problem domain, when faced with easily foreseeable changes over time.

Does any of this mean that BI represents the full set of capabilities that successful organizations need? Definitely not. More and more, BI is becoming “table stakes” for businesses. Without BI it’s becoming more difficult for companies to simply survive, and BI is no longer a true differentiator that assures a competitive advantage. For that advantage, companies need to look at other ways to get value from their data, including predictive and prescriptive analytics, and the development of a data culture that empowers and encourages more people to do more things with more data in the execution of their duties.

And of course, this may well have been Eric’s point from the beginning…

 


[1] I’ve been serving on the jury for a moderately complex civil trial for most of August, and because the trial is in downtown Seattle during business hours I have been working early mornings and evenings in the office, and taking the bus to the courthouse to avoid the traffic and parking woes that plague Seattle. I am very, very tired.

[2] Please remind me to add “thought leader” to my LinkedIn profile. Also maybe something about blockchain.

[3] I’ll leave this as an exercise for the reader.

[4] At least in my reality. Your mileage may vary.

[5] Did this analogy hold up to even distant observation?

Recipe: Chocolate Espresso Sandwich Cookies

This is one of my favorite cookies recipes. Like my candied orange peel recipe, it is adapted from one originally published in the mid-90s in Chocolatier Magazine.  The recipe below is double the volume of the original Chocolatier recipe[1], and if you have a mixing bowl big enough it is easy to double or triple for a truly epic batch.

cookies finished

Ingredients – Cookies

  • 2 cups all-purpose flour (260 g)
  • 1/2 cup non-alkalized cocoa powder (41 g)
  • 1 teaspoon baking soda
  • 1/2 teaspoon salt
  • 3 1/2 teaspoons instant espresso powder
  • 3 teaspoons vanilla extract
  • 8 ounces unsalted butter, softened (227 g)
  • 1 cup granulated sugar (200 g)
  • 1 cup dark brown sugar (239 g)
  • 2 large eggs, at room temperature

Ingredients – Ganache

  • 10 ounces dark chocolate, finely chopped (284 g)
  • 1/2 cup plus 2 tablespoons heavy cream (142 g)
  • 1 tablespoon instant espresso powder

Procedure

Phase 1: Make the cookies

  • Position one rack in the top third and another in the bottom third of the oven and preheat to 350 Fahrenheit (180 Celsius).
  • In a medium bowl, using a wire whisk, stir together the flour, baking soda, cocoa powder and salt until thoroughly blended.
  • In a small cup, combine the espresso powder and vanilla and stir with a small rubber spatula until the espresso is dissolved.
  • In a large bowl, using a hand-held electric mixer set at medium speed, beat the butter for 30 seconds, until creamy. Scrape down the sides of the bowl.
  • Add both sugars and continue beating for two to three minutes, until the mixture is light in texture and color. Scrape down the sides of the bowl.
  • Beat in the egg until blended. Scrape down the sides of the bowl.
  • Beat in the espresso mixture. Scrape down the sides of the bowl.
  • At low speed, beat in the flour mixture in three additions, scraping down the side of the bowl after each addition.
  • Drop the dough by slightly rounded measured teaspoonfuls onto ungreased cookie sheets.
  • Bake the cookies for 7 to 9 minutes, until the edges are very lightly browned but the centers are still slightly soft; switch the position of the baking sheets halfway through the baking time for even browning. (For crisper cookies, bake for 8 to 10 minutes until the centers are no longer soft.)
  • Cool the cookies on the baking sheets for 1 to 2 minutes. Using a metal spatula, transfer the cookies to paper towels to cool completely.

Phase 2: Make the ganache

  • In a food processor fitted with the metal blade, process the chocolate until finely chopped.
  • In a small saucepan, combine the cream and the espresso powder and bring to a boil, stirring to dissolve the espresso powder.
  • With the processor running, add the hot cream mixture to the chopped chocolate and process for 25 to 30 seconds, or until completely smooth.
  • Scrape the ganache into a medium bowl and let stand at room temperature until just slightly set and spreadable, about 30 minutes.

Phase 3: Assemble the sandwich cookies

  • Group the cookies into pairs, matching two cookie rounds of similar shape and size.
  • Spread a gently rounded teaspoonful of the ganache filling onto the bottom of one of the cookies.
  • Top with the matching cookie, right-side-up, and very gently press each sandwich together.
  • Repeat with the remaining cookie pairs until all cookies are made into sandwiches.
  • Let sit at room temperature for about 30 minutes to set the ganache. The cookies can be stored in an airtight container for two to three days.

Storage

The cookies can be stored in an airtight container for two to three days.

This slideshow requires JavaScript.

Notes

  • You can also use a stand mixer with the paddle attachment, as shown in the slide show above.
  • If possible, line your cookie sheets with silicone baking mats, or with parchment paper. 

 


[1] I can’t imagine making a batch of any cookies that uses one egg and one stick of butter. That’s not right.

Power BI dataflows, Premium, and ADLSg2

Important: This post was written and published in 2019, and the content below may no longer represent the current capabilities of Power BI. Please consider this post to be an historical record and not a technical resource. All content on this site is the personal output of the author and not an official resource from Microsoft.

I received a question today via Twitter, and although I know the information needed to answer it is available online, I don’t believe there’s a single concise answer anywhere[1]. This is the question, along with a brief elaboration following my initial response:

Billing Twitter

Here’s the short answer: When you use an organizational ADLSg2 account to store dataflow data, your Azure subscription will be billed for any storage and egress based on however Azure billing works[2].

Here’s the longer answer:

  • Power BI dataflows data counts against the same limits as Power BI datasets. Each Pro license grants 10 GB of storage,  and a Premium capacity node includes 100 TB of storage.
  • Integrating Power BI dataflows with ADLSg2 is not limited to Power BI Premium.
  • When you’re using Power BI dataflows in their default configuration, dataflow data is stored to this Power BI storage, and counts against the appropriate quota.
  • When dataflow data is saved to Power BI storage, it can only be accessed by Power BI – no other services or applications can read the data.
  • When you configure your dataflows to use an organizational ADLSg2 account, the dataflow data is saved to the Azure resource you specify, and not to the Power BI storage, so it doesn’t count against the Pro or Premium storage quota. This is particularly significant when you’re not using Power BI Premium, as ADLSg2 storage will scale to support any scenario, and not be limited by the 10 GB Pro storage limit.
  • When dataflow data is saved to ADLSg2, the CDM folders can be accessed by any authorized client via Azure APIs, and by Power BI as dataflow entities. This is particularly valuable for enabling collaboration between analysts and other Power BI users, and data scientists and other data professionals using Azure tools.

Hopefully this will help clear things up. If you have any questions, please let me know!


[1] Please note that I didn’t actually go looking to make sure, because I was feeling lazy and needed an excuse to blog about something vaguely technical.

[2] I add that final qualifier because I am not an authority on Azure or Power BI billing, or on licensing of any sort. For any specific information on licensing or billing, please look elsewhere for expert advice, because you won’t find it here.