You can’t avoid problems you can’t see

The last post was about the dangers inherent in measuring the wrong thing – choosing a metric that doesn’t truly represent the business outcome[1] you think it does. This post is about different problems – the problems that come up when you don’t truly know the ins and outs of the the data itself… but you think you do.

This is another “inspired by Twitter” post – it is specifically inspired by this tweet (and corresponding blog post) from Caitlin Hudon[2]. It’s worth reading her blog post before continuing with this one – you go do that now, and I’ll wait.

Caitlin’s ghost story reminded me of a scary story of my own, back from the days before I specialized in data and BI. Back in the days when I was a werewolf hunter. True story.

Around 15 years ago I was a consultant, working on a project with a company that made point-of-sale hardware and software for the food service industry. I was helping them build a hosted solution for above-store reporting, so customers who had 20 Burger Hut or 100 McTaco restaurants[3] could get insights and analytics from all of them, all in one place. This sounds pretty simple in 2020, but in 2005 it was an exciting first-to-market offering, and a lot of the underlying platform technologies that we can take for granted today simply didn’t exist. In the end, we built a data movement service that took files produced by the in-store back-of-house system and uploaded them over a shared dial-up connection[4] from each restaurant to the data center where they could get processed and warehoused.

The analytics system supported a range of different POS systems, each of which produced files in different formats. This was a fun technical challenge for the team, but it was a challenge we expected. What we didn’t expect was the undocumented failure behavior of one of these systems. Without going into too much detail, this POS system would occasionally produce output files that were incomplete, but which did not indicate failure or violate any documented success criteria.

To make a long story short[5], because we learned about the complexities of this system very late in the game, we had some very unhappy customers and some very long nights. During a retrospective we engaged with of the project sponsors for the analytics solution because he had – years earlier – worked with the development group that built this POS system. (For the purposes of this story I will call the system “Steve” because I need a proper noun for his quote.)

The project sponsor reviewed all we’d done from a reliability perspective – all the validation, all the error handling, all the logging. He looked at this, then he looked at the project team and he said:

You guys planned for wolves. ‘Steve’ is werewolves.

Even after all these years, I still remember the deadpan delivery for this line. And it was so true.

We’d gone in thinking we were prepared for all of the usual problems – and we were. But we weren’t prepared for the horrifying reality of the data problems that were lying in wait. We weren’t prepared for werewolves.

Digging through my email from those days, I found a document I’d sent to this project sponsor, planning for some follow-up efforts, and was reminded that for the rest of the projects I did for this client, “werewolves” became part of the team vocabulary.

2019-10-30-12-37-25-597--WINWORD

What’s the moral of this story? Back in 2008 I thought the moral was to test early and often. Although this is still true, I now believe that what Past Matthew really needed was a data catalog or data dictionary with information that clearly said DANGER: WEREWOLVES in big red letters.

This line from Caitlin’s blog post could not be more wise, or more true:

The best defense I’ve found against relying on an oral history is creating a written one.

The thing that ended up saving us back in 2005 was knowing someone who knew something – we happened to have a project stakeholder who had insider knowledge about a key data source and its undocumented behavior. What could have better? Some actual <<expletive>> documentation.

Even in 2020, and even in mature enterprise organizations, having a reliable data catalog or data dictionary that is available to the people who could get value from it is still the exception, not the rule. Business-critical data sources and processes rely on tribal knowledge, time after time and team after team.

I won’t try to supplement or repeat the best practices in Caitlin’s post – they’re all important and they’re all good and I could not agree more with her guidance. (If you skipped reading her post earlier, this is the perfect time for you to go read it.) I will, however, supplement her wisdom with one of my favorite posts from the Informatica blog, from back in 2017.

I’m sharing this second link because some people will read Caitlin’s story and dismiss it because she talks about using Google Sheets. Some people will say “that’s not an enterprise data catalog.” Don’t be those people.

Regardless of the tools you’re using, and regardless of the scope of the data you’re documenting, some things remain universally true:

  • Tribal knowledge can’t be relied upon at any meaningful scale or across any meaningful timeline
  • Not all data is created equal – catalog and document the important things first, and don’t try to boil the ocean
  • The catalog needs to be known by and accessible to the people who need to use the data it described
  • Someone needs to own the catalog and keep it current – if its content is outdated or inaccurate, people won’t trust it, and if they don’t trust it they won’t use it
  • Sooner or later you’ll run into werewolves of your own, and unless you’re prepared in advance the werewolves will eat you

When I started to share this story I figured I would find a place to fit in a “unless you’re careful, your data will turn into a house when the moon is full” joke without forcing it too much, but sadly this was not the case. Still – who doesn’t love a good data werehouse joke?[6]

Maybe next time…


[1] Or whatever it is you’re tracking. You do you.

[2] Apparently I started this post last Halloween. Have I mentioned that the past months have been busy?

[3] Or Pizza Bell… you get the idea.

[4] Each restaurant typically had a single “data” phone line that used the same modem for processing credit card transactions. I swear I’m not making this up.

[5] Or at least short-ish. Brevity is not my forte.

[6] Or this data werehouse joke, for that matter?

Successfully measuring / measuring success

In 2008 I was hired to solve a problem.

At this point almost 12 years later, the problem itself is no longer relevant[1]. While digging around on an unrelated task today I found this chart, which is. You should look at it now.

2020-03-11-14-44-54-465--POWERPNT
Before Power BI we had PowerPoint, but data was still Power, even back then

The scope of the problem is measured by the blue series on this chart. You should look at it again. Just look at it!

Both the blue series and the yellow series are net satisfaction (NSAT) scores. There’s a lot of context behind the numbers[2], but for the purposes of this post let’s say that on this scale anything over 150 is “time for a team party and a big round of bonuses” and anything under 100 is “you probably won’t include this job on your resume, and you’re thinking about this a lot because you’ve been sending your resume out a lot this week.”

There are two stories that leap out from this chart.

The first story is pretty obvious: something changed in FY06. That change had a dramatically negative impact on the blue series, and a small (and probably acceptable) negative impact on the yellow series.

The second story may not be as obvious, but it’s vitally important: the yellow series was being used to track the impact of the change. Something changed in FY06, and the people that made the change were measuring its impact.

They were tracking the wrong thing.

Until I joined the team, no one had a chart like this. It wasn’t that the blue series wasn’t being tracked – it was. It just wasn’t recognized as the true success metric until things were well into resume-polishing territory.[3]

meme

The lesson here isn’t that someone made a bad decision and didn’t realize it. The lesson is that sometimes the metric you’re tracking doesn’t mean what you think it means.

As is the case in my personal story, the problem is usually quite obvious in retrospect, but it’s also usually quite opaque in the moment. Although most large companies have a culture of measurement, it’s more rare to see a culture that consistently questions those measurements. Although this approach may not work for everyone, I recommend using this three-year-old approach to defining your most important metrics.

I don’t mean that the approach is three years old. I mean that you should approach the problem like a three-year-old would: by repeatedly asking “why?”

When someone[4] suggests measuring using a given metric, ask why. “Why do you think this is the right way to measure this thing?” When you get an answer, ask why again. “Why do you believe that?” Keep asking why – the more important the metric, the more times you should ask why and expect to get a well-considered answer[5]. And if the answers aren’t forthcoming or aren’t credible… that is an important point to recognize before you’ve invested too much in a project or solution, isn’t it?


[1] Which is why I’m not going to talk about the problem or the solution here, except in the most general, hand-wavey terms.

[2] You can read this article if you’re curious.

[3] I should also point out that I wasn’t the person who figured out that we’d been measuring the wrong thing. The person who hired me had figured it out, which was why I was hired. Credit where credit is due.

[4] This someone may or may not be you. But definitely question yourself in the same way, because it’s always hardest to see your own biases.

[5] The person who introduced me to this idea called it “five whys” but I wouldn’t read too much into that specific number. He also never explained what he meant by this, and for months I thought he was referring to some five word phrase where each word started with the letter Y. True story.

Viral adoption: Self-service BI and COVID-19

I live 2.6 miles (4.2 km) from the epicenter of the coronavirus outbreak in Washington state. You know, the nursing home that’s been in the news, where over 10 people have died, and dozens more are infected.[1]

As you can imagine, this has started me thinking about self-service BI.

2020-03-10-17-44-57-439--msedge
Where can I find information I can trust?[2]
When the news started to come out covering the US outbreak, there was something I immediately noticed: authoritative information was very difficult to find. Here’s a quote from that last link.

This escalation “raises our level of concern about the immediate threat of COVID-19 for certain communities,” Dr. Nancy Messonnier, director of the CDC’s National Center for Immunization and Respiratory Diseases, said in the briefing. Still, the risk to the general public not in these areas is considered to be low, she said.

That’s great, but what about the general public in these areas?

What about me and my family?

When most of what I saw on Twitter was people making jokes about Jira tickets[3], I was trying to figure out what was going on, and what I needed to do. What actions should I take to stay safe? What actions were unnecessary or unhelpful?

Before I could answer these questions, I needed to find sources of information. This was surprisingly difficult.

Specifically, I needed to find sources of information that I could trust. There was already a surge in misinformation, some of it presumably well-intentioned, and some from deliberately malicious actors. I needed to explore, validate, confirm, cross-check, act, and repeat. And I was doing this while everyone around me seemed to be treating the emerging pandemic as a joke or a curiosity.

I did this work and made my decisions because I was a highly-motivated stakeholder, while others in otherwise similar positions were farther away from the problem, and were naturally less motivated at the time.[4]

And this is what got me thinking about self-service BI.

In many organizations, self-service BI tools like Power BI will spread virally. A highly-motivated business user will find a tool, find some data, explore, iterate, refine, and repeat. They will work with untrusted – and sometimes untrustworthy – data sources to find the information they need to use, and to make the decisions they need to make. And they do it before people in similar positions are motivated enough to act.

But before long, scraping together whatever data is available isn’t enough anymore. As the number of users relying on the insights being produced increases – even if the insights are being produced by a self-service BI solution – the need for trusted data increases as well.

Where an individual might successfully use disparate unmanaged sources successfully, a population needs a trusted source of truth.

At some point a central authority needs to step up, to make available the data that can serve as that single source of truth. This is easier said than done[5], but it must be done. And this isn’t even the hard part.

The hard part is getting everyone to stop using the unofficial and untrusted sources that they’ve been using to make decisions, and to use the trusted source instead. This is difficult because these users are invested in their current sources, and believe that they are good enough. They may not be ideal, but they work, right? They got me this far, so why should I have to stop using them just because someone says so?

This brings me back to those malicious actors mentioned earlier. Why would someone deliberately share false information about public health issues when lies could potentially cost people their lives? They would do it when the lies would help forward an agenda they value more than they value other people’s lives.

In most business situations, lives aren’t at stake, but people still have their own agendas. I’ve often seen situations where the lack of a single source of truth allows stakeholders to present their own numbers, skewed to make their efforts look more successful than they actually are. Some people don’t want to have to rebuild their reports – but some people want to use falsified numbers so they can get a promotion, or a bonus, or a raise.

Regardless of the reason for using untrusted sources, their use is damaging and should be reduced and eliminated. This is true of business data and analytics, and it is true of the current global health crisis. In both arenas, let’s all be part of the solution, not part of the problem.

Let us be a part of the cure, never part of the plague – we’ll only be remembered for what we create.[6]


[1] Before you ask, yes, my family and I are healthy and well. I’ve been working from home for over a week now, which is a nice silver lining; I have a small but comfortable home office, and can avoid the obnoxious Seattle-area commute.

[2] This article is the best single source I know of. It’s not authoritative source for the subject, but it is aggregating and citing authoritative sources and presenting their information in a form closer to the solution domain than to the problem domain.

[3] This is why I’ve been practicing social media distancing.

[4] This is the where the “personal pandemic parable” part of the blog post ends. From here on it’s all about SSBI. If you’re actually curious, I erred on the side of caution and started working from home and avoiding crowds before it was recommended or mandated. I still don’t know if all of the actions I’ve taken were necessary, but I’m glad I took them and I hope you all stay safe as well.

[5] As anyone who has ever implemented a single source of truth for any non-trivial data domain can attest.

[6] You can enjoy the lyrics even if Kreator’s awesome music isn’t to your taste.

Common sense solutions to simple problems

Have you ever looked at a problem and immediately known how to solve it?

Have you ever seen that there’s already a group of people who are responsible for solving this problem this problem, and although they’re working on a fix they obviously have not seen the common sense solution that is so apparent to you?

ford-498244_640
I knew we could park here! Problem solved!

Yeah[1].

In situations like these, there are three[2] potential reasons for what you’re seeing:

  1. The other people are just too stupid to to see the simple and obvious solution that’s been sitting right under their noses all this time.
  2. The other people, due to the time that they have spent closely examining the problem and potential solutions, have realized that the simple and obvious solution doesn’t actually address the problem – even though common sense suggests that it should – or that the solution has unintended side-effects that outweigh its benefits.
  3. The other people are responsible for additional problems and solutions across a broader or more strategic scope, and although the simple solution may make sense in isolation, implementing it has been deferred to enable work on other, higher priority problems.

(At this point I feel compelled to point out that this is something I’ve thought about for decades, and I’ve had a draft blog post on this subject for almost a year. I’m not trying to subtweet[3] anyone. I’ve had several people recently reach out to me to make sure that they’re not the reason I tweet periodic reminders that lazy communication is theft, so a disclaimer here feels appropriate.)

In my experience, many people seem to automatically gravitate to the first choice, and that bias seems exacerbated by differences in perspective. If the observer is in a business group and the solution team is in IT – or vice versa[4] – it’s easy for them to miss the complexity and nuance in something that appears simple and straightforward from the outside.

Part of this challenge is inherent in this type of relationship. It’s natural and appropriate to hide internal complexity from external stakeholders. I’m pretty sure I even learned about this in college… not that I remember college[5].

But part of it is the choice we make when we engage in the cross-team relationship. Every time we see a “simple problem” with an “obvious” or “easy” or “common sense” solution we get to decide how to react. We can assume that we know everything important and that the “other guy” is an idiot… or we can consider the option that there might be something we don’t already know.

The choice is ours.


[1] This is less of an actual footnote than a tool to enforce a dramatic pause as you read. Assuming you read footnotes as you go through the post, rather than reading them all at the end. I may need more telemetry to better understand use behavior. Why aren’t any simple problems actually as simple as they first seem?

[2] If you can think of other reasons that aren’t just variations on these three, I would love to hear about them in the comments. My list used to only include the first two reasons, but over the last few years I’ve added the third.

[3] Sub-blog? is that a thing?

[4] There are scores of other variations on this theme – I picked this one because it’s likely to be familiar to the people I imagine read my blog.

[5] Maybe this?

Data culture and the centerline

I’m running behind on my own YouTube publishing duties[1], but that doesn’t keep me from watching[2] the occasional data culture YouTube video produced by others.

Like this one:

Ok… you may be confused. You may believe this video is not actually about data culture. This is an easy mistake to make, and you can be forgiven for making it, but the content of the video make its true subject very clear:

A new technology is introduced that changes the way people work and live. This new technology replaces existing and established technologies; it lets people do what they used to do in a new way – easier, faster, and further. It also lets people do things they couldn’t do before, and opens up new horizons of possibility.

The technology also brings risk and challenge. Some of this is because of the new capabilities, and some is because of the collision[3] between the new way and the old way of doing things. The old way and the new way aren’t completely compatible, but they use shared resources and sometimes things go wrong.

At the root of these challenges is users moving faster than any relevant authorities. Increasing numbers of people are seeing the value of the new technology, assuming the inherent risk[4], and embracing its capabilities while hoping for the best.

Different groups see the rising costs and devise solutions for these challenges. Some solutions are tactical, some are strategic. And eventually some champions emerge to push for the creation of standard solutions. Or standards plural, because there always seems to be more than one of those darned things.

Not everyone buys into the standards at first, but over time the standards are refined and… actually standardized.

This process doesn’t slow down the technology adoption. The process and the standards instead provide the necessary shape and structure for adoption to take place as safely as possible.

With the passage of time, users take for granted the safety standards as much as they take for granted the capabilities of the technology… and can’t imagine using one without the other.

For the life of me I can’t imagine why they kept doubling down on the “lane markings” analogy, but I’m actually happy they did. This approach may get more people paying attention – I can’t find any other data culture videos on YouTube with 488K views…

road-220058_640


[1] Part of this is because my wife has been out of town, and my increased parental responsibilities have reduced the free time I would normally spend filming and editing… but it’s mainly because I’m finding that talking coherently about data culture is harder for me than writing about data culture. I’ll get better, I assume. I hope.

[2] In this case, I watched while I was folding laundry. As one does.

[3] Yes, pun intended. No, I’m not sorry.

[4] Either through knowledge or through ignorance.

Tough love in the data culture

When I shared on Twitter the most recent installment in the BI Polar “data culture” series, BI developer Chuy Varela replied to agree with some key points from the video on executive sponsorship… and to share some of his frustrations as well[1].

2020-01-13-08-46-03-439--msedge

Chuy’s comments made me think of advice that I received a few years back from my manager at the time. She told me:

Letting others fail is a Principal-level behavior.

Before I tie this tough love into the context of a data culture and executive sponsorship, I’d like to share the context in which the advice was given.

At the time I had just finished a few 80-hour work weeks, including working 12+ hour days through a long holiday weekend. Much of that time was spent performing tasks that weren’t my responsibility – another member of the extended team was dropping the ball, and with a big customer-facing milestone coming up I wanted to make everything as close to perfect as possible.

Once the milestone had been met, my manager pulled me aside and let me know how unhappy she was that I had been making myself ill to deliver on someone else’s commitments[4]. She went on to elaborate that because of my well-intentioned extra effort, there were two negative consequences:

  1. The extended team member would not learn from the experience, so future teams were more likely to have the same challenges.
  2. The executive sponsors for the current effort would believe that the current level of funding and support was enough to produce the high quality and timely deliverables that actually required extra and unsustainable work.

She was right, of course, I’ve added two phrases to my professional vocabulary because of her advice.

The first phrase[5] gets used when someone is asking me to do work:

That sounds very important, and I hope you find the right folks to get it done. It sounds too important for me to take on, because I don’t have the bandwidth to do it right, and it needs to be done right.

The second phrase gets used when I need to stop working on something:

After this transition I will no longer be performing this work. If the work is still important to you, we’ll need to identify and train someone to replace me. If the work is no longer important, no action is required and the work will no longer be done.

Now let’s think about data culture and executive sponsorship in a bottom-up organization. How does a sponsor – or potential sponsor – react when a BI developer or analyst is delivering amazing insights when it’s not really their job?

If the sponsor is bought in to the value of a data culture, their reaction is like to include making it their job, and ensuring that the analyst gets the support and resources to make this happen. This can take many different forms[6], but should always include an explicit recognition of the work and the value it delivers.

And what if the analyst who has enabled more data-driven decisions is ready to move on to new challenges? What happens to the solutions they’ve built and the processes that rely on their work? What then?

Again, if the sponsor is committed to a data culture, then their reaction will involve making sure that it becomes someone’s responsibility to move the analyst’s work forward. They will assign the necessary resources – data resources, human resources, financial resources – to ensure that the data-driven insights continue.

If you’re working on helping to build a data culture from the bottom up, there are a few opportunities to apply this advice to that end. Please understand that there is risk involved in some of these approaches, so be certain to take the full context into consideration before taking action.

  1. Work with your immediate manager(s) to amplify awareness of your efforts. If your managers believe in the work that you’re doing, they should be willing to help increase visibility.
  2. Ensure that everyone understands that your work needs support to be sustained. You’ve done something because it was the right thing to do, but for you to keep doing it, but you can’t keep doing it with all of your other responsibilities.
  3. When it comes time to change roles, ask who will be taking over the solutions you’ve built in your current role, and how you can transition responsibility to them.

Each of these potential actions is the first step down a more complicated path, and the organization’s response[7] will tell you a lot about the current state of the data culture.

For the first potential course of action, managerial support in communicating the value and impact of current efforts can demonstrate the art of the possible[8] to a potential executive sponsor who may not be fully engaged. If you can do this for your current team or department, imagine what a few more empowered folks could do for the business unit! You see where I’m going…

If this first course of action is the carrot, the other two might be the stick. Not everyone will respond the way we might like to our efforts. If you’re building these solutions today without any additional funding or support, why would I give you more resources? Right? In these cases, the withdrawal or removal of existing capabilities may be what’s necessary to communicate the value of work that’s already been completed.

Wow, this post ended up being much longer than planned. I’ll stop now. Please let me know what you think!


[1] I don’t actually believe in New Years resolutions, but I am now retroactively adding “start a blog post with a sentence that contains six[2] or more hyperlinks to my goals for 2020[3].

[2] Yes, I had to go back to count them.

[3] Nested footnotes achievement unlocked!

[4] If your reaction at this point is “what sort of manager would yell at you for doing extra work?” the answer is “an awesome manager.” This may have been the best advice I ever received.

[5] I’ve probably never said either phrase exactly like this, but the sentiment is there.

[6] Including promotion, transfer to a new role, and/or adding capacity to the analyst’s team so the analyst can focus more on the new BI work.

[7] Remember that a culture is what people do, not what people say – have I mentioned that before?

[8] The art of the possible is likely to warrant its own post and video before too long, so I won’t go into too much depth here.

Data culture: Executive sponsorship

Continuing on our series on data culture, we’re examining the importance of having an executive sponsor. This is one of the least exciting success factors for implementing Power BI and getting more insights from more data to deliver more value to the business… but it’s also one of the most important factors.

Let’s check it out:

Ok, what did we just watch?

This video (and the series it’s part of) includes patterns for success I’ve observed as part of my role on the Power BI CAT team[1]. and will complement the guidance being shared in the Power BI Adoption Framework.

The presence of an executive sponsor is one of the most significant factors for a successful data culture. An executive sponsor is:

  • Someone in a position of authority who shares the goals having important business decisions driven by accurate and timely data
  • A leader who can help remove barriers and make connections necessary to build enterprise data solutions
  • A source of budgetary[2] and organizational support for data initiatives
skyscraper-3184798_1920
Because executives fly on planes, right?

Without an executive sponsor, the organizational scope of the data culture is often limited by the visibility that departmental BI successes can achieve. The data culture will grow gradually and may eventually attract executive attention… or may not.

Without an executive sponsor, the lifetime of a data culture is often limited by the individuals involved. When key users move to new roles or take on new challenges and priorities, the solutions they’ve developed can struggle to find new owners.

Without an executive sponsor, all of the efforts you take to build and sustain a data culture in your organization will be harder, and will be more likely to fail.

Who is your executive sponsor?

Update: A Twitter conversation about this video sparked a follow-up post. You can check it out here: Tough Love in the data culture.


[1] This is your periodic reminder that this is my personal blog, and all posts and opinions are mine and mine alone, and do not reflect the opinions of my employer or my teenage children.

[2] This aspect of sponsorship is a bigger deal than we’re going to cover in this post and video – organizations fund what’s important to them, and they don’t fund what’s not.

Building a data culture

tl;dr – to kick off 2020 we’re starting a new BI Polar video series focusing on building a data culture, and the first video introduces the series. You should watch it and share it.

Succeeding with a tool like Power BI is easy – self-service BI tools let more users do more things with data more easily, and can help reduce the reporting burden on IT teams.

Succeeding at scale with a tool like Power BI is not easy. It’s very difficult, not because of the technology, but because of the context in which the technology is used. Organizations adopt self-service BI tools because their existing approaches to working with data are no longer successful – and because the cost and pain[1] of change has become outweighed by the cost and pain of maintaining course.

Tool adoption may be top-down, encouraged or mandated by senior management as a broad organization-wide effort. Adoption may be bottom-up, growing organically and virally in the teams and departments least well served by the existing tools and processes in place.

Both of these approaches[2] can be successful, and both of these approaches can fail. The most important success factor is a data culture in which the proper use of self-service BI tools can deliver the greatest value for the organization.

The most important success factor is a data culture

books-1655783_640
There must be a data culture on the other side of this door.

Without an organizational culture that values, encourages, recognizes, and rewards users and teams for their use of data, no tool and no amount of effort and skill is enough to achieve the full potential of the tools – or of the data.

In this new video series we’ll be covering practices that will help build a data culture. More specifically, we’ll introduce common practices that are exhibited by large organizations that have mature and successful data cultures. Each culture is unique, but there are enough commonalities to identify patterns and anti-patterns.

The content in this series will be informed by my work with enterprise Power BI customers as part of my day job[3], and will complement nicely[4] the content and guidance in the Power BI Adoption Framework.

Back in November when the 100th BI Polar blog post was published, I asked what everyone wanted to read about in the next 100 posts. There were lots of different ideas and suggestions, but the most common theme was around guidance like this. Hopefully you’ll enjoy the result – and hopefully you’ll let me know either way.


[1] I strongly believe that pain is a fundamental precursor to significant change. If there is no pain, there is no motivation to change. Only when the pain of not changing exceeds the perceived pain of going through the change will most people and organizations consider giving up the status quo. There are occasional exceptions, but in my experience these are very rare.

[2] Including any number of variations – these approaches are common points on a wide spectrum, but should not be interpreted as the only ways to adopt Power BI or other self-service BI tools.

[3] By day I’m a masked crime-fighter. Or a member of the Power BI customer advisory team. Or both. It varies from day to day.

[4] Hopefully this will be true. I’m at least as interested in seeing where this ends up as you are.