Dataflows enhanced compute engine – will it fold?

I’ve been seeing more questions lately about the dataflows enhanced compute engine in Power BI. Although I published a video overview and although there is feature coverage in the Power BI documentation, there are a few questions that I haven’t seen readily answered online.

A lot of these questions can be phrased as “what happens when I turn on the enhanced compute engine for my Power BI Premium capacity?”

Most of my responses start with the phrase “it depends” – so let’s look at some of the factors the answer depends on.

First, let’s look at a set of connected dataflows in a Premium workspace:

This workspace has three dataflows we’re interested in:

  1. A “staging” dataflow that contains un- or minimally-transformed data from the external data source or sources
  2. A “cleansed” dataflow that uses linked and computed entities to apply data cleansing logic to the data in the staging dataflow
  3. A final dataflow that uses linked and computed entities to transform the cleansed data into a star schema for analysis

Next, let’s look at the settings for one of these dataflows:

For a given dataflow, the owner can configure the enhanced compute engine to be Off/Disabled, On/Enabled, or “Optimized” which means that the Power BI service will turn it on or off for entities in the dataflow depending on how each entity is used by other dataflows.

This dataflow-level setting, combined with how the dataflows are used, determines whether the enhanced compute engine is enabled on entities in the dataflow:

  • Disabled: The enhanced compute engine is not enabled for any entities in the dataflow
  • Optimized: The enhanced compute engine is enabled for any entity in the dataflow that is referenced by a linked entity in another dataflow
  • On: The enhanced compute engine is enabled for all entities in the dataflow

Important point: To connect to a dataflow using DirectQuery you must explicitly set the enhanced compute engine to “On” even if entities in the dataflow are being referenced by linked entities.

Now let’s put these two factors together, considering that all three dataflows in the first image are using the default setting: Optimized.

  1. The “staging” dataflow uses the enhanced compute engine for any entities that are referenced by linked entities in another dataflow
  2. The “cleansed” dataflow uses the enhanced compute engine for any entities that are referenced by linked entities in another dataflow
  3. The final dataflow does not use the enhanced compute engine because none of its entities is referenced by linked entities in another dataflow[2]

Now we’re ready to revisit the actual question we’re trying to answer[3]: “what happens when I turn on the enhanced compute engine for my Power BI Premium capacity?”

Once the enhanced compute engine is enabled in the Power BI Premium capacity settings, and the dataflow settings and configuration (as illustrated above) dictate that the engine is used for a given dataflow, this is what happens:

  • When the dataflow is refreshed, the Power Query for each entity is executed, and the output of the query is persisted in the dataflow’s CDM folder as CSV data and JSON metadata
  • For any entity for which the enhanced compute engine is enabled, the output of the entity’s Power Query is also loaded into a table in a SQL database instance managed by the Power BI service

This second bullet is where the magic happens. Because the data is now in a storage format that includes a compute engine, supported queries can use the SQL cache instead of the underlying CSV files, and get the increased performance that comes with query folding.

Having the data in SQL also means that the dataflow can server as a DirectQuery data source – without the enhanced compute engine a dataflow can only be used as an Import mode data source.

The next logical question is “what exactly do you mean by supported queries?”

These queries are supported, meaning that they will use the SQL data if the enhanced compute engine is enabled for a dataflow:

  • Dataflow refresh against a dataflow with the enhanced compute engine enabled – for example, the “Cleansed” dataflow or “Final” dataflow in the image above
  • Authoring in Power BI Desktop, when using DirectQuery mode
  • User activity in the BI service, when using DirectQuery mode

These queries are not supported, and will always use the CSV data even if the enhanced compute engine is enabled for a dataflow:

  • Editing a dataflow in Power Query Online
  • Authoring in Power BI Desktop, when using Import mode
  • Dataset refresh in the Power BI service, when using Import mode

The final question[4] is “what Power Query operations will actually fold against the SQL data and take full advantage of the compute capabilities?”

The answer is in this blog post from Cristian Petculescu, the architect of dataflows and much of Power BI. Cristian enumerates well over 100 M constructs and the SQL to which they fold so I’m not going to try to summarize them all here. Go take a look for yourself if you need more information than what’s in this post.

Was this helpful? Did you learn anything new?

If you have any more questions about the enhanced compute engine, please let me know!


[1] This pattern of separating dataflows based on the logical role of the data preparation logic they implement is a general best practice, in addition to aligning well with dataflows best practices.

[2] If we wanted the enhanced compute engine to be used for this final dataflow even though it is not referenced by any linked entities, we would need to change the setting from “Optimized” to “On.”

[3] Did you still remember what the question was? I copied it here because I’d forgotten and was concerned you might have too. It’s September 2020 and the world is on fire, which can make it surprisingly difficult to think…

[4] Which almost no one ever actually asks me, but which fits into the theme of the post, so I’m including it here for completeness.

Data Culture: Now you’re thinking with portal

In an ideal world, everyone knows where to find the resources and tools they need to be successful.

We don’t live in that world.

I’m not even sure we can see that world from here. But if we could see it, we’d be seeing it through a portal[1].

One of the most common themes from my conversations with enterprise Power BI customers is that organizations that are successfully building and growing their data cultures have implemented portals where they share the resources, tools, and information that their users need. These mature companies also treat their portal as a true priority – the portal is a key part of their strategy, not an afterthought.

This is why:

In every organization of non-trivial size there are obstacles that keep people from finding and using the resources, information, and data they need.

Much of the time people don’t know what they need, nor do they know what’s available. They don’t know what questions to ask[2], much less know where to go to get the answers. This isn’t their fault – it’s a natural consequence of working in a complex environment that changes over time on many different dimensions.

As I try to do in these accompanying-the-video blog posts I will let the video speak for itself, but there are a few key points I want to emphasize here as well.

  1. You need a place where people can go for all of the resources created and curated by your center of excellence
  2. You need to engage with your community of practice to ensure that you’re providing the resources they need, and not just the resources you think they need
  3. You need to keep directing users to the portal, again and again and again, until it becomes habit and they start to refer their peers

The last point is worth emphasizing and explaining. If community members don’t use the portal, it won’t do what you need it to do, and you won’t get the return you need on your investments.

Users will continue to use traditional “known good” channels to get information – such as sending you an email or IM – if you let them. You need to not let them.


[1] See what I did there?

[2] Even though they will often argue vehemently against this fact.

Standardizing on Power BI

Last week’s post on migrating to Power BI was intended to be a stand-alone post, but this excellent comment from Matthew Choo via LinkedIn made me realize I had more to say.

I could not agree more, Matthew![1]

In my conversations with leaders from enterprise Power BI customers, if they mention to me that they’re standardizing on Power BI as their analytics tool and platform, I try to ask two questions:

  1. Why?
  2. Why now?

The answer to the first question is almost always a combination of reasons from the previous post on migration and the fact that the customer sees Power BI as the best choice for the organization’s needs moving forward. There’s very little variation in the answers over dozens of conversations.

The answers to the second question are more diverse, but there are some common themes that I hear again and again.

One theme is that the selection of multiple tools happened organically. Sometimes a new BI tool is adopted through a merger or acquisition. Sometimes a new CIO or other senior leader mandates the adoption of their favorite tool, and as leaders change the old tools are left behind while the new tools are added to the stable. Sometimes a key data source only works well with the data source vendor’s reporting tool because the vendor refuses to open their APIs to 3rd parties.[3] Often the plan has been to eliminate excess tools at some point, but the point hasn’t been now… until now.

Very often the factor that makes standardization a priority is pain, which brings me back to a point I made in the introductory post in the “building a data culture” series:

I strongly believe that pain is a fundamental precursor to significant change. If there is no pain, there is no motivation to change. Only when the pain of not changing exceeds the perceived pain of going through the change will most people and organizations consider giving up the status quo.

The most common general reasons for this “pain balance” shifting[4] involve money. Organizations need to eliminate inefficiencies so they can invest in areas of strategic importance. A leader may proactively seek out inefficiencies to eliminate, but it’s more typical for me to hear about external market pressures[5] necessitating a more reactive consolidation.

The other common theme in responses to the question “why now?” is that the organization has a new chief data officer, that the CDO is focused on building a data culture, and for the reasons listed in last week’s post has has made consolidation a priority.

What’s interesting about this theme is that the hiring of a CDO is part of a larger strategic shift in how the organization thinks about data. The C-level executives know that they need to start treating data as a strategic asset, and they realize that it’s too important to be rolled up into the responsibilities of the CIO or another existing leader. Very often, they already have a good idea of the types of changes that need to happen, but want to hire a senior leader who will make the final decisions and own the actions and their outcomes. In other words, the hiring of a CDO is often an early-but-lagging indicator that there’s executive support for a data culture. That’s a good thing.

Before making a big change always important to understand what you hope to achieve, but it’s also important to take a little time to examine how you got to the point where change is necessary, so you can better avoid the mistakes of the past…


[1] Usually when I say this I’m agreeing with myself[2] so it’s nice to be referring to a different Matthew here.

[2] Not that i agree with myself very often, mind you.

[3] Yeah, you know who I’m talking about.

[4] I wasn’t expecting to coin this phrase, but as soon as I typed it, I loved it. I think I may quit my day job and start a new business as a consultant delivering expensive “Shifting the Pain Balance(tm)” workshops for chief data officers and other senior executives.

[5] Since it’s still 2020, I should point out that the COVID-19 has been at the root of many external market pressures. I’ve heard dozens of companies say that they’re increasing their investment in building a data culture because of pandemic-induced challenges.

Data Culture: Showcasing the Art of the Possible

The last post and video in this series looked at the broad topic of training. This post looks at as specific aspect of this topic: letting people know what is possible, and sparking their imagination to do amazing things.

A lot of content and training materials will focus on capabilities: here is a feature, this is what it does, and this is how you use it. Although this type of content is important, it isn’t enough on its own to accelerate the growth of a data culture.

The most successful organizations I’ve worked with have included in their community of practice content specifically targeting the art of the possible. This might be a monthly presentation by community champions across the business. It might be someone from the center of excellence highlighting new features, or the integration between features and tools. The most important thing is planting the seed of an idea in the minds of people who will say “I had no idea you could do that!”

My colleagues Miguel and Chris are some of my greatest personal sources of inspiration for building reports[1] because each of them does amazing things with Power BI that make it powerful, usable, and beautiful – but they’re just two of the many people out there showing me new techniques and possibilities.

Who will you inspire today?


[1] And by now you probably realize that I need all the inspiration I can get for anything related to data visualization.

Migrating to Power BI

One aspect of building a data culture is selecting the right tools for the job. If you want more people working with more data, giving the tools they need to do that work is an obvious[1] requirement. But how many tools do you need, and which tools are the right tools?

Migrating to the cloud

It should be equally obvious that the answer is “it depends.” This is the answer to practically every interesting question. The right tools for an organization depend on the data sources it uses, the people who work with that data, the history that has gotten the organization to the current decision point, and the goals the organization needs to achieve or enable with the tools it selects.

With that said, it’s increasingly common[2] to see large organizations actively working to reduce the number of BI tools they support[3]. The reasons for this move to standardization are often the same:

  • Reduce licensing costs
  • Reduce support costs
  • Reduce training costs
  • Reduce friction involved in driving the behaviors needed to build and grow a data culture

Other than reducing the licensing costs[4], most of these motivations revolve around simplification. Having fewer tools means learning and using fewer tools. It means everyone learning and using fewer tools, which often results in less time and money spent to get more value from the use of those tools.

One of the challenges in eliminating a BI tool is ensuring that the purpose that tool fulfilled is now effectively fulfilled by the tool that replaces it. This is where migration comes in.

The Power BI team at Microsoft has published a focused set of guidance articles focused specifically on migrating from other BI tools to Power BI.

This documentation was written by the inestimable Melissa Coates of Coates Data Strategies, with input and technical review by the Power BI customer advisory team. If you’re preparing to retire another BI tool and move its workload to Power BI – or if you’re wondering where to start – I can’t recommend it highly enough.


[1] If this isn’t obvious to a given organization or individual, I’m reasonably confident that they’re not actively trying to build a data culture, and not reading this blog.

[2] I’m not a market analyst but I do get to talk to BI, data, and analytics leaders at large companies around the world, and I suspect that my sample size is large and diverse enough to be meaningful.

[3] I’m using the word “support” here – and not “use” – deliberately. It’s also quite common to see companies remove internal IT support from deprecated BI tools, but also let individual business units continue to use them – but also to pay for the tools and support out of their own budgets. This is typically a way to allow reluctant “laggard” internal customer groups to align with the strategic direction, but to do it on their own schedules.

[4] I’m pretty consistent in saying I don’t know anything about licensing, but even I understand that paying for two things costs more than paying for one of those things.

Data Culture: Training for the Community of Practice

The last few posts and videos in this series have introduced the importance of a community where your data culture can grow, and ways to help motivate members of the community, so your data culture can thrive.

But what about training? How do we give people the skills, knowledge, and guidance that they need before they are able do work with data and participate in the data culture you need them to help build?

Training is a key aspect of any successful data culture, but it isn’t always recognized as a priority. In fact the opposite is often true.

I’ve worked in tech long enough, and have spent enough of that time close to training to know that training budgets are often among the first things cut during an economic downturn. These short-term savings often produce long-term costs that could be avoided, and more mature organizations are beginning to realize this.

In my conversations with enterprise Power BI customers this year, I’ve noticed a trend emerging. When I ask how the COVID-19 pandemic is affecting how they work with data, I hear “we’re accelerating our efforts around self-service BI and building a data culture because we know this is now more important than ever” a lot more than I hear “we’re cutting back on training to save money.” There’s also a clear correlation between the maturity of the organizations I’m talking with and the response I get. Successful data cultures understand the value of training.

I’ll let the video speak for itself, but I do want to call out a few key points:

  1. Training on tools is necessary, but it isn’t enough. Your users need to know how to use Power BI[1], but they also need to know how to follow organizational processes and work with organizational data sources.
  2. Training material should be located as close as possible to where learners are already working – the people who need it the most will not go out of their way to look for it or to change their daily habits.
  3. There is a wealth of free Power BI training available from Microsoft (link | link | link) as well as a broad ecosystem of free and paid training from partners.

The most successful customers I work with use all of the resources that are available. Typically they will develop internal online training courses that include links to Microsoft-developed training material, Microsoft product documentation, and community-developed content, in a format and structure[2] that they develop and maintain themselves, based on their understanding of the specific needs of their data culture.

Start as small as necessary, listen and grow, and iterate as necessary. There’s no time like the present.


[1] Or whatever your self-service BI tool of choice may be – if you’re reading this blog, odds are it’s Power BI.

[2] I’m tempted to use the term “curriculum” here, but this carries extra baggage that I don’t want to include. Your training solution can be simple or complex and still be successful – a lot of this will depend on your company culture, and the needs of the learners you’re targeting.

Webcast: Patterns for adopting dataflows in Power BI

I haven’t been posting a lot about dataflows in recent months, but that doesn’t mean I love them any less. On Wednesday September 23rd, I’ll be sharing some of that love via a free webcast hosted by the Istanbul Power BI user group[1]. You can sign up here.

In this webcast I’ll be presenting practices for successfully incorporating dataflows into Power BI applications, based on my experience working with enterprise Power BI customers. If you’re interested in common patterns for success, common challenges to avoid, and answers to the most frequently asked dataflows questions, please sign up today.

This webcast won’t cover dataflows basics, so if you’re new to dataflows in Power BI or just need a refresher, please watch this tutorial before joining!


[1] In case it needs to be said, yes, the session will be delivered in English.

Sometimes culture is life or death

Back in January I shared a video that wasn’t technically about data culture, but which I believed was a near-perfect analogy for the evolution of a data culture. Now I’d like to share another one. It’s a short and thoughtful six minute video that I hope you’ll take the time to watch.

Consider this question from the video: “How much freedom is too much? How much is not enough?” Then consider the answer: it depends.

In the first post in my data culture series, I included this footnote:

I strongly believe that pain is a fundamental precursor to significant change. If there is no pain, there is no motivation to change. Only when the pain of not changing exceeds the perceived pain of going through the change will most people and organizations consider giving up the status quo. There are occasional exceptions, but in my experience these are very rare.

Bermuda changed, because the pain of not changing was too great. They realized that the traditional, centralized approach[1] would not work for them, so they developed a distributed, decentralized approach that would work.

This change meant that individuals needed to do some of the things that most of us would expect the a government agency to do. This change meant that individuals gave up some freedom that most of us[2] have always taken for granted.

This change also kept those individuals from dying.

clock-2535061_1280

If you skipped over the video and just read to this point, please go back up and watch it now. Go. Listen to the words about Bermuda, and think about how your organization uses data. Think about how hard change is – who accepts it, and who pushes back.

Evan Hadfield, the young man behind the Rare Earth channel on YouTube, touches on a lot of the nuance and balance and conflict that makes culture change so difficult. A lot of his videos touch on painful historical topics which he explores and questions, but often without answers to those questions. I love it[3], and watch every video he releases. If you like this blog for more than just the data stuff, odds are you’ll love it too.


[1] For them it was about water management. For you it might be about data. Work with me here.

[2] If you have a homeowners association that mandates and restricts the exterior of your home, you may be in the exception on this one.

[3] I first discovered the channel when YouTube recommended this video. I ignored it for weeks, but when I finally gave in and watched it the first time I was instantly hooked.

Data Culture: Motivation and Encouragement

The last post in our ongoing series on building a data culture focused on the importance of community, and on ways organizations can create and promote successful communities around data. But while a community is where the data culture can grow, how can you motivate people to participate, to contribute, and to be part of that growth?

Business intelligence is more about business than it is about technology, and business is really about people. Despite this, many BI professionals focus their learning largely on the technology – not the people.

Do you remember the first time you were involved in a performance tuning and optimization effort? The learning process involved looking at the different parts of the tech stack, and in understanding what each part does, how it does it, and how it relates to all of the other parts. Only when you understood these “internals” could you start looking at applying your knowledge to optimizing a specific query or workload.

You need to know how a system works before you can make it work for you. This is true of human systems too.

This video[1] looks at motivation in the workplace, and how you can motivate the citizen analysts in your data culture to help it – and them – grow and thrive. If you think about these techniques as “performance tuning and optimization” for the human components in a data culture, you’re on the right track.

This image makes a lot more sense after you’ve watched the video – I promise

People are motivated by extrinsic motivators (doing something to get rewards) and intrinsic motivators (doing something because doing it makes them happy)[2], and while it’s important to understand both types of motivators, it’s the intrinsic motivators that are more likely to be interesting – and that’s where we spend the most time in the video.

When you’re done with the video, you probably want to take a moment to read this Psychology Today article, and maybe not stop there. People are complicated, and if you’re working to build a data culture, you need to understand how you can make people more likely to want to participate. Even with an engaged executive sponsor, it can be difficult to drive personal change.

In my personal experience, task identity and task significance are the biggest success factors when motivating people to contribute in a data culture. If someone knows that their work is a vital part of an important strategic effort, and if they know that their work makes other people’s lives better, they’re more likely to go the extra mile, and to willingly change their daily habits. That’s a big deal.


[1] If you’re not old enough to recognize the opening line in the video, please take a moment to appreciate how awesome commercials were in the 1980s.

[2] Yes, I’m oversimplifying.

Representation and visibility

tl;dr: We need more representation in tech, in part because career opportunities come from kicking ass where people can see you, and that comes from knowing that you belong, and that knowing comes from seeing people like you already belonging.

Image by Gerd Altmann from Pixabay

Representation is a complex topic. Being a straight, white, cisgender, American man, I don’t have a lot of personal experiences with the lack of representation. But I do have one, and it involves swords.

Back in 2004 a friend of mine sent me a copy of the first edition of The Swordsman’s Companion by Guy Windsor. At this  point I had no idea that people were recreating medieval martial arts and fighting with steel weapons[1], so I read through the book and eventually sold it at a yard sale. I was interesting, and I loved the idea of being able to do what the people in the book were doing, but deep inside I knew that people like me didn’t actually do things like that. I was wrong, of course, but at the time the totality of my life experience told me I was right, and I simply didn’t question this knowledge.[2] Ten years passed before this opportunity presented itself again – ten years in which I could have been studying, practicing, competing, improving.

Now take this almost-trivial example and apply it to your career. What if you had never seen someone like yourself in a job role? How would you know that this was something that you could do, that this was a path that was open for you to travel?

When I look back on my career in IT, I can see tipping points – places where everything changed, and my life was forever improved. All but one of them (we’ll come back to this one) happened because someone saw me kicking ass and said “I want you to come kick ass with me.”

  • In 1996 when Dave saw me excelling as an applications trainer and offered me the opportunity to become a technical trainer and MCT.
  • In 1997 when Jeff saw me teaching Windows NT networking and offered me a job with a much higher salary and responsibilities that included consulting and application development.
  • In 2003 when Carolyn offered me a full-time contract as I was leaving my then-collapsing employer and starting my own consultancy, and when I simply didn’t know how I would pay my mortgage in six months, or pay the hospital bills for my impending second child.
  • In 2005 when Corey saw me teaching a beta SQL Server 2005 course, and said “you need to come work at TechEd this year.”
  • In 2007 when I quoted Ted almost double my then-standard daily rate to get him to quit nagging me about working for him, and he instantly agreed and asked how soon I could start.
  • In 2008 when Ken and Dave told Shakil that I was the only person who could do the job he needed done, and he ended up offering me my first job at Microsoft.
  • In 2011 when Matt learned I was looking for a new role and said “we have an open PM position on the SSIS team – let me introduce you to the hiring manager so you can see if you want to apply.”
  • In 2017 when Kasper kept subtly mentioning how much he loved the team he was on, until I finally figured out he wanted me to join it.

This is my story, and I’m including names both to say “thank you” and to make it real – these are the people who enabled me to be who I am today, where I am today, doing what I love so much today. I’ve heard variations on this story from many of my colleagues and peers, but this one is mine.

Please don’t get me wrong – I don’t believe anyone was giving me any handouts. At every step along the way I was doing the work – I was working hard, pushing, trying, struggling to kick as much ass as I possibly could. I was really good, because I had studied and practiced and put in the long hours to learn and improve – but without these people seeing and recognizing my hard work, the opportunities simply would not have existed for me. At any of these tipping points, if I hadn’t been where these people could see me, the moment would have passed, and the opportunity would have been lost. Each opportunity was dependent on the opportunities that came before it – each one depended on my being where I was, being visible and being seen.

This brings me back to that very first tipping point. In 1996 I saw a classified ad[3] for a job teaching people how to use Microsoft Windows and Office, and I applied for it.

And this brings us back to representation.

I applied for that first job because I could see myself in the role – I knew that I could do it. I had no difficulty picturing my 20-something Christian white male self in that role, and neither did the folks doing the hiring.[4]

But what if I was female, or Black, or transgender? What if I looked at an industry that was overwhelmingly male and overwhelmingly white and overwhelmingly cis, and knew that I didn’t belong, because there was no one like me doing things like that? Would I have opened that very first door, on which every later door depended?

I can’t answer that, but looking at the still-white and still-male faces around me every day in tech, I have every reason to believe that many candidates would see this lack of representation and infer from it a lack of opportunity – and never take that first step. And for those who do take the first steps I only need to look around to see the barriers that the tech industry put in place for women and minorities, letting them know every day that they’re not welcome.

Back in July I shared a post that referenced an amazing webcast by UX Designer Jasmine Orange on “Designing for the Ten Percent.” In that post the link was buried in a footnote, so I want to call it out here explicitly – if you made it this far into my post, you really want to watch this webcast. Jasmine’s premise[5] is that by designing for underrepresented users/communities/audiences/people you end up making better products for everyone.

I believe the same is true of tech in general – by building organizations, teams, and cultures that are welcoming to people who have been traditionally excluded, we build organizations, teams, and cultures that are better and more welcoming for people who have never been excluded. Diverse teams are stronger, more resilient, more agile, and more productive. Diverse cultures thrive and grow where monocultures collapse and die.

Why do I care? Does any of this even affect me directly?

Yes, it does.

On one hand, being part of a diverse team means that I will be more likely to be part of a successful team, with the financial rewards that come with success. On the other hand, I feel more welcome and more at home as part of a diverse team.

Still, it’s not about me, is it?

If you’re considering a role in tech and you’re not sure if you should apply for a job, do it. Even if you don’t feel like you’re a perfect fit, or don’t believe you meet every single requirement in the job posting. The white guys are going to apply because they know they below. You belong too, even if you don’t know it yet.

If you’re coming into tech from an underrepresented minority, you not only belong, you’re vitally needed. You bring something that no one else can bring – your background, your experiences, your perspectives – all the lessons you learned the hard way that an all-male, all-white team might pay an expensive consultant to tell them about after they’ve failed enough times.

Not every employer will recognize this value, but that doesn’t mean it’s not there, or that it’s not real. Some employers will judge every candidate against the “tech bro ideal” and will try to make every hire fit into this mold.

If you’re an employer, don’t be this guy. If you’re an employer, recognize the strategic value of having a diverse team. And recognize that this diverse team won’t happen on its own. Support organizations like Black Girls Code, because this is where your hiring pipeline will come from. Value the diverse perspectives that are already represented on your team, and promote them. Give authority and power to people who will hire diverse candidates. Give authority and power to people who look like the people you’re trying to recruit.

And if you’re working in tech today, please be be the person who notices. When you see someone kicking ass – doing an amazing job and demonstrating talent and potential to do more – be the person who says something. Even if you can’t see yourself in that person. Even if they don’t look like you and the rest of your team.

Representation is important, because it helps more people know that they can open that first door. Visibility is important, because it provides the opportunity for change and growth – visibility is what opens the second door, and the third, and….

Ok…

I started off writing a completely different post, but this is the one that wanted to come out today. At some point in the next few weeks[6] I’ll follow up with a post on community and how community is a key technique for increasing your visibility, but this post has gotten long enough, and then some.

Thanks for sticking with me – I’d love to hear what you think.


[1] I learned in 2014 when I first saw this video and fell in love.

[2] Underlying this false knowledge was likely the experience of growing up rural and poor, and not having a lot of opportunities as a child, or many experiences as an adult. It wasn’t until 2005 when I first travelled to Europe for a Manowar concert that my mindset started to change, but that’s probably not relevant to this post.

[3] Classified ads are what we called Craigslist before it was LinkedIn. Something like that.

[4] At this point I feel compelled to mention my friend Kathy, who actually got the job I applied for. I didn’t get the job, so I ended up taking a job as a bank teller. When the training center had another opening a few weeks later they called me back to ask if I was still available, and I said yes. True story. Hi Kathy!

[5] More accurately, this is my interpretation of Jasmine’s premise.

[6] LOL @ me, and LOL @ you if you haven’t learned by now not to trust any predictions of future output on my part. We should both know better by now.