AI is here. What Does It Mean for Your Team?
I just finished a webinar that was over an hour of AI tips and tricks, and I won’t lie – some of it blew my mind. In minutes, AI generated social media videos from LinkedIn posts; it built simple websites and then “made them even prettier.” And that was just the beginning. When I watched it analyze Excel sheet data in minutes that equated with a year of my life in grad school, I could see how different the future would be for millions of students. As I sit here reflecting on the power of AI, my immediate response is to share what will become more valuable to leaders in teams right here, right now.
You’ve heard some version of the phrase, “In the age of AI, we still need EI (emotional intelligence).” Agreed. But what does that mean for teams in your organization, now and in the very near future?
The Three Important Skills For AI-Savvy Teams In The Near Future:
1. Leaders with Good Judgement Matter More Than Leaders with Good Skills: Many leaders these days are delegating more and more work in the domain of their skill set. This makes sense: the senior leaders I work with zoom from meeting to meeting, physically some days and by clicking into call after call on other days – they don’t have time to do the “technical” work of their function. So the most important leadership skills you need from your team are the ability to understand or “grok” enormous amounts of information by asking incisive questions and then making skillful judgments about the meaning of what they are hearing. For your team to be successful, you need to develop these skills in your leaders. Your 1-1s are worthless if you’re not developing their critical thinking skills. I’m not saying that you need to be using the law school practice of the Socratic method but it’s worth thinking how you could adapt it within your team.
For help, I asked ChatGPT to describe the Socratic method. Here’s what it shared:
The Socratic method of teaching is based on asking a series of open-ended, probing questions to stimulate critical thinking and encourage deeper understanding. Key components include:
1. **Questioning**: Instead of providing answers, the teacher asks questions that challenge assumptions, clarify concepts, and explore underlying principles.
2. **Dialogical Engagement**: It promotes a collaborative dialogue between teacher and students, where both parties contribute to the discovery of knowledge.
3. **Critical Thinking**: The method encourages students to critically evaluate their beliefs, identify contradictions, and refine their reasoning.
4. **Eliciting Definitions and Clarifications**: Teachers often ask students to define terms or concepts, leading them to refine and solidify their understanding.
5. **Reflection and Self-Examination**: Students are encouraged to reflect on their own thought processes, assumptions, and conclusions, fostering self-awareness and intellectual humility.
The goal is not to impart knowledge directly, but to guide students toward self-discovery and a deeper understanding of complex ideas.
Now let’s imagine replacing Teacher with “Leader” and Students with “Team.” The result is a leader whose team is thriving in the world of AI. So – what would you have to change in your team meetings to bring more of the Socratic method to your collaborations?
2. Promote Ethics and Integrity: AI is pretty smart at this point. Not only does it work quickly, there’s no back-talk or attitude. For many leaders out there, AI might be a better direct report than some of their current team. However, the ethics of AI are still quite ambiguous. AI still tends to hallucinate and make up things rather than confess it doesn’t have enough information. Also, the ethics built into AI are very much debated by philosophers – what calculus should a self-driving car make to save the life of a child in the street? A dog? In the famous philosophers’ Trolley Problem, would AI switch the trolley to kill one person vs. 5 people? Because AI is always learning, the ethics of AI are in that black box, such that it’s hard to imagine AI with integrity. AI is trained on all sorts of data and much data is inherently flawed. That’s why as we rely on the judgement of our leaders, we want to have the trust and confidence that our leaders have moral backbones and take ethical stances that honor our organizations as well as our people.
AI is flawed. But so are people. There’s no formula that removes bias and always finds the “right” answer. This makes it even more important to have leaders who can admit and address their biases, can think through cause and effect of different scenarios from a logistical as well as ethical perspective, and are willing to engage in the hard work of “being” human. Because that’s a team that can use and engage in AI without losing the human compass that can’t be ignored at work (or ignored only at your peril!)
3. Develop leaders who can manage through complexity: Whether you have heard the terms about living in a VUCA (volatility/uncertainty/complexity/ambiguity) world or BANI (brittle, anxious, nonlinear and incomprehensible) world, the reality is that you and your leaders are experiencing these factors every day. It could be picking which product updates to invest in, whether it’s worth it to do another restructuring or what L&D programs will be the most useful for your workforce – in all cases these are questions that require the nuanced judgment and integrity mentioned above, not simple answers you can put into a google search.
In the world of complexity, there aren’t “right” solutions to find, there are experiments that move you closer to or further away from your goal and provide learning. You need to make sure that you are deepening the ability of your team to step back, design experiments to understand the complexity in your space, and then process the results of these experiments to learn more and iterate new experiments.
AI can look for patterns and is often better than humans at picking them out, so ideally your team can harness the power of AI pattern recognition with their own experiments and take-aways to come up with more educated next steps to take, or decisions to make. This blending of AI usage with human discernment is an art, not a science – that’s why in complex marketplaces, you need innovative teams that are willing to make thoughtful guesses and experiment, not employees who just want to be told what the right answer is. The more you can empower your team to take calculated risks and thoughtful guesses, the more you are developing a complexity-adept team – just the kind of team that can use AI well along with their own experiences and wisdom.
It’s increasingly common to hear stories of how companies and teams are using AI to automate and augment their work. If you want your team to be using AI well in the future, it’s worth investing now in creating a team that can blend their strengths seamlessly with the ever-increasing capabilities of AI.